# [Official] AMD R9 Radeon FURY / NANO / X / Pro DUO FIJI Owners Club



## hyp36rmax

*Members Form*

https://docs.google.com/forms/d/1nmupQHqtF7msLmrb99hO835-OiwqrHDGKmAXxpL_wrY/viewform?embedded=true

*Members List*

https://docs.google.com/spreadsheets/d/1tz-37WCYD1DOhu9LCV3N6pkrWqu4ThHIAZfUlBs2LDY/pubhtml?widget=true&headers=false


----------



## hyp36rmax

*Resources and Helpful Information*



Spoiler: AMD Radeon R9 FURY X Video Overviews



*AMD Radeon™ R9 Fury X Graphics: Product Overview:*






*Introducing the AMD Radeon™ R9 Fury X GPU: Pushing the boundaries of what is possible:*






*AMD Radeon™ R9 Fury X Graphics: Industrial Design*











Spoiler: Personalizing your AMD Radeon R9 Fury X graphics card





The AMD Radeon™ R9 Fury X graphics card industrial design was created with the goal of embodying a professional, elegant and simple design. Using multiple pieces of aluminum die cast finished in black nickel and a soft touch black, the full metal construction makes the graphics card feels as good as it looks. We are extremely pleased with the outcome of the design but also understand there are always different design ideas out there.

During the process of creating the industrial design for the AMD Radeon™ R9 Fury X graphics card we encountered a variety of unique perspectives within AMD on how it should look. These differentiating opinions made us think, what if we could enable our customers to implement their own creativity on our design? To do this we incorporated a removable front plate on the AMD Radeon™ R9 Fury X graphics card to allow for customer creativity. Below you will find a link to download the 3D model for the face plate to help get you started on designing your own 3D printed or CNC front plate.

Please ensure to take all necessary precautions prior to removing the front plate from the graphics card; these include but are not limited to:


Do not remove the front plate while the graphics card is installed inside a system
Do not remove the front plate while the graphics card is powered or operational
Ensure your workspace is clear of debris and appropriate electrostatic discharge (ESD) protection is taken
The front plate can be removed by removing the four hex screws from the front of the graphics card as illustrated below
Do not remove any other screws or modify any other components on the graphics card

Use a proper hex key or screwdriver to remove the screws from the front plate to avoid damaging the screws
When reinstalling the screws do not to over tighten the screws




Be sure to share with us your creations and we'll highlight some of our favorites.



*Download the front plate 3D model HERE*

IMPORTANT: AMD's product warranty does not cover damage to your graphics card or system caused in whole or in part by removing, modifying or reinstalling the AMD Radeon Fury X faceplate, which activities you agree to carry out at your own risk. AMD will not provide replacement faceplates for any faceplates lost or damaged, nor will AMD be liable for any damages to the graphics card or your system caused during the removal, modification or reinstallation of the faceplate.

RadeonR9FuryXfrontplate.zip 295k .zip file


*Source:* Link





Spoiler: Signature Code:






Spoiler: Official AMD R9 Radeon FURY / NANO / X/ X2 Owners Club



Code:



Code:


[center][url="http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club"][B]Official AMD R9 Radeon FURY / NANO / X/ X2 Owners Club[/B][/url][/center]


----------



## hyp36rmax

*Reviews and Benchmarks*




*AMD R9 Radeon Fury X Reviews*


*Site Name**Article Name**Date**Link*TechPowerUp

AMD Radeon R9 Fury X 4 GB

06/24/15LinkToms Hardware

AMD Radeon R9 Fury X 4 GB

06/24/15LinkHardware Canucks

AMD Radeon R9 Fury X 4 GB

06/24/15LinkGuru 3D

The Radeon R9 Fury X Analyzed: AMD Unleashes Fury

06/24/15LinkBit-Tech

AMD Radeon R9 Fury X Review

06/24/15LinkOverClock3D.net

AMD R9 Fury X Review

06/24/15LinkHardOCP

AMD Radeon R9 Fury X Video Card Review

06/24/15LinkHexus

AMD Radeon R9 Fury X 4GB

06/24/15LinkVMOD Tech

AMD RADEON™ R9 FURY X 4GB HBM 4096-bit Review

06/24/15LinkTechFrag

AMD Radeon R9 Fury X Official Benchmark Results Released

06/24/15LinkHardware.FR

AMD Radeon R9 Fury X : le GPU Fiji et sa mémoire HBM en test

06/24/15LinkSWE Clockers

AMD Radeon R9 Fury X

06/24/15LinkHardware.INFO

AMD Radeon R9 Fury X review: AMD's new flag ship graphics card

06/24/15LinkJagat Review

Review Radeon R9 Fury X: AMD Gaming VGA Best When It!

06/24/15LinkHispaZone

AMD Radeon R9 Fury X Series

06/24/15LinkForbes

Radeon R9 Fury X Review: This Is AMD At Their Best

06/24/15LinkPC Gamer

AMD Radeon R9 Fury X tested: not quite a 980 Ti killer

06/24/15LinkTechReport

AMD's Radeon R9 Fury X graphics card reviewed: The red team vents its Fury

06/24/15LinkPC World

AMD Radeon R9 Fury X graphics card review: AMD's long-awaited 4K powerhouse

06/24/15Link





*AMD R9 Radeon Fury Reviews*


*Site Name**Article Name**Date**Link*        





*AMD R9 Radeon Nano Reviews*


*Site Name**Article Name**Date**Link*        





*AMD R9 Radeon Fury X Dual Reviews*


*Site Name**Article Name**Date**Link*


----------



## gagac1971

nice tread, subbed...


----------



## hyp36rmax

Are we getting closer...?


----------



## DividebyZERO

I want to join this thread officially so bad I'd sell my grandma if it made it happen faster. While I am unsure about HBM effects on throughput vs vram size and high resolutions with games. I am hoping for the best on that front. If not I may be joining 290x 8gb until 3xx comes with 8 gb


----------



## hyp36rmax

Quote:


> Originally Posted by *DividebyZERO*
> 
> I want to join this thread officially so bad I'd sell my grandma if it made it happen faster. While I am unsure about HBM effects on throughput vs vram size and high resolutions with games. I am hoping for the best on that front. If not I may be joining 290x 8gb until 3xx comes with 8 gb


Same here! I'm looking forward to a possible announcement at E3 and any information from Computex 2015.


----------



## hyp36rmax

*+ Added "AMD High Bandwidth Memory (HBM) official " to What is HBM of OP*


----------



## Roaches

Subbed for some future goodness


----------



## DividebyZERO

Just curious, since its not official yet price wise. What price range would be acceptable for people? Say it was 8GB 390x? Or since we don't know even 4GB?

I am looking at wanting to get 3 of these, but of course price may change that.


----------



## glenn37216

$849.00 (390x) is a quote coming from insiders @ newegg and some Chinese distribution outlets. -Sources are the same ones who leaked the correct price 295x2 months before they came out.

Personally , I wouldn't buy these cards at launch. AMD has a history of taking their time on perfecting drivers for newly released cards. Delayed drivers for AAA titles , buggy crossfire profiles.. Knowing AMD's history we have a lot of frustration to look forward to .


----------



## DividebyZERO

Yeah i am used to AMD now, plus im locked into AMD with my setup now. 849$ is really too much for my tastes. That said i bought my 290's on launch day and liked them even with driver issues. I don't think people will have any issues with one of these 390x which is what most likely will be the most used scenarios. I don't expect it to be perfect out the gate, it's just the way the PC market is nowadays.

If they do cost 850$ then i may hold off unless they are just too irresistible on performance. Really my biggest factors are going to be price, and extreme resolution performance 4k+. So i also wonder what the secondary top card will be like price/perf wise.


----------



## glenn37216

I don't think the 390x will be released before the 390 does. If I remember correctly it will be a few weeks late compared to other 3xx series cards. Price wise I believe the 390 will be a sub $600.00 card so that could be affordable if the performance in 4k is there.

- I'm waiting on the 395x2 ($1500.00 msrp) which will be out months later. By then the drivers and crossfire bugs should be ironed out.. (hopefully)


----------



## Sgt Bilko

I think something just popped up on the radar














__ https://twitter.com/i/web/status/601739457960763392


----------



## jackalopeater

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I think something just popped up on the radar
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> __ https://twitter.com/i/web/status/601739457960763392


I want to touch it's tra-la-la!


----------



## DividebyZERO

Mmmmmmm


----------



## hyp36rmax

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I think something just popped up on the radar
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> __ https://twitter.com/i/web/status/601739457960763392


So much awesome! I'm in for two! Maybe even three if the drop a 395X2 and a 390X for some Tri-fire in an MATX  I wonder what the power requirements will be.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I think something just popped up on the radar
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> __ https://twitter.com/i/web/status/601739457960763392
> 
> 
> 
> So much awesome! I'm in for two! Maybe even three if the drop a 395X2 and a 390X for some Tri-fire in an MATX
> 
> 
> 
> 
> 
> 
> 
> I wonder what the power requirements will be.
Click to expand...

Judging by how small it is HBM really made an impact on the Cards physical size so i imagine power consumption might be down as well









Loving the Metal shroud though.......so much awesome in that alone


----------



## DarthBaggins

I wana play w/ one


----------



## hyp36rmax

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Judging by how small it is HBM really made an impact on the Cards physical size so i imagine power consumption might be down as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Loving the Metal shroud though.......so much awesome in that alone


With AMD maximizing the HBM with an interposer to stack the ram within the GPU, it sure is pretty cool to see how it affects the overall design of the GPU. I'm looking forward to a R9 395X2 dual GPU PCB being about 10.5 inches. Imagine that! hahaha


----------



## Sgt Bilko

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Judging by how small it is HBM really made an impact on the Cards physical size so i imagine power consumption might be down as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Loving the Metal shroud though.......so much awesome in that alone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With AMD maximizing the HBM with an interposer to stack the ram within the GPU, it sure is pretty cool to see how it affects the overall design of the GPU. I'm looking forward to a R9 395X2 dual GPU PCB being about 10.5 inches. Imagine that! hahaha
Click to expand...

Now that would be awesome.....definitely means no more sacrificing power for a tiny LAN rig anymore!


----------



## zealord

we are close. very close now









I hope Lisa Su doesn't go mental with the price


----------



## DividebyZERO

Quote:


> Originally Posted by *zealord*
> 
> we are close. very close now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope Lisa Su doesn't go mental with the price


Depending on where you sit on that issue, the competition sets the bar high on price. So chances are from that standpoint it will be less.

I will be ready to put these bad boys to the test and it won't be a measly 4k or 5k


----------



## $k1||z_r0k

leaked specs from WCCFT guys...



source: http://wccftech.com/amd-fiji-die-reconstructed-hbms-huge-gpu-uncovered/


----------



## ssateneth

I just want to see the naked PCB and die. I want to know exactly how much real estate is freed, what the die looks like, can we expect a super short gfx card, all of this!

BTW, people asking/begging for 8GB versions, it absolutely will NOT happen until 490X (or later). This is first generation HBM and first generation HBM has already had its specs and limitations set in stone, which include 4GB maximum capacity. I'm too lazy to google it for other people, but later generations will get a higher total capacity.


----------



## DividebyZERO

Quote:


> Originally Posted by *ssateneth*
> 
> *I just want to see the naked PCB and die.* I want to know exactly how much real estate is freed, what the die looks like, can we expect a super short gfx card, all of this!
> 
> BTW, people asking/begging for 8GB versions, it absolutely will NOT happen until 490X (or later). This is first generation HBM and first generation HBM has already had its specs and limitations set in stone, which include 4GB maximum capacity. I'm too lazy to google it for other people, but later generations will get a higher total capacity.


Not gonna lie, i read the above bold part and laughed. I know you meant the GPU core but i kept thinking die and in death from seeing a naked pcb.


----------



## bobbavet

I don't think I have been so excited for a GPU since HD6990 & GTX690. The thought of 2 of these pre waterblocked cards side by side is rustling my jimmies!


----------



## tp4tissue

I'm not really that excited about HBM

Cuz.. it's cool and all, and makes the card smaller, that's great.. but performance wasn't really bottle-necked by memory bandwidth..

The shader increase ultimately makes up the bulk of the increased drawing performance.

So in that sense, it's still a linear race to GET BIG..

Of course, the next step is them stacking more shaders vertically.

Here's hoping we get a good 50% bump in-GAMES from the 390x..


----------



## DividebyZERO

Yeah, i'm really hoping for some reviews soon cause its so unknown right now.


----------



## $k1||z_r0k

390/390X will not be Fiji but in fact Hawaii rebranded (Grenada) cards...

http://wccftech.com/amd-radeon-r9-390x-hawaii-8-gb-gddr5-spotted-radeon-r9-380-tonga-r9-370-pitcairn-rebrands/

so i think we need a new Fiji and Fiji X Information and Owners Club thread.


----------



## hyp36rmax

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> 390/390X will not be Fiji but in fact Hawaii rebranded (Grenada) cards...
> 
> http://wccftech.com/amd-radeon-r9-390x-hawaii-8-gb-gddr5-spotted-radeon-r9-380-tonga-r9-370-pitcairn-rebrands/
> 
> so i think we need a new Fiji and Fiji X Information and Owners Club thread.


Thank you for the insight, I will adjust the thread title accordingly to reflect FIJI once we have official news from AMD. With that said WCCFTech rumor.

If this rumor becomes fact my Sapphire VAPOR-X R9 290X 8Gb Crossfire are still relevant hahaha. However the goal of this thread is FIJI


----------



## semitope

Really hope these aren't rebrands. I mean, if they "rebrand" hawaii and its faster than 980, ok I guess. They are likely going for full dx12 compatibility and freesync compatibility so I doubt its going to be busines as usual. tahiti will likely be gone. Hawaii modified. Who knows what fiji is


----------



## flopper

Quote:


> Originally Posted by *semitope*
> 
> Really hope these aren't rebrands. I mean, if they "rebrand" hawaii and its faster than 980, ok I guess. They are likely going for full dx12 compatibility and freesync compatibility so I doubt its going to be busines as usual. tahiti will likely be gone. Hawaii modified. Who knows what fiji is


at higher reslutions the 290x is the better choice vs 980.
soon we know what the new stuff is about


----------



## jackalopeater

It's coming #AMDRTP


----------



## ladcrooks

alright then! Who's got the time machine - Dr Who?


----------



## The Mac

One week left

latest rumor is they will be 4xx, as 3xx was given to OEMs.


----------



## hyp36rmax

Quote:


> Originally Posted by *The Mac*
> 
> One week left
> 
> latest rumor is they will be 4xx, as 3xx was given to OEMs.


This makes sense considering the re-branded 290X as a 390X would be inline with OEM's and a similar occurrence with the HD 7000 / HD 8000 series. One week to find out for sure!


----------



## The Mac




----------



## hyp36rmax

I saw that also. Soon!!!


----------



## Newbie2009

Have to say if the new 3xx series are all rebrands apart from the top two I will be disappoint.


----------



## the9quad

If the top card is as fast as a titanx and comes in at <$800 and I can sell these 290x's for $150 each I will get one lol. Would love to be able to say goodbye and good riddance to crossifre


----------



## ElectroGeek007

Hopefully soon-to-be Fiji owner, so excited!







The battle between this and the 980ti will be something to behold.



Spoiler: Warning: Spoiler!


----------



## hamzta09

Any idea when we get to see -real- benchmarks?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Any idea when we get to see -real- benchmarks?


probably when it's released?


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> probably when it's released?


And that is?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> probably when it's released?
> 
> 
> 
> And that is?
Click to expand...

Whenever AMD decide to release it, Either Computex or E3 I'd say.


----------



## Rickles

Quote:


> Originally Posted by *hamzta09*
> 
> Any idea when we get to see -real- benchmarks?


Launch day, whether its a soft launch or hardware is available.

I'll probably be in on one from whoever gives the longest warranty. My 7970 experience has been mostly awful.


----------



## OkanG

Getting excited to see what AMD has to offer. Would be nice if this thread could contain all the info when it gets leaked/released







Subbed


----------



## Newbie2009

Would have to be something special for me to buy, like the HD5870 launch.


----------



## Legion123

some more pics:

http://www.overclock3d.net/articles/gpu_displays/amd_fiji_pictured_again/1

i cant wait, its supposed to be released 3rd week of June. i Was going to buy titan x this week but i think i will wait few more weeks - cant wait to get rid of crossfire setup that never works properly


----------



## Sumner Rol

I don't think I've been this excited for a GPU launch in a long time. I'm just exhausted with Nvidia and their sales practices. They did the same thing with the 980 as they did with the 780. They release the card as their flagship then 9mos later: "Hey guys here's a ti version that's much better than the flagship! We had the technology to make the original just as good but we like to reward our loyal customers who buy our cards on release by making them feel like fools."

I'm so glad I didn't buy into the 9xx series. The 970, even with the memory debacle, was and is a solid card. But then you have the 980. A good card yes, but then they bring out the Titan X and wave that in our faces. Now before the new card smell fades from the Titan X we have the 980ti on imminent release. And if the leaked info is true the 980ti is around 23% more powerful than an OC'd 980 and only about 5% less powerful than a Titan X.

And yes I know Nvidia is a business, and a business needs to make money. But the way they are running their business is going to have an adverse affect on customer loyalty, especially if AMD can really deliver with the 390/390x. In my opinion Team Green has become solely about the green, and I look forward to adding more red to my system.


----------



## Newbie2009

Quote:


> Originally Posted by *Sumner Rol*
> 
> I don't think I've been this excited for a GPU launch in a long time. I'm just exhausted with Nvidia and their sales practices. They did the same thing with the 980 as they did with the 780. They release the card as their flagship then 9mos later: "Hey guys here's a ti version that's much better than the flagship! We had the technology to make the original just as good but we like to reward our loyal customers who buy our cards on release by making them feel like fools."
> 
> I'm so glad I didn't buy into the 9xx series. The 970, even with the memory debacle, was and is a solid card. But then you have the 980. A good card yes, but then they bring out the Titan X and wave that in our faces. Now before the new card smell fades from the Titan X we have the 980ti on imminent release. And if the leaked info is true the 980ti is around 23% more powerful than an OC'd 980 and only about 5% less powerful than a Titan X.
> 
> And yes I know Nvidia is a business, and a business needs to make money. But the way they are running their business is going to have an adverse affect on customer loyalty, especially if AMD can really deliver with the 390/390x. In my opinion Team Green has become solely about the green, and I look forward to adding more red to my system.


Yeah even diehard green people find it hard to defend them these days. The new titan does perform well though, i'd give it a 3.5 out of 4.


----------



## velocityx

and then theres the 980 Metal? whats that about.


----------



## zealord

Quote:


> Originally Posted by *The Mac*


That could be the rebrands. I've heard the 390X could be a rebranded 290X with 8GB GDDR VRAM or something?. Wish he had said Fiji high end GPU


----------



## Sumner Rol

Quote:


> Originally Posted by *Newbie2009*
> 
> Yeah even diehard green people find it hard to defend them these days. The new titan does perform well though, i'd give it a 3.5 out of 4.


It's true their products and technology are good, it's just the manner in which they sell the products that bothers me.


----------



## hyp36rmax

Quote:


> Originally Posted by *Sumner Rol*
> 
> I don't think I've been this excited for a GPU launch in a long time. I'm just exhausted with Nvidia and their sales practices. They did the same thing with the 980 as they did with the 780. They release the card as their flagship then 9mos later: "Hey guys here's a ti version that's much better than the flagship! We had the technology to make the original just as good but we like to reward our loyal customers who buy our cards on release by making them feel like fools."
> 
> I'm so glad I didn't buy into the 9xx series. The 970, even with the memory debacle, was and is a solid card. But then you have the 980. A good card yes, but then they bring out the Titan X and wave that in our faces. Now before the new card smell fades from the Titan X we have the 980ti on imminent release. And if the leaked info is true the 980ti is around 23% more powerful than an OC'd 980 and only about 5% less powerful than a Titan X.
> 
> And yes I know Nvidia is a business, and a business needs to make money. But the way they are running their business is going to have an adverse affect on customer loyalty, especially if AMD can really deliver with the 390/390x. In my opinion Team Green has become solely about the green, and I look forward to adding more red to my system.


Definitely a rehash of the high end GTX 700 series with the GTX 980 -> GTX 980Ti. As an owner of a GTX 780Ti, GTX 970, Crossfire HD7970's, and Crossfire R9 290X's (all on different builds in my household) I can totally understand. Both AMD and Nvidia can really deliver in performance, however Nvidia's business tactics lately have been pretty sketchy.


----------



## hamzta09

So.. are all cards just rebrands? Whats the hype for if so?

Wheres the Fiji? Will it be another TitanX style card for $1200? But with a mere 4GB ram?


----------



## hyp36rmax

Quote:


> Originally Posted by *hamzta09*
> 
> So.. are all cards just rebrands? Whats the hype for if so?
> 
> Wheres the Fiji? Will it be another TitanX style card for $1200? But with a mere 4GB ram?


We'll have solid information once Computex 2015 and E3 2015 have commenced. Rumors say that the whole 300 Series are rebrands, possibly OEM such as the HD8000 series, as well as the FIJI GPU taking on a moniker to challenge Nvidia's Titan Series exclusively. At this point it can be any of those three and some. Time will tell...


----------



## Newbie2009

Quote:


> Originally Posted by *hyp36rmax*
> 
> We'll have solid information once Computex 2015 and E3 2015 have commenced. Rumors say that the whole 300 Series are rebrands, possibly OEM such as the HD8000 series, as well as the FIJI GPU taking on a moniker to challenge Nvidia's Titan Series exclusively. At this point it can be any of those three and some. Time will tell...


Nothing in computex apparently, E3 Launch.

http://videocardz.com/55734/amd-radeon-300-series-to-arrive-on-june-16th


----------



## hyp36rmax

Quote:


> Originally Posted by *Newbie2009*
> 
> Nothing in computex apparently, E3 Launch.
> 
> http://videocardz.com/55734/amd-radeon-300-series-to-arrive-on-june-16th


Of course, however i'm sure we'll get *some* information at Computex 2015 as that is just a couple weeks from E3 (06/16 - 06/18). The closer we get to that date the more pertinent the information. Exciting times indeed friends!


----------



## Sumner Rol

Quote:


> Originally Posted by *Newbie2009*
> 
> Nothing in computex apparently, E3 Launch.
> 
> http://videocardz.com/55734/amd-radeon-300-series-to-arrive-on-june-16th


Looking that way:

http://www.ign.com/wikis/e3/PC_Gaming_Show_at_E3_2015


----------



## The Mac

as opposed to this?

http://www.amdcomputex.com.tw/

" AMD executives and special guests will be introducing new, comprehensive details on AMD's 2015 product lineup."


----------



## hyp36rmax

Quote:


> Originally Posted by *The Mac*
> 
> as opposed to this?
> 
> http://www.amdcomputex.com.tw/
> 
> " AMD executives and special guests will be introducing new, comprehensive details on AMD's 2015 product lineup."


Yep along with

Quote:


> AMD would like to cordially invite you to our press conference at Computex 2015. You will get the opportunity to see AMD's latest products and leading-edge technologies, while experiencing immersive visual, computing, and gaming demonstrations. *During this event, AMD executives and special guests will be introducing new, comprehensive details on AMD's 2015 product lineup.*


----------



## The Mac

I think they are going to announce reference designs at Comptex and AIB designs at E3.


----------



## semitope

yeah. sounds like info at computex then everything else at e3. Not sure why but if its to ride on the e3 train and get people buying right away then makes some sort of sense. They better be launched for sale right after or during the conference.

The fact that they are keeping everything so tight suggests its a real big deal to me. They are likely doing everything at glofo so thats a change. That process is better and they have to redo all their cards for it to some degree so maybe some good improvements across the board. They are unlikely to be "rebrands" in the truest sense if they have to be at global foundries with improvements. they may resemble a particular chip, but thats not really a rebrand.

we're in for something good.









Square enix is in that list. I think we are going to see dx 12 stuff and deus ex with tressFX might be there.


----------



## hamzta09

Quote:


> Originally Posted by *semitope*
> 
> yeah. sounds like info at computex then everything else at e3. Not sure why but if its to ride on the e3 train and get people buying right away then makes some sort of sense. They better be launched for sale right after or during the conference.
> 
> The fact that they are keeping everything so tight suggests its a real big deal to me. They are likely doing everything at glofo so thats a change. That process is better and they have to redo all their cards for it to some degree so maybe some good improvements across the board. They are unlikely to be "rebrands" in the truest sense if they have to be at global foundries with improvements. they may resemble a particular chip, but thats not really a rebrand.
> 
> we're in for something good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Square enix is in that list. I think we are going to see dx 12 stuff and deus ex with tressFX might be there.


AMD is doing E3 with PCGamer and other PC Devs.


----------



## bobbavet

Quote:


> Originally Posted by *hamzta09*
> 
> AMD is doing E3 with PCGamer and other PC Devs.


This comfirms what I believe will happen.

That Fiji XT will be released with Star Wars Battlefront gameplay.

With SWBF being "optimised" for HBM. hahahha. Stick it Green. lols


----------



## hamzta09

Quote:


> Originally Posted by *bobbavet*
> 
> This comfirms what I believe will happen.
> 
> That Fiji XT will be released with Star Wars Battlefront gameplay.
> 
> With SWBF being "optimised" for HBM. hahahha. Stick it Green. lols


Guess thats why DICE got Fiji.


----------



## Legion123

its going to be called fury??!!

http://www.overclock3d.net/articles/gpu_displays/amd_fiji_might_be_renamed_amd_fury/1


----------



## Sgt Bilko

Quote:


> Originally Posted by *Legion123*
> 
> its going to be called fury??!!
> 
> http://www.overclock3d.net/articles/gpu_displays/amd_fiji_might_be_renamed_amd_fury/1


I like it!


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I like it!


But but... couldn't they do better naming because because ya know marketing... its just not as good as the other guys names... but but... why waste money changing the name and having to advertise a new name... but but...

Hehehe, it could be called anything as long as its performance speaks for itself for me.


----------



## OkanG

Quote:


> Originally Posted by *DividebyZERO*
> 
> But but... couldn't they do better naming because because ya know marketing... its just not as good as the other guys names... but but... why waste money changing the name and having to advertise a new name... but but...
> 
> Hehehe, it could be called anything as long as its performance speaks for itself for me.


AMD Toothbender Dragonslayer 9001x_2135 edition 3 Coming this Summer to a Newegg near you!


----------



## DividebyZERO

Quote:


> Originally Posted by *OkanG*
> 
> AMD Toothbender Dragonslayer 9001x_2135 edition 3 Coming this Summer to a Newegg near you!


Now were talking lol


----------



## BinaryDemon

Quote:


> Originally Posted by *Legion123*
> 
> its going to be called fury??!!
> 
> http://www.overclock3d.net/articles/gpu_displays/amd_fiji_might_be_renamed_amd_fury/1


Yep. Looks like it's time to rename this Club given that the R9 390/390x cards dont appear to be Fiji based anymore.


----------



## hyp36rmax

Quote:


> Originally Posted by *BinaryDemon*
> 
> Yep. Looks like it's time to rename this Club given that the R9 390/390x cards dont appear to be Fiji based anymore.


Looks like it! If all this information with AMD resurrecting the FURY series is correct it would make a very nice throwback to their monster card of the early 2000's





The new Fury Maxx could possibly be the dual FIJI chip. With the Single FIJI chip being called a FURY PRO.


----------



## The Mac

http://item.taobao.com/item.htm?spm=2013.1.20141001.2.N99McW&id=36935228083&scm=1007.10115.6103.i44869761305&pvid=f0d3a4de-2c1e-4b65-8747-45cdece72c9c

scroll halfway down the page.


----------



## hamzta09

Quote:


> Originally Posted by *The Mac*
> 
> http://item.taobao.com/item.htm?spm=2013.1.20141001.2.N99McW&id=36935228083&scm=1007.10115.6103.i44869761305&pvid=f0d3a4de-2c1e-4b65-8747-45cdece72c9c
> 
> scroll halfway down the page.


8GB HBM.

Well I wonder what their source is.

Probably placeholder specs tho.


----------



## The Mac

Most likely from tpu.


----------



## szeged

nice work on the club thread hyp36rmax.

subbing to this just in case amd does blow me away enough to ditch the titans


----------



## hyp36rmax

Quote:


> Originally Posted by *szeged*
> 
> nice work on the club thread hyp36rmax.
> 
> subbing to this just in case amd does blow me away enough to ditch the titans


Thank you sir! Looking forward to this GPU as well. Things are about to get interesting haha.


----------



## OkanG

The 390x looks very.. short









Also, I wonder how much VRAM the 390 will have if the 390x has 8GB


----------



## The Mac

latest rumor is it isnt a 390x, its a new brand outside of the normal nomenclature ie "Radeon Fury"

Which means there may not be a "cut down" Fiji.


----------



## hyp36rmax

Quote:


> Originally Posted by *The Mac*
> 
> latest rumor is it isnt a 390x, its a new brand outside of the normal nomenclature ie "Radeon Fury"
> 
> Which means there may not be a "cut down" Fiji.


They may have three skus ready for the Radeon Fury Series: Fury (Fiji), Fury Pro (Fiji XT), and Fury MAXX (Dual-GPU)


----------



## bobbavet

Anyone find benchmarks of original Fury? Particularly against opposition.
Ya just don't go around "name dropping" an old card.


----------



## hyp36rmax

Quote:


> Originally Posted by *bobbavet*
> 
> Anyone find benchmarks of original Fury? Particularly against opposition.
> Ya just don't go around "name dropping" an old card.


Haha yea! I found some old Tom's Hardware Benches from the original Ati Rage Fury Pro: *Link*

These are new times and would not really reflect what to expect with the new FIJI.


----------



## OkanG

Quote:


> Originally Posted by *hyp36rmax*
> 
> Haha yea! I found some old Tom's Hardware Benches from the original Ati Rage Fury Pro: *Link*
> 
> These are new times and would not really reflect what to expect with the new FIJI.


Damn that's interesting. I feel like such a youngster now









It gets beat in Quake 3 Arena on High Quality in 640x480 by the Creative Labs TNT2. I guess Creative Labs > AMD?









http://www.tomshardware.com/reviews/ati-rage-fury-pro-review,133-9.html


----------



## hyp36rmax

Quote:


> Originally Posted by *OkanG*
> 
> Damn that's interesting. I feel like such a youngster now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It gets beat in Quake 3 Arena on High Quality in 640x480 by the Creative Labs TNT2. I guess Creative Labs > AMD?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.tomshardware.com/reviews/ati-rage-fury-pro-review,133-9.html


The Creative Labs TNT2 is an Nvidia GPU.


----------



## OkanG

Quote:


> Originally Posted by *hyp36rmax*
> 
> The Creative Labs TNT2 is an Nvidia GPU.


Cool, had no idea







What's Matrox and Voodoo then? Did they just get shoved out of the business by ATi and Nvidia?


----------



## hyp36rmax

Quote:


> Originally Posted by *OkanG*
> 
> Cool, had no idea
> 
> 
> 
> 
> 
> 
> 
> What's Matrox and Voodoo then? Did they just get shoved out of the business by ATi and Nvidia?


Matrox still exist in the GPU market focused outside of the gaming industry now while 3DFX's Voodoo line succumbed to mismanagement which was later bought by Nvidia. ATi as we now know are AMD.


----------



## The Mac

Quote:


> Originally Posted by *hyp36rmax*
> 
> They may have three skus ready for the Radeon Fury Series: Fury (Fiji), Fury Pro (Fiji XT), and Fury MAXX (Dual-GPU)


The problem with that is it may create market confusion.

They already have the numbered models with 2 models each. Adding 3 outside that schema would be confusing

Have just one penultimate Ala titan would be less confusing.


----------



## gatygun

I would far rather that the dual gpu's are called 395x2 / 495x2 so that it's easy to recognize.

The worst thing that nvidia did with naming was the dual gpu titan card, titan z or something. With titan x now it's just really confusing specially when more of those titan ( letter ) comes out.

I hope the 395x2 is 2,5x faster then a single titan and costs 1000 euro's + water cooling and 12+gb of memory. But yea that's probably not gona happen


----------



## $k1||z_r0k

leaked pic of die (with TIM still on to protect NDA)


----------



## ssateneth

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> 
> 
> leaked pic of die (with TIM still on to protect NDA)


What a tease!


----------



## DividebyZERO

if it's legit, is that 4GB of HBM?


----------



## szeged

its legit imo and yes that is 4gb HBM.


----------



## DividebyZERO

Then if so i guess it's going to come down to exactly how it performs? Not sure that will change a vram limit even if its faster?

I guess it still leaves many question also, such as is there more than one model of Fiji and from what i've read there is 2 versions. So if 2, then maybe there is still a possibility for more HBM.


----------



## ssateneth

Quote:


> Originally Posted by *DividebyZERO*
> 
> Then if so i guess it's going to come down to exactly how it performs? Not sure that will change a vram limit even if its faster?
> 
> I guess it still leaves many question also, such as is there more than one model of Fiji and from what i've read there is 2 versions. So if 2, then maybe there is still a possibility for more HBM.


Radeon Fury is using 1st generation HBM. It is absolutely set in stone that it will ONLY get 4GB of HBM. No more. No less. Not even future versions using the same Fiji chip. It is a hardware limitation of 1st gen HBM. Radeon Fury 2 (The chip after Fiji) will likely use 2nd generation HBM which will probably have a wider I/O path and higher total capacity due to higher stacking possibilities or additional stacks, if the memory amount per HBM chunk remains the same.


----------



## DividebyZERO

Quote:


> Originally Posted by *ssateneth*
> 
> Radeon Fury is using 1st generation HBM. It is absolutely set in stone that it will ONLY get 4GB of HBM. No more. No less. Not even future versions using the same Fiji chip. It is a hardware limitation of 1st gen HBM. Radeon Fury 2 (The chip after Fiji) will likely use 2nd generation HBM which will probably have a wider I/O path and higher total capacity due to higher stacking possibilities or additional stacks, if the memory amount per HBM chunk remains the same.


See i have heard different from someone who should have some inside info. Obviously i tak i all with a grain of salt. Time will tell but it does suck to wait so long with nvidia dropping TX and TI


----------



## hamzta09

The fury is now aimed at the 980 Ti rather than Titan X.

http://www.sweclockers.com/nyhet/20607-amd-radeon-fiji-positioneras-mot-geforce-gtx-980-ti

Nvidia really wants to get rid of AMD and gain Monopoly.


----------



## gatygun

Would be hilarious if the fury is going to be 2x 390x cores and 8gb of hbm which dx12 completely adressable at a 850 price point.

One can only dream


----------



## hyp36rmax

Quote:


> Originally Posted by *hamzta09*
> 
> The fury is now aimed at the 980 Ti rather than Titan X.
> 
> http://www.sweclockers.com/nyhet/20607-amd-radeon-fiji-positioneras-mot-geforce-gtx-980-ti
> 
> Nvidia really wants to get rid of AMD and gain Monopoly.


SWEClockers are usually on point. With that said i'd be totally ok for AMD to focus on the GTX 980Ti with the R9 FURY priced competitively starting around $600+. If it does indeed compete with the Titan X also it would be icing on the cake.

*Google Translate of the SWEClockers article*

Quote:


> Nvidia decided to bring forward the launch of the GeForce GTX 980 Ti to make life miserable for AMD and put a spoke in the wheels for the next generation Radeon is hardly a secret. The idea is that from the outset put competitor at a disadvantage and force economically vulnerable AMD once again cut into margins.
> 
> Now it looks like that this strategy may be about to succeed. Sources of vBulletin state that AMD will reposition the new flagship Radeon "Fiji" to meet the GeForce GTX 980 Ti in the price range "around $ 600". It is far from what was originally planned.
> 
> According partner manufacturers would Radeon "Fiji" contrary marketed as a more reasonable alternative to GeForce GTX Titan X. The new graphics card would offer similar performance at a lower price tag, though not as low as $ 649, that is what the GeForce GTX 980 Ti costs in the current situation .
> 
> Additional salt in the wounds come from AMD struggling with low yield, ie the percentage of usable circuits from production, and may be forced to postpone the introduction of the scaled-down versions of the graphics processor Fiji. Although the flagship steals most attention is usually downsized sibling models that account for the vast majority of sales.
> 
> The launch of the new Radeon 300 series takes place in connection with game exhibition E3 forward mid-June.


----------



## gatygun

It's good to hear that there is competition tho, because of AMD nvidia made a titan x lite for cheaper.

Gona be interesting what prices the amd cards are going to bring. I hope it's possible to get 2x fury's for 1000 euro on a single card, or fury a single on for 499. If it costs more i would rather move towards nvidia.


----------



## Agent Smith1984

If it's in stone that we will never see more than 4GB on this card, then it may better better to purchase two 8GB 290x's when the prices go down.... just sayin'...

That'd give you tons of horsepower, and plenty of VRAM to boot.


----------



## gatygun

First need to see how DX12 develops, maybe the v-ram really gets added towards eachother already soon in games. 2x 4gb will already be enough or maybe 4+8gb card. Who knows tho. The future is not clear enough yet


----------



## hyp36rmax

Quote:


> Originally Posted by *gatygun*
> 
> It's good to hear that there is competition tho, because of AMD nvidia made a titan x lite for cheaper.
> 
> Gona be interesting what prices the amd cards are going to bring. I hope it's possible to get 2x fury's for 1000 euro on a single card, or fury a single on for 499. If it costs more i would rather move towards nvidia.


I'd love to see a Dual-GPU R9 Radeon FURY MAXX using FIJI.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If it's in stone that we will never see more than 4GB on this card, then it may better better to purchase two 8GB 290x's when the prices go down.... just sayin'...
> 
> That'd give you tons of horsepower, and plenty of VRAM to boot.


I have two R9 290X 8GB cards in crossfire right now and it's awesome, however currently I have yet to see a game maximize the extra ram outside of 4GB in 4k especially in games such as Project Cars and the Witcher 3. It does in Mortal Kombat X due to a memory leak though lol!

Quote:


> Originally Posted by *gatygun*
> 
> First need to see how DX12 develops, maybe the v-ram really gets added towards eachother already soon in games. 2x 4gb will already be enough or maybe 4+8gb card. Who knows tho. The future is not clear enough yet


This would be cool if implemented correctly, I believe it's all in the hands of the developers to support this feature. Time will tell if they will finally jump on board with multi-gpu setups.


----------



## Shatun-Bear

Quote:


> Originally Posted by *hamzta09*
> 
> The fury is now aimed at the 980 Ti rather than Titan X.
> 
> http://www.sweclockers.com/nyhet/20607-amd-radeon-fiji-positioneras-mot-geforce-gtx-980-ti
> 
> Nvidia really wants to get rid of AMD and gain Monopoly.


Yeah, I reckon the Fury card being released this month is the 4GB HBM one. That will trade blows with the 980 Ti. Initially I was disappointed for AMD and for competitions sake that they couldn't also get their 8GB HBM Fury card out this month too, as that has very likely been pushed back to August (so says rumours).

But thinking about it, this 8GB HBM card is going to be an absolute beast. I fully expect it to trump the Titan X handedly. And the thing is, if they get that card out in August, there ain't going to be an answer from Nvidia for several months. I mean, when would the next Fury-beating card be out? Wouldn't that be Pascal next year? This would mean AMD could remain top dog for a while with the bona-fide top single GPU out there, obviously excluding dual-GPU cards.

Speaking of dual-GPUs, what makes the outlook seem rosier still for AMD is the rumours they have an insane 2x 8GB HBM card (Fiji version of 295x2) scheduled for release too, likely this year.


----------



## bobbavet

"the rumours they have an insane 2x 8GB HBM card (Fiji version of 295x2) scheduled for release too, likely this year."

Yeh thats the Maxx but I think it will have 8gb (2x4) in total, not 2x8gb

This is the card I am waiting for.


----------



## Sgt Bilko

16th June is the launch date










__ https://twitter.com/i/web/status/605928817576669186


----------



## neurotix

I am fully prepared to be completely let down with the performance of these cards.

I want to root for AMD, and of course I still have AMD GPUs in my system, but I have a nagging feeling that these cards might not even be comparable to a 980, let alone the 980ti or Titan X.

With all the setbacks, low yield rumors and so on... and that's just it, everything we have is rumors- I'm not getting my hopes up.

Of course, we could get something really awesome with 128 ROPs and 352 TMUs or something crazy like that, but I doubt it.

Anyone remember the lackluster 6970, or Bulldozer? Yeah...


----------



## DividebyZERO

Quote:


> Originally Posted by *neurotix*
> 
> I am fully prepared to be completely let down with the performance of these cards.
> 
> I want to root for AMD, and of course I still have AMD GPUs in my system, but I have a nagging feeling that these cards might not even be comparable to a 980, let alone the 980ti or Titan X.
> 
> With all the setbacks, low yield rumors and so on... and that's just it, everything we have is rumors- I'm not getting my hopes up.
> 
> Of course, we could get something really awesome with 128 ROPs and 352 TMUs or something crazy like that, but I doubt it.
> 
> Anyone remember the lackluster 6970, or Bulldozer? Yeah...


HBM has to have advantages over DDR5 or the next gen cards wouldn't be using that. Why are people expecting it to be worse than a 980? Rumors are just that rumors.. I don't see them releasing something slower than their last gen flagship. People just need to step back take a breath and wait a few more days or so and the reviews and such will come.

OCN is getting overrun on rumor mills these days. It's kinda making me steer away from the news threads now.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 16th June is the launch date
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> __ https://twitter.com/i/web/status/605928817576669186


Now we have a release date


----------



## MikeMike86

Quote:


> Originally Posted by *neurotix*
> 
> I am fully prepared to be completely let down with the performance of these cards.
> 
> I want to root for AMD, and of course I still have AMD GPUs in my system, but I have a nagging feeling that these cards might not even be comparable to a 980, let alone the 980ti or Titan X.
> 
> With all the setbacks, low yield rumors and so on... and that's just it, everything we have is rumors- I'm not getting my hopes up.
> 
> Of course, we could get something really awesome with 128 ROPs and 352 TMUs or something crazy like that, but I doubt it.
> 
> Anyone remember the lackluster 6970, or Bulldozer? Yeah...


Results like these give me hope AMD will bring it with DX12.
Either way Nvidia is hurting themselves with the 980ti since it comes so close to the Titan X and the Titan X is merely a gaming card unlike its predecessors.
I really don't see Nvidia going that low on price for the 980ti and not being beat by the Fury, they're too money hungry and they'd have to drop prices throughout the board to really wipe them out.

I'm going to be optimistic and say the 390x with a fully unlocked Hawaii Pro will compare to the 980 and the Fury will beat the 980ti and the Fury X will beat the Titan X.
I mean they did name it Fury X, sounds like a taunt to me, lol..

Worst case scenario they trade blows throughout and AMD goes $100 less on each card to compete with Nvidia.. The Fury X is still rumored to be $850 as of a couple days ago aiming at the Titan X, not the 980ti.

Another optimism would be that DX12 will close the gap, at least in gaming, between AMD and Intel's mainstream processors.


----------



## Casey Ryback

Quote:


> Originally Posted by *neurotix*
> 
> but I have a nagging feeling that these cards might not even be comparable to a 980, let alone the 980ti or Titan X.


It's rumoured to have 4096 cores though over the 2816 on the 290X, so I don't see how it won't be at least comparable to a gtx 980.


----------



## hyp36rmax

> A new era of PC Gaming. Coming *06.16.15.* #AMD300


https://t.co/vSGmJSZ8FW



> - AMD Radeon Graphics (@AMDRadeon)


----------



## flopper

Quote:


> Originally Posted by *neurotix*
> 
> I am fully prepared to be completely let down with the performance of these cards.
> 
> I want to root for AMD, and of course I still have AMD GPUs in my system, but I have a nagging feeling that these cards might not even be comparable to a 980, let alone the 980ti or Titan X.
> .


Not happy?

It be the card to own this year.
Buy it you be happy


----------



## DividebyZERO

Quote:


> Originally Posted by *Casey Ryback*
> 
> It's rumoured to have 4096 cores though over the 2816 on the 290X, so I don't see how it won't be at least comparable to a gtx 980.


You guys post like the 980gtx is way faster than the 290x. If you said 980ti then i would understand what your saying


----------



## Casey Ryback

Quote:


> Originally Posted by *DividebyZERO*
> 
> You guys post like the 980gtx is way faster than the 290x. If you said 980ti then i would understand what your saying


I was just replying to someone who said it may not even be comparable to a 980.

Honestly I think it will compete with the 980ti/Titan X


----------



## flopper

Quote:


> Originally Posted by *Casey Ryback*
> 
> I was just replying to someone who said it may not even be comparable to a 980.
> 
> Honestly I think it will compete with the 980ti/Titan X


of course its the fastest card in the world so there is no competiton.
compete means values changes like in a race but this isnt a race.


----------



## Casey Ryback

Quote:


> Originally Posted by *flopper*
> 
> of course its the fastest card in the world so there is no competiton.
> compete means values changes like in a race but this isnt a race.


I have no idea what you're going on about in relation to my posts.


----------



## Zealon

Close up of the Fiji die from the event:


----------



## bobbavet

Hands on with one of PowerColor's next-gen Radeon R9 390X video cards


----------



## Casey Ryback

Quote:


> Originally Posted by *bobbavet*
> 
> Hands on with one of PowerColor's next-gen Radeon R9 390X video cards


So in theory it will be an 8GB 290X with a huge overclock?

I'm a bit worried about the price with that kind of cooling solution.


----------



## hamzta09

Fiji is the "worlds fastest GPU" now.

http://www.sweclockers.com/nyhet/20618-amd-visar-radeon-fiji-varldens-snabbaste-grafikprocessor


----------



## Alastair

Quote:


> Originally Posted by *bobbavet*
> 
> Hands on with one of PowerColor's next-gen Radeon R9 390X video cards


wow. Really classy Powercolour really classy! AMD starts making good looking GPU's with 295X2 and the leaked pics of Fury and all you have to show for it is a Devil edition that looks like it was designed by a 13 year old who got board and doodled in his exam time. (I hope you didn't steal that poor child's dreams and crush his future! Shame on you Powercolour! Give that kids intellectual property back!)


----------



## Alastair

In other news. Looks like I will be getting Fury Pro or whatever the cut down Fiji die will be called. Price seems should be good and with VRAM pooling I can get the full 8GB of goodness when I have the gold to afford a second card. My dear old 6850's. It looks like your time has come. I have enjoyed my 5 years with you. You have treated me so well!


----------



## gatygun

Quote:


> Originally Posted by *Alastair*
> 
> In other news. Looks like I will be getting Fury Pro or whatever the cut down Fiji die will be called. Price seems should be good and with VRAM pooling I can get the full 8GB of goodness when I have the gold to afford a second card. My dear old 6850's. It looks like your time has come. I have enjoyed my 5 years with you. You have treated me so well!


what performance you get on witcher 3 on high / ultra with your current setup?


----------



## Alastair

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> In other news. Looks like I will be getting Fury Pro or whatever the cut down Fiji die will be called. Price seems should be good and with VRAM pooling I can get the full 8GB of goodness when I have the gold to afford a second card. My dear old 6850's. It looks like your time has come. I have enjoyed my 5 years with you. You have treated me so well!
> 
> 
> 
> what performance you get on witcher 3 on high / ultra with your current setup?
Click to expand...

I don't. Cause I don't have Witcher. I am saving every penny I have for new cards. YET THE URGE TO PURCHASE WITCHER! MUST RESIST!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


----------



## The Mac

Quote:


> Originally Posted by *Casey Ryback*
> 
> So in theory it will be an 8GB 290X with a huge overclock?
> 
> I'm a bit worried about the price with that kind of cooling solution.


no, in theory its a respun Hawaii aka Granada.

So new silicon possibly upgraded to GCN 1.2 (4K VSR, PLP Eyefinity, better tess performance)


----------



## Casey Ryback

Quote:


> Originally Posted by *The Mac*
> 
> no, in theory its a respun Hawaii aka Granada.
> 
> So new silicon possibly upgraded to GCN 1.2 (4K VSR, PLP Eyefinity, better tess performance)


Is that tessellation performance advantage only whilst using DX12 though? Or does it simply mean better cards ie 390 will beat 290 clock for clock?

(probably asking questions nobody knows yet)

Think I'll settle on a 390/390X and skip the HBM gen 1.


----------



## The Mac

tess is a hardware function, it has nothing to do with dx version.


----------



## BradleyW

Quote:


> Originally Posted by *The Mac*
> 
> tess is a hardware function, it has nothing to do with dx version.


Tessellation can be software optimized and may perform better in general on DX12. Just like any other function. Executed in hardware, programmed in software.


----------



## The Mac

you know what i meant.

Any hardware function can be poorly implemented.

The tessellator itself has been upgraded in hardware.


----------



## djsatane

Quote:


> Originally Posted by *bobbavet*
> 
> Hands on with one of PowerColor's next-gen Radeon R9 390X video cards


Article has now been removed so anything that was said there should be considered rumor for now...


----------



## boredmug

Lol.. well WHAT DID IT SAY?


----------



## Casey Ryback

Quote:


> Originally Posted by *The Mac*
> 
> Any hardware function can be poorly implemented.
> 
> The tessellator itself has been upgraded in hardware.


Yeah I thought as much, hopefully they get a decent boost in performance from the improvements and clock speed bump.


----------



## BradleyW

Quote:


> Originally Posted by *The Mac*
> 
> you know what i meant.
> 
> Any hardware function can be poorly implemented.
> 
> The tessellator itself has been upgraded in hardware.


But you say it has nothing to do with DX version. That's wrong. Although not many people know, DX9 had tessellation, but runs quicker on the first official build of DX11 and onwards. GPU manufacturer's then started to implement better hardware tessellation units dedicated to certain calculation types once heavier tessellation became viable via the boosted performance of DX11 over DX9,


----------



## MikeMike86

I just remember this: link to earlier chats about fat Hawaii's.
I just hope they made them more efficient, I don't care much about power consumption being from the Midwest but I can completely understand the cost affecting a purchase in bigger cities where a kWh can cost up to double what I pay.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MikeMike86*
> 
> I just remember this: link to earlier chats about fat Hawaii's.
> I just hope they made them more efficient, I don't care much about power consumption being from the Midwest but I can completely understand the cost affecting a purchase in bigger cities where a kWh can cost up to double what I pay.


Perspective, i pay 0.31c per kW/hr and i still don't care but i know others do.


----------



## p4inkill3r

You'd need to be running full use for many hours per day in order to make a substantial difference in your electricity bill.
Anyway, the argument that $50/year savings when debating which $500+ video card to buy is like not taking a Lamborghini over a Ferrari because there's a 2 MPG difference.


----------



## MikeMike86

Hey I only paid $240 for my 290x new so the difference wasn't worth it to go with a 970 at $300, some people that are looking at current prices tend to take it into account $300 vs $300 and 2yr old tech vs 1yr old tech.. not to mention I've had good luck with AMD the past couple generations while my older Nvidia cards don't run games as well as they used to due to lack of support.

But yea it is perspective but that's all I hear from the other side of the fence, main things are they're not efficient and the tech is older...

So with the real issues of efficiency and being newly released tech it should shut them up, but if they come out requiring 100watts more people will continue to complain.. It's all about getting them to taste the fruit punch, once they do, I'm pretty sure many more will join in on the red side.

I do it for prices personally, I'm a bargain shopper and Nvidia is greedy as hell.


----------



## flopper

Listening to people saying power is important then they dont buy the 980ti for sure as it draws 400w.
or any other high end card for that matter.

great line up from amd expected and the fury for dx12 and win 10 oh my.
savings have started as a Fury I want.


----------



## Casey Ryback

Quote:


> Originally Posted by *flopper*
> 
> Listening to people saying power is important then they dont buy the 980ti for sure as it draws 400w.


Seen the kingpin 980ti?

http://www.legitreviews.com/evga-geforce-gtx-980-ti-kingpin-video-card-coming_164793

Uses two 8 pin and one 6 pin PCI connectors. Crazy.


----------



## tp4tissue

Quote:


> Originally Posted by *Casey Ryback*
> 
> Seen the kingpin 980ti?
> 
> http://www.legitreviews.com/evga-geforce-gtx-980-ti-kingpin-video-card-coming_164793
> 
> Uses two 8 pin and one 6 pin PCI connectors. Crazy.


You know those race cars that use up an engine within a race or two..

That's what the KingPin is..


----------



## Alastair

Quote:


> Originally Posted by *tp4tissue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Casey Ryback*
> 
> Seen the kingpin 980ti?
> 
> http://www.legitreviews.com/evga-geforce-gtx-980-ti-kingpin-video-card-coming_164793
> 
> Uses two 8 pin and one 6 pin PCI connectors. Crazy.
> 
> 
> 
> You know those race cars that use up an engine within a race or two..
> 
> That's what the KingPin is..
Click to expand...

If youre using the Kingpin for LN2 yes maybe. But if you are using it like a normal person than no. However normal people can just get the SSC or SCC or whatever they are called, cards that EVGA makes.


----------



## MikeMike86

720watts on one card? That's insane...
As for the 980Ti being power hungry, it still consumes less power than the 290x http://www.anandtech.com/bench/product/1438?vs=1496 and it throws down at 4k unlike the regular 980. (400watts is total system power consumption.)

Also performance per watt it looks like the 980 is in the lead (seen here), so I guess I don't understand the comment about not getting a high end gpu.
I suspect the 750ti is better yet but I can't find a direct comparison for high to low end cards and you can't sli them.

I'm a numbers and facts based guy, not trying to offend you...
The 290x still reminds me of a gtx480 which was built like a tank and consumed power like a Razer scooter with a 400lb dude on it.


----------



## DividebyZERO

What are you guys talking about.


----------



## jerrolds

Kinda bummed that the rumours so far say the Fury Fiji only has HDMI/DP outputs and no DVI. Using the QNIX [email protected], which only allows for 1 DVI connection. Afaik there are no active dp to dual link dvi active adapters that have enough bandwidth for [email protected]

Looks like i might go back to the green team









Owned the last 3 Radeon flagships....6970, 7970, 290X - all crossfired at one point hah

If this is the case, im hoping prices for the 980ti go down on Fury launch


----------



## tsm106

Quote:


> Originally Posted by *MikeMike86*
> 
> 720watts on one card? That's insane...
> *As for the 980Ti being power hungry, it still consumes less power than the 290x* http://www.anandtech.com/bench/product/1438?vs=1496 and it throws down at 4k unlike the regular 980. (400watts is total system power consumption.)
> 
> Also performance per watt it looks like the 980 is in the lead (seen here), so I guess I don't understand the comment about not getting a high end gpu.
> I suspect the 750ti is better yet but I can't find a direct comparison for high to low end cards and you can't sli them.
> 
> I'm a numbers and facts based guy, not trying to offend you...
> The 290x still reminds me of a gtx480 which was built like a tank and consumed power like a Razer scooter with a 400lb dude on it.


Dude, you gotta stop using those stupid annandtech charts. You know they are using a reference coolered card right, and you compare that to a brand new ti? Even putting aside that that is not a fair comparison, it further adds insult by using a ref card. Just saying...


----------



## The Mac

Quote:


> Originally Posted by *jerrolds*
> 
> Kinda bummed that the rumours so far say the Fury Fiji only has HDMI/DP outputs and no DVI. Using the QNIX [email protected], which only allows for 1 DVI connection. Afaik there are no active dp to dual link dvi active adapters that have enough bandwidth for [email protected]
> 
> Looks like i might go back to the green team
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Owned the last 3 Radeon flagships....6970, 7970, 290X - all crossfired at one point hah
> 
> If this is the case, im hoping prices for the 980ti go down on Fury launch


its suppsed to be HDMI 2.0 wich means you can get a dual-link passive adapter no problem.


----------



## Dotachin

Quote:


> Originally Posted by *The Mac*
> 
> its suppsed to be HDMI 2.0 wich means you can get a dual-link passive adapter no problem.


Would the adapter be included? I haven't seen any hdmi 2.0 adapters yet.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> Dude, you gotta stop using those stupid annandtech charts. You know they are using a reference coolered card right, and you compare that to a brand new ti? Even putting aside that that is not a fair comparison, it further adds insult by using a ref card. Just saying...


Yeah even the fiji info and owners thread isn't going to be safe from this crap. Would be nice to actually compare fiji to 980ti. June 16th is close then we can compare apple to apples


----------



## DNMock

Hey guys, any word on the possibility of DP 1.3 being on the Fury XT or whaatever the HBM Fiji card is called nowadays?


----------



## jerrolds

Quote:


> Originally Posted by *The Mac*
> 
> its suppsed to be HDMI 2.0 wich means you can get a dual-link passive adapter no problem.


Interesting ok sweet - hopefully Fury Fiji is a good margin faster than 980ti


----------



## tsm106

Quote:


> Originally Posted by *DNMock*
> 
> Hey guys, any word on the possibility of DP 1.3 being on the Fury XT or whaatever the HBM Fiji card is called nowadays?


50/50. Though, to be cautious I wouldn't get my hopes up. The ports were out not long ago but the Fury cards would have had to have them in possession much longer in order to be in production. Timeline wise and with NDAs it just doesn't seem like they had ample time to get them with enough leeway for production.


----------



## The Mac

Quote:


> Originally Posted by *Dotachin*
> 
> Would the adapter be included? I haven't seen any hdmi 2.0 adapters yet.


probobly not, but that would depend on the AIB.

http://www.amazon.com/Belkin-DVI-HDMI-Adapter-Supports/dp/B0013FA8LM


----------



## Dotachin

Quote:


> Originally Posted by *The Mac*
> 
> probobly not, but that would depend on the AIB.
> 
> http://www.amazon.com/Belkin-DVI-HDMI-Adapter-Supports/dp/B0013FA8LM


"The middle two vertical rows are missing, meaning for any newbies out there, that this is a "single link" adapter"

No way that one will work.

"supports" probably means "compatible"


----------



## The Mac

edit: after further research, i dont think its possible, as it has the same limitation as DP.

Passive adapters work because there is a discrete signal

HDMI only has one signal regardless of the resolutio/refresh, so an active solution is the only choice.


----------



## Dotachin

Well that's a shame, thanks anyway.


----------



## tsm106

Quote:


> Originally Posted by *Dotachin*
> 
> Well that's a shame, thanks anyway.


There won't be any for a while. And when they do come out they will probably be as unreliable as the other active adapters that they will supersede unfortunately. Btw, true hdmi 2.0 ports are only just appearing on home audio receivers so I suspect the wait for true 2.0 adapters to be a really long ways off.

http://www.pioneerelectronics.com/PUSA/Home/AV-Receivers/Elite+Receivers/VSX-90
Quote:


> Full Bandwidth HDMI (*4K UltraHD 60P/4:4:4*) with HDCP 2.2


99% of the current receivers advertising 4k are full of crap.


----------



## Newbie2009

http://www.eteknix.com/powercolor-clarifies-details-on-amd-390x-photos-at-computex/

PowerColor Clarifies Details on AMD 390X Photos at Computex


----------



## The Mac

old news, that was yesterday.

pay attention

lol


----------



## Newbie2009

Quote:


> Originally Posted by *The Mac*
> 
> old news, that was yesterday.
> 
> pay attention
> 
> lol


My bad. I'm trying to not pay too much attention so I can avoid the upgrade bug.


----------



## tsm106

Quote:


> Originally Posted by *Newbie2009*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Mac*
> 
> old news, that was yesterday.
> 
> pay attention
> 
> lol
> 
> 
> 
> My bad. I'm trying to not pay too much attention so I can avoid the upgrade bug.
Click to expand...

Look at the pcb number, it's a revised 290x pcb. Don't worry though, you're not the only one who didn't notice.

http://www.ekwb.com/news/503/19/New-Full-Cover-water-block-for-new-Radeon-R9-290X-graphics-cards/


----------



## The Mac

at the time, that was explained away by the rumor mongers as Grenada being potentially pin compatable with Hawaii.


----------



## snow cakes

oh boy, i might have to sell my brand new xfx r9 290x for one of these bad boys


----------



## frunction

This card plus freesync is getting me interested

I have been on Nvidia for several years (6950 crossfire was maybe was last ATI card I ran for any period of time).

I just don't know how I feel about AMD as company. How are the drivers? Do AMD components play nice with Intel chips and Windows these days?


----------



## The Mac

no issues whatsoever if you go single card.

xfire, just like sli has its issues.

dx12 should fix a lot of that, but that doesnt help for dx11 and older games.


----------



## DNMock

Quote:


> Originally Posted by *tsm106*
> 
> 50/50. Though, to be cautious I wouldn't get my hopes up. The ports were out not long ago but the Fury cards would have had to have them in possession much longer in order to be in production. Timeline wise and with NDAs it just doesn't seem like they had ample time to get them with enough leeway for production.


Dang, looks like my dreams of 120hz 4k are just gonna have to stay on hold for another year or two.


----------



## Casey Ryback

Quote:


> Originally Posted by *frunction*
> 
> I just don't know how I feel about AMD as company. How are the drivers? Do AMD components play nice with Intel chips and Windows these days?


AMD as a company are great, for the position they are in they still pump out good cards for the dollar.

I swear by them for graphics, but then again I never run multiple GPUs or displays so not sure what their drivers are like for those situations.

For a single card my 7950 now 7970 have been really good (mining boom upgrade for $20)

I run an Intel CPU and AMD GPU there's no problems there................pretty sure that's an old way of thinking.


----------



## jerrolds

Quote:


> Originally Posted by *The Mac*
> 
> edit: after further research, i dont think its possible, as it has the same limitation as DP.
> 
> Passive adapters work because there is a discrete signal
> 
> HDMI only has one signal regardless of the resolutio/refresh, so an active solution is the only choice.


Dammit looks like I'm going back to Nvidia for the first time since the 460ti lol

Unless the rumour that the Acer ultra wide ips monitor might actually be 100hz instead of the 75hz is true.


----------



## hamzta09

290X rebrand, awyeah AMD... how about something new in the same price range of a 980, to drop down the price on that and also the 980 Ti..


----------



## Alastair

Quote:


> Originally Posted by *hamzta09*
> 
> 290X rebrand, awyeah AMD... how about something new in the same price range of a 980, to drop down the price on that and also the 980 Ti..


Fury won't be a rebrand. And AMD isn't going to release an entire slew of 28nm cards when 16/14nm finfet is around the corner. It would be a massive waste of R&D. What you might find is the hawaii rebrand will have an improved power efficiency compared to the older 290's. But in all reality Hawaii already is extremely competitive with NVidia GM204. The power consumption argument is a moot point. And if it really concerns people than just turn your PC off when you don't use it.


----------



## flopper

Quote:


> Originally Posted by *Alastair*
> 
> Fury won't be a rebrand. And AMD isn't going to release an entire slew of 28nm cards when 16/14nm finfet is around the corner. It would be a massive waste of R&D. What you might find is the hawaii rebrand will have an improved power efficiency compared to the older 290's. But in all reality Hawaii already is extremely competitive with NVidia GM204. The power consumption argument is a moot point. And if it really concerns people than just turn your PC off when you don't use it.


yea a 14/16nm which is unknown how much yield it will have and if any unforseen issues will be discovered.
seems like a superb line up for AMD with the glofo node.
soon we know


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hamzta09*
> 
> 290X rebrand, awyeah AMD... how about something new in the same price range of a 980, to drop down the price on that and also the 980 Ti..
> 
> 
> 
> Fury won't be a rebrand. And AMD isn't going to release an entire slew of 28nm cards when 16/14nm finfet is around the corner. It would be a massive waste of R&D. What you might find is the hawaii rebrand will have an improved power efficiency compared to the older 290's. But in all reality Hawaii already is extremely competitive with NVidia GM204. The power consumption argument is a moot point. And if it really concerns people than just turn your PC off when you don't use it.
Click to expand...

Very true this.

390x will most likely have higher clocks and maybe lower power consumption over the 290x not to mention a lovely 8GB of vram


----------



## glenn37216

Omg. As of right now current builds of fiji are slower than a 980ti. *** amd. some cards are already arriving at review sites. (Chinese sites)


----------



## rdr09

Quote:


> Originally Posted by *glenn37216*
> 
> Omg. As of right now current builds of fiji are slower than a 980ti. *** amd. some cards are already arriving at review sites. (Chinese sites)


OMG, fiji is faster than my 290 and I rather stay red.


----------



## flopper

Quote:


> Originally Posted by *rdr09*
> 
> OMG, fiji is faster than my 290 and I rather stay red.


yea who want to buy from a company that sells 4gb oh sry 3.5gb cards as 4gb?
lies and deceit are common Nvidia traits and if people want to support such well good for them.

Fiji cant come fast enough.


----------



## rdr09

Quote:


> Originally Posted by *flopper*
> 
> yea who want to buy from a company that sells 4gb oh sry 3.5gb cards as 4gb?
> lies and deceit are common Nvidia traits and if people want to support such well good for them.
> 
> Fiji cant come fast enough.


no comment.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> yea who want to buy from a company that sells 4gb oh sry 3.5gb cards as 4gb?
> lies and deceit are common Nvidia traits and if people want to support such well good for them.
> 
> Fiji cant come fast enough.
> 
> 
> 
> no comment.
Click to expand...

No _memory_ of this ever taking place?


----------



## The Mac

Quote:


> Originally Posted by *glenn37216*
> 
> Omg. As of right now current builds of fiji are slower than a 980ti. *** amd. some cards are already arriving at review sites. (Chinese sites)


such reputable sources

ill believe any of that FUD when its posted on site i actually trust.

Until its on the shelves, its all just rumor.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very true this.
> 
> 390x will most likely have higher clocks and maybe lower power consumption over the 290x not to mention a lovely 8GB of vram


50 or so Mhz higher clock on Core...

290X also has an 8GB model so..


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Very true this.
> 
> 390x will most likely have higher clocks and maybe lower power consumption over the 290x not to mention a lovely 8GB of vram
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 50 or so Mhz higher clock on Core...
> 
> 290X also has an 8GB model so..
Click to expand...

Nothing has been confirmed as yet.

It might just be a factory overclocked 290x 8GB but then again it may not be









either way with a rumoured MSRP of $329 for the R9 390 it makes for a very good solution for 4k gaming with those in Crossfire


----------



## DividebyZERO

What has been confirmed is tons of speculation and guesses. I got to admit its amusing watching all this negativity and wild stuff thrown out as fact. Not much longer to wait though.


----------



## PontiacGTX

Quote:


> Originally Posted by *DividebyZERO*
> 
> What has been confirmed is tons of speculation and guesses. I got to admit its amusing watching all this negativity and wild stuff thrown out as fact. Not much longer to wait though.


they can add the 4096SP,64CU ,256TMUs ,4GB 1024Bit HBM 4Hi in the OP


----------



## tsm106

Hold on to your hats and glasses guys, this next week is gonna be a wild ride.


----------



## p4inkill3r

Quote:


> Originally Posted by *tsm106*
> 
> Hold on to your hats and glasses guys, this next week is gonna be a wild ride.


Not to mention the furor that will erupt when the first reviews come out and Fury becomes the fastest GPU in the world.


----------



## hamzta09

http://www.sweclockers.com/nyhet/20665-amd-radeon-fiji-dops-till-fury-kostar-over-10-000-kronor

Priced like a Titan X and slower than a Titan X apparently.


----------



## tsm106

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.sweclockers.com/nyhet/20665-amd-radeon-fiji-dops-till-fury-kostar-over-10-000-kronor
> 
> Priced like a Titan X and slower than a Titan X apparently.


And where are these dopey swedes getting their numbers from?


----------



## hyp36rmax

Quote:


> Originally Posted by *PontiacGTX*
> 
> they can add the 4096SP,64CU ,256TMUs ,4GB 1024Bit HBM 4Hi in the OP


This information will go up once AMD releases this information. Do you have a solid source for this?


----------



## zealord

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.sweclockers.com/nyhet/20665-amd-radeon-fiji-dops-till-fury-kostar-over-10-000-kronor
> 
> Priced like a Titan X and slower than a Titan X apparently.


Google translate not working that well, are you sure about the performance? Price seems a bit lower than Titan X. This news is so random.

AMD knows they can't release a card slower than a Titan X for the same price as the Titan X.

I think the Big Fiji is going to be 849-899$ 4GB HBM same performance as Titan X on 1080p/1440p and faster on 4K and above.


----------



## hamzta09

Quote:


> Originally Posted by *tsm106*
> 
> And where are these dopey swedes getting their numbers from?


Same as dopey americans.
Quote:


> Originally Posted by *zealord*
> 
> Google translate not working that well, are you sure about the performance? Price seems a bit lower than Titan X. This news is so random.
> 
> AMD knows they can't release a card slower than a Titan X for the same price as the Titan X.
> 
> I think the Big Fiji is going to be 849-899$ 4GB HBM same performance as Titan X on 1080p/1440p and faster on 4K and above.


The performance is based on PCper.

Price is 10000 kr according to documents, so thats roughly 1200 dollars. Rid EU Tax and sh.. you get 1000


----------



## zealord

Quote:


> Originally Posted by *hamzta09*
> 
> Same as dopey americans.
> The performance is based on PCper.
> 
> Price is 10000 kr according to documents, so thats roughly 1200 dollars. Rid EU Tax and sh.. you get 1000


On the same site there is a price comparison down there with the Titan X and it ranges from 11300 - 12000 kr. You can't convert that easily. If the Titan X is 11500~ and 1000$ in the US and the Fury X is 10000 kr~ according to rumours then that probably means the Fury is around 850-900$ ~.


----------



## szeged

Bold move from amd to price it like the Titan x. That hopefully means the performance matches or beats it. Hope it doesn't turn into a fx 9590 situation again.


----------



## tsm106

for taking stupid rumors posted on websites as apparently truth.


----------



## PontiacGTX

Quote:


> Originally Posted by *hyp36rmax*
> 
> This information will go up once AMD releases this information. Do you have a solid source for this?


well it is supposed that if you have 64SP more you get 4TMUs more.each Compute Unit has 64 SP (4096/64=64CU) and HBM 4096 Bit has 4 Hi which has 4 layers,each with a 128bit dual channel, with 2Gb memory density,each running at 1GHz(effective?) (1024bit*1GHz/8/100=128GB/s) per stack 4x stacks have 512GB/s

http://en.wikipedia.org/wiki/Graphics_Core_Next


----------



## hyp36rmax

Quote:


> Originally Posted by *PontiacGTX*
> 
> well it is supposed that if you have 64SP more you get 4TMUs more.each Compute Unit has 64 SP (4096/64=64CU) and HBM 4096 Bit has 4 Hi which has 4 layers,each with a 128bit dual channel, with 2Gb memory density,each running at 1GHz(effective?) (1024bit*1GHz/8/100=128GB/s) per stack 4x stacks have 512GB/s
> 
> http://en.wikipedia.org/wiki/Graphics_Core_Next


I appreciate the effort, I'll wait for AMD on 06/16/15 at E3.


----------



## hamzta09

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> for taking stupid rumors posted on websites as apparently truth.


Oh look at Troll.
Sweclockers is as Reliable as any other HW site. They're not rumors as they're taken from AMD.

And here you have some benches.
http://www.sweclockers.com/nyhet/20663-radeon-fiji-med-4-096-streamprocessorer-skymtas-i-prestandatest


----------



## The Mac

OMG,









that comment is flat out ridiculous

you are batguano crazy if you think any of these rumors are anything other than clickbait.

Its fun to debate all this garbage, but until its on the shelf, its rumor.


----------



## Peter Nixeus

Hoping getting Star Wars: Battlefront bundled with a Fury is true!


----------



## flopper

Quote:


> Originally Posted by *szeged*
> 
> Bold move from amd to price it like the Titan x. That hopefully means the performance matches or beats it. Hope it doesn't turn into a fx 9590 situation again.


10% above Titan x in performance
so even if I dont pay that price for a card it seem priced accordingly to performance.


----------



## frunction

Hopefully the air cooled versions are somewhere between the 980 Ti and Titan X in price with better performance. If it turns out okay, I am factoring in a freesync monitor will be much less than gsync as well.

However, with the new memory and all, it'll probably be well over $1k at release.


----------



## hamzta09

Quote:


> Originally Posted by *The Mac*
> 
> OMG,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that comment is flat out ridiculous
> 
> you are batguano crazy if you think any of these rumors are anything other than clickbait.
> 
> Its fun to debate all this garbage, but until its on the shelf, its rumor.


You really think its gonna reck Titan X?


----------



## The Mac

maybe, maybe not.

Just like that same site last week said it was crap, and the week beofre that they said it was on par.

till an official review comes online, im not picking a side.

Every site has said it both ways so they can say "see, we were right" no matter how it falls out.

If you throw enough crap at the wall, something will stick.


----------



## hamzta09

Quote:


> Originally Posted by *The Mac*
> 
> maybe, maybe not.
> 
> Just like that same site last week said it was crap, and the week beofre that they said it was on par.
> 
> till an official review comes online, im not picking a side.
> 
> Every site has said it both ways so they can say "see, we were right" no matter how it falls out.
> 
> If you throw enough crap at the wall, something will stick.


Id like to see where it says its crap.


----------



## The Mac

google it, i havent got the patience.


----------



## tsm106

Quote:


> Originally Posted by *The Mac*
> 
> google it, i havent got the patience.


----------



## hamzta09

Quote:


> Originally Posted by *The Mac*
> 
> google it, i havent got the patience.


You said "Just like that same site last week said it was crap"

I cant find anything in the Swec archives so.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Peter Nixeus*
> 
> Hoping getting Star Wars: Battlefront bundled with a Fury is true!


As do I!


----------



## jerrolds

Quote:


> Originally Posted by *glenn37216*
> 
> Omg. As of right now current builds of fiji are slower than a 980ti. *** amd. some cards are already arriving at review sites. (Chinese sites)


If this is true im actually glad - since the fiji reportedly only supports dp/hdmi outputs and my qnix only had a single dl-dvi input. Looks like im back to green team, hoping Fiji launch lowers 980ti prices.


----------



## kcuestag

Quote:


> Originally Posted by *Sgt Bilko*
> 
> As do I!


That'd be awesome.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *glenn37216*
> 
> Omg. As of right now current builds of fiji are slower than a 980ti. *** amd. some cards are already arriving at review sites. (Chinese sites)
> 
> 
> 
> If this is true im actually glad - since the fiji reportedly only supports dp/hdmi outputs and my qnix only had a single dl-dvi input. Looks like im back to green team, hoping Fiji launch lowers 980ti prices.
Click to expand...

Im in the same boat but if i do pick one up it'll either have an adapter with it or ill just grab one for it, no big deal either way


----------



## Scotty99

So wait, the 390/390x is still old GPU's?

http://www.digitaltrends.com/computing/radeon-300-pricing-leak/

/confused, i thought 390 stuff was supposed to compete with titanx/980ti?


----------



## p4inkill3r

Quote:


> Originally Posted by *Scotty99*
> 
> So wait, the 390/390x is still old GPU's?
> 
> http://www.digitaltrends.com/computing/radeon-300-pricing-leak/
> 
> /confused, i thought 390 stuff was supposed to compete with titanx/980ti?


There's nothing that confirms that speculation officially.


----------



## DividebyZERO

Quote:


> Originally Posted by *p4inkill3r*
> 
> There's nothing that confirms that speculation officially.


you forgot about the Klingon,romulan,ferengi,borg semi partial tech review third party affiliates of distant time traveling future reporting websites that confirm we all have a big imagination.

EDITED: Removed real world descriptors of any race as to be neutral and not offend anyone who might read this post. (Hamzta09)


----------



## Scotty99

Digital trends is a pretty reputable website, how are they getting all the specifics of what cores the cards are using?

Just speculate for a second they are correct, what in the world is AMD doing? ANOTHER round of rebadges???? Im so sick of this, i wanna give AMD my money but they keep disappointing.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scotty99*
> 
> Digital trends is a pretty reputable website, how are they getting all the specifics of what cores the cards are using?
> 
> Just speculate for a second they are correct, what in the world is AMD doing? ANOTHER round of rebadges???? Im so sick of this, i wanna give AMD my money but they keep disappointing.


Then buy a Fury.


----------



## Scotty99

Quote:


> Originally Posted by *p4inkill3r*
> 
> Then buy a Fury.


When is that coming? For months people assumed the 390x would be it right? Now look at whats happening.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scotty99*
> 
> Quote:
> 
> 
> 
> Originally Posted by *p4inkill3r*
> 
> Then buy a Fury.
> 
> 
> 
> When is that coming? For months people assumed the 390x would be it right? Now look at whats happening.
Click to expand...

People assumed......thats the issue


----------



## tsm106

^^Yep

AMD has released almost zero specifications on the Fury. Everything you read is a rumor or speculation.

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> for taking stupid rumors posted on websites as apparently truth.
> 
> 
> 
> Oh look at Troll.
> Sweclockers is as Reliable as any other HW site. They're not rumors as they're taken from AMD.
> 
> And here you have some benches.
> http://www.sweclockers.com/nyhet/20663-radeon-fiji-med-4-096-streamprocessorer-skymtas-i-prestandatest
Click to expand...


----------



## hamzta09

Quote:


> Originally Posted by *DividebyZERO*
> 
> you forgot about the Chinese,Korean,African,Swedish semi partial tech review third party affiliates of distant time traveling future reporting websites that confirm we all have a big imagination.


Gotta love how you didnt mention American.

But I guess, we're all communists and liars outside of America...

390X will compete with 980, thats known. Its known its a rebrand.

Stop pretending otherwise.

XFX even posted official specs on their site.


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> XFX even posted official specs on their site.


They posted a box/card picture, not the specs. But it did say 8GB GDDR5, so at least that much is confirmed.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> People assumed......thats the issue


You people dont keep up with anything at all do you?

AMD will show off the cards at E3.. which is June 19th.


----------



## hamzta09

Quote:


> Originally Posted by *Forceman*
> 
> They posted a box/card picture, not the specs. But it did say 8GB GDDR5, so at least that much is confirmed.


Quote:


> The XFX Radeon R9 390X Double Dissipation comes with the Grenada (Hawaii) core which packs 2816 stream processors, 176 texture mapping units and 64 ROPs. The card comes with 8 GB of GDDR5 VRAM which is clocked at 6 GHz as opposed to 5 GHz on the Radeon R9 290X. The memory operates along a 512-bit interface and pumps out 384 GB/s bandwidth. There are no clock speeds given except the older specs from Hawaii GPU but it is expected that the card will get a 50 MHz core bump (reference) so this card may be further factory over clocked. From a design perspective, the card features the XFX Double Dissipation cooler with a custom PCB design that has two fans to push air down a central Direct Contact base heatsink system incorporated with several heatpipes. Display outputs include Dual-DVI, HDMI and a Display Port. Card may be provided power by either 8+8 or 8+6 Pin connectors, *expect a price range around $449 US when the cards officially hit the market on 16th June.*


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> ^^Yep
> 
> AMD has released almost zero specifications on the Fury. Everything you read is a rumor or speculation.


you keep hitting yourself like that your gonna get drain bamage.
Quote:


> Originally Posted by *hamzta09*
> 
> Gotta love how you didnt mention American.
> 
> But I guess, we're all communists and liars outside of America...
> 
> 390X will compete with 980, thats known. Its known its a rebrand.
> 
> Stop pretending otherwise.
> 
> XFX even posted official specs on their site.


I never meant to imply anything of that sort, next time i will use ficticious descriptors.

Fixed it for you:
http://www.overclock.net/t/1547314/amd-r9-radeon-fury-fiji-gpu-information-and-owners-club/210#post_24017455


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*


I went to the XFX webpage while it was still live, the only thing 390X on it was that single picture. The rest of the page was 290X info. The text you posted is rumor/speculation, it wasn't on the page.

Edit: not saying those specs aren't correct, just that they weren't in the XFX leak.


----------



## hyp36rmax

Everything regarding the R9 Radeon FURY up to this point is speculation without a solid source from AMD with the exception of the name "R9 Radeon FURY" and HBM. How about we stop polluting this thread with nonsense until AMD announces specs.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> People assumed......thats the issue
> 
> 
> 
> You people dont keep up with anything at all do you?
> 
> AMD will show off the cards at E3.. which is June 19th.
Click to expand...

And here i thought it was the 16th judging by the giant backdrop at computex....

The issue is you keeping posting rumours like they are fact and that its just a 290x with a name change (Which is what a rebrand actually is)

If there have been any changes at all then its an upgraded model and not a rebrand, I don't understand why it's so hard for people to understand that.
Quote:


> Originally Posted by *hyp36rmax*
> 
> Everything regarding the R9 Radeon FURY up to this point is speculation without a solid source from AMD with the exception of the name "R9 Radeon FURY" and HBM. How about we stop polluting this thread with nonsense until AMD announces specs.


^Yup....


----------



## The Mac

The fail is strong with this thread...

lol

Until it comes directly from AMD, and that wont happen till 9AM PDT on the 16th (NOT the 19th), it pure gossip regardless of how reputable the source is.

it the same sites regurgitating the same stuff, over and over again.

Everyone seems to have an "official unnamed source"

yeah right....

Well, my moms hairdressers dog groomer knows someone who cleans the toilets at amd, and they said everyone is flat out wrong...

And hes a reliable source

lol


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> You people dont keep up with anything at all do you?
> 
> AMD will show off the cards at E3.. which is June 19th.












AMD officially announced the 16th as the date.

http://www.anandtech.com/show/9335/amd-confirms-june-16th-date-for-upcoming-gpu-announcement

Yet you are accusing people of not keeping up with anything, not only that but in a superior manner.

Back under your bridge sir/madam.


----------



## djsatane

If 390 series are just rebrands and do not have HBM memory then this is big dissapointment after all the HBM memory hype.


----------



## Casey Ryback

Quote:


> Originally Posted by *djsatane*
> 
> If 390 series are just rebrands and do not have HBM memory then this is big dissapointment after all the HBM memory hype.


Not really, AMD never hyped anything about HBM being on all cards.

Just because random people on the internet hype things up, doesn't make it true, or expected.

All that matters is if they can compete with price performance, whilst supplying ample vram, which they will, and always have done.

Sure it would be nice if all the cards were new tech, but if it's not possible then so be it.

I'll be buying an R9 390/390X with 8GB as I only like spending >$400 on a card anyway. nvidia pricing is a joke in australia though, gtx 970's are $500. Makes the choice a lot easier, as you can buy R9 290's for $350 atm.

Plus 3.5GB vram isn't enough, or at least won't be when I get my 1440p display, I like to keep my cards for some years

For people that want high end cards and spend more, fury will be available.


----------



## PontiacGTX

Quote:


> Originally Posted by *Casey Ryback*
> 
> Not really, AMD never hyped anything about HBM being on all cards.
> 
> Just because random people on the internet hype things up, doesn't make it true, or expected.
> 
> All that matters is if they can compete with price performance, whilst supplying ample vram, which they will, and always have done.
> 
> Sure it would be nice if all the cards were new tech, but if it's not possible then so be it.
> 
> I'll be buying an R9 390/390X with 8GB as I only like spending >$400 on a card anyway. nvidia pricing is a joke in australia though, gtx 970's are $500. Makes the choice a lot easier, as you can buy R9 290's for $350 atm.
> 
> Plus 3.5GB vram isn't enough, or at least won't be when I get my 1440p display, I like to keep my cards for some years
> 
> For people that want high end cards and spend more, fury will be available.


hold your 7970. they could be fine untl 2016 when you will get a better performance for your money


----------



## Casey Ryback

Quote:


> Originally Posted by *PontiacGTX*
> 
> hold your 7970. they could be fine untl 2016 when you will get a better performance for your money


I've been holding it for years though,

Started with a sapphire 7950 $350

> sold to a miner for $310

> bought a gigabyte 7970 for $330

I'd prefer to sell the 7970 before the warranty ends, preferably with a year+ still on it.

If you wait, you wait for ever, and the 7970 just isn't cutting it. I think I've been running the 7000 series for 3+ years now.

I won't be buying and nvidia card in 2016 anyway as australian prices suck so badly, plus they don't really supply the cards I'm looking for at the mid price section of the market.

I don't see AMD releasing anything new till at least late 2016, with the value I've had from AMD I won't be jumping ship anytime soon.

Going up to 1440p shortly and this 7970 just won't cut it anymore, I will wait till prices settle on the 300 series though, probably give it 3-4 months before buying.

If i had the US market pricing it would make my decision a lot tougher. For example the 980ti is $1000 here....you get the picture.


----------



## PontiacGTX

Quote:


> Originally Posted by *Casey Ryback*
> 
> If you wait, you wait for ever, and the 7970 just isn't cutting it. I think I've been running the 7000 series for 3+ years now.
> 
> I won't be buying and nvidia card in 2016 anyway as australian prices suck so badly, plus they don't really supply the cards I'm looking for at the mid price section of the market. I don't see AMD releasing anything new till at least late 2016.
> 
> Going up to 1440p shortly and this 7970 just won't cut it anymore, I will wait till prices settle on the 300 series though, probably give it 3-4 months before buying.


if you wait, you do it for something you know is a good improvement for the money unless you dont mind about power consumption
i am not talking about nvidia card maybe a R9 470X or 480 would bear a 390/x


----------



## Casey Ryback

Quote:


> Originally Posted by *PontiacGTX*
> 
> if you wait, you do it for somethiNG you know is a good improvement for the money not a small upgrade which would look small next year
> 
> am not talking about nvidia card maybe a R9 470X or 480 WOULD BEAT A 390X


But when will they be released....2017? it's not exactly on the horizon. You're telling me to wait for the next series when this one hasn't even been released!

I probably won't be pulling the trigger if performance is identical to current R9 290's. Unless they are really cheap.

I'm hoping the rumours are true and they have slightly improved tessellation units, and efficiency. Along with the DX12 features they might create quite a large gap between itself and the 7970 in those titles.

I am eagerly awaiting the benchmarks.


----------



## bobbavet




----------



## bobbavet

Quote:


> Originally Posted by *Peter Nixeus*
> 
> Hoping getting Star Wars: Battlefront bundled with a Fury is true!


Hey! I started that rumor, so it better be true. lols

Thought this would have been posted by now. Real or more Herp Derp?

AMD Radeon Fury X beats Titan X in leaked CompuBench OpenCL benchmark


----------



## bobbavet

*AMD Radeon Fury X And Fury Specs Confirmed - Immensely Powerful Cards Powered By Fiji, World's First HBM GPU
*


----------



## Forceman

So it's "confirmed" but the clock speed is listed as >= 1050. Uh huh, confirmed.


----------



## hamzta09

So what ur saying is that amd is nolonger announcing anything at E3?

Then why are they there.
Quote:


> Originally Posted by *Casey Ryback*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD officially announced the 16th as the date.
> 
> http://www.anandtech.com/show/9335/amd-confirms-june-16th-date-for-upcoming-gpu-announcement
> 
> Yet you are accusing people of not keeping up with anything, not only that but in a superior manner.
> 
> Back under your bridge sir/madam.


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> So what ur saying is that amd is nolonger announcing anything at E3?
> 
> Then why are they there.


You don't know the dates of E3 obviously?

https://www.e3expo.com/home

16th -18th June.


----------



## The Mac

It's during E3, close enough. 24 hrs of PC centric material, and a boatload of devs and AIBs.


----------



## Sgt Bilko

Quote:


> Originally Posted by *The Mac*
> 
> It's during E3, close enough. 24 hrs of PC centric material, and a boatload of devs and AIBs.


And i will be glued for any and all info!









Seriously considering selling my 295x2 for one of these bad boys.....


----------



## 3930sabertooth

*This comment on anandtechs website is worrying, and i hope this guy is completely wrong, I really want to support AMD though.*

"well we already know that around CES. AMD revised the silicon. that is fact! as Titan was released, and AMD announced the first delay essentially. Went radio silent.
and then Lisa Su announced that Computex was the release for the new chip. We know this didnt happen. and a few things really got under my skin here.

The EA/DICE employee the posted that picture. was obviously a media ploy by AMD. AMD confirmed at computex that they dont even have bioses or clocks etc nailed down. and AMD didnt have a full unit for display even!! So that was a fake!! I'm very upset at this.

2nd. They missed another deadline here. and at E3 they will miss another. you cant have a card that far behind be ready in what 13 days! That's impossible! Remember they said they would be FOR SALE! There is NO WAY! Again another deadline. and really under my skin.

leaks coming out of Computex are. the AIB's were given a demo of an unfinished Fiji/ FURY whatever. and it performed worse than a Ti 980. So I would have hated to see the A revision. cause not beating a 980 vanilla. that wld have been really bad.

Instead realistically at the most we will see Fiji run as well as a 295 X2. and thats if they get every single thing 100% right! drivers, bios, water cooling. everything.

So realistically we may even see 85% of a 295X2 speed.

For perspective a Titan X is about 20-30% slower then 2 780Ti's in SLI.
And we all know they all use the same node at TSMC. or maybe AMD have global foundries making it. who knows then that makes things a bit harder to judge!"


----------



## Sgt Bilko

Found this while roaming the interwebz......thoughts?


----------



## Casey Ryback

Quote:


> Originally Posted by *3930sabertooth*
> 
> *
> The EA/DICE employee the posted that picture. was obviously a media ploy by AMD. AMD confirmed at computex that they dont even have bioses or clocks etc nailed down. and AMD didnt have a full unit for display even!! So that was a fake!! I'm very upset at this.
> 
> 2nd. They missed another deadline here. and at E3 they will miss another. you cant have a card that far behind be ready in what 13 days! That's impossible! Remember they said they would be FOR SALE! There is NO WAY! Again another deadline. and really under my skin.*


There's no way to prove if the EA/DICE employee post was a stunt by AMD themselves. Accusations of 'a fake' is a little overboard.

If you get upset by delays in tech then expect a lifetime of being upset lol.

I'm not certain they said fury would be on sale on the 16th, I could be wrong but afaik they said the 300 series would be released on the 16th, which it will.

It sucks that it's coming a little later but hey, not the first time tech has been delayed...............


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Found this while roaming the interwebz......thoughts?


Godlike performance from the Fury
small neat and awesome


----------



## bobbavet

Afaic I was under the impression that 300 series would be released at E3.

Then AMD confirmed a re-badge 300 series and announced the "Fury" range.

If "Fury" game play is not at least shown @ E3, along with paper specs and some benches. It's a fail afaic.

You don't go to a "gunfight:" with a knife. Hence my thoughts on a E3 release of Fury and SWBF.

I thinking more a 980Ti now to go with some 4k.

If I had the cash on hand, I would have done that by now.


----------



## hwoverclkd

well, AMD should deliver or go home








Quote:


> Originally Posted by *3930sabertooth*
> 
> *This comment on anandtechs website is worrying, and i hope this guy is completely wrong, I really want to support AMD though.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> "well we already know that around CES. AMD revised the silicon. that is fact! as Titan was released, and AMD announced the first delay essentially. Went radio silent.
> and then Lisa Su announced that Computex was the release for the new chip. We know this didnt happen. and a few things really got under my skin here.
> 
> The EA/DICE employee the posted that picture. was obviously a media ploy by AMD. AMD confirmed at computex that they dont even have bioses or clocks etc nailed down. and AMD didnt have a full unit for display even!! So that was a fake!! I'm very upset at this.
> 
> 2nd. They missed another deadline here. and at E3 they will miss another. you cant have a card that far behind be ready in what 13 days! That's impossible! Remember they said they would be FOR SALE! There is NO WAY! Again another deadline. and really under my skin.
> 
> leaks coming out of Computex are. the AIB's were given a demo of an unfinished Fiji/ FURY whatever. and it performed worse than a Ti 980. So I would have hated to see the A revision. cause not beating a 980 vanilla. that wld have been really bad.
> 
> Instead realistically at the most we will see Fiji run as well as a 295 X2. and thats if they get every single thing 100% right! drivers, bios, water cooling. everything.
> 
> So realistically we may even see 85% of a 295X2 speed.
> 
> For perspective a Titan X is about 20-30% slower then 2 780Ti's in SLI.
> And we all know they all use the same node at TSMC. or maybe AMD have global foundries making it. who knows then that makes things a bit harder to judge!"


----------



## Casey Ryback

Quote:


> Originally Posted by *bobbavet*
> 
> Afaic I was under the impression that 300 series would be released at E3.
> 
> Then AMD confirmed a re-badge 300 series and announced the "Fury" range.
> 
> If "Fury" game play is not at least shown @ E3, along with paper specs and some benches. It's a fail afaic.
> 
> You don't go to a "gunfight:" with a knife. Hence my thoughts on a E3 release of Fury and SWBF.
> 
> I thinking more a 980Ti now to go with some 4k.
> 
> If I had the cash on hand, I would have done that by now.


Whether they always planned to have the 300 series separate from the fury cards only they would know.

Maybe they did it when they found out fury wasn't going to be ready in time. Maybe it was always the plan.

It's so hard for me to actually remember the official announcements through the tides of rumours lol.

One can only hope fury will be demonstrated at E3, would be good for consumers to at least show them what exactly is coming.


----------



## Casey Ryback

Quote:


> Originally Posted by *acupalypse*
> 
> well, AMD should deliver or go home


Wrong forum this is OCN not WCCF tech.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> well, AMD should deliver or go home
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wrong forum this is OCN not WCCF tech.
Click to expand...

To be fair WCCFTech has been pretty decent with their accuracy lately, Not going to take it all as fact of course but they are getting better









But yeah, that was a stupid comment, of course Fury is going to deliver


----------



## djsatane

So I wonder... actual fury cards and not the 290 series rebrand will still come this year you think?


----------



## Sgt Bilko

Quote:


> Originally Posted by *djsatane*
> 
> So I wonder... actual fury cards and not the 290 series rebrand will still come this year you think?


It's only a rebrand if nothing has been changed imo and yes, there's no reason to think that Fury won't launch alongside the rest of the 300 series.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Hold on to your hats and glasses guys, this next week is gonna be a wild ride.


I hope so... after the wait/delay and being beat to the punch by NV, AMD really needs to bring out a halo product with lot's 'o Wow factor. If _fury_ is as good as rumored, I'll be getting a couple.. hopefully an sku without some fischer-price CLC dangling off it. I pulled the CLC off my 295x2 as soon as a waterblock was available.
Quote:


> Originally Posted by *PontiacGTX*
> 
> hold your 7970. they could be fine untl 2016 when you will get a better performance for your money


Probably one of the top 3 gpus released in recent years. A landmark. I miss mine...


----------



## bobbavet

Hello! Hello! Wat we have ere?
http://www.tweaktown.com/news/45791/amd-rumored-bundling-star-wars-battlefront-each-fury-card/index.html


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Found this while roaming the interwebz......thoughts?


looks pretty nice to me. Chances are i would rather custom watercool but i guess it depends on heat


----------



## Sgt Bilko

Quote:


> Originally Posted by *bobbavet*
> 
> Hello! Hello! Wat we have ere?
> http://www.tweaktown.com/news/45791/amd-rumored-bundling-star-wars-battlefront-each-fury-card/index.html


WccfTech reported that first funnily enough








Quote:


> Our source was also able to confirm that AMD will bundle their top most cards with Star Wars Battlefront to sweeten the deal for Radeon users. This means that buyers who buy or pre-order the cards will immediately get $60 US worth of value on their purchase and gain access to a highly anticipated AAA title based. Cards under the Fiji based parts are not confirmed to be included in the bundle and will get Dirt: Rally instead.


http://wccftech.com/amd-radeon-fury-fiji-based-graphics-card-msi-pictured-roadmap-radeon-300-series-leaked/
Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Found this while roaming the interwebz......thoughts?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> looks pretty nice to me. Chances are i would rather custom watercool but i guess it depends on heat
Click to expand...

I doubt we'd just get a watercooled version and that's it, there will most likely be Air cooled cards as well


----------



## bobbavet

So juat say Fury X is 54℅ up on 290x.
Where would that place it against TX?

Foing off to troll throufh TX benches with 290x.


----------



## bobbavet

Wc may have earlier. I predicted weeks a go. So excited.


----------



## DividebyZERO

It would be nice to know when they will be for sale so hopefully i am not at work and they sell out. Then again nvidia has 100% market share so maybe it wont be so bad.


----------



## Sgt Bilko

Quote:


> Originally Posted by *DividebyZERO*
> 
> It would be nice to know when they will be for sale so hopefully i am not at work and they sell out. Then again nvidia has 100% market share so maybe it wont be so bad.


Haha, i know the feeling, i was at work when the 290x went on sale here and i had no phone reception.....had to stand on top of a truck to get enough signal to get my order in, was very lucky, snagged the last Sapphire 290x available on day one


----------



## flopper

Quote:


> Originally Posted by *bobbavet*
> 
> So juat say Fury X is 54℅ up on 290x.
> Where would that place it against TX?
> 
> Foing off to troll throufh TX benches with 290x.


Depends on how they calculated those numbers trough game tests on average or calculated performance.
Titanx is 43% faster on average than 290x while costing 4x more.
The release of a $650 980ti shows the Fury is coming for Nvidia.


----------



## The Mac

all that nonsense by anad is just that.

There was never an announcement by AMD that fiji would be revealed at computex.GPUs are never revealed at computex.

That being said, Lisa Su did show the chip.

There is absolutely no proof whatsoever they revised the silicon.

We only have the word of one AIB that they all got non-bios cards, and im not sure i believe the source.

i dont agree with a single word that guy posted.


----------



## jerrolds

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Im in the same boat but if i do pick one up it'll either have an adapter with it or ill just grab one for it, no big deal either way


I dont think there are any active adapters that will do [email protected] - i have a dell/biztalk one and the highest i could hit was around 75hz at 1440p. Unless you know of higher bandwidth active adapters?


----------



## zealord

http://videocardz.com/56225/amd-radeon-fury-x-3dmark-performance

Would fall into line with what we've heard and what I'd expect them to be based on specifications. But might still be fake though


----------



## DNMock

Quote:


> Originally Posted by *zealord*
> 
> 
> 
> http://videocardz.com/56225/amd-radeon-fury-x-3dmark-performance
> 
> Would fall into line with what we've heard and what I'd expect them to be based on specifications. But might still be fake though


the 4gb and lack of firestrike ultra score on the crossfire set-up is kind of concerning


----------



## tsm106

Fake.


----------



## The Mac

im going with BS.


----------



## MiladEd

Quote:


> Originally Posted by *zealord*
> 
> 
> 
> http://videocardz.com/56225/amd-radeon-fury-x-3dmark-performance
> 
> Would fall into line with what we've heard and what I'd expect them to be based on specifications. But might still be fake though


It looks good. Just hope they price it competitively, and it will be a clear winner. It looks like AMD might make a serious comeback with this card. I hope.


----------



## tsm106

Quote:


> Originally Posted by *The Mac*
> 
> im going with BS.


^^Yep

The numbers WCCF are spouting are really ridiculous. One, at face value they are in line with existing performance spectrum or expectations. If AMD produced a regular card, one could expect it to drop right where they expected it to. But we don't know how good HBM is. But we do know it is going to be something since not only AMD but Nvidia is following AMD in banking on HBM as the future. Thus having the results drop right where you'd naturally expect it ballpark range is obviously obvious.


----------



## The Mac

For me its less about that, and more about the fact that they have not legitimized their source.

Some random person named Lisa is not a valid source.


----------



## FrenzyBee

Quote:


> Originally Posted by *The Mac*
> 
> For me its less about that, and more about the fact that they have not legitimized their source.
> 
> Some random person named Lisa is not a valid source.


Pretty sure they meant Lisa Su, AMD's CEO.
Might be wrong tho.


----------



## tsm106

^^I know it's cute right? Lisa!


----------



## szeged

he was definitely trying to say lisa su but probably left the last name out just incase someone found out he didnt have any contact with her then he could be like "oh no i meant my good friends brothers uncles dog lisa.


----------



## tsm106




----------



## The Mac

Quote:


> Originally Posted by *FrenzyBee*
> 
> Pretty sure they meant Lisa Su, AMD's CEO.
> Might be wrong tho.


I can guarantee you any information he got did not come from Lisa Su.

He was clearly trying to imply this however, wich is where the fail is.


----------



## carlhil2

Know this guy? Lol..http://www.3dmark.com/fs/5036776


----------



## tsm106

Quote:


> Originally Posted by *carlhil2*
> 
> 
> Know this guy? Lol..http://www.3dmark.com/fs/5036776










@ WCCF

http://videocardz.com/56225/amd-radeon-fury-x-3dmark-performance

Lmao their chart number doesn't even match the actual run. [email protected]#$


----------



## devilhead

Quote:


> Originally Posted by *carlhil2*
> 
> 
> Know this guy? Lol..http://www.3dmark.com/fs/5036776


DigitalStorm https://www.youtube.com/user/digitalstormpc/featured
looks somebody trolling them


----------



## gatygun

Quote:


> Originally Posted by *3930sabertooth*
> 
> *This comment on anandtechs website is worrying, and i hope this guy is completely wrong, I really want to support AMD though.*
> 
> "well we already know that around CES. AMD revised the silicon. that is fact! as Titan was released, and AMD announced the first delay essentially. Went radio silent.
> and then Lisa Su announced that Computex was the release for the new chip. We know this didnt happen. and a few things really got under my skin here.
> 
> The EA/DICE employee the posted that picture. was obviously a media ploy by AMD. AMD confirmed at computex that they dont even have bioses or clocks etc nailed down. and AMD didnt have a full unit for display even!! So that was a fake!! I'm very upset at this.
> 
> 2nd. They missed another deadline here. and at E3 they will miss another. you cant have a card that far behind be ready in what 13 days! That's impossible! Remember they said they would be FOR SALE! There is NO WAY! Again another deadline. and really under my skin.
> 
> leaks coming out of Computex are. the AIB's were given a demo of an unfinished Fiji/ FURY whatever. and it performed worse than a Ti 980. So I would have hated to see the A revision. cause not beating a 980 vanilla. that wld have been really bad.
> 
> Instead realistically at the most we will see Fiji run as well as a 295 X2. and thats if they get every single thing 100% right! drivers, bios, water cooling. everything.
> 
> So realistically we may even see 85% of a 295X2 speed.
> 
> For perspective a Titan X is about 20-30% slower then 2 780Ti's in SLI.
> And we all know they all use the same node at TSMC. or maybe AMD have global foundries making it. who knows then that makes things a bit harder to judge!"


In my vision they need to drop the hbm entirely. Go fo gddr5, push a 980 ti performance card for 450 euro with 8gb gddr5 and we have a winner. Build a bit slower solution for 100 euro's off on top of that and be done.

Then build a 395x2 on 2x titan x performance with water cooling 2x 8gb v-ram for 999 euro's.

Launch next year a lineup with hbm2.

If they gonna sell fury cards for more then 500+ euro's and won't have any decent performance boost over a 980 ti and only have 4gb of v-ram, i won't see much people moving towards it at all.

These cards will probably be doa on arrivel then.

I hope they spended there time wise. They probably are completely cornered by nvidia atm other then price. So let's hope that's going somewhere.


----------



## PontiacGTX

Quote:


> Originally Posted by *gatygun*
> 
> In my vision they need to drop the hbm entirely. Go fo gddr5, push a 980 ti performance card for 450 euro with 8gb gddr5 and we have a winner. Build a bit slower solution for 100 euro's off on top of that and be done.
> 
> Then build a 395x2 on 2x titan x performance with water cooling 2x 8gb v-ram for 999 euro's.
> 
> Launch next year a lineup with hbm2.
> 
> If they gonna sell fury cards for more then 500+ euro's and won't have any decent performance boost over a 980 ti and only have 4gb of v-ram, i won't see much people moving towards it at all.
> 
> These cards will probably be doa on arrivel then.
> 
> I hope they spended there time wise. They probably are completely cornered by nvidia atm other then price. So let's hope that's going somewhere.


why they would go with a lower bandwidth?


----------



## flopper

Quote:


> Originally Posted by *PontiacGTX*
> 
> why the would go with a lower bandwdith?


yea LOL.
Fury Pro new tech 5 days to go


----------



## djsatane

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's only a rebrand if nothing has been changed imo and yes, there's no reason to think that Fury won't launch alongside the rest of the 300 series.


So people out there still think actual HBM Fury cards will still launch or be shown at this e3??? From what I read last week it seems things are shaping into amd only releasing rebrands with gddr5 the 300 fiji or whatever it is cards and we wont see hbm cards for while...


----------



## Sgt Bilko

Quote:


> Originally Posted by *djsatane*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's only a rebrand if nothing has been changed imo and yes, there's no reason to think that Fury won't launch alongside the rest of the 300 series.
> 
> 
> 
> So people out there still think actual HBM Fury cards will still launch or be shown at this e3??? From what I read last week it seems things are shaping into amd only releasing rebrands with gddr5 the 300 fiji or whatever it is cards and we wont see hbm cards for while...
Click to expand...

As i said before......we have no reason to believe that the entire 300 series + Fury won't be officially announced/released at E3 on the 16th.

AMD is doing a 24hr livestream for the event so i dare say alot of stuff would be covered during it


----------



## $k1||z_r0k

nobody posted this yet?










looks like the inside of a Corsair Graphite 600T White case...


----------



## Ultracarpet

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> nobody posted this yet?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> looks like the inside of a Corsair Graphite 600T White case...


I think I would have to leave the side of my case off just to look at it.


----------



## glenn37216

Wonder if the Fury will handle PhysX and Gamework titles better than the 2 series did ? Are we in for more late driver releases and excuses? -I'm a little antsy to find out.


----------



## Casey Ryback

Quote:


> Originally Posted by *glenn37216*
> 
> Wonder if the Fury will handle PhysX and Gamework titles better than the 2 series did ? Are we in for more late driver releases and excuses? -I'm a little antsy to find out.


1. Doubt it.

2. Probably still have the later drivers, as they are still way down in market share to support their cards as well as nvidia.

It's kind of a catch 22 as if you want them to improve, you need to buy their cards and tell your friends to purchase them too.


----------



## TK421

would there be custom pcb version of amd fury?

I would like to see a lightning from msi


----------



## frunction

Quote:


> Originally Posted by *TK421*
> 
> would there be custom pcb version of amd fury?
> 
> I would like to see a lightning from msi


Latest rumor is Fury X will be reference only water cooled, and the regular Fury will allow non-reference designs.

http://www.sweclockers.com/nyhet/20684-radeon-fury-x-endast-med-vattenkylning-nedskalade-fury-anmals-saknad


----------



## zealord

new leaked picture :

http://i.imgur.com/UJU62xR.jpg


----------



## tsm106

Quote:


> Originally Posted by *zealord*
> 
> new leaked picture :
> 
> http://i.imgur.com/UJU62xR.jpg


I posted in the news thread, but the more I look at this pic it is the same as the wccf pics. However the tubes and fan choice is odd.


----------



## zealord

Quote:


> Originally Posted by *tsm106*
> 
> I posted in the news thread, but the more I look at this pic it is the same as the wccf pics. However the tubes and fan choice is odd.


oh my bad then. didn't see it just came home an hour ago.


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> I posted in the news thread, but the more I look at this pic it is the same as the wccf pics. However the tubes and fan choice is odd.


Looks like the angle of the shot makes it look like its not connected as it's lines up perfectly with the return hose to the GPU. Looks like another Gentle Typhoon being tested with this.


----------



## TK421

1. buy gpu
2. steal typhoon
3. return to store claiming it overheated

profit!?!11!


----------



## zeppoli

I mean how funny would it be if this new 390X(fury card or whatever its called)

is..

Slower than the 980 ti.
as much money as the 980 ti,

nothing that impressive with the new series.

I mean, IMO AMD needs to hit a homerun here, if they won't be as fast as Nvidia's second top tier lineup, they better be much much cheaper.


----------



## Agent Smith1984

In my opinion, the card needed to have 6GB+ VRAM and be the same price or less than 2) 290x....

Neither of those will be the case, so I'm going to hold off on this series....


----------



## The Mac

im all over a fiji pro if it comes under 5 bills.

if not, my hawaii pro will have to hold me off till 14nm


----------



## zeppoli

Quote:


> Originally Posted by *Agent Smith1984*
> 
> In my opinion, the card needed to have 6GB+ VRAM and be the same price or less than 2) 290x....
> 
> Neither of those will be the case, so I'm going to hold off on this series....


well we dont know this for a fact do we?

I mean I tested a GTX 970 against my top of the line R9 290 and the 970 was about 30% faster over all.

I want AMD to hit it out of the park

They have to be faster than the 980 ti at the same price

or very close, maybe 10% slower, but 30% cheaper.

Otherwise they might as well shut their doors for PC GPU';s


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> well we dont know this for a fact do we?
> 
> I mean I tested a GTX 970 against my top of the line R9 290 and the 970 was about 30% faster over all.
> 
> I want AMD to hit it out of the park
> 
> They have to be faster than the 980 ti at the same price
> 
> or very close, maybe 10% slower, but 30% cheaper.
> 
> Otherwise they might as well shut their doors for PC GPU';s


wut? even a 980 is not 30% faster than any of my 290s.

Edit: oh, wait, you are the one having trouble with the 290s. saw your thread.


----------



## zeppoli

Quote:


> Originally Posted by *rdr09*
> 
> wut? even a 980 is not 30% faster than any of my 290s.
> 
> Edit: oh, wait, you are the one having trouble with the 290s. saw your thread.


You're kidding right?









I can do this all day long. 980 is almost exactly 30% faster, but regardless, the 970 in my system was about 30% or so, I'm just throwing that out there, bottom line AMD needs to come out strong, there is no reason they should lag behind (pardon the pun)


----------



## tsm106

http://www.3dmark.com/compare/fs/4477985/fs/2889373

Fastest 970 listed on search vs my lowly 290x. The 970 should be what, 25% faster than my 290x? 1600+ core clocked 970 on a chiller (look at that cpu oc lol) or so vs my 290x at 1320 core on water, sounds fare to me.

Oh look, fastest valid 980 sli on search vs...

http://www.3dmark.com/compare/fs/2940940/fs/3100892


----------



## DividebyZERO

it just keeps getting worse and worse..


----------



## tsm106

I could do this all day.


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/compare/fs/4477985/fs/2889373
> 
> Fastest 970 listed on search vs my lowly 290x. The 970 should be what, 25% faster than my 290x? 1600+ core clocked 970 on a chiller (look at that cpu oc lol) or so vs my 290x at 1320 core on water, sounds fare to me.


i'm already missing my refernce 290X on water







http://www.3dmark.com/fs/3049351


----------



## tsm106

It's unfair that cpu physics lol!

http://www.3dmark.com/compare/fs/3049351/fs/2889373

What gpu clock were you at devil?


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> It's unfair that cpu physics lol!
> 
> http://www.3dmark.com/compare/fs/3049351/fs/2889373
> 
> What gpu clock were you at devil?


1360-1370mhz core /1700 memory


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> You're kidding right?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can do this all day long. 980 is almost exactly 30% faster, but regardless, the 970 in my system was about 30% or so, I'm just throwing that out there, bottom line AMD needs to come out strong, there is no reason they should lag behind (pardon the pun)


no i'm not. unless you are comparing a stock 290 with a 970 that's boosting to 1400?

Can you at least oc your 290 to 1250 core? that like a 970 boosting to 1500.


----------



## tp4tissue

Quote:


> Originally Posted by *TK421*
> 
> 1. buy gpu
> 2. steal typhoon
> 3. return to store claiming it overheated
> 
> profit!?!11!


Yea man... No way you get a typhoon with this thing.. just no way


----------



## TK421

Quote:


> Originally Posted by *tp4tissue*
> 
> Yea man... No way you get a typhoon with this thing.. just no way


you never know :3


----------



## zeppoli

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/compare/fs/4477985/fs/2889373
> 
> Fastest 970 listed on search vs my lowly 290x. The 970 should be what, 25% faster than my 290x? 1600+ core clocked 970 on a chiller (look at that cpu oc lol) or so vs my 290x at 1320 core on water, sounds fare to me.
> 
> Oh look, fastest valid 980 sli on search vs...
> 
> http://www.3dmark.com/compare/fs/2940940/fs/3100892


I never play 3dmark to be honest, is it a good game? I see all sorts of videos being played, I might have to look into that.

But AAA games, like GTA V, Witcher 3, Skyrim fully modded, WAtchdogs, FSX,DCS and all sorts of others, shows the 970 (and especially the 980) being quite a bit faster than a 290.

Just search gtx 970 benchmark, hit images and you'll see our mighty r9 290's in there, but you need to look down.


----------



## zeppoli

Serious time, are you all actually paying for 3dmark to benchmark? or are you using the free version?

I just used the free version

Intel Core i7-4790K @ 4.4
AMD Radeon R9 290 @ 1073 / 1414
MSI Z97 GAMING 5 (MS-7917)
64-bit Windows 8.1 (6.3.9600)

scored 12987, with just 1 gpu not 2

http://www.3dmark.com/3dm11/9923752


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> Serious time, are you all actually paying for 3dmark to benchmark? or are you using the free version?
> 
> I just used the free version
> 
> Intel Core i7-4790K @ 4.4
> AMD Radeon R9 290 @ 1073 / 1414
> MSI Z97 GAMING 5 (MS-7917)
> 64-bit Windows 8.1 (6.3.9600)
> 
> scored 12987, with just 1 gpu not 2
> 
> http://www.3dmark.com/3dm11/9923752


that 3DMark11.









no wonder your 290 gets beat . . . its slow . . .

http://www.3dmark.com/3dm11/8776470


----------



## tsm106




----------



## zeppoli

oh is that what it is now?

so I have two of them, I'll switch them out,

then you'll say what?


----------



## Jpmboy

loo - what's going on here? the Fury has to be as good, hopefully better than a 980 and maybe knock on a TitanX or two. Otherwise it will be very disappointing. 290X is old news, AMD needs some wow factor on this release considering the hype.
Not comparing it to TX, but at least a last gen 980:
http://www.3dmark.com/fs/4213354
http://www.3dmark.com/3dm11/9500248
Fury has to do better. (period) ... please.









... still likin my 295x2 tho.


----------



## Agent Smith1984

I'm seeing some pretty dumb posts.....

Sigh..


----------



## The Mac

just now?

read back...

lol


----------



## DividebyZERO

Quote:


> Originally Posted by *Jpmboy*
> 
> loo - *what's going on here?* the Fury has to be as good, hopefully better than a 980 and maybe knock on a TitanX or two. Otherwise it will be very disappointing. 290X is old news, AMD needs some wow factor on this release considering the hype.
> Not comparing it to TX, but at least a last gen 980:
> http://www.3dmark.com/fs/4213354
> http://www.3dmark.com/3dm11/9500248
> Fury has to do better. (period) ... please.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ... still likin my 295x2 tho.


I think its like "LOOK SQUIRREL!" - I wanted to say originally i though AMD keeping quiet on this was excellent. The more i read on here about the fiji though the more i wish we had official information already from AMD. My brain is starting to hurt reading all this crap popping up now. I don't recall ever seeing so much crazy stuff on OCN.


----------



## F4ze0ne

Reviews in the next few weeks?

Quote:


> Yeah that is in my .plan however *the upcoming week or two three will be busy with the AMD stuff.* But likely end of the month and once I receive a third card I'll look into a 3-way SLI article.


http://forums.guru3d.com/showpost.php?p=5094022


----------



## Gumbi

Quote:


> Originally Posted by *F4ze0ne*
> 
> Reviews in the next few weeks?
> http://forums.guru3d.com/showpost.php?p=5094022


A 290X is as good or slightly faster than a 970, no way is a 970 30% faster.

However, in CPU limited games, such as WoW, nVidia takes the cake.


----------



## glenn37216

Quote:


> Originally Posted by *Gumbi*
> 
> A 290X is as good or slightly faster than a 970, no way is a 970 30% faster.
> 
> However, in CPU limited games, such as WoW, nVidia takes the cake.


really depends on what game your playing. I know for a fact my 290x direct cu II is slower in all gameworks titles than my evga 970 sc is . - By how much that's to be debated.


----------



## PontiacGTX

Quote:


> Originally Posted by *Gumbi*
> 
> A 290X is as good or slightly faster than a 970, no way is a 970 30% faster.
> 
> However, in CPU limited games, such as WoW, nVidia takes the cake.


in Crysis AMD is better than nvidia,in CoH and CoH2 amd is better than nvidia.then no

http://www.anandtech.com/show/8568/the-geforce-gtx-970-review-feat-evga/5


----------



## the9quad

I do wish I could afford both camps so I could have the best of both worlds to be honest. It is a pain to be locked out of things as an AMD user, but such is life.


----------



## hyp36rmax

Quote:


> Originally Posted by *the9quad*
> 
> I do wish I could afford both camps so I could have the best of both worlds to be honest. It is a pain to be locked out of things as an AMD user, but such is life.


That's where i'm at with a GTX 970, GTX 780Ti, Crossfire R9 290X's, Crossfire HD 7970's. It's like owning a XBOX ONE and a PS4 lol....


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> loo - what's going on here? the Fury has to be as good, hopefully better than a 980 and maybe knock on a TitanX or two. Otherwise it will be very disappointing. 290X is old news, AMD needs some wow factor on this release considering the hype.
> Not comparing it to TX, but at least a last gen 980:
> http://www.3dmark.com/fs/4213354
> http://www.3dmark.com/3dm11/9500248
> Fury has to do better. (period) ... please.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ... still likin my 295x2 tho.


Who called the ringer in?


----------



## Alastair

Quote:


> Originally Posted by *the9quad*
> 
> I do wish I could afford both camps so I could have the best of both worlds to be honest. It is a pain to be locked out of things as an AMD user, but such is life.


You know what is even worse! Nvidia screwing their old customers over. People who wanted to find a use for their old 8500GT's or GTS450's as physx cards getting screwed by NVidia because they think your money is worthless if you own Nvidia and AMD at the same time. screw NVidia. Because of that they essentially rendered both my old cards useless and for that I will always by AMD whether they are ahead or behind in the graphics game.


----------



## 2002dunx

Without compute Nvidia has no use to me......

Do hope AMD haven't followed their lead with the latest product....

When dammit ?

dunx


----------



## hamzta09

What I think the Fury is like..


Spoiler: Warning: Spoiler!






Spoiler: Warning: Spoiler!


----------



## F4ze0ne

New pics on PCPer...









http://www.pcper.com/news/Graphics-Cards/AMD-Radeon-Fury-X-Graphics-Card-Pictured-Uses-2-x-8-pin-Power



Spoiler: Warning: Spoiler!


----------



## hyp36rmax

Yes the fan looks like a Nidec Gentle Typhoon AP29/30/31 as you can see from my own shot in comparison. This is cool stuff indeed!


----------



## DNMock

Quote:


> Originally Posted by *hyp36rmax*
> 
> Yes the fan looks like a Nidec Gentle Typhoon AP29/30/31 as you can see from my own shot in comparison. This is cool stuff indeed!


I wouldn't be suprised if they were using a Nidec Servo fan. Hell I bet their thermal pads are fujipoly and they are using some Gelid Extreme or something for their paste as well.

If they are going flagship, they are going all the way with it.


----------



## Dotachin

Quote:


> Originally Posted by *F4ze0ne*
> 
> New pics on PCPer...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.pcper.com/news/Graphics-Cards/AMD-Radeon-Fury-X-Graphics-Card-Pictured-Uses-2-x-8-pin-Power
> 
> 
> 
> Spoiler: Warning: Spoiler!


No grills? Absolutely no reason not to place a DVI port or two there then. Shame on them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F4ze0ne*
> 
> New pics on PCPer...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.pcper.com/news/Graphics-Cards/AMD-Radeon-Fury-X-Graphics-Card-Pictured-Uses-2-x-8-pin-Power
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No grills? Absolutely no reason not to place a DVI port or two there then. Shame on them.
Click to expand...

Tell you what, I'll grab some glue for you to stick some on then









I have no issue with them wanting to move forward and discontinue the DVI ports on the cards (And i use a Qnix Evo II atm).

Displayport is the future and they've been wanting to do this for a long time now....


----------



## Mega Man

correction I have been wanting them to do this for a long time
more over.... why not single slot amd ./... why not this


----------



## Dotachin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tell you what, I'll grab some glue for you to stick some on then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have no issue with them wanting to move forward and discontinue the DVI ports on the cards (And i use a Qnix Evo II atm).
> 
> Displayport is the future and they've been wanting to do this for a long time now....


So put 6 DP ports then.


----------



## Agent Smith1984

One of my 2 290's are sold, selling the second end of week. Just waiting for prices and benchmarks. Either pulling the trigger on fury, or 980 ti....

May go for dual 8gb 290x/390x depending on how the numbers work out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Tell you what, I'll grab some glue for you to stick some on then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have no issue with them wanting to move forward and discontinue the DVI ports on the cards (And i use a Qnix Evo II atm).
> 
> Displayport is the future and they've been wanting to do this for a long time now....
> 
> 
> 
> So put 6 DP ports then.
Click to expand...

That one is confusing, I'm assuming they are using HDMI on there as well because of the same reason alot of people want DVI, it's widely adopted.


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That one is confusing, I'm assuming they are using HDMI on there as well because of the same reason alot of people want DVI, it's widely adopted.


dvi is going away its EOL.


----------



## Dotachin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That one is confusing, I'm assuming they are using HDMI on there as well because of the same reason alot of people want DVI, it's widely adopted.


They can fit 6 DP (12 miniDP) and 2 HDMI in there. They put half. You praise them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That one is confusing, I'm assuming they are using HDMI on there as well because of the same reason alot of people want DVI, it's widely adopted.
> 
> 
> 
> They can fit 6 DP (12 miniDP) and 2 HDMI in there. They put half. You praise them.
Click to expand...

Praise them did i?

must have missed that part, you asked why no DVI, i answered.....


----------



## Dotachin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Praise them did i?
> 
> must have missed that part, you asked why no DVI, i answered.....


You said there was no reason to do it. I say there is no reason not to do it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Praise them did i?
> 
> must have missed that part, you asked why no DVI, i answered.....
> 
> 
> 
> You said there was no reason to do it. I say there is no reason not to do it.
Click to expand...

Someone else answered it for you:
Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That one is confusing, I'm assuming they are using HDMI on there as well because of the same reason alot of people want DVI, it's widely adopted.
> 
> 
> 
> dvi is going away its EOL.
Click to expand...


----------



## Dotachin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Someone else answered it for you:


Yeah it's so eol that only half of us here in ocn use them and only 95% or so of monitors have them.


----------



## tsm106

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Someone else answered it for you:
> 
> 
> 
> Yeah it's so eol that only half of us here in ocn use them and only 95% or so of monitors have them.
Click to expand...

21 users in that poll.









If ya aren't buying one get out of the thread then. Your whining isn't going to make a DVI port appear.


----------



## Dotachin

Quote:


> Originally Posted by *tsm106*
> 
> 21 users in that poll.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If ya aren't buying one get out of the thread then. Your whining isn't going to make a DVI port appear.


I want to buy it, but it seems I won't since AMD decided that bare metal is the way to go with GPU's. And as far as I know complaining isn't agaisnt tos.


----------



## p4inkill3r

Yep, this is the owner's club thread. There are 1034039480394 other threads for you to cry about stuff in.


----------



## Dotachin

Quote:


> Originally Posted by *p4inkill3r*
> 
> Yep, this is the owner's club thread. There are 1034039480394 other threads for you to cry about stuff in.


Are you an owner? is anyone here an owner yet? I said I wanted to buy it.


----------



## Mega Man

Quote:


> Originally Posted by *Dotachin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Someone else answered it for you:
> 
> 
> 
> Yeah it's so eol that only half of us here in ocn use them and only 95% or so of monitors have them.
Click to expand...

so ? buy an adapter, or .... update your monitor


----------



## p4inkill3r

Quote:


> Originally Posted by *Dotachin*
> 
> Are you an owner? is anyone here an owner yet? I said I wanted to buy it.


The usual tact is to not go to owner's threads and complain about the product, that's all.


----------



## Dotachin

Quote:


> Originally Posted by *p4inkill3r*
> 
> The usual tact is to not go to owner's threads and complain about the product, that's all.


That picture brought me here. you need me to not complain? ok:
I hope we get a variant with DVI ports and no wasted space.








Quote:


> Originally Posted by *Mega Man*
> 
> so ? buy an adapter, or .... update your monitor


Adapters won't let you overclock and monitors better than my Qnix cost twice as much or more.


----------



## p4inkill3r

Looks like you're running into the limitations of your monitor.


----------



## szeged

dvi needs to be phased out like vga finally was.

yeah itll suck for those with the korean monitors ( i have one myself ) but if you really want fury x i think it might be time to move to a real monitor instead of the pile of rejects the korean companies didnt want to let rot.


----------



## Dotachin

Quote:


> Originally Posted by *p4inkill3r*
> 
> Looks like you're running into the limitations of your monitor.


Yes, I like pushing the limits with my hardware. I think you do as well (FX-8320 @ 4.8GHz







)


----------



## Dotachin

Quote:


> Originally Posted by *szeged*
> 
> dvi needs to be phased out like vga finally was.
> 
> yeah itll suck for those with the korean monitors ( i have one myself ) but if you really want fury x i think it might be time to move to a real monitor instead of the pile of rejects the korean companies didnt want to let rot.


I understand you can buy stuff, I respect that. I have to be more picky. If there was one reason not to put a ($0.2??) DVI port in Fiji (aka a benefit)... but there really isn't, is there?

edit: sorry for double posting I messed up.


----------



## szeged

youre gonna have to ask amd on that one. I really dont know the reason they didnt include it, they could have put a single slot dvi port on there and kept the card potentially single slot for real watercooling. Just my opinion that dvi needs to be phased out as it is a dinosaur at this point.


----------



## Dotachin

Quote:


> Originally Posted by *szeged*
> 
> youre gonna have to ask amd on that one. I really dont know the reason they didnt include it, they could have put a single slot dvi port on there and kept the card potentially single slot for real watercooling. Just my opinion that dvi needs to be phased out as it is a dinosaur at this point.


I agree, it should be phased out, but from monitors first. That way would be swiftier imo

edit: I mean mainstream monitors


----------



## F4ze0ne

I'm sure Sapphire will include an adapter in their version. This was the case with the 295x2.


----------



## LegacyLG

Some websites show it is still not enough to beat the tittian x


----------



## flopper

Quote:


> Originally Posted by *LegacyLG*
> 
> Some websites show it is still not enough to beat the tittian x


its called rumors for a reason.
you be surprised how wrong they can be


----------



## The Mac

As far as the port selection goes, Robert Hallock mentioned there are some compliance hoops you have to jump through once you get beyond 3 monitors due to EFI levels.

He didnt really specify what they were.


----------



## DNMock

Quote:


> Originally Posted by *LegacyLG*
> 
> Some websites show it is still not enough to beat the tittian x


Even if these folks got their hands on a Fury XT to play with, they don't have the updated drivers in all likely hood to get them to run optimally. Also most of these leaked comparisons are based on benchmark programs like 3DMark which are notoriously weighted towards Nvidia.


----------



## LegacyLG

Fair play guys


----------



## trodas

*First Benchmarks of AMD's Fiji GPU Surface - Open CL CompuBench Performance Beats Out the TITAN-X*http://wccftech.com/benchmarks-amds-fiji-gpu-surface-open-cl-compubench-performance-beats-titanx/



So, unless that are going to be fools in just 3 days, the Titan X is owned









...

BTW, this is no surprise. The computing power was always shown to be notably higher over Titan X, so people who think that this is fake should think twice now









However this is not game performance test, sadly. And I believe that this is what interest *MOST* buyers.


----------



## LegacyLG

Any crossfire trodas for 4k/5k?


----------



## Kane2207

Quote:


> Originally Posted by *trodas*
> 
> *First Benchmarks of AMD's Fiji GPU Surface - Open CL CompuBench Performance Beats Out the TITAN-X*http://wccftech.com/benchmarks-amds-fiji-gpu-surface-open-cl-compubench-performance-beats-titanx/
> 
> 
> 
> So, unless that are going to be fools in just 3 days, the Titan X is owned
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> BTW, this is no surprise. The computing power was always shown to be notably higher over Titan X, so people who think that this is fake should think twice now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However this is not game performance test, sadly. And I believe that this is what interest *MOST* buyers.


Been posted before in a different thread. Also, Open CL is something AMD have always excelled at - but it's usually worth squat in actual gaming benchmarks.

Plus, don't cherry pick the graph that fits your bias, the article clearly shows other benches where the Titan wins.

And I believe that 'early bench' is actually someone performing some 'calculations', so I'd use a pinch of salt if I were you.


----------



## xer0h0ur

The only recent testing of Fury X I saw was with the 15.5 driver. Not a chance in hell this is the driver they use at launch though. Even with the 15.5 it beat 980 Ti and was just behind Titan X. Though I believe it beat Titan X at 4K? Don't remember.


----------



## xer0h0ur

You people do realize Displayport is superior in every possible way to DVI right? All it takes is a simple adapter to keep using your old DVI monitor if you're that stubborn to not join the modern era.


----------



## Mega Man

+1


----------



## flopper

HBM talk with Hallock about AMD world leading technology

http://www.redgamingtech.com/amd-exclusive-interview-high-bandwidth-memory-tressfx-graphics-technology-with-robert-hallock/


----------



## the9quad

Quote:


> Originally Posted by *xer0h0ur*
> 
> You people do realize Displayport is superior in every possible way to DVI right? All it takes is a simple adapter to keep using your old DVI monitor if you're that stubborn to not join the modern era.


I dont think its stubborn, its just a matter of people having dvi-d only monitors who are now faced with not only the price of a card, but adding a new monitor on top of it or an adapter. So instead if it being a $600-700 upgrade it is now a $1300 upgrade, not everyone wants to or can afford to make that kind of investment.

1440p 120hz ips monitors aren't cheap, the ones with displayport anyway.

For that price you could almost do two 980ti's and keep the old monitor working just fine.

Im not knocking amd's choice or displayport, just saying its not like people are being "stubborn". They are just being realistic.


----------



## Gumbi

Quote:


> Originally Posted by *the9quad*
> 
> I dont think its stubborn, its just a matter of people having dvi-d only monitors who are now faced with not only the price of a card, but adding a new monitor on top of it or an adapter. So instead if it being a $600-700 upgrade it is now a $1300 upgrade, not everyone wants to or can afford to make that kind of investment.
> 
> 1440p 120hz ips monitors aren't cheap, the ones with displayport anyway.
> 
> For that price you could almost do two 980ti's and keep the old monitor working just fine.
> 
> Im not knocking amd's choice or displayport, just saying its not like people are being "stubborn". They are just being realistic.


Haven't you heard of adapters?


----------



## djsatane

Quote:


> Originally Posted by *flopper*
> 
> HBM talk with Hallock about AMD world leading technology
> 
> http://www.redgamingtech.com/amd-exclusive-interview-high-bandwidth-memory-tressfx-graphics-technology-with-robert-hallock/


And yet all I see so far being confirmed to come out soon are the non HBM cards the rebrands + gddr5....


----------



## the9quad

Quote:


> Originally Posted by *Gumbi*
> 
> Haven't you heard of adapters?


Yea and they effectively makw the 120hz monitor at best capable of 75hz on a good day. So yeah no true dvi output means an upgrade to a new monitor if you want to keep the refresh rate


----------



## Mega Man

you do know they make DL dvi adapters ?


----------



## the9quad

Quote:


> Originally Posted by *Mega Man*
> 
> you do know they make DL dvi adapters ?


They make an adapter someone has gotten to work with korean monitors at 120hz? The ones i have seen have issues barely hitting 75hz. Thats the active dl ones btw. If they do make ones that work with them can you post the make and model, id be interested in one.


----------



## Mega Man

to my understanding 75hz= single link

IE this

http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=332-2272&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410

dual link DVI adapter is

http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=470-AANW&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
http://www.monoprice.com/Product?p_id=6904&catargetid=320013720000066114&cadevice=c&kpid=106904&gclid=CPjslpOsj8YCFQyqaQodWSEAsw

according to this ( and many others first link i could fine), it works fine when i was first getting into eyefinity i looked into getting a DL DVI adapter

https://tinkertry.com/close-look-qnix-qx2700-led-2560x1440-27-inch-monitor
http://www.overclock.net/t/1406251/monoprice-zero-g-slim-vs-glass-panel-pro


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> to my understanding 75hz= single link
> 
> IE this
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=332-2272&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> 
> dual link DVI adapter is
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=470-AANW&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> http://www.monoprice.com/Product?p_id=6904&catargetid=320013720000066114&cadevice=c&kpid=106904&gclid=CPjslpOsj8YCFQyqaQodWSEAsw
> 
> according to this ( and many others first link i could fine), it works fine when i was first getting into eyefinity i looked into getting a DL DVI adapter
> 
> https://tinkertry.com/close-look-qnix-qx2700-led-2560x1440-27-inch-monitor
> http://www.overclock.net/t/1406251/monoprice-zero-g-slim-vs-glass-panel-pro


Doesn't ToastyX's AMD PIXEL CLOCK PATCH utility help with this stuff as well? Yeah i know it's third party and all, but i used it and it never seems to fail me. Perhaps i am ignorant to what exactly the whole dvi adapter argument that is going on right now.


----------



## the9quad

Quote:


> Originally Posted by *Mega Man*
> 
> to my understanding 75hz= single link
> 
> IE this
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=332-2272&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> 
> dual link DVI adapter is
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=470-AANW&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> http://www.monoprice.com/Product?p_id=6904&catargetid=320013720000066114&cadevice=c&kpid=106904&gclid=CPjslpOsj8YCFQyqaQodWSEAsw
> 
> according to this ( and many others first link i could fine), it works fine when i was first getting into eyefinity i looked into getting a DL DVI adapter
> 
> https://tinkertry.com/close-look-qnix-qx2700-led-2560x1440-27-inch-monitor
> http://www.overclock.net/t/1406251/monoprice-zero-g-slim-vs-glass-panel-pro


Yeah look at the reviews for those adapters, they have issues at 60hz, im thinking 120hz would be a no go, but maybe they'd work. Still have to be willing to roll the dice, and if they dont you either have a useless adapter and monitor or a 120hz monitor stuck at 60hz.

+rep man appreciate it.


----------



## the9quad

Quote:


> Originally Posted by *DividebyZERO*
> 
> Doesn't ToastyX's AMD PIXEL CLOCK PATCH utility help with this stuff as well? Yeah i know it's third party and all, but i used it and it never seems to fail me. Perhaps i am ignorant to what exactly the whole dvi adapter argument that is going on right now.


You have to use that patcher to exceed the dvi-d limit, the issue is can those active adapaters handle it. Seems to me they are flaky even at 60hz based on reviews, so 120hz is just asking for more problems.


----------



## trodas

Quote:


> The only recent testing of Fury X I saw was with the 15.5 driver. Not a chance in hell this is the driver they use at launch though. Even with the 15.5 it beat 980 Ti and was just behind Titan X. Though I believe it beat Titan X at 4K? Don't remember.


I say it again - with all the technologicaly superior characteristic of the Fury, it would be tragic, if AMD manage to screw it up and do not overtake the Titan X for the crown







Sure, it could happen, but I bet pretty high that AMD won't let that happen.
There is their chance (Pascal is in NEXT year and ony IF TSMC hold true to their new manufacturing process, witch is ATM only their PROMISES, that they will... so the Pascal and HBM2 rams for nVidia in NEXT year is only hypotetical - TSMC failed to deliver on most of their promises, they have kinda bad track record, so... I don't want to be in nVidia guys skin, when TSMC again fail to deliver...) and they are not stupid to let it go.

And it would have to be a monstrous screw-up, if they cannot beat Titan X, with all their technological advantages - HBM rams, 4096 bit wide memory bus (first in history!), 4096 stream processors (world record!), 128 ROP units (simply Earthshaking) and reasonable clocks. Even if they end a bit lower on the O/C potencial that Titan X, they still have to beat it.

It would be a good thing to remember, that AMD knows how to set records: the fastest overclocking CPU's are AMD FX-8xxx:
http://hwbot.org/benchmark/cpu_frequency/halloffame

- AMD hold first 4 places in maximum frequency
- AMD was first to produce 5GHz CPU
- AMD was first to include on-die memory controller on CPU, now standard...
- AMD was first to add the 64 bit instructions to x86 CPU's, now standard...

So if AMD get hold of better manufacturing process that nVidia, then nVidia is screwed up completely... (and given the TSMC situation this might happen sooner that people think...)


----------



## Casey Ryback

Quote:


> Originally Posted by *trodas*
> 
> And it would have to be a monstrous screw-up, if they cannot beat Titan X, with all their technological advantages - HBM rams, 4096 bit wide memory bus (first in history!), 4096 stream processors (world record!), 128 ROP units (simply Earthshaking) and reasonable clocks. Even if they end a bit lower on the O/C potencial that Titan X, they still have to beat it.
> 
> It would be a good thing to remember, that AMD knows how to set records: the fastest overclocking CPU's are AMD FX-8xxx:
> http://hwbot.org/benchmark/cpu_frequency/halloffame
> 
> - AMD hold first 4 places in maximum frequency
> - AMD was first to produce 5GHz CPU
> - AMD was first to include on-die memory controller on CPU, now standard...
> - AMD was first to add the 64 bit instructions to x86 CPU's, now standard...
> 
> So if AMD get hold of better manufacturing process that nVidia, then nVidia is screwed up completely... (and given the TSMC situation this might happen sooner that people think...)


It's not going to be a monstrous screw up, with that hardware it should smash the titan X in compute performance, and probably match it in gaming performance.

Why are AMD expected to take this so called 'crown' it's just a myth really, they will do what they can with the hardware they have.

Instead of dwelling on the past when AMD and Nvidia would trade the 'crown' and be much closer with their hardware performance, just accept those days are gone....for now.

Maybe AMD will be back on top in a couple of years........who knows. I'll support them if they can provide a decent product for a good price.

The top end segment where the crown is won is such a small part of GPU sales honestly.


----------



## Imburnal

I want to rock 3 curve monitors 1440p with thing called FIJI. How many FIJI do I need? By the way count me in.


----------



## p4inkill3r

AMD's tenacity is overlooked frequently, I feel.
They have a fraction of the R&D yet still are able to compete with the juggernauts of nvidia and intel. _If_ Fiji takes the crown, is it an indictment of nvidia's complacency? Is it testament to AMD's ability to do more with less? Is it a combination of the two?

I don't know, but for all the crap AMD gets about their products, CPUs included, the fact that they are even able to trade blows with the most dominant corporations in their respective fields is amazing.


----------



## the9quad

They just don't have the market share needed to push their innovations and more often then not they wither on the vine.


----------



## flopper

Quote:


> Originally Posted by *p4inkill3r*
> 
> AMD's tenacity is overlooked frequently, I feel.
> They have a fraction of the R&D yet still are able to compete with the juggernauts of nvidia and intel. _If_ Fiji takes the crown, is it an indictment of nvidia's complacency? Is it testament to AMD's ability to do more with less? Is it a combination of the two?
> 
> I don't know, but for all the crap AMD gets about their products, CPUs included, the fact that they are even able to trade blows with the most dominant corporations in their respective fields is amazing.


They have the better technology for Dx12 and going for eyefinity and 4k and bigger resolutions.
Fury cant come fast enough


----------



## jerrolds

Quote:


> Originally Posted by *Mega Man*
> 
> to my understanding 75hz= single link
> 
> IE this
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=332-2272&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> 
> dual link DVI adapter is
> 
> http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=&cs=04&sku=470-AANW&ST=pla&dgc=ST&cid=262075&lid=4742361&acd=1230980794501410
> http://www.monoprice.com/Product?p_id=6904&catargetid=320013720000066114&cadevice=c&kpid=106904&gclid=CPjslpOsj8YCFQyqaQodWSEAsw
> 
> according to this ( and many others first link i could fine), it works fine when i was first getting into eyefinity i looked into getting a DL DVI adapter
> 
> https://tinkertry.com/close-look-qnix-qx2700-led-2560x1440-27-inch-monitor
> http://www.overclock.net/t/1406251/monoprice-zero-g-slim-vs-glass-panel-pro


There are no dual link dvi active adapters that can hit [email protected] 460mhz/~10GB/s afaik.. Mine (Accell) topped out at around 75hz

I want to get Fiji but I might have to switch to green team since no monitor can beat my qnix from a price: performance standpoint for me. the only one I was interested in as a possible upgrade was the Acer predator ultra wide.. But it's only 75hz.

The 144hz 1440p ones out now aren't worth it at 4X the price for me.


----------



## xer0h0ur

The main difference between Maxwell and GCN is that GCN is a massively parallel architecture with higher compute power and the ability to perform more tasks asynchronously from the graphics queue. This difference is not going to be noticed until DX12 and Vulkan take foothold. Then people are going to realize just how gimped of compute power Maxwell 2 is.

Hell, even AMD's years old cards support feature level 12_0 in DX12 and nothing before Maxwell 2 supports it. Nvidia's claims that Maxwell 2 is the only one that supports full DX12 is also FUD since they don't support resource binding tier 3, it does however support feature level 12_1 which so far none of AMD's cards have been confirmed to support.

This is good article explaining an architectural lead AMD has over Nvidia going into the DX12 and Vulkan era: http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading


----------



## gatygun

Quote:


> Originally Posted by *xer0h0ur*
> 
> The main difference between Maxwell and GCN is that GCN is a massively parallel architecture with higher compute power and the ability to perform more tasks asynchronously from the graphics queue. This difference is not going to be noticed until DX12 and Vulkan take foothold. Then people are going to realize just how gimped of compute power Maxwell 2 is.
> 
> Hell, even AMD's years old cards support feature level 12_0 in DX12 and nothing before Maxwell 2 supports it. Nvidia's claims that Maxwell 2 is the only one that supports full DX12 is also FUD since they don't support resource binding tier 3, it does however support feature level 12_1 which so far none of AMD's cards have been confirmed to support.
> 
> This is good article explaining an architectural lead AMD has over Nvidia going into the DX12 and Vulkan era: http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading


Dunno but i somehow got the feeling that my 290 was a good choice for the long haul.


----------



## xer0h0ur

As long as AMD doesn't abandon the GCN architecture going forward, you're likely to reap the benefits for a long time to come.


----------



## semitope

Quote:


> Power efficiency is an oft-used negative against the large-die Hawaii chips, but I've been playing with powertune settings and Furmark recently as an experiment to fit a "hot and noisy" AMD card into an SFF with limited cooling.
> 
> Actually, I stand by an earlier post I made that says I think AMD pushed Hawaii silicon too far.
> With both GPU-Z and Furmark able to report power consumptions, I can see a 100W reduction in power consumption on 290X cards for as little as 5% performance loss.
> 
> If you have a Hawaii card, I urge you to crank power limits down in the overdrive tab of CCC and see what the resulting clockspeed is under full load. Even in a worst-case scenario, I'm seeing a typical clockspeed of 850MHz with the slider all the way to the left at -50%
> 
> That means that Hawaii (the two samples I personally own, at least) can run at 850+MHz on only 145W (half the 290W TDP). As mentioned, that's a worst-case scenario using a power-virus like Furmark. Under real gaming situations (I was messing around with Alien Isolation on 1440p ultra settings) the clocks averaged about 925MHz yet my PC was inaudible; Fans that normally hum along at 55% were barely spinning at 30% during my gameplay.
> 
> As Nvidia has proved, you can make a 28nm chip run efficiently. I think the design of Hawaii holds up very well under vastly reduced power constraints - AMD just pushed it outside its comfort zone in order to get the most out of it.
> 
> In saying that, the "underpowered" 290X is around the same performance as my GTX970 and also the same cost - significantly higher than a GTX960 4GB. I don't know if die-harvested 290 cards deal with power limit caps like the cherry-picked 290X cards.


And that suggests its actually energy efficient

http://techreport.com/news/27996/4gb-gtx-960s-trickle-into-retail-channels?post=893388#893388

Comments


----------



## xer0h0ur

Yeah you can make a 28nm chip power efficient when it has a whopping 0.2 teraflops of FP64 compute power in its die. GCN has remained what it is, massively parallel. Pascal is going back into the direction Kepler was going and its not a coincidence. They know that Maxwell 2 is not an architecture they can remain with going forward.


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah you can make a 28nm chip power efficient when it has a whopping 0.2 teraflops of FP64 compute power in its die. GCN has remained what it is, massively parallel. Pascal is going back into the direction Kepler was going and its not a coincidence. They know that Maxwell 2 is not an architecture they can remain with going forward.


It be fun for people thinking pascal be so good when they have to re-design it with such.
I am not so hopeful for a smooth die node change.
The next fury can truly be a monster though


----------



## Evil Penguin

So... When do you all think it'll be available on newegg?


----------



## p4inkill3r

Quote:


> Originally Posted by *Evil Penguin*
> 
> So... When do you all think it'll be available on newegg?


Tomorrow, hopefully.


----------



## Forceman

Quote:


> Originally Posted by *p4inkill3r*
> 
> Tomorrow, hopefully.


I'm going to go out on a limb here and say there is no chance at all that Fury will be available tomorrow.


----------



## zealord

Quote:


> Originally Posted by *Forceman*
> 
> I'm going to go out on a limb here and say there is no chance at all that Fury will be available tomorrow.


I think you are spot on. We would've had a lot of leaks if that was the case.

If we are lucky then in 7-14 days they will be available. If we are lucky.

I say they just show the card tomorrow and say the name and thats it.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> The main difference between Maxwell and GCN is that GCN is a massively parallel architecture with higher compute power and the ability to perform more tasks asynchronously from the graphics queue. This difference is not going to be noticed until DX12 and Vulkan take foothold. Then people are going to realize just how gimped of compute power Maxwell 2 is.
> 
> Hell, even AMD's years old cards support feature level 12_0 in DX12 and nothing before Maxwell 2 supports it. Nvidia's claims that Maxwell 2 is the only one that supports full DX12 is also FUD since they don't support resource binding tier 3, it does however support feature level 12_1 which so far none of AMD's cards have been confirmed to support.
> 
> This is good article explaining an architectural lead AMD has over Nvidia going into the DX12 and Vulkan era: http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading


http://mobile.extremetech.com/latest/223320-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver?origref=


----------



## xer0h0ur

So any particular reason you quoted me when nothing I said was wrong?

Edit: Maxwell 2 bit, yeah apparently on that chart they list the original Maxwell as supporting 12_0.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> So any particular reason you quoted me when nothing I said was wrong?


Cause alot of ppl been quoting tier capability without including feature level requirement.

Guess its down to game engine after this. Heard first two games with dx12 is gonna be arkham knights & witcher 3.


----------



## xer0h0ur

Quote:


> Originally Posted by *cstkl1*
> 
> Cause alot of ppl been quoting tier capability without including feature level requirement.
> 
> Guess its down to game engine after this. Heard first two games with dx12 is gonna be arkham knights & witcher 3.


I would not use a game that wasn't built from the ground up intended for DX12 to compare. That is just me though, I don't trust patching a game into DX12.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> I would not use a game that wasn't built from the ground up intended for DX12 to compare. That is just me though, I don't trust patching a game into DX12.


Y? Dx feature level is just more graphic fidelity/enchancements. In that respect a ground up is better but
Dx12 main gain in respectivd to previous dx is suppose to be
Cpu overhead. So having patch support and backward support would amount to the samething.


----------



## p4inkill3r

Quote:


> Originally Posted by *Forceman*
> 
> I'm going to go out on a limb here and say there is no chance at all that Fury will be available tomorrow.


I did add 'hopefully'.


----------



## xer0h0ur

I wouldn't downplay asynchronous compute but yes the largest wide sweeping improvement is multi-core CPU processing finally being able to handle entirely independent and different threads now.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> I wouldn't downplay asynchronous compute but yes the largest wide sweeping improvement is multi-core processing finally being able to handle entirely independent and different threads now.


Yet amd 7 series etc fares badly on tessalation. Same debate bring repeated again as of three years ago with windows 8 launch on dx11 feature level etc

Really got to see gaming performance. Personally gpu that are build from ground up after dx is launch always are better on support. Amd dx10.1 saga etc. when dx12 comes out lets see again with gcn 1.3 vs pascal.

Btw does anybody know the time nda will be uplifted??


----------



## xer0h0ur

Yeah I am not worried about tessellation. We can control the level of tessellation easily in our drivers. Nvidia just screws their own users and won't give users control over it since it wouldn't give them reason to hand them more money for a newer generation card. Tessellation was also vastly improved from Hawaii to Tonga and of course that is going into Fiji as well.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am not worried about tessellation. We can control the level of tessellation easily in our drivers. Nvidia just screws their own users and won't give users control over it since it wouldn't give them reason to hand them more money for a newer generation card. Tessellation was also vastly improved from Hawaii to Tonga and of course that is going into Fiji as well.


Funny. Ure ok with amd screwing u over with reducing graphic fidelity of a game engine quoting its not required to have that much realism. Everybody bought into the mantel thing which died out.
Thats how the game suppose to look like.
And now ure saying a gospel truth of what fiji should have based on tonga while most of the 3xx refresh is based on hawaii.

Lets see during the launch.


----------



## xer0h0ur

Is Fiji going to magically perform worse than Tonga or something? Are you on your knees praying to the based god for it? LOL

I give credit where its due, but Nvidia has a habit of screwing everyone over. Even their own customers.


----------



## ssiperko

I will say that based on the video in the OP that the memory architecture seems to be a big gain in and of itself ....... no?

Keep in mind I'm old and know that 28nm is ginorrmuss compared to 20nm but ..... the memory is impressive yes?

SS


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> Is Fiji going to magically perform worse than Tonga or something? Are you on your knees praying to the based god for it? LOL
> 
> I give credit where its due, but Nvidia has a habit of screwing everyone over. Even their own customers.


Dude ure hoping for something thats hasnt been announced. I wait for things that is. Ure the one thats praying. Same as those fools who keep quoting pascal.

As for screwing over. Nvidia consumers are nitpicking. Basic gameplay works. I wont say amd but you seem to be happy with the services amd provide and everytime something goes wrong its the
Game devs/scalers/ everybody elses fault. Good job amd.


----------



## xer0h0ur

I am still skeptical that AMD can really overcome the 4GB HBM1 limitation on initial Fiji dies with its color compression and memory saving techniques whatever the hell that meant. I am just happy not very many games will eat up over 4GB of vRAM yet for a single monitor setup, even at 4K.


----------



## xer0h0ur

Quote:


> Originally Posted by *cstkl1*
> 
> Dude ure hoping for something thats hasnt been announced. I wait for things that is. Ure the one thats praying. Same as those fools who keep quoting pascal.
> 
> As for screwing over. Nvidia consumers are nitpicking. Basic gameplay works. I wont say amd but you seem to be happy with the services amd provide and everytime something goes wrong its the
> Game devs/scalers/ everybody elses fault. Good job amd.


Not even close, read the post I just made criticizing them. The difference is that I know that architectural improvements made on Tonga over Hawaii aren't going to magically disappear. That means tessellation improvements and color compression. You're cray cray if you honestly think those features are going to go up in smoke going from Tonga to Fiji. I don't need to wait for Fiji's official specifications to come out to take that to the bank. You are welcome to shove it in my face if I am wrong and I will eat the crow. I don't have a problem being wrong or admitting being wrong.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am still skeptical that AMD can really overcome the 4GB HBM1 limitation on initial Fiji dies with its color compression and memory saving techniques whatever the hell that meant. I am just happy not very many games will eat up over 4GB of vRAM yet for a single monitor setup, even at 4K.


Again dude. Dont know... everybody speculating. Lets see how it does.
Me personally i had issues with titan black [email protected] on the fidelity i like gaming at. Tested 970 n was uber stuttering.
I doubt the 4gb made a difference

But fiji has a high bandwidth. So maybe something is different here. So seriously lets see. Do you know the time nda will be lifted.

I dont care how fiji performs in dx12. Its games now. Samething windows 10 will be crap until we see 10.1 or sp1 etc. If fiji has tonga tesselation traits.. Awesome. It will slaughter gameworks.

Amd/nvidia gou always nvr work well on future dx.


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> Not even close, read the post I just made criticizing them. The difference is that I know that architectural improvements made on Tonga over Hawaii aren't going to magically disappear. That means tessellation improvements and color compression. You're cray cray if you honestly think those features are going to go up in smoke going from Tonga to Fiji. I don't need to wait for Fiji's official specifications to come out to take that to the bank. You are welcome to shove it in my face if I am wrong and I will eat the crow. I don't have a problem being wrong or admitting being wrong.


So 390 refresh just made tonga go up in smoke.
No dude. I have no expectations. You have. Nvidia maxwell 2 titan x dp already showed that every company new products can take a turn. I dont view amd being any different.

Lets see. For now the wow factor is that small length gpu.
I am buying either this or 980ti. Thats a fact.


----------



## xer0h0ur

What? Don't you have a Titan X? Why in the world would you go to a 980 Ti?


----------



## cstkl1

Quote:


> Originally Posted by *xer0h0ur*
> 
> What? Don't you have a Titan X? Why in the world would you go to a 980 Ti?


Generally in the last decade always had one rig from each camp. Now i need another in another place as i seem to be spending alot of time there. So fury X compactness is appealing.

Pretty sure this is the dawn of beastly insane small rigs era.


----------



## xer0h0ur

I still need to see if the air cooled version will remain as small. I have a feeling the air cooler on it will end up extending the length of the card instead of also being 19cm. If its some how still only 19cm long then you're bang on right about the era of the mini beasts beginning.


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> I still need to see if the air cooled version will remain as small.


Of course it won't.

It's just as power hungry (maybe more as they are now factory overclocked)

It will defy logic if they are not the same size, if they are smaller they will probably be louder.

edit -


----------



## xer0h0ur

Quote:


> Originally Posted by *Casey Ryback*
> 
> Of course it won't.
> 
> It's just as power hungry (maybe more as they are now factory overclocked)
> 
> It will defy logic if they are not the same size, if they are smaller they will probably be louder.


What are you going on about? Read that post again....


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> What are you going on about? Read that post again....


lol brain fade on my part....sigh.

Well anyway I highly doubt they will be able to tame such a beast with a small air cooler. The new chip is pretty big and has pretty high power consumption.

Then again the PCB is a fair bit smaller, maybe you're right the length may not be an issue, could be a 2.5 to 3 slot card though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> What are you going on about? Read that post again....
> 
> 
> 
> lol brain fade on my part....sigh.
> 
> Well anyway I highly doubt they will be able to tame such a beast with a small air cooler. The new chip is pretty big and has pretty high power consumption.
Click to expand...

The card itself might be small but the cooler might be large...

I dunno, id love to see a 15cm or so air cooled version but yeah.....i can dream can't i?


----------



## DividebyZERO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The card itself might be small but the cooler might be large...
> 
> I dunno, id love to see a 15cm or so air cooled version but yeah.....i can dream can't i?


I don't know how i would feel about it. My pc hardware inept friends would come over and be like what is that? They might mistake size for performance. Why do i sense a joke around the corner about size.


----------



## Creator

Man. AMD is so late that even the "owner's club" on OCN won't officially have an owner until like page 50.


----------



## semitope

Quote:


> Originally Posted by *xer0h0ur*
> 
> I still need to see if the air cooled version will remain as small. I have a feeling the air cooler on it will end up extending the length of the card instead of also being 19cm. If its some how still only 19cm long then you're bang on right about the era of the mini beasts beginning.


should be longer. Possibly triple fan


----------



## xer0h0ur

Quote:


> Originally Posted by *Casey Ryback*
> 
> lol brain fade on my part....sigh.
> 
> Well anyway I highly doubt they will be able to tame such a beast with a small air cooler. The new chip is pretty big and has pretty high power consumption.
> 
> Then again the PCB is a fair bit smaller, maybe you're right the length may not be an issue, could be a 2.5 to 3 slot card though.


You're more than likely right about the length of the card and AMD had said that their reference cooler for the air cooled Fury X / Fury would be a triple fan cooler. I can't imagine a triple fan cooler that is 19cm short. They have thankfully abandoned that hot garbage blower style cooler.


----------



## boredmug

If I got my hands on one of these it would have to be watercooled and not with the crappy AIO solution.


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're more than likely right about the length of the card and AMD had said that their reference cooler for the air cooled Fury X / Fury would be a triple fan cooler. I can't imagine a triple fan cooler that is 19cm short. They have thankfully abandoned that hot garbage blower style cooler.


The PCB looks so short though, it will be odd having 1/3 of the heatsink hanging off the card with a fan on it.

I hope they go for a short chunky 2.5-3 slot cooler with 2 fans.


----------



## Ultracarpet

Quote:


> Originally Posted by *Casey Ryback*
> 
> lol brain fade on my part....sigh.
> 
> Well anyway I highly doubt they will be able to tame such a beast with a small air cooler. The new chip is pretty big and has pretty high power consumption.
> 
> Then again the PCB is a fair bit smaller, maybe you're right the length may not be an issue, could be a 2.5 to 3 slot card though.


Would be interesting to see a fury/fury pro with lower clocks and voltage that could be tamed with a single fan. Single card performance somewhere in between the 980 and the 980ti that could fit into a tiny little case


----------



## Agent Smith1984

Well...

Just read that the PC Gaming show doesn't start until 5PM eastern standard time....
AMD is the first presenter, so I am guessing that's when Fury will be announced.

Better be good AMD, cause this 390X rebranding business has got you looking pretty bad right now....


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well...
> 
> Just read that the PC Gaming show doesn't start until 5PM eastern standard time....
> AMD is the first presenter, so I am guessing that's when Fury will be announced.
> 
> Better be good AMD, cause this 390X rebranding business has got you looking pretty bad right now....


Quote:


> Reminder: Webcast today
> 
> AMD Presents: A New Era of PC Gaming event live webcast.
> 
> Don't miss it! You can join us online at www.amd.com/graphics
> 
> Watch the event live via webcast today (06/16/15) @9am PT / 11am CT / Noon ET.
> 
> Tweet This #AMD300 Twitter


----------



## The Mac

The PC Gaming show is the Developer extravaganza hosted by PC Gamer, thats later in the evening.

the AMD webcast is at noon EST.


----------



## zeppoli

The fury will likely be priced to compete with the 980 ti right? I mean after all the 980 ti is pulling the same results/benchmarks as the titan.

700.00 IMO would be the high end for the fury, no way they would think that 1000.00 dollars is a price to start at.

right?


----------



## Agent Smith1984

I'm hoping for something like $629.99 with 980 ti-esque performance.
If HBM VRAM is used (and filled) the same way that GDDR5 is, then it greatly hinders the card's marketability.

If however, 4GB of HBM is as good as having 6+GB of GDDR5 then it should be a level playing field, with the scaling tipping towards AMD in 4K resolution due their much higher bandwidth.


----------



## zeppoli

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm hoping for something like $629.99 with 980 ti-esque performance.
> If HBM VRAM is used (and filled) the same way that GDDR5 is, then it greatly hinders the card's marketability.
> 
> If however, 4GB of HBM is as good as having 6+GB of GDDR5 then it should be a level playing field, with the scaling tipping towards AMD in 4K resolution due their much higher bandwidth.


I just need to wait for reviewers to compare the card, I only care about raw numbers, performance .

It needs to be faster than the 980ti and same price, or close/match the 980ti and be a little cheaper. Then I'm game


----------



## jerrolds

Quote:


> Originally Posted by *zeppoli*
> 
> The fury will likely be priced to compete with the 980 ti right? I mean after all the 980 ti is pulling the same results/benchmarks as the titan.
> 
> 700.00 IMO would be the high end for the fury, no way they would think that 1000.00 dollars is a price to start at.
> 
> right?


If its performance is anywhere near 980ti, it needs to undercut it by a good margin to compete. Nvidia has the lions share when it comes to enthusiast numbers, going to take a good price to get them to switch over. Brand loyalty is powerful.

Id say it would have to be $549-599 if its within 5% of the 980ti


----------



## Agent Smith1984

Quote:


> Originally Posted by *jerrolds*
> 
> If its performance is anywhere near 980ti, it needs to undercut it by a good margin to compete. Nvidia has the lions share when it comes to enthusiast numbers, going to take a good price to get them to switch over. Brand loyalty is powerful.
> 
> Id say it would have to be $549-599 if its within 5% of the 980ti


Agreed!


----------



## semitope

woooow. Single fan r9 nano fiji. looks cool


----------



## hyp36rmax

Updating OP as we go!


----------



## Gumbi

Edit: misquote


----------



## Sgt Bilko

Soooooo......I'm hyped, how about you guys?


----------



## Kokin

Wow $650 for the FuryX and $550 for the Fury. Nvidia is gonna have a huge price drop very soon.


----------



## hyp36rmax

Quote:


> Originally Posted by *Kokin*
> 
> Wow $650 for the FuryX and $550 for the Fury. Nvidia is gonna have a huge price drop very soon.


Daaaaaammmnnn!!! Nice! Now I'm curious on performance. *June 24, 2015* along with a dual FIJI this fall.


----------



## tsm106

That NANO!!!!! I wonder if I could stick one into my daughters h frame mini?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kokin*
> 
> Wow $650 for the FuryX and $550 for the Fury. Nvidia is gonna have a huge price drop very soon.
> 
> 
> 
> Daaaaaammmnnn!!! Nice! Now I'm curious on performance. _*June 24, 2015*_
Click to expand...

Oh yeah....i have a feeling my 295x2 might get sold pretty soon


----------



## HiTechPixel

I'm already planning on going quadfire Fury X.


----------



## gatygun

Let's hope the cards are absolute beasts


----------



## Gumbi

Why would nVidia have to drop prices? Did they even give performance of Fiji yet?


----------



## DividebyZERO

What is the difference from fury x and fury? Sorry im working so i havent been able to watch the streams


----------



## Kokin

Quote:


> Originally Posted by *tsm106*
> 
> That NANO!!!!! I wonder if I could stick one into my daughters h frame mini?


Yeah I'm very curious on how this will be performing, as it is supposed to only half the power consumption of the R9 290X.
Quote:


> Originally Posted by *DividebyZERO*
> 
> What is the difference from fury x and fury? Sorry im working so i havent been able to watch the streams


Should be similar to R9 290X and R9 290. The Fury will have disabled parts.

If rumors are true: the FuryX is 4000 stream processors, while the Fury is 3500~3600 stream processors.


----------



## gatygun

Is there a 395x2 announced? which contains 2x fury chips?


----------



## Agent Smith1984

Wait, maybe I missed it with work and all, but did anyone see any info on the Fury X???

They just ended with the 3 series cards....


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> Is there a 395x2 announced? which contains 2x fury chips?


Not sure it's a 395x2 designation but there will be a dual fury card.


----------



## Gobigorgohome

Fury X for 650 USD, I'll take two!


----------



## gatygun

Quote:


> Originally Posted by *tsm106*
> 
> Not sure it's a 395x2 designation but there will be a dual fury card.


Nice, thanks.


----------



## Tom Brohanks

Lame, I thought reviews would be out today. When are the reviews?


----------



## p4inkill3r

I'll take one Fury X, please.


----------



## Hazardz

Quote:


> Originally Posted by *Kokin*
> 
> Yeah I'm very curious on how this will be performing, as it is supposed to only half the power consumption of the R9 290X.
> Should be similar to R9 290X and R9 290. The Fury will have disabled parts.
> 
> If rumors are true: the FuryX is 4000 stream processors, while the Fury is 3500~3600 stream processors.


http://wccftech.com/amd-radeon-r9-fury-r9-nano-fury-unveiled-fiji-gpu-based-hbm-powered-649-priced-small-form-factor-powerhouse/


Quote:


> Originally Posted by *Tom Brohanks*
> 
> Lame, I thought reviews would be out today. When are the reviews?


I guess would be the actual launch date so next week.


----------



## semitope

They may be holding on to the nano to clear 390x/290x stock. 429 - 549 is not a lot of room. Or they are waiting to have enough less than perfect fury chips to cut down further than fury pro. I'd watch out for fury nano bios unlocks depending on how they put out those chips.


----------



## SPLWF

What are the chances of a Fury unlocking to a Fury X like the previous gens?


----------



## KeepWalkinG

I'll take one Fury X too


----------



## tsm106

Quote:


> Originally Posted by *SPLWF*
> 
> What are the chances of a Fury unlocking to a Fury X like the previous gens?


They'll probably use e-fuses here too, so not likely?


----------



## SPLWF

Quote:


> Originally Posted by *tsm106*
> 
> They'll probably use e-fuses here too, so not likely?


One can only hope it unlocks, we will see in the future....


----------



## velocityx

to be honest, as much as I love the cards, the power, hardware etc, the biggest problem amd has right now is its drivers. they offer no support, they didnt even mention how mantle evolved and helped vulkan cuz dx12. true audio? wat is that? sorry amd, broken promises.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> to be honest, as much as I love the cards, the power, hardware etc, the biggest problem amd has right now is its drivers. they offer no support, they didnt even mention how mantle evolved and helped vulkan cuz dx12. true audio? wat is that? sorry amd, broken promises.


if you think its greener on the other side, then . . . gl!


----------



## brucethemoose

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SPLWF*
> 
> What are the chances of a Fury unlocking to a Fury X like the previous gens?
> 
> 
> 
> They'll probably use e-fuses here too, so not likely?
Click to expand...

Manufacturers can still stick Fury X chips into regular Fury cards if supply is low... But if that's the case, unlocks will be extremely rare.


----------



## Agent Smith1984

Looks like Fury vanilla will crush the 290x

With such high frame buffer bandwidth, will the size not matter as much?


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> if you think its greener on the other side, then . . . gl!


atleast the drivers have less overhead. who needs mantle when dx11 multithreaded on nvidia side matches mantle.

just my 2c.


----------



## Agent Smith1984

Hmm....

A sooner launch date would be nice... I assumed we'd have cards on sale by Friday of this week.

Here's where this gets interesting....

295x2 is currently at $600.....

Will it drop even further? Even at $600 it's the best performance/dollar out there....

Will there be a 395x2 (relaunch of 295) or will there be a ~$1400 Fiji x2 released soon?

There's gotta be a lot of 290 stock still out there, and I can't see vendors moving it but so cheaply considering everything below Fiji is a rebrand.

Looks like 2 more years of mainstream tahiti and Hawaii cards.......

What a testament to tahiti....


----------



## Ashura

Interested to see how Fury(Fiji pro) performs & overclocks.
could be good value for money.

I hope to join this club


----------



## zeppoli

People that are all excited right now, were any of you on the fence and now pushed over because of the announcement ? I mean the price is what we expected, but what about the numbers!! I'm sure that is what we all must see.
If this card performs less than the 980ti does, are you still excited ?


----------



## hyp36rmax

*Here are the slides from the announcement*



















Looks like the fan is not the Nidec Servo Gentle Typhoon AP29 that we have seen with the latest leaks but is a Cooler Master Silencio FP 120 fan that is also on the Nepton 120XL and 240M along with the Silencio 652S Chassis


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmm....
> 
> A sooner launch date would be nice... I assumed we'd have cards on sale by Friday of this week.
> 
> Here's where this gets interesting....
> 
> 295x2 is currently at $600.....
> 
> Will it drop even further? Even at $600 it's the best performance/dollar out there....
> 
> Will there be a 395x2 (relaunch of 295) or will there be a ~$1400 Fiji x2 released soon?
> 
> There's gotta be a lot of 290 stock still out there, and I can't see vendors moving it but so cheaply considering everything below Fiji is a rebrand.
> 
> Looks like 2 more years of mainstream tahiti and Hawaii cards.......
> 
> What a testament to tahiti....


What would be interesting is a 2x fury x card setup for 999 and a 395x2 for 499 or something.

295x2 will probably retire the end of this year tho and replaced.

What gpu are you going to buy next?


----------



## zeppoli

Quote:


> Originally Posted by *gatygun*
> 
> What would be interesting is a 2x fury x card setup for 999 and a 395x2 for 499 or something.
> 
> 295x2 will probably retire the end of this year tho and replaced.
> 
> What gpu are you going to buy next?


no thanks. Personally I'm done with crossfire. I assume the 295x2 acts as a crossfire platform correct?

no more micro stutters for me! I want a top tier card to give me 60+ fps @1440P with high/ultra!


----------



## Agent Smith1984

I just finished selling off my tri-x 290 crossfire set....

Those cards performed wonderfully...


----------



## flopper

14 July Fury pro Riddick edition.
I want one now....


----------



## jerrolds

Goddammit $649 for the flagship? Grrr super bummed that it has no DVI out...i hope its within 5% of 980ti, which forces nvidia to drop their prices. I dont think ill be chaning my monitor anytime soon.


----------



## brucethemoose

Quote:


> Originally Posted by *hyp36rmax*
> 
> *Here are the slides from the announcement*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like the fan is not the Nidec Servo Gentle Typhoon AP29 that we have seen with the latest leaks but is a Cooler Master Silencio FP 120 fan that is also on the Nepton 120XL and 240M along with the Silencio 652S Chassis


They used an AP29 on the stage.


----------



## F4ze0ne

Quote:


> Originally Posted by *hyp36rmax*


The Nano is a cute little guy ready to pounce.


----------



## zealord

Nano is probably, performance-wise, somewhere between GTX 980 and 980 Ti ?


----------



## hyp36rmax

Quote:


> Originally Posted by *brucethemoose*
> 
> They used an AP29 on the stage.


Looks like they may have variations. We'll know for sure on July 24. I can see the Cooler Master fan in the final product since AMD has a great relationship with CM as they did work together with the FX 9590 AIO package last year. I wouldn't mind an AP29 either


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> atleast the drivers have less overhead. who needs mantle when dx11 multithreaded on nvidia side matches mantle.
> 
> just my 2c.


I use mantle for smoothness not because my i7 cannot keep up with my 2 290s.


----------



## flopper

early but


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> I use mantle for smoothness not because my i7 cannot keep up with my 2 290s.


I wasnt only talking fps, I was also talking frame latency, and here, dx mt in forceware is equal to mantle in smoothness. thats the thing. I was so happy about mantle and it being so smooth until I realized the green side has the same thing but with dx 11.

wont believe it till you see it.


----------



## Hazardz

Quote:


> Originally Posted by *flopper*
> 
> early but


If that chart is true, then the Fury X starts getting crippled by only have 4GB of HBM as the resolution goes up with it dropping below the Titan X and GTX 980 Ti at 5K. I will assume AMD will ready 8GB models some time next year.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> I wasnt only talking fps, I was also talking frame latency, and here, dx mt in forceware is equal to mantle in smoothness. thats the thing. I was so happy about mantle and it being so smooth until I realized the green side has the same thing but with dx 11.
> 
> wont believe it till you see it.


i only use mantle in BF4. even the rest of my games are smooth. there was a study that proved hawaii is smoother than maxwell, except in some gameworks. now, if you play WoW, then i guess that you really need to go green. But, since you made up your mind . . . wish you the best.


----------



## flopper

Quote:


> Originally Posted by *Hazardz*
> 
> If that chart is true, then the Fury X starts getting crippled by only have 4GB of HBM as the resolution goes up with it dropping below the Titan X and GTX 980 Ti at 5K. I will assume AMD will ready 8GB models some time next year.


HBM 1 stopgap until die shrink and HBM 2 is ready and we have no idea how thats going to work out, unforseen problems with die or ram and we are delayed months.
I getting a fury pro for sure


----------



## hyp36rmax

*+ Added Video overview to OP*


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> i only use mantle in BF4. even the rest of my games are smooth. there was a study that proved hawaii is smoother than maxwell, except in some gameworks. now, if you play WoW, then i guess that you really need to go green. But, since you made up your mind . . . wish you the best.


funny how you worded that. as if my feeling of dissapointment from the lack of support, ,for a customer who pretty much blindly purchased on day one two of amd cards because he believed in the company even tho I already wasnt happy with 6970 crossfire that I had, you worded that as if its my user error. ive been using them amd cards since 2008.

;] gl to you too. just because the cards are cheaper doesnt mean you should accept being screwed over with lack of support.


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> early but


If the chart is legit..... then the Fury Pro is going to be a big hit.


----------



## zeppoli

so it beats not the 980ti but the titan in firestrike and not just a tiny amount.

well, I just read that it sees 45FPS average in Tomb Raider on ultra. with 5k FIVE KAY

That's insane if true and means it will pretty much blow the titan away.
SIGN ME UP!! when can I get one?


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> funny how you worded that. as if my feeling of dissapointment from the lack of support, ,for a customer who pretty much blindly purchased on day one two of amd cards because he believed in the company even tho I already wasnt happy with 6970 crossfire that I had, you worded that as if its my user error. ive been using them amd cards since 2008.
> 
> ;] gl to you too. just because the cards are cheaper doesnt mean you should accept being screwed over with lack of support.


what lack of support? to you maybe. i never have to revert back to old driver. now, go to the other side and see how many revert back to old driver so you'll be prepared.


----------



## Kuivamaa

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If the chart is legit..... then the Fury Pro is going to be a big hit.


It does show the limitation of the design, if true. People that push more than 4k might will most likely face issues.But I play at 1440p so a single Fury X seems amazing deal. It nearly has the horsepower of a pair of 290X without the implications. I hope it will be available in Europe, soon.


----------



## hamzta09

Quote:


> Originally Posted by *hyp36rmax*
> 
> *+ Added Video overview to OP*


"4K Gaming"
"League of Legends"

lmao

Anyway in 4h roughly AMD along with PC Gamer and other devs will stream on here from E3.


----------



## p4inkill3r

Quote:


> People that push more than 4k might will most likely face issues.


How many people are gaming @ 5k right now, much less 4k? By the time 5k is mainstream, these cards will be as obsolete as an AIW HD.


----------



## szeged

they made it seem like people are swarming to get 4k monitors lol. i dont think 4k will be considered the mainstream for a while still even though it is making good progress.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If the chart is legit..... then the Fury Pro is going to be a big hit.


it's about as fast as 2 290s at stock.


----------



## flopper

Quote:


> Originally Posted by *rdr09*
> 
> it's about as fast as 2 290s at stock.


amazingly good


----------



## Peter Nixeus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmm....
> 
> A sooner launch date would be nice... I assumed we'd have cards on sale by Friday of this week.
> 
> Here's where this gets interesting....
> 
> 295x2 is currently at $600.....
> 
> Will it drop even further? Even at $600 it's the best performance/dollar out there....
> 
> Will there be a 395x2 (relaunch of 295) or will there be a ~$1400 Fiji x2 released soon?
> 
> There's gotta be a lot of 290 stock still out there, and I can't see vendors moving it but so cheaply considering everything below Fiji is a rebrand.
> 
> Looks like 2 more years of mainstream tahiti and Hawaii cards.......
> 
> What a testament to tahiti....


If there are alot of stock left of the R9 200 series - it is mainly due to the Los Angeles port strikes that caused shipments to be stuck in ports for several months. Source = our monitor shipments were delayed by up to 8 weeks!


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> it's about as fast as 2 290s at stock.


True... and at $550 (with no indication yet of the OC ceiling) that makes them a pretty sweet package.


----------



## CM Felinni

Quote:


> Originally Posted by *Peter Nixeus*
> 
> If there are alot of stock left of the R9 200 series - it is mainly due to the Los Angeles port strikes that caused shipments to be stuck in ports for several months. Source = our monitor shipments were delayed by up to 8 weeks!


We experienced the same delay due to the Los Angeles Port Labor Strikes in Long Beach. It's a relief to finally have our products trickle in for our clients and customers.

*Source:* http://www.nytimes.com/2015/02/13/us/west-coast-labor-dispute-brings-crippling-delays-to-seaports.html?_r=1


----------



## Agent Smith1984

Is AMD not making much of the 4GB HBM limitations because it seems to scale all the way up to 4K, and once you surpass 4k you need a second card anyways?

And with that being the case, are they further counting on DX12 to make use of dual GPU VRAM?

Launching a flagship with 4GB when your new (old) second tier card has a standard 8GB, and your competition just launched 12GB and 6GB units seems either brave, or stupid....

I personally am fine with it, but just seems like a bold move to rely on only 4GB. I can almost already see the 4k "hitching" videos hitting youtube


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is AMD not making much of the 4GB HBM limitations because it seems to scale all the way up to 4K, and once you surpass 4k you need a second card anyways?
> 
> And with that being the case, are they further counting on DX12 to make use of dual GPU VRAM?
> 
> Launching a flagship with 4GB when your new (old) second tier card has a standard 8GB, and your competition just launched 12GB and 6GB units seems either brave, or stupid....
> 
> I personally am fine with it, but just seems like a bold move to rely on only 4GB. I can almost already see the 4k "hitching" videos hitting youtube


DX12 changes the memory footprint.
developers have more control over stuff.
still need dual cards for 4k and while you can play with one card its not viable on any card except the fury x2
Nvidias cards are eol and obselete now.
Old tech.


----------



## Kokin

Quote:


> Originally Posted by *szeged*
> 
> they made it seem like people are swarming to get 4k monitors lol. i dont think 4k will be considered the mainstream for a while still even though it is making good progress.


Very true, but I'll always take a big chunk of performance increase over an incremental one any day, especially with a fair price that will eventually lower.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is AMD not making much of the 4GB HBM limitations because it seems to scale all the way up to 4K, and once you surpass 4k you need a second card anyways?
> 
> And with that being the case, are they further counting on DX12 to make use of dual GPU VRAM?
> 
> Launching a flagship with 4GB when your new (old) second tier card has a standard 8GB, and your competition just launched 12GB and 6GB units seems either brave, or stupid....
> 
> I personally am fine with it, but just seems like a bold move to rely on only 4GB. I can almost already see the 4k "hitching" videos hitting youtube


It's only because they are launching a new bleeding edge technology (HBM). Think of Fury as the "inbetween" card for people who are going from 1080p/1440p to 4K. It's unrealistic for AMD to release higher density memory with a new technology when 99% of gamers are still on 1080p or even lower. That said, when 2nd gen HBM releases, the next gen GPUs should have 8/16/32GB as the vRAM capacity and 4K/5K/8K should be the "norm" per se.


----------



## Agent Smith1984

Decisions decisions....

Maybe my 2x 290's should of just stayed with daddy for a little longer....


----------



## magicc8ball

I really want to know what this NANO has in stock for us. 1 or 2 of those could be a good filler between my 7970 and hbm2 cards.


----------



## jerrolds

Goddammit, someone link me some display port -> dual link dvi active adapters that can support [email protected]

Fully aboard the hype train.


----------



## zeppoli

Quote:


> Originally Posted by *jerrolds*
> 
> Goddammit, someone link me some display port -> dual link dvi active adapters that can support [email protected]
> 
> Fully aboard the hype train.


I came here to ask this, but I don't need 120hz, I'm ok with 60hz, my Korean Monitor won't OC, well i haven't really tried so I don't care


----------



## jerrolds

Quote:


> Originally Posted by *zeppoli*
> 
> I came here to ask this, but I don't need 120hz, I'm ok with 60hz, my Korean Monitor won't OC, well i haven't really tried so I don't care


Oh that ones easy - ive had http://www.amazon.com/Dell-BIZLINK-DisplayPort-Adapter-Powered/dp/B003XYBA72 for a couple years now - works with 1440p up to around 75hz


----------



## magicc8ball

Quote:


> Originally Posted by *jerrolds*
> 
> Goddammit, someone link me some display port -> dual link dvi active adapters that can support [email protected]
> 
> Fully aboard the hype train.


So Jerrolds you might be out of luck, I think you can only get a 120hz refresh with an adapter at 1080 not 1440.


----------



## tsm106

Furry drivers for you. 15.20 coming out soon too.

The AMD Catalyst™ Software Suite contains the following:

AMD Catalyst™ Display Driver version 15.15

Supported Products

AMD Radeon™ R7 300 Series
AMD Radeon™ R9 300 Series
AMD Radeon™ R9 Fury X

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> Furry drivers for you. 15.20 coming out soon too.
> 
> The AMD Catalyst™ Software Suite contains the following:
> 
> AMD Catalyst™ Display Driver version 15.15
> 
> Supported Products
> 
> AMD Radeon™ R7 300 Series
> AMD Radeon™ R9 300 Series
> AMD Radeon™ R9 Fury X
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


Thanks! I will update the OP


----------



## tsm106

Ooh saw this on ocuk.


----------



## hyp36rmax

Looks like AMD designed their own AIO? No more Asetek? I'm curious how we're going to dissect this for custom GPU blocks.


----------



## tsm106

Quote:


> Originally Posted by *hyp36rmax*
> 
> Looks like AMD designed their own AIO? No more Asetek? I'm curious how we're going to dissect this for custom GPU blocks.


They already have full cold plates like in the quantum pc. Those blocks look just like an ek block btw. I don't think this has any bearing on custom blocks.


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> They already have full cold plates like in the quantum pc. Those blocks look just like an ek block btw. I don't think this has any bearing on custom blocks.


I noticed that also.


----------



## tsm106

Look at those cold plates on the bench table.


----------



## tsm106

lol so silly, "AMD Reveals Dual Fiji Board, World's Fastest Graphics Card - 17 TERAFLOPS Small Form Factor Behemoth"

It's alive!

http://wccftech.com/amd-dual-fiji-fury-graphics-card/


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> lol so silly, "AMD Reveals Dual Fiji Board, World's Fastest Graphics Card - 17 TERAFLOPS Small Form Factor Behemoth"
> 
> It's alive!
> 
> http://wccftech.com/amd-dual-fiji-fury-graphics-card/


Now that is the small form factor beast i was waiting for


----------



## bobbavet

B
Quote:


> Originally Posted by *tsm106*
> 
> lol so silly, "AMD Reveals Dual Fiji Board, World's Fastest Graphics Card - 17 TERAFLOPS Small Form Factor Behemoth"
> 
> It's alive!
> 
> http://wccftech.com/amd-dual-fiji-fury-graphics-card/


UBER BOOM!


----------



## DividebyZERO

Quote:


> Originally Posted by *Kuivamaa*
> 
> It does show the limitation of the design, if true. People that push more than 4k might will most likely face issues.But I play at 1440p so a single Fury X seems amazing deal. It nearly has the horsepower of a pair of 290X without the implications. I hope it will be available in Europe, soon.


Yeah, it's not looking good for me personally. Hopefully when reviews do come out they test these variables as i am only looking beyond 4k. I still guess if the chart is right VRAM size matters for sure. Looking at the 8k test the 390x murders the fury.

5K doesn't look much better with the 980ti/TX being faster


----------



## magicc8ball

This is only for aesthetics but I wish all the 8/6 power connectors would go out of the back of the pcb like the Dual Fury board...


----------



## bobbavet

I knew AMD wouldn't turn up without a "gun".
Now for some Star Wars Battlefront game play fully spec'd out!
AMD have just won my tax return $$$$$$
Well played AMD.


----------



## HiTechPixel

So the Fury X can't handle 5K. Guess I'm going with the Titan X after all.


----------



## szeged

Fury seemed to do fine at 5k in sniper elite 3.


----------



## hyp36rmax

Quote:


> Originally Posted by *HiTechPixel*
> 
> So the Fury X can't handle 5K. Guess I'm going with the Titan X after all.


Where is everyone getting the information that the FURY X does not perform in 5k? Does it really matter for most people in this GPU generation? The majority of people on here have 1080P or 1440P with a handful with 4K monitors. By the time 5K and 4K Eyefinity is even a thing, we'll probably be closer to the R9 500 series.


----------



## DividebyZERO

Quote:


> Originally Posted by *szeged*
> 
> Fury seemed to do fine at 5k in sniper elite 3.


Yeah, until we get official benchmarks who knows. I guess we don't know when that will be either except possibly around july 25th?
Quote:


> Originally Posted by *hyp36rmax*
> 
> Where is everyone getting the information that the FURY X does not perform in 5k? Does it really matter for most people in this GPU generation? The majority of people on here have 1080P or 1440P with a handful with 4K monitors. By the time 5K and 4K Eyefinity is even a thing, we'll probably be closer to the R9 500 series.


The chart linked earlier in this thread
Quote:


> Originally Posted by *flopper*
> 
> early but


----------



## HiTechPixel

Quote:


> Originally Posted by *szeged*
> 
> Fury seemed to do fine at 5k in sniper elite 3.


Well, yeah. I'm slightly confused.

Yeah, I'm slightly confused. Lisa Su had the Fury X hooked up to Dell UP2715K, a 5K screen and she said the performance was admirable in Tomb Raider.

Yeah, I'm slightly confused. They had two displays hooked up to two separate computers that each had Fury X in them. Both displays were Dell UP2715K, a 5K monitor. One display ran Tomb Raider and the other ran Sniper Elite. Lisa Su said that Tomb Raider ran admirably well on the Fury X. Then again, we don't know what settings they used but I imagine she was talking about maximum settings.

Then there's the "leaked" benchmark. It could be fake. It could be numbers they pulled from their asses. And it could be true. It could also be driver-related. That AMD simply didn't optimize Fury X for Fire Strike at 5K and higher.

AH! I DON'T KNOW WHAT'S TRUE!
Quote:


> Originally Posted by *hyp36rmax*
> 
> Where is everyone getting the information that the FURY X does not perform in 5k? Does it really matter for most people in this GPU generation? The majority of people on here have 1080P or 1440P with a handful with 4K monitors. By the time 5K and 4K Eyefinity is even a thing, we'll probably be closer to the R9 500 series.


It matters to me because I have a 5K display hooked up to a cheap temporary GPU.


----------



## Redwoodz

Quote:


> Originally Posted by *HiTechPixel*
> 
> So the Fury X can't handle 5K. Guess I'm going with the Titan X after all.


Funny how the 390X blows away the 980Ti in that chart @8k.


----------



## hyp36rmax

Quote:


> Originally Posted by *HiTechPixel*
> 
> Well, yeah. I'm slightly confused.
> 
> Yeah, I'm slightly confused. Lisa Su had the Fury X hooked up to Dell UP2715K, a 5K screen and she said the performance was admirable in Tomb Raider.
> 
> Yeah, I'm slightly confused. They had two displays hooked up to two separate computers that each had Fury X in them. Both displays were Dell UP2715K, a 5K monitor. One display ran Tomb Raider and the other ran Sniper Elite. Lisa Su said that Tomb Raider ran admirably well on the Fury X. Then again, we don't know what settings they used but I imagine she was talking about maximum settings.
> 
> Then there's the "leaked" benchmark. It could be fake. It could be numbers they pulled from their asses. And it could be true. It could also be driver-related. That AMD simply didn't optimize Fury X for Fire Strike at 5K and higher.
> 
> AH! I DON'T KNOW WHAT'S TRUE!
> It matters to me because I have a 5K display hooked up to a cheap temporary GPU.


That's cool! Which one? In your case I suggest looking forward to some actual benchmarks from reputable sources to assist you in your decision.


----------



## Forceman

Quote:


> Originally Posted by *Redwoodz*
> 
> Funny how the 390X blows away the 980Ti in that chart @8k.


And the Fury, and the Fury X. Welcome to the VRAM limit.


----------



## tsm106

iirc, Tomb Raider was run at 4k ultra settings and Sniper was at 5k ultra. It hit 60 fps in TR and 45 fps in Sniper avg.


----------



## HiTechPixel

Quote:


> Originally Posted by *hyp36rmax*
> 
> That's cool! Which one? In your case I suggest looking forward to some actual benchmarks from reputable sources to assist you in your decision.


Which GPU? I think it was a GTX 960 from whichever brand was the cheapest. I basically just needed 2x DisplayPorts.

Yeah, I'm waiting and waiting for reviews and benchmarks from good sources. I'm also interested in knowing the DX12 featureset of the Fury X. If it fully supports it and all that.


----------



## hyp36rmax

Quote:


> Originally Posted by *HiTechPixel*
> 
> Which GPU? I think it was a GTX 960 from whichever brand was the cheapest. I basically just needed 2x DisplayPorts.
> 
> Yeah, I'm waiting and waiting for reviews and benchmarks from good sources. I'm also interested in knowing the DX12 featureset of the Fury X. If it fully supports it and all that.


Sorry I meant which monitor... The Dell or Apple?


----------



## HiTechPixel

Quote:


> Originally Posted by *hyp36rmax*
> 
> Sorry I meant which monitor... The Dell or Apple?


Ah, sorry, sorry! It's the Dell monitor. I bought it for work-related purposes but ended up taking it home with me to use it as my main monitor because the quality was incredible. It's almost at the level of industrial NEC displays. The glossy panel helps alot!


----------



## jdstock76

Quote:


> Originally Posted by *Redwoodz*
> 
> Funny how the 390X blows away the 980Ti in that chart @8k.


What's funny about it? I don't get your claim. Especially since he was talking about the TX.


----------



## magicc8ball

Quote:


> Originally Posted by *DividebyZERO*
> 
> Yeah, until we get official benchmarks who knows. I guess we don't know when that will be either except possibly around july 25th?
> The chart linked earlier in this thread


It will be way before the 25th, the Fury X is released next week and since they have already announced it I bet they are shipping them out to the Reviewers now. We would get the Fury or the Nano by next week but those are not launched July 14th for the Fury and no date for the Nano yet...


----------



## Casey Ryback

Dat Nano.........









2X performance per watt compared to 290X!!!

How many processors in the nano card?


----------



## diedo

so ... I'm buying the FuryX after reading proper reviews about it ....

i'm selling my kidney ...


----------



## hamzta09

Quote:


> Originally Posted by *diedo*
> 
> so ... I'm buying the FuryX after reading proper reviews about it ....
> 
> i'm selling my kidney ...


There are reviews?


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> Dat Nano.........
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2X performance per watt compared to 290X!!!
> 
> How many processors in the nano card?


Did they say it was 2 times the performance of the 290x? if so i did not hear that....

Nothing has been released about the Nano so all we have to go on is the rumors once again...


----------



## Casey Ryback

Quote:


> Originally Posted by *diedo*
> 
> so ... I'm buying the FuryX after reading proper reviews about it ....
> 
> i'm selling my kidney ...


I don't even need a review to know I'm buying one.

1440p + Fury =


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> Did they say it was 2 times the performance of the 290x? if so i did not hear that....
> 
> Nothing has been released about the Nano so all we have to go on is the rumors once again...


They stated that fury was 1.5X performance PER WATT, Compared to the 290X.

Then lisa su was very excited and said she loved the NANO card and that it had 2X the performance PER WATT than the 290X.

It's also a 6" card with one fan on it, amazing for small form factor PC's.

Literally, amazing work by AMD. They've had a tough few years and to take a risk and innovate like this is awesome.

Did you watch E3 and just miss the part about the nano card or are you just trolling right now?


----------



## boredmug

Its exciting and nice to see. AMD is here in Austin so in a way they are a hometown team for me. Most of my cards have been AMD. I think I'm going to wait for the next iteration when it had been refined a little bit(more memory)and jump on the dual GPU offering. Or.... I'll probably just take you uber technophiles used cast-offs when that generation releases.


----------



## hamzta09

24min to go. AMD is there, so they may show something new, perhaps gameplay on one of their GPUs


----------



## xer0h0ur

Quote:


> Originally Posted by *velocityx*
> 
> atleast the drivers have less overhead. who needs mantle when dx11 multithreaded on nvidia side matches mantle.
> 
> just my 2c.


Apparently you haven't been paying attention to the development of the Windows 10 drivers. Your complaint is going to go right down the tubes as soon as they are official releases. DX CPU overhead has improved a lot as the W10 driver has been their proving ground for improving it. Much like Nvidia didn't feel the need to jump on Mantle since they were able to optimize DX11 CPU overhead, AMD didn't feel the need to improve the CPU overhead in their DX11 drivers because of Mantle. Now that Mantle is taking on a different use they have shifted to CPU overhead improvement. Everyone using Asder's modified W10 drivers within Win7/8.1 have noticed the gradual improvement.


----------



## Agent Smith1984

I'm sadly eye balling all these gtx 980 owners dumping their cards for $400....
They are either going ti or fury... I may just take gain from their loss if i see some $350's pop up


----------



## magicc8ball

Quote:


> Originally Posted by *boredmug*
> 
> Its exciting and nice to see. AMD is here in Austin so in a way they are a hometown team for me. Most of my cards have been AMD. I think I'm going to wait for the next iteration when it had been refined a little bit(more memory)and jump on the dual GPU offering. Or.... I'll probably just take you uber technophiles used cast-offs when that generation releases.


O no I watched it all and remember her talking about the 1.5 and 2 times more power per watt but I did not hear here say the 290x.


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> O no I watched it all and remember her talking about the 1.5 and 2 times more power per watt but I did not hear here say the 290x.


Hmm what card was she talking about then? I'm sure she said the 290X.


----------



## rv8000

Rip my 290x lightning, and heres to saying I wouldn't upgrade again


----------



## PontiacGTX

Quote:


> Originally Posted by *Casey Ryback*
> 
> Hmm what card was she talking about then? I'm sure she said the 290X.


295x2?


----------



## xer0h0ur

Quote:


> Originally Posted by *brucethemoose*
> 
> They used an AP29 on the stage.


Actually just to clear this up. AMD intentionally used different fans on every batch of Fury X water cooled cards that were sent out to reviewers and the like. They did this on purpose so they could readily identify who leaked information/pics. I don't think anyone knows with any certainty what the final product will actually carry. Although if the card being shown off has the Nidec fan then I would lean towards that one being the fan.


----------



## Casey Ryback

Quote:


> Originally Posted by *PontiacGTX*
> 
> 295x2?


maybe









I'm positive she said 2X performance per watt vs 290X. The Nano is going to be an amazing card.

In general fury should OC well (one can hope)


----------



## tsm106

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> 295x2?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm positive she said 2X performance per watt vs 290X. The Nano is going to be an amazing card.
> 
> In general fury should OC well (one can hope)
Click to expand...

Yea the nano is 2x perf/watt and the fury x is 1.5x perf/watt.


----------



## magicc8ball

Quote:


> Originally Posted by *Casey Ryback*
> 
> maybe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm positive she said 2X performance per watt vs 290X. The Nano is going to be an amazing card.
> 
> In general fury should OC well (one can hope)


That makes no sense tho as to why they would only a single fan on that card if it is seriously going to have 2x the performance of the 290x. im going to re watch it lol


----------



## Forceman

Quote:


> Originally Posted by *magicc8ball*
> 
> That makes no sense tho as to why they would only a single fan on that card if it is seriously going to have 2x the performance of the 290x. im going to re watch it lol


It's performance *per watt*. Not performance. The 750Ti probably has 2x the perf/watt of a 290X also.


----------



## tsm106

Quote:


> Originally Posted by *magicc8ball*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Casey Ryback*
> 
> maybe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm positive she said 2X performance per watt vs 290X. The Nano is going to be an amazing card.
> 
> In general fury should OC well (one can hope)
> 
> 
> 
> That makes no sense tho as to why they would only a single fan on that card if it is seriously going to have 2x the performance of the 290x. im going to re watch it lol
Click to expand...

Either it is same perf as 290x at half the watts or it is twice the perf at same same watts. Also if it is the same perf at half the watts, that doesn't automatically mean it will scale to twice the performance at the same watts as a 290x. And by that distinction, if it is 2x perf/watt that does not mean it equals twice the performance of a 290x unless they specifically stated that, which they did not.


----------



## xer0h0ur

Higher performance per watt doesn't translate into higher gaming performance. There is no substitute for more stream processors / cuda cores, ROPs, TMUs etc etc.


----------



## Casey Ryback

Quote:


> Originally Posted by *magicc8ball*
> 
> That makes no sense tho as to why they would only a single fan on that card if it is seriously going to have 2x the performance of the 290x. im going to re watch it lol


I think you're confusing 2x performance of the 290X, with 2x the performance per watt.

Whether or not the Nano is actually double the performance, it's going to use half the power of that architecture.

HBM and this power factor is allowing the card to be around half the size, and a lot cooler.

Hence the 6" card with a single fan. The Nano is going to be the go to card for small form factor builds on air.


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> Higher performance per watt doesn't translate into higher gaming performance. There is no substitute for more stream processors / cuda cores, ROPs, TMUs etc etc.


Fury also has 4096 processors (vs 2816). So we can assume around 30% faster, with much lower power draw.

Whether HBM increases the overall performance too, we will know when benchmarks arrive.


----------



## Casey Ryback

Quote:


> Originally Posted by *tsm106*
> 
> Either it is same perf as 290x at half the watts or it is twice the perf at same same watts. Also if it is the same perf at half the watts, that doesn't automatically mean it will scale to twice the performance at the same watts as a 290x. And by that distinction, if it is 2x perf/watt that does not mean it equals twice the performance of a 290x unless they specifically stated that, which they did not.


Exactly, it won't necessarily have double the power at the same watts, but it should have dramatically less power draw.


----------



## xer0h0ur

Quote:


> Originally Posted by *Casey Ryback*
> 
> Fury also has 4096 processors (vs 2816). So we can assume around 30% faster, with much lower power draw.
> 
> Whether HBM increases the overall performance too, we will know when benchmarks arrive.


Yeah I know, they also increased the texture mapping units but left the ROPs the same amount as the 290X? WAT? That one threw me for a loop.

If anything, I really want to know how many ACEs they tossed into Fury. Did they stick with 8? Or did they extend their asynchronous compute lead over Maxwell 2?


----------



## Agent Smith1984

Here's to hoping the fury shreds as hard as i did my wife's strat in this video


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Here's to hoping the fury shreds as hard as i did my wife's strat in this video


Nice dude.

Your pup seemed to like it too lol.


----------



## hamzta09

The nano is worlds fastest GPU (in that size)


----------



## Casey Ryback

Quote:


> Originally Posted by *hamzta09*
> 
> The nano is worlds fastest GPU (in that size)


Two weeks ago who would've thought that AMD would be the king of SFF builds with an air cooled GPU?

AMD lived up to the hype


----------



## Balsagna

Quote:


> Originally Posted by *hamzta09*
> 
> The nano is worlds fastest GPU (in that size)


I see what you did there


----------



## gatygun

That 2x fury card will be 4x faster then what i got now, really interested in it. But the 4gb of v-ram for each gpu is really holding me back still.

Will probably wait for the next series card.


----------



## xer0h0ur

The Nano literally came out of left field. The name had been known already but people thought that was going to be the name for Fury X due to the 19cm length. They completely threw us a curveball.


----------



## Cutomz

Any guess on pricing for the Nano?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cutomz*
> 
> Any guess on pricing for the Nano?


$600 im guessing.....more than fury but less than fury x


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> The Nano literally came out of left field. The name had been known already but people thought that was going to be the name for Fury X due to the 19cm length. They completely threw us a curveball.


I know right?

Really excited to see what it can do


----------



## Agent Smith1984

I guess $450 and is 10% faster than a 290x 4gb card but still sells?


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I guess $450 and is 10% faster than a 290x 4gb card but still sells?


Too much competition against the 390X at that price. I'm guessing $500 and 980-level or a little better performance. But without some kind performance leak it's hard to speculate.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Too much competition against the 390X at that price. I'm guessing $500 and 980-level or a little better performance. But without some kind performance leak it's hard to speculate.


It's all still barring overclock headroom


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I guess $450 and is 10% faster than a 290x 4gb card but still sells?


It'll be slightly better than that I think.

Probably 15-20% faster with the 3500-ish processors.

Maybe $500?


----------



## xer0h0ur

Well boys, official benchmarking won't hit the net until Fury X's release day. Until then we will be living off speculation and leaks for a week.


----------



## gatygun

Isn't the only information about the nano 2x performance each watt in comparison towards the 290x?

This could mean that the nano is a slow 750ti budget type of card.


----------



## Casey Ryback

Quote:


> Originally Posted by *gatygun*
> 
> Isn't the only information about the nano 2x performance each watt in comparison towards the 290x?
> 
> This could mean that the nano is a slow 750ti budget type of card.


Hush now









Don't burst the nano bubble!

Honestly though I would expect it to be around 290 series at least.


----------



## Kokin

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well boys, official benchmarking won't hit the net until Fury X's release day. Until then we will be living off speculation and leaks for a week.


A week is an eternity for everyone here. The forums will be wrecked by then!


----------



## Balsagna

I held out on a 980TI to see what the red team brings to the table.

My dual GTX 680's (2gb version) are holding me back on my QNIX at 120hz :C


----------



## DividebyZERO

The Vram horse will be beaten so much it is not funny


----------



## xer0h0ur

Quote:


> Originally Posted by *DividebyZERO*
> 
> The Vram horse will be beaten so much it is not funny


LOL srsly. If that 5K / 8K bench floating around is legit then expect to see tons of talk about it. As if suddenly half the world owns 5K monitors.


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL srsly. If that 5K / 8K bench floating around is legit then expect to see tons of talk about it. As if suddenly half the world owns 5K monitors.


That user may have been talking about the person above with 2 x 2GB 680's at 2560 x 1440p?

I'd have to agree vram would be an issue in many titles.


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL srsly. If that 5K / 8K bench floating around is legit then expect to see tons of talk about it. As if suddenly half the world owns 5K monitors.


Not that i want to join the parade of that dead horse, however i was very interested in trying 4k VSR in 2x2 eyefinity even if only for a learning experience.
If only i could test these things before buying. Kinda hard to sink hundreds or thousands of dollars to find out it doesn't work at all or barely. I guess i have moved on from overclocking to trying to find other ways to push limits. I think i lost my sanity somewhere.


----------



## MadRabbit

Lol. If the Fury X FS scores are right and it beating the TX up until 5K then this is Nvidia atm.


----------



## neurotix

Hmm, so I'm basically shocked that the actual specs of these cards more or less match the rumored specs. I fully expected the cards to be 980 level of performance (perhaps with 96 ROPs), not 980ti level of performance. (In theory, anyway, we need to see the benches. All I really care about is the Anandtech review.)

I said I wouldn't upgrade my 290s until they had cards with twice as many ROPs, so I could get (theoretically) the performance of my Crossfire on a single card... and of course get two of them, so my setup would be twice as powerful.









128 ROPs, 256 TMUs and 4096 shaders (62% more shaders than my 290s). Seems like they really nailed this one.

Again, worried about VRAM like everyone else. I don't believe the color compression hype and all that other stuff they pushed with Tonga. If the 980ti is offering 6GB, so should AMD. AMD has traditionally been the one with more RAM, so it's a departure from the norm. Since I run 5760x1080 Eyefinity, I *really* need that VRAM. I don't even run my games with a ton of AA or post-processing either, but some of the more recent titles have been VRAM limited for my 290s.

Also, it will probably still take months to get the board partner's custom designs. I don't want that hybrid water crap in my system. Basically don't have the room for it, and with two cards it would be a real mess. (And if I'm buying two Fury-X I'm certainly not going to have the money for custom water on both cards. Some people aren't made of money and can only afford the cards, albeit with a real struggle and sacrifice.)

Personally, I'm thinking I'm going to skip this generation and wait for the next. My 290s hold up fine for most of the games I want to play. Now that the new cards will be out, as well as rebranded 390X/390, I will get less money if I sell these, and will probably have to come up with < $600 to afford two Fury-X.

Well, for anyone here who knows me, those are my thoughts.


----------



## xer0h0ur

Quote:


> Originally Posted by *neurotix*
> 
> Hmm, so I'm basically shocked that the actual specs of these cards more or less match the rumored specs. I fully expected the cards to be 980 level of performance (perhaps with 96 ROPs), not 980ti level of performance. (In theory, anyway, we need to see the benches. All I really care about is the Anandtech review.)
> 
> I said I wouldn't upgrade my 290s until they had cards with twice as many ROPs, so I could get (theoretically) the performance of my Crossfire on a single card... and of course get two of them, so my setup would be twice as powerful.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 128 ROPs, 256 TMUs and 4096 shaders (62% more shaders than my 290s). Seems like they really nailed this one.
> 
> Again, worried about VRAM like everyone else. I don't believe the color compression hype and all that other stuff they pushed with Tonga. If the 980ti is offering 6GB, so should AMD. AMD has traditionally been the one with more RAM, so it's a departure from the norm. Since I run 5760x1080 Eyefinity, I *really* need that VRAM. I don't even run my games with a ton of AA or post-processing either, but some of the more recent titles have been VRAM limited for my 290s.
> 
> Also, it will probably still take months to get the board partner's custom designs. I don't want that hybrid water crap in my system. Basically don't have the room for it, and with two cards it would be a real mess. (And if I'm buying two Fury-X I'm certainly not going to have the money for custom water on both cards. Some people aren't made of money and can only afford the cards, albeit with a real struggle and sacrifice.)
> 
> Personally, I'm thinking I'm going to skip this generation and wait for the next. My 290s hold up fine for most of the games I want to play. Now that the new cards will be out, as well as rebranded 390X/390, I will get less money if I sell these, and will probably have to come up with < $600 to afford two Fury-X.
> 
> Well, for anyone here who knows me, those are my thoughts.


Nope, as I just mentioned before they didn't add a single ROP. Its the same number of ROPs in the 290X as there are in Fury XT.


----------



## zeppoli

Quote:


> Originally Posted by *Casey Ryback*
> 
> Two weeks ago who would've thought that AMD would be the king of SFF builds with an air cooled GPU?
> 
> AMD lived up to the hype


Wait, we can't say that yet. I want to see the numbers, specs and talk are one thing, how it performs is another.

I'm hoping though, was about to go nvidia, but I think I might just stay with AMD


----------



## jerrolds

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm sadly eye balling all these gtx 980 owners dumping their cards for $400....
> They are either going ti or fury... I may just take gain from their loss if i see some $350's pop up


I really hope this is the case - id pick up a $400 980ti ha cant afford a Fury X and a new monitor to go with it atm.


----------



## Casey Ryback

Quote:


> Originally Posted by *zeppoli*
> 
> Wait, we can't say that yet. I want to see the numbers, specs and talk are one thing, how it performs is another.
> 
> I'm hoping though, was about to go nvidia, but I think I might just stay with AMD


True, although I'm making the assumption looking at DDR5 PCB's vs the new HBM boards.

AMD can now fit a lot more on the smaller PCB, compared with older designs.


----------



## mikegray

What do you guys think:

Will there be any chance of adding a Fury X into a preexisting custom WC loop? It seems like it could, in theory, be easier (if not cheaper) than buying the air cooled Fury and adding an aftermarket water block ...


----------



## Casey Ryback

Quote:


> Originally Posted by *mikegray*
> 
> What do you guys think:
> 
> Will there be any chance of adding a Fury X into a preexisting custom WC loop? It seems like it could, in theory, be easier (if not cheaper) than buying the air cooled Fury and adding an aftermarket water block ...


Well if the air cooler fury is the full fiji with 4096 processors then it's the go to card for custom loops surely.

In other news

There was a farcry 4 slide at E3 showing Fury X performance.

4K Ultra settings* (not sure what the star implies here..........)

Average 54 fps

Min 43 fps

Titan X results from various websites.

Tom's 39.4 ave 33 min.

Anandtech 42.1 ave

This looks like the fury is well ahead............

But I'm not about to call the Fury a titan killer without knowing exactly why that * is there after the words ultra settings.


----------



## szeged

probably ultra settings without AA since most reviewers leave AA off for 4k since 4k + AA = bye bye vram.


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> True, although I'm making the assumption looking at DDR5 PCB's vs the new HBM boards.
> 
> AMD can now fit a lot more on the smaller PCB, compared with older designs.


I still remember codename Longcat days when cards like the 5970 were super long. I bought a fulltower just so i could house that monstrosity.


----------



## 1337LutZ

Not sure if this is legit, but this just got posted: (benchmarks)

http://iyd.kr/746


----------



## glenn37216

I think the only benchmarks that really matter to me are the Gameworks titles. I mean it would really be depressing if the fury benchmarks good in 3dmark but is slower than a 980 in Batman Arkam knight. I sure hope crossfire's framepacing works well with the 3 series too . anticipation is driving me sleepless. need - facts - now so I can make up my purchasing mind ......


----------



## rdr09

Quote:


> Originally Posted by *glenn37216*
> 
> I think the only benchmarks that really matter to me are the Gameworks titles. I mean it would really be depressing if the fury benchmarks good in 3dmark but is slower than a 980 in Batman Arkam knight. I sure hope crossfire's framepacing works well with the 3 series too . anticipation is driving me sleepless. need - facts - now so I can make up my purchasing mind ......


that's like picking up a rock and hitting yourself with it. gameworks . . . go green. lol


----------



## flopper

best cards in the world presented and soon released.
Its like christmas here.


----------



## Casey Ryback

Quote:


> Originally Posted by *glenn37216*
> 
> I mean it would really be depressing if the fury benchmarks good in 3dmark but is slower than a 980 in Batman Arkam knight.


Do you mean Batman 'DLC' Knight?

I heard it's getting renamed. Either that or Batman Gouge Knight









http://i.imgur.com/btz1VKI.jpg

I don't support games with vast DLC programs anymore.


----------



## wstanci3

Damn that Nano is a lil' cutie.
If I was making a mini itx, that would be my card.

Can't wait for more Fury (non X) news. Want to know what they've skimmed off other than the AIO for $100 off...


----------



## gatygun

Quote:


> Originally Posted by *zeppoli*
> 
> no thanks. Personally I'm done with crossfire. I assume the 295x2 acts as a crossfire platform correct?
> 
> no more micro stutters for me! I want a top tier card to give me 60+ fps @1440P with high/ultra!


To bad there is really no option for 1440p or even 1080p to get a solid 60fps on a single gpu without having to turn down settings. This is why crossfire / sli will always be a thing.


----------



## HiTechPixel

Quote:


> Originally Posted by *neurotix*
> 
> Also, it will probably still take months to get the board partner's custom designs. I don't want that hybrid water crap in my system. Basically don't have the room for it, and with two cards it would be a real mess. (And if I'm buying two Fury-X I'm certainly not going to have the money for custom water on both cards. Some people aren't made of money and can only afford the cards, albeit with a real struggle and sacrifice.)


There won't be third-party board designs or coolers for the Radeon Fury X. They've (AMD and hardware sites) have already confirmed that there will only be a reference water-cooled design.


----------



## ozyo

now we need fury x lightning
what stupid name is that


----------



## white owl

Quote:


> Originally Posted by *ozyo*
> 
> now we need fury x lightning *twin frozer boost 2.1+ dethklok edition*
> what stupid name is that


I'd buy one.


----------



## ssateneth

FINALLY, some naked PCB shots, including a dual GPU version?!


----------



## Plonide

Quote:


> Originally Posted by *HiTechPixel*
> 
> There won't be third-party board designs or coolers for the Radeon Fury X. They've (AMD and hardware sites) have already confirmed that there will only be a reference water-cooled design.


I really hope this isn't true, feels like such wasted potential with the OC headroom available. Sure you can OC it yourself but the partners could make some impressive designs and good OC out of the box.

According to this "leak" http://www.legitreviews.com/sapphire-lists-radeon-390x-390-380-370-and-360-video-cards_166023 Sapphire has plans for custom Fury cards which was taken down shortly after.
Maybe there will only be overclocked versions from third party vendors based on the reference design. Damn shame, I was hoping for more options regarding ports. Still have a DVI monitor.


----------



## ozyo

Quote:


> Originally Posted by *white owl*
> 
> I'd buy one.


going for 2


----------



## HaunteR

I'll wait for the Fury X double GPU.

That looks like a real contender to the Titan X, though we don't know the price yet?


----------



## Casey Ryback

Quote:


> Originally Posted by *HaunteR*
> 
> I'll wait for the Fury X double GPU.
> 
> That looks like a real contender to the Titan X, though we don't know the price yet?


A contender?

It'll go into a single PCIE slot vs 2 x titan X's in two slots. Not to mention a shorter length.

For a mini-itx build it will be insane. Probably still cooled by a single rad AIO.

Nvidia need something to contend with the Fury X - double GPU. You have it the wrong way round.

AMD will have that market segment for power in small form factors.

Unless 6-12GB vram is required then obviously that will be the pro to the 980ti or titan X sli.

If AMD can push out the 8GB+ HBM cards sooner than nvidia, then they will clearly dominate small form factor enthusiast machines.


----------



## HaunteR

Quote:


> Originally Posted by *Casey Ryback*
> 
> A contender?
> 
> It'll go into a single PCIE slot vs 2 x titan X's in two slots. Not to mention a shorter length.
> 
> For a mini-itx build it will be insane. Probably still cooled by a single rad AIO.
> 
> Nvidia need something to contend with the Fury X - double GPU. You have it the wrong way round.
> 
> AMD will have that market segment for power in small form factors.
> 
> Unless 6-12GB vram is required then obviously that will be the pro to the 980ti or titan X sli.
> 
> If AMD can push out the 8GB+ HBM cards sooner than nvidia, then they will clearly dominate small form factor enthusiast machines.


...Yeah I'll wait for the Benchmarks before I jump on that theory.

Gloating at E3 with misleading information is far too common nowadays to build hype, how can they even consider pre-orders before the full specs are benchmarked because of DMCA? I'll reserve judgment until that time comes.


----------



## jdstock76

Quote:


> Originally Posted by *HaunteR*
> 
> ...Yeah I'll wait for the Benchmarks before I jump on that theory.
> 
> Gloating at E3 with misleading information is far too common nowadays to build hype, how can they even consider pre-orders before the full specs are benchmarked because of DMCA? I'll reserve judgment until that time comes.


Wow! People are now saying a single Fury beats 2x TX? LoL!

Edit: I read further up. Still highly unlikely. But I would like to see the Fury kick some butt. Competition is good for everyone.


----------



## HaunteR

Quote:


> Originally Posted by *jdstock76*
> 
> Wow! People are now saying a single Fury beats 2x TX? LoL!
> 
> Edit: I read further up. Still highly unlikely. But I would like to see the Fury kick some butt. Competition is good for everyone.


Don't get me wrong, I reserved buying a 980 Ti because I wanted to see the specs of the 300 series.

Still a bit misguided to pre-order without benchmarks, at least for me. And thinking that a single card can beat 2x Titan Xs is a tall order. We'll see if it's possible.

Probably going to get a Fury card myself, if the Fury 2x IS as good as they claim, maybe that instead.


----------



## Newbie2009

Quote:


> Originally Posted by *HaunteR*
> 
> Don't get me wrong, I reserved buying a 980 Ti because I wanted to see the specs of the 300 series.
> 
> Still a bit misguided to pre-order without benchmarks, at least for me. And thinking that a single card can beat 2x Titan Xs is a tall order. We'll see if it's possible.
> 
> Probably going to get a Fury card myself, if the Fury 2x IS as good as they claim, maybe that instead.


I don't mind them pushing it out for a few weeks more so long as they have a done a good job with drivers on launch.


----------



## HaunteR

Quote:


> Originally Posted by *Newbie2009*
> 
> I don't mind them pushing it out for a few weeks more so long as they have a done a good job with drivers on launch.


I don't mind the wait either, I just want to see some benchmarks before they go on sale, they will most likely be sold out.


----------



## SPLWF

So does anyone else see an issue with the AIO cooler in-line with the main VRMS. They are reducing the size of the rubber hose and from the looks of it, it seems like it will effect water flow. Also, is the rubber hose being converted to copper then back to rubber?


----------



## PontiacGTX

Quote:


> Originally Posted by *Casey Ryback*
> 
> Fury also has 4096 processors (vs 2816). So we can assume around 30% faster, with much lower power draw.
> 
> Whether HBM increases the overall performance too, we will know when benchmarks arrive.


indeed it is 45%+3%+Frequency Scaling+Driver improvement+HBM increase of performance.
Quote:


> Originally Posted by *gatygun*
> 
> That 2x fury card will be 4x faster then what i got now, really interested in it. But the 4gb of v-ram for each gpu is really holding me back still.
> 
> Will probably wait for the next series card.


if you are not using something above 4k pixel count in a console port ,4GB are fine
Quote:


> Originally Posted by *Casey Ryback*
> 
> It'll be slightly better than that I think.
> 
> Probably 15-20% faster with the 3500-ish processors.
> 
> Maybe $500?


27%+Above

Quote:


> Originally Posted by *neurotix*
> 
> Hmm, so I'm basically shocked that the actual specs of these cards more or less match the rumored specs. I fully expected the cards to be 980 level of performance.


why amd would waste their time with a 770/670 equivalent








Quote:


> Originally Posted by *szeged*
> 
> probably ultra settings without AA since most reviewers leave AA off for 4k since 4k + AA = bye bye vram.


You dont need more than SMAA or FXAA for 4k, or even no AA in a small screen.


----------



## p4inkill3r

Quote:


> Originally Posted by *SPLWF*
> 
> So does anyone else see an issue with the AIO cooler in-line with the main VRMS. They are reducing the size of the rubber hose and from the looks of it, it seems like it will effect water flow. Also, is the rubber hose being converted to copper then back to rubber?


----------



## Casey Ryback

Quote:


> Originally Posted by *HaunteR*
> 
> ...Yeah I'll wait for the Benchmarks before I jump on that theory.
> 
> Gloating at E3 with misleading information is far too common nowadays to build hype, how can they even consider pre-orders before the full specs are benchmarked because of DMCA? I'll reserve judgment until that time comes.


I'm not sure what theory you mean?

Even if a single fiji chip it's slower than the titan X the dual fiji will be the go to card for mini cube cases etc that can't fit large dual card configurations. People love SFF builds these days, less space used and portability.

That's the only point I was making, wasn't saying a single fury chip is going to beat the titan. My post was a little confusing.

Example - the Project Quantum


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> There won't be third-party board designs or coolers for the Radeon Fury X. They've (AMD and hardware sites) have already confirmed that there will only be a reference water-cooled design.


You mind sourcing that since it goes directly against what I have read?


----------



## SPLWF

Quote:


> Originally Posted by *p4inkill3r*


Thanks for that, didn't see that picture before. All my answers, answered in one picture, lol.


----------



## zeppoli

Does anyone think the card will come with a dvi to display port adapter? It would be smart IMO, I have to factor that in when deciding if I want the 650.00 dollar Fury X or the 650.00 dollar 980 ti.

I have a Korean Monitor like many, shoot, most probably don't even have monitors with display ports, so maybe an adapter will be included?

If not, there has to be a cheaper than 70 dollar one, I don't need it to do 240hz or anything crazy, just a 1440P and at least 60 or 75hz


----------



## xer0h0ur

I can squash any rumors or hopes of HBM2 Fury/X. HBM2 won't be available until Q2 2016.


----------



## zeppoli

Quote:


> Originally Posted by *gatygun*
> 
> To bad there is really no option for 1440p or even 1080p to get a solid 60fps on a single gpu without having to turn down settings. This is why crossfire / sli will always be a thing.


WAT!! lol , when I put my 290 on single GPU it almost gets 60FPS average in GTA V, with settings high/very high.
with BF4, I'm easily getting 60FPS average on ultra

a 980 ti, will run 1440P 60fps at least with those two games listed on ultra.
Plus I rather see dips below 40FPS, than take the horrible micro stutters I get with CF. Or worse, games that don't even support CF/SLI
GTA V, DCS, FSX, IL-2 , and so on and so on.


----------



## hamzta09

Quote:


> Originally Posted by *zeppoli*
> 
> WAT!! lol , when I put my 290 on single GPU it almost gets 60FPS average in GTA V, with settings high/very high.
> with BF4, I'm easily getting 60FPS average on ultra
> 
> a 980 ti, will run 1440P 60fps at least with those two games listed on ultra.
> Plus I rather see dips below 40FPS, than take the horrible micro stutters I get with CF. Or worse, games that don't even support CF/SLI
> GTA V, DCS, FSX, IL-2 , and so on and so on.


Average.. who wants a 60fps average? You want a 60 minimum.

GTA support SLI/Xfire.


----------



## flopper

Quote:


> Originally Posted by *SPLWF*
> 
> So does anyone else see an issue with the AIO cooler in-line with the main VRMS. They are reducing the size of the rubber hose and from the looks of it, it seems like it will effect water flow. Also, is the rubber hose being converted to copper then back to rubber?


The coppar cools VRM also.


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> Average.. who wants a 60fps average? You want a 60 minimum.
> 
> GTA support SLI/Xfire.


you figure out zep sooner or later. lol


----------



## Creator

That cooling setup looks so ghetto... Good thing ghetto mods usually work well.


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can squash any rumors or hopes of HBM2 Fury/X. HBM2 won't be available until Q2 2016.


It's probably not going to matter the 4GB limitation for high res.

From what AMD has been saying 4GB HBM works differently to 4GB DDR5.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> You mind sourcing that since it goes directly against what I have read?


You can google it yourself easily. But here's one from Sweclockers: http://www.sweclockers.com/nyhet/20684-radeon-fury-x-endast-med-vattenkylning-nedskalade-fury-anmals-saknad

*"Enligt källor till SweClockers är Radeon Fury X endast tänkt att släppas med sluten vattenkylning och ingen luftkyld referensdesign finns i planerna. Samtidigt understryks att partnertillverkare inte får göra några ingrepp på kortet, utan att det måste följa AMD:s specifikationer till punkt och pricka."*


----------



## hyp36rmax

*+Specifications added to OP*


----------



## gatygun

Quote:


> Originally Posted by *zeppoli*
> 
> WAT!! lol , when I put my 290 on single GPU it almost gets 60FPS average in GTA V, with settings high/very high.
> with BF4, I'm easily getting 60FPS average on ultra
> 
> a 980 ti, will run 1440P 60fps at least with those two games listed on ultra.
> Plus I rather see dips below 40FPS, than take the horrible micro stutters I get with CF. Or worse, games that don't even support CF/SLI
> GTA V, DCS, FSX, IL-2 , and so on and so on.


BF4 got released october 2013, the game is practically 2 years old by now. It's not really a new game anymore by any means.

GTA 5 drops to 40's constantly with a single titan x on 1080p, even with 2x 980 sli it will drop down to low 50's at times and that's without msaa.
2x 290x will drop below the 60's.

A 980 ti is slower then all those card setups, it won't get 60 fps on ultra at lows ( lows is what matters as this is what makes the game feel laggy at times ).

The same goes for witcher 3 not a single gpu can push it on 60 on ultra. The fury x won't change this.

And these games are already released, let alone when you talk about games in the future.

If you want to play games on 1080p / 1440p on ultra + 60 fps you will need 2 gpu's as minimum.

People in many forums keep talking about how these cards are overkill for 1080p, but in the reality there is no single gpu that pushes a stable 60 fps on 1080p let alone hgiher resolutions with maxed settings.

This is why you are forced to sli / crossfire cards to make that happen and even then the question remains on how long you can push the settings to the limit in newer games, mostly not long. There always have to be tradeoffs even with that.


----------



## xer0h0ur

Quote:


> Originally Posted by *Creator*
> 
> That cooling setup looks so ghetto... Good thing ghetto mods usually work well.


Frankly all that matters is that the important things get cooled off and that is the VRMs, the GPU and the HBM. All of it is being covered by this AIO cooler so looks fine to me. Hell, its better than the crap Asetek made for the 295X2.
Quote:


> Originally Posted by *HiTechPixel*
> 
> You can google it yourself easily. But here's one from Sweclockers: http://www.sweclockers.com/nyhet/20684-radeon-fury-x-endast-med-vattenkylning-nedskalade-fury-anmals-saknad
> 
> *"Enligt källor till SweClockers är Radeon Fury X endast tänkt att släppas med sluten vattenkylning och ingen luftkyld referensdesign finns i planerna. Samtidigt understryks att partnertillverkare inte får göra några ingrepp på kortet, utan att det måste följa AMD:s specifikationer till punkt och pricka."*


Well that one is useless to me. I don't speak/read whatever language that is. What I am most interested in though is this claim you made that AMD said it themselves. I have not seen this anywhere but from you. The reason why this particularly doesn't make sense to me is because information came out that AMD ditched the blower style cooler for a triple fan cooler on Fury/X.


----------



## Casey Ryback

Quote:


> Originally Posted by *gatygun*
> 
> People in many forums keep talking about how these cards are overkill for 1080p, but in the reality there is no single gpu that pushes a stable 60 fps on 1080p let alone hgiher resolutions with maxed settings.


That's because people want full AA and they're willing to fork out another $500 just to have it.

Many cards can have ultra textures and detail with AA off or reduced, and a couple of other minor tweaks.

The result is 99% of the eye candy without the crippled frames.


----------



## zeppoli

Quote:


> Originally Posted by *hamzta09*
> 
> Average.. who wants a 60fps average? You want a 60 minimum.
> 
> GTA support SLI/Xfire.


Exactly, this is why I want a better card than my single 290.. He was saying you can't get 60fps with 1080 res. which is just insane to say. I was getting 100+FPS with 1080 ultra on BF4.
regardless, I will see above 60FPS 1440P ultra on a single card.
If the fury X can't do it, well I know the 980ti can.


----------



## gatygun

Quote:


> Originally Posted by *Casey Ryback*
> 
> That's because people want full AA and they're willing to fork out another $500 just to have it.
> 
> Many cards can have ultra textures and detail with AA off or reduced, and a couple of other minor tweaks.
> 
> The result is 99% of the eye candy without the crippled frames.


Witcher 3 hardly pushes AA and those examples are without AA just fxaa on gta 5, on 1080p you would want to have atleast 4x msaa to reduce the jaggies tho, so more performance impact right there.

BF4 is a
Quote:


> Originally Posted by *zeppoli*
> 
> Exactly, this is why I want a better card than my single 290.. He was saying you can't get 60fps with 1080 res. which is just insane to say. I was getting 100+FPS with 1080 ultra on BF4.
> regardless, I will see above 60FPS 1440P ultra on a single card.
> If the fury X can't do it, well I know the 980ti can.


Game is 2 years old, obviously i'm talking about current games and not older games.


----------



## Agent Smith1984

I think I may just order of these, and get another in a month or so....
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=r9_390x-_-14-127-872-_-Product

That gives me 4GB additional VRAM over my 290 CF setup, and 512 additional shaders, and also appears to better clockers based on the stock clocks...

That 1100MHz default core, and 1525MHz default VRAM is a nice out of the box speed for a hawaii.... I wonder if they improved the process any for some more OC headroom, or will they still run out at 1175-1200 on most...


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Frankly all that matters is that the important things get cooled off and that is the VRMs, the GPU and the HBM. All of it is being covered by this AIO cooler so looks fine to me. Hell, its better than the crap Asetek made for the 295X2.
> Well that one is useless to me. I don't speak/read whatever language that is. What I am most interested in though is this claim you made that AMD said it themselves. I have not seen this anywhere but from you. The reason why this particularly doesn't make sense to me is because information came out that AMD ditched the blower style cooler for a triple fan cooler on Fury/X.


You're getting the Fury and Fury X mixed up. The Fury X will be AMDs answer to the Titan X. It will only come in a reference design with a reference cooler. The normal Fury will have third-party designs.


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can squash any rumors or hopes of HBM2 Fury/X. HBM2 won't be available until Q2 2016.


Q1 seems more likely.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> You're getting the Fury and Fury X mixed up. The Fury X will be AMDs answer to the Titan X. It will only come in a reference design with a reference cooler. The normal Fury will have third-party designs.


No I am not getting anything mixed up here. I am fully aware what the three Fury based cards are and their direct competition. Again. Show me something where AMD is directly quoted as saying there won't be an air cooled Fury X....


----------



## flopper

Quote:


> Originally Posted by *gatygun*
> 
> Witcher 3 hardly pushes AA and those examples are without AA just fxaa on gta 5, on 1080p you would want to have atleast 4x msaa to reduce the jaggies tho, so more performance impact right there.
> 
> BF4 is a
> Game is 2 years old, obviously i'm talking about current games and not older games.


Yea 1080p still demanding on older games and x11.
its like 1100euro titanx cant do it so one got to ask who buys it then.

I rather mix settings to allow the fps I want and the Fury seems to allow a Higher Minfps which is whats important for me


----------



## jerrolds

Quote:


> Originally Posted by *xer0h0ur*
> 
> Frankly all that matters is that the important things get cooled off and that is the VRMs, the GPU and the HBM. All of it is being covered by this AIO cooler so looks fine to me. Hell, its better than the crap Asetek made for the 295X2.
> Well that one is useless to me. I don't speak/read whatever language that is. What I am most interested in though is this claim you made that AMD said it themselves. I have not seen this anywhere but from you. The reason why this particularly doesn't make sense to me is because information came out that AMD ditched the blower style cooler for a triple fan cooler on Fury/X.


Google translate? "According to sources of vBulletin is the Radeon Fury X only supposed to be released with closed water cooling and no air -cooled reference design are available in the plans. At the same time underlines that partner manufacturers may not make any interventions on the card , without the need to follow AMD's specifications to the letter"

Looks like 3rd party arent allowed to make interventions (modifications?) on the card - i think us QNIX owners are hosed if we want [email protected] 75hz is possible tho with dl active adapters


----------



## xer0h0ur

Some random source is not AMD themselves as he claimed. I also can't find anyone repeating this by googling "no air cooled Fury X"


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Some random source is not AMD themselves as he claimed. I also can't find anyone repeating this by googling "no air cooled Fury X"


It wasn't specific, but during the presentation Dr Su said that the Fiji family would have a liquid-cooled version, the Fury X, and an air-cooled version, the Fury. So that kind of implies that the Fury X is only water-cooled, although she didn't come right out and say it.


----------



## zeppoli

1080p, ULTRA settings everything all the way up, everything but msaa



the 290x is averaging above 60fps, the 980 is too,
the 980ti easily averages over 60fps on ultra at 1080.


----------



## xer0h0ur

Did they present the air cooled Fury (obviously non X)? Or did they only show off the R9 Nano and the Fury X?


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did they present the air cooled Fury (obviously non X)? Or did they only show off the R9 Nano and the Fury X?


No, just mentioned it. They also didn't give any specs for it at all. Guess we'll have to wait for mid-July to find out.


----------



## xer0h0ur

Here is the main reason why it makes zero sense to me to not have an air cooled Fury X. There are tons of people that are already complaining that they want Fury X but can't add its radiator into the system. This severely limits AMD's sales on Fury X. Also, I don't ever remember AMD pulling that crap with AIBs by limiting them on creating custom PCBs or custom cooled cards.


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> Here is the main reason why it makes zero sense to me to not have a Fury X. There are tons of people that are already complaining that they want Fury X but can't add its radiator into the system. This severely limits AMD's sales on Fury X. Also, I don't ever remember AMD pulling that crap with AIBs by limiting them on creating custom PCBs or custom cooled cards.


there be 3 other Fury cards.
so much whining.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> Here is the main reason why it makes zero sense to me to not have a Fury X. There are tons of people that are already complaining that they want Fury X but can't add its radiator into the system. This severely limits AMD's sales on Fury X. Also, I don't ever remember AMD pulling that crap with AIBs by limiting them on creating custom PCBs or custom cooled cards.


How many people are actually running custom loops? They're a fraction of a fraction of the market.


----------



## Agent Smith1984

Well, AMD seemed to have limited it's partners on the 295x2 to reference, but PC did a completely new design and popped out the Devil 13, so we could still see something pop up down the road right?

And honestly, I can't see the X being sales limited by it's cooler....
People wanting to incorporate it into their own loops or whatever, are already going to be buying blocks anyways.
And anyone sticking with the reference cooling, should have no issue with swapping their rear exhaust fan for their GPU cooling fan, since it will essentially do the same thing, and cool the card.....


----------



## xer0h0ur

I said the AIO's radiator.... not adding it to an open loop. How the hell did you arrive at that based on what I said about people not having the space for a radiator....

Edit: I see why, missed putting air cooled on that post.


----------



## p4inkill3r

I assumed that was what you meant because that is a complaint I've seen since the announcement.
Since that's not what you meant, my point is moot.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Here is the main reason why it makes zero sense to me to not have an air cooled Fury X. There are tons of people that are already complaining that they want Fury X but can't add its radiator into the system. This severely limits AMD's sales on Fury X. Also, I don't ever remember AMD pulling that crap with AIBs by limiting them on creating custom PCBs or custom cooled cards.


Haha, wow, you're too much man. You should get a job at AMD or NVIDIA since you're so freaking smart. Freaking lol.


----------



## Amhro

I am really confused now








I wanted 290 or 290X (for 1080p max settings, sometimes 4K tv, price around 300-400eur), but now those fiji cards made a little mess for me, will there be any better choice in that price range now?







Or should I grab 290 (320eur) / 290X (390eur)?


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> Haha, wow, you're too much man. You should get a job at AMD or NVIDIA since you're so freaking smart. Freaking lol.


Are you seriously this dense? How is it even remotely difficult to understand that you're limiting your market and sales of your top end single gpu graphics card by limiting it to a water cooled solution forcing a radiator upon you? Its not even an argument worth having. By far and away most people buy air cooled video cards.


----------



## zeppoli

The furyX and fury nano are exact same specs, only the nano is small.. Correct?


----------



## Agent Smith1984

Quote:


> Originally Posted by *zeppoli*
> 
> The furyX and fury nano are exact same specs, only the nano is small.. Correct?


No confirmation on that at all yet....


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> The furyX and fury nano are exact same specs, only the nano is small.. Correct?


not enough information to confirm or deny that


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you seriously this dense? How is it even remotely difficult to understand that you're limiting your market and sales of your top end single gpu graphics card by limiting it to a water cooled solution forcing a radiator upon you? Its not even an argument worth having. By far and away most people buy air cooled video cards.


I guess that's why the R9 295X sold like crap. Oh wait, it didn't. It sells like hotcakes.


----------



## xer0h0ur

Nevermind. I now realize I am speaking with someone who is not open to possibilities or grounded in reality.


----------



## Forceman

Quote:


> Originally Posted by *zeppoli*
> 
> The furyX and fury nano are exact same specs, only the nano is small.. Correct?


That seems very unlikely. They'd have to underclock it a ton to hit that power envelope, and that would be a real waste of a full Fiji die. It'll almost certainly be a cut-down die.


----------



## zealord

I don't think the Nano has Fury X speccs, because the Nano is 175W and the FuryX 275W. It would be wasted to have a big die with 4096 shaders and underclock it heavily to have low power usage.

Pretty sure it is a heavily cut Fiji core.

Edit: looks like I was a couple of seconds too slow


----------



## fewness

Quote:


> Originally Posted by *zeppoli*
> 
> The furyX and fury nano are exact same specs, only the nano is small.. Correct?


How is that possible if X needs a AIO water cooler but nano only requires one fan?


----------



## xer0h0ur

Did he edit the post? I thought he was comparing standard Fury to the R9 Nano. Not a chance in hell its the equivalent to the Fury X. How close to the standard Fury is the only question I have.


----------



## Agent Smith1984

I'm guessing the Nano will fill a gap right between the $400 390x and the $550 Fury.....

It will probably have around 3072-3328 shaders, and cost around $500..... Just a guess though


----------



## zeppoli

So release date of the R9 3xx series is tomorrow.

Fury/FuryX will be in one week, June 24th

Will any of the reviewers out there get it early to release experience/benchmarks for us? Or are we going to have to wait and someone here will be our guinea pig


----------



## Roaches

I kinda missed the paper launch yesterday, if it turns out to be a real beast when reviews surfaces, I'm afraid my Devil 13 CFX will have very short service life. Might pick up 2 during the holiday season if AIB models solutions appear attractive for the selection.


----------



## PontiacGTX

Quote:


> Originally Posted by *zealord*
> 
> I don't think the Nano has Fury X speccs, because the Nano is 175W and the FuryX 275W. It would be wasted to have a big die with 4096 shaders and underclock it heavily to have low power usage.
> 
> Pretty sure it is a heavily cut Fiji core.
> 
> Edit: looks like I was a couple of seconds too slow


where they said it was 175w??


----------



## MiladEd

Here:

http://i.imgur.com/WiNvXWn.png


----------



## Hazardz

Quote:


> Originally Posted by *zeppoli*
> 
> So release date of the R9 3xx series is tomorrow.
> 
> Fury/FuryX will be in one week, June 24th
> 
> Will any of the reviewers out there get it early to release experience/benchmarks for us? Or are we going to have to wait and someone here will be our guinea pig


I would guess there will be some more leaked benches coming out of Asia or something ahead of launch since they should already be in the distribution channels.


----------



## jerrolds

Quote:


> Originally Posted by *zeppoli*
> 
> So release date of the R9 3xx series is tomorrow.
> 
> Fury/FuryX will be in one week, June 24th
> 
> Will any of the reviewers out there get it early to release experience/benchmarks for us? Or are we going to have to wait and someone here will be our guinea pig


I would expect reviews to be made available by the end of this week. I know the 290X reviews were out at least a few days before retail launch.


----------



## hamzta09

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well that one is useless to me. I don't speak/read whatever language that is. What I am most interested in though is this claim you made that AMD said it themselves. I have not seen this anywhere but from you. The reason why this particularly doesn't make sense to me is because information came out that AMD ditched the blower style cooler for a triple fan cooler on Fury/X.


Had you read the URL you'd known its Swedish.

And if you had any knowledge of computers or the internet you could have:

https://translate.google.se/translate?sl=sv&tl=en&js=y&prev=_t&hl=sv&ie=UTF-8&u=http%3A%2F%2Fwww.sweclockers.com%2Fnyhet%2F20684-radeon-fury-x-endast-med-vattenkylning-nedskalade-fury-anmals-saknad&edit-text=


----------



## PontiacGTX

Quote:


> Originally Posted by *MiladEd*
> 
> Here:
> 
> http://i.imgur.com/WiNvXWn.png


sounds like a 2816SP or 3072SP GPU to compete against beat the 980, for a 450-500usd price tag

3072/2816SP 4GB RAM 1024 Bit HBM 64 ROPs 192/176 TMUs, 48/44 CU


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> So release date of the R9 3xx series is tomorrow.
> 
> Fury/FuryX will be in one week, June 24th
> 
> Will any of the reviewers out there get it early to release experience/benchmarks for us? Or are we going to have to wait and someone here will be our guinea pig


Nope. Fury X launches in a week. Fury doesn't come out until next month and the Nano has no release date or specifications.

"The new Fury X flagship is launching on the 24th of June and will cost $650. Fury on the other hand will launch on the 12th of July and will cost $550."

http://wccftech.com/amd-officially-launches-radeon-fury-x-nano-650/


----------



## coelacanth

Quote:


> Originally Posted by *jerrolds*
> 
> Google translate? "According to sources of vBulletin is the Radeon Fury X only supposed to be released with closed water cooling and no air -cooled reference design are available in the plans. At the same time underlines that partner manufacturers may not make any interventions on the card , without the need to follow AMD's specifications to the letter"
> 
> Looks like 3rd party arent allowed to make interventions (modifications?) on the card - *i think us QNIX owners are hosed if we want [email protected]* 75hz is possible tho with dl active adapters


That wouldn't be good if that's the case. I need my 96-120Hz!!!


----------



## Agent Smith1984

It really just leaves one looking for max horsepower to either opt for 2x 390's or 1x Fury XT, with both setups costing around the same, and the performance edge going to the two cards, while the usual quarks of crossfire (for some) remain the deterrent from going that rout.


----------



## xer0h0ur

Quote:


> Originally Posted by *hamzta09*
> 
> Had you read the URL you'd known its Swedish.
> 
> And if you had any knowledge of computers or the internet you could have:
> 
> https://translate.google.se/translate?sl=sv&tl=en&js=y&prev=_t&hl=sv&ie=UTF-8&u=http%3A%2F%2Fwww.sweclockers.com%2Fnyhet%2F20684-radeon-fury-x-endast-med-vattenkylning-nedskalade-fury-anmals-saknad&edit-text=


Thanks for clearing that up in as condescending of a way as possible. Really showing off your superior intellect there bud.


----------



## Zmanster

This is according to Techpowerup's website:

AMD Radeon R9 Nano to Feature a Single PCIe Power Connector

AMD's Radeon R9 Nano is shaping up to be a more important card for AMD, than even its flaghsip, the R9 Fury X. Some of the first pictures of the Fury X led us to believe that it could stay compact only because it's liquid cooled. AMD disproved that notion, unveiling the Radeon R9 Nano, an extremely compact air-cooled graphics cards, with some stunning chops.

The Radeon R9 Nano is a feat similar to the NUC by Intel - to engineer a product that's surprisingly powerful for its size. The card is 6-inches long, 2-slot thick, and doesn't lug along any external radiator. AMD CEO Lisa Su, speaking at the company's E3 conference, stated that the R9 Nano will be faster than the Radeon R9 290X. That shouldn't surprise us, since it's a bigger chip; but it's the electrical specs, that make this product exciting - a single 8-pin PCIe power input, with a typical board power rated at 175W (Radeon R9 290X was rated at 275W). The card itself is as compact as some of the "ITX-friendly" custom design boards launched in recent times. It uses a vapor-chamber based air-cooling solution, with a single fan. The Radeon R9 Nano will launch later this Summer. It could compete with the GeForce GTX 970 in both performance and price.


----------



## PontiacGTX

Quote:


> Originally Posted by *Zmanster*
> 
> This is according to Techpowerup's website:
> 
> AMD Radeon R9 Nano to Feature a Single PCIe Power Connector
> 
> AMD's Radeon R9 Nano is shaping up to be a more important card for AMD, than even its flaghsip, the R9 Fury X. Some of the first pictures of the Fury X led us to believe that it could stay compact only because it's liquid cooled. AMD disproved that notion, unveiling the Radeon R9 Nano, an extremely compact air-cooled graphics cards, with some stunning chops.
> 
> The Radeon R9 Nano is a feat similar to the NUC by Intel - to engineer a product that's surprisingly powerful for its size. The card is 6-inches long, 2-slot thick, and doesn't lug along any external radiator. AMD CEO Lisa Su, speaking at the company's E3 conference, stated that the R9 Nano will be faster than the Radeon R9 290X. That shouldn't surprise us, since it's a bigger chip; but it's the electrical specs, that make this product exciting - a single 8-pin PCIe power input, with a typical board power rated at 175W (Radeon R9 290X was rated at 275W). The card itself is as compact as some of the "ITX-friendly" custom design boards launched in recent times. It uses a vapor-chamber based air-cooling solution, with a single fan. The Radeon R9 Nano will launch later this Summer. It could compete with the GeForce GTX 970 in both performance and price.


if thats true it has around the SP of a 290


----------



## zeppoli

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nope. Fury X launches in a week. Fury doesn't come out until next month and the Nano has no release date or specifications.
> 
> "The new Fury X flagship is launching on the 24th of June and will cost $650. Fury on the other hand will launch on the 12th of July and will cost $550."
> 
> http://wccftech.com/amd-officially-launches-radeon-fury-x-nano-650/


interesting.. Why do I feel this is because the Fury with a small OC will perform the same/better than the Fury X ?

lol

possible?
Why release the water cooling solution GPU first?


----------



## xer0h0ur

Well don't look to me to give any speculation. These toolsheds will eat me alive if I contradict them in any way shape or form.


----------



## Agent Smith1984

Anyone notice that the 390x on newegg is listed as a 208w card, yet it's clocked at 1100/1525!!!

What the hell did they do to Hawaii to pull that off?


----------



## xer0h0ur

Some people believe that the 3XX series has massive typos in its specifications. Some believe they actually did change something to the dies. Even though Best Buy has been selling these things for days now, no legit reviewers seem to have gotten their hands on them yet or even if they have they are not releasing reviews of the cards yet.


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone notice that the 390x on newegg is listed as a 208w card, yet it's clocked at 1100/1525!!!
> 
> What the hell did they do to Hawaii to pull that off?


Someone pointed out the other day that MSI (or some AIB) lists the 290X as 208W on their website. So it may just be a difference in the way they report the number.



http://www.overclock.net/t/1559387/pc-per-amd-r9-390x-confirmed-hawaii-rebrand-via-leaked-box-shot/150_30#post_24030413


----------



## PontiacGTX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Some people believe that the 3XX series has massive typos in its specifications. Some believe they actually did change something to the dies. Even though Best Buy has been selling these things for days now, no legit reviewers seem to have gotten their hands on them yet or even if they have they are not releasing reviews of the cards yet.


it is interesting that the log heatsink and triple fan design of the 290x now is dual fan,evne when nvidia model is triple fan

GV-R939G1

btw the r9 390 is a 290x not 290









http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672&cm_re=r9_390-_-14-131-672-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125792&cm_re=r9_390-_-14-125-792-_-Product


----------



## hyp36rmax

*+ Updated OP with R9 Radeon FURY X User guide*

*Source:* Link

*First Post:* Link

amd-radeon-r9-fury-x.pdf 1,554k .pdf file


----------



## xer0h0ur

Quote:


> Originally Posted by *PontiacGTX*
> 
> it is interesting that the log heatsink and triple fan design of the 290x now is dual fan,evne when nvidia model is triple fan
> 
> GV-R939G1
> 
> btw the r9 390 is a 290x not 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131672&cm_re=r9_390-_-14-131-672-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125792&cm_re=r9_390-_-14-125-792-_-Product


WAT? So what is the difference between the 390 and 390X then if they have equal amount of stream processors? Did anyone else notice that the MSI Gaming 390X is already running faster clocks than the 290X Lightning was? 1100MHz

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127872&cm_re=msi_390X-_-14-127-872-_-Product


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone notice that the 390x on newegg is listed as a 208w card, yet it's clocked at 1100/1525!!!
> 
> What the hell did they do to Hawaii to pull that off?


something for sure.


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> something for sure.


I mean, if those numbers are correct, and you account for the card having 8GB of VRAM, it's not that bad of a rebranding at all....

Looks like they have Hawaii's power down, it's OC headroom increased, and it's VRAM not just increased, but improved to faster chips.....


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I mean, if those numbers are correct, and you account for the card having 8GB of VRAM, it's not that bad of a rebranding at all....
> 
> Looks like they have Hawaii's power down, it's OC headroom increased, and it's VRAM not just increased, but improved to faster chips.....


added value to the series cant say I complain about that.
TMSC improved process for sure.
Fury will be my card as I wanted something an upgrade for a long time that could last me a couple of years or so for my setup.


----------



## xer0h0ur

That R9 Nano makes it look like a chump though. Only drawback being that 4GB capacity.


----------



## hamzta09

Nano is a 175W card, so 670 levels?


----------



## hamzta09

Quote:


> Originally Posted by *xer0h0ur*
> 
> That R9 Nano makes it look like a chump though. Only drawback being that 4GB capacity.


According to AMD, HBM is not the same as GDDR5 so 4GB isnt a "limitation" due ti high bandwidth.


----------



## zeppoli

Quote:


> Originally Posted by *hyp36rmax*
> 
> *+ Updated OP with R9 Radeon FURY X User guide*
> 
> *Source:* Link
> 
> *First Post:* Link
> 
> amd-radeon-r9-fury-x.pdf 1554k .pdf file


WTH!!!

this is that manual
Quote:


> The radiator assembly must be mounted above the graphics card and in a
> location that has minimal impedance to air flow.


I have a phantek enthoo luxe and planned on mounting the rad on the FLOOR of the case, what is up with this "must be mounted above the GPU" talk? there would be no place for me to do this.. I guess this might be the deal breaker. Plus I need to spend at least 75 bucks on a silly adapter as I'm sure MOST need to as well.


----------



## xer0h0ur

Quote:


> Originally Posted by *hamzta09*
> 
> Nano is a 175W card, so 670 levels?


Nope. They already stated its a higher performing card than the 290X was. Its a cut down variant of Fiji but its not cut down to the point of allowing the 290X to beat it.
Quote:


> Originally Posted by *hamzta09*
> 
> According to AMD, HBM is not the same as GDDR5 so 4GB isnt a "limitation" due ti high bandwidth.


If the leaked benchmark is to be believed, once you run out of vRAM then all Fiji cards with 4GB take a nose dive in performance. Performance crashing to the point of mediocrity. I am skeptical myself until I see legitimate benchmarks on a driver made for Fiji.


----------



## hyp36rmax

Quote:


> Originally Posted by *zeppoli*
> 
> WTH!!!
> 
> this is that manual
> I have a phantek enthoo luxe and planned on mounting the rad on the FLOOR of the case, what is up with this "must be mounted above the GPU" talk? there would be no place for me to do this.. I guess this might be the deal breaker. Plus I need to spend at least 75 bucks on a silly adapter as I'm sure MOST need to as well.


That is just a manufacturer suggestion as all builds will deter slightly. Go with your build how you intend it to be. As Cooler Master's new slogan goes... "Make it yours"


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> WTH!!!
> 
> this is that manual
> I have a phantek enthoo luxe and planned on mounting the rad on the FLOOR of the case, what is up with this "must be mounted above the GPU" talk? there would be no place for me to do this.. I guess this might be the deal breaker. Plus I need to spend at least 75 bucks on a silly adapter as I'm sure MOST need to as well.


This is nothing different from the instructions they give for mounting the radiator of the 295X2. The reason they say this is because of the air bubbles in the closed loop. If not mounting it above the card then you're going to hear gurgling sounds or noise from the pump as the air bubbles work their way through. On every start up of the system mind you.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> That R9 Nano makes it look like a chump though. Only drawback being that 4GB capacity.


Nano is the true X Factor in all of this, and you can bet AMD knew that going into all this.

That's why it's the one card we know nothing about, and didn't know squat about to begin with.

This will be highly anticipated little GPU....

Even with what we have coming now though, we are looking at:

$200 380 GPU that will best the GTX 960

$329 (I feel like this card should be more like $289) R9 390 GPU that will compete much closer to the GTX 970 with it's higher clock speeds for $30 or so less

$430 (roughly?, I think $349 is more reasonable) 390X GPU that will actually compete much closer with the GTX 980 now, for $100 less...

We are looking at a $550 GPU that will compete with the 980Ti for $100 less

And finally the Fury X for $650 which will compete or exceed the performance of the TX for A LOT LESS $$!!!

If AMD followed my suggested pricing for the 390 series, then the NANO would squeeze right in between the 390x and Fury for around $459-479.....

We shall see I guess!!

Regardless, I'm not bashing AMD for the rebrandings at this time, due to the information stated above.....

I mean, if they reduced the power usage of Hawaii, and increased performance through higher clock speeds, then they filled the market share they needed to. Why does it matter if they used a new die to do it with or not?? The whole thing allowed them invest in Fury, and still be more than competitive in the mainstream with their other cards...


----------



## Agent Smith1984

I'd of thought that 512GB/s bandwidth would allow the frame buffer to process information through the VRAM at a fast enough rate that the memory limitation itself is not as much of an issue.....

I may have theorized that incorrectly....


----------



## MiladEd

And 390 and 390X, despite being older (and hotter) than similary priced Nvidia offering, can still compete with them perfectly.


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd of thought that 512GB/s bandwidth would allow the frame buffer to process information through the VRAM at a fast enough rate that the memory limitation itself is not as much of an issue.....
> 
> I may have theorized that incorrectly....


The only thing I am grateful for is that even while gaming at 4K no AA there are very few games which will eat up more vRAM than 4GB when cranking settings. Off the top of my head only GTA V and Shadow of Mordor. I don't like this console port trend of using more and more vRAM.

Its the multi-monitor guys that really want the extra vRAM. Not me.


----------



## hamzta09

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nope. They already stated its a higher performing card than the 290X was. Its a cut down variant of Fiji but its not cut down to the point of allowing the 290X to beat it..


I never said performance.

I said watt.

670 is 170W TDP.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> WAT? So what is the difference between the 390 and 390X then if they have equal amount of stream processors?


That may be a typo. The MSI cards show 2560 shaders.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127874&cm_re=r9_390-_-14-127-874-_-Product

And the Sapphire, and XFX.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148&cm_re=r9_390-_-14-202-148-_-Product

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150729&cm_re=r9_390-_-14-150-729-_-Product

But what is this Gigabyte card with 1792 and 4GB?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125793&cm_re=r9_390-_-14-125-793-_-Product


----------



## xer0h0ur

Quote:


> Originally Posted by *hamzta09*
> 
> I never said performance.
> 
> I said watt.
> 
> 670 is 170W TDP.


Aye, "levels" threw me off.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> That may be a typo. The MSI cards show 2560 shaders.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127874&cm_re=r9_390-_-14-127-874-_-Product
> 
> And the Sapphire, and XFX.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148&cm_re=r9_390-_-14-202-148-_-Product
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150729&cm_re=r9_390-_-14-150-729-_-Product
> 
> But what is this Gigabyte card with 1792 and 4GB?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125793&cm_re=r9_390-_-14-125-793-_-Product


That last one has 380 specs, so I'm assuming a title error on that one. Especially considering the picture on the box shows "R9 380"


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> That may be a typo. The MSI cards show 2560 shaders.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127874&cm_re=r9_390-_-14-127-874-_-Product
> 
> And the Sapphire, and XFX.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202148&cm_re=r9_390-_-14-202-148-_-Product
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150729&cm_re=r9_390-_-14-150-729-_-Product
> 
> But what is this Gigabyte card with 1792 and 4GB?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125793&cm_re=r9_390-_-14-125-793-_-Product


I figured that had to be a typo. Now that you mention Gigabyte by the way, the AIBs are extremely pissed at AMD for dropping the pricing on the Fury line to line up with Nvidia's current pricing or best it. Particularly mad about the Fury X which was apparently supposed to be ~$850. Gigabyte is considering dropping AMD altogether because of it.


----------



## p4inkill3r

Source?


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That last one has 380 specs, so I'm assuming a title error on that one. Especially considering the picture on the box shows "R9 380"


Should have caught that. Top notch Newegg quality control.


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> Source?


Industry insider that posted the information on wccftech.com yesterday. Take it for what it is, information from a source we can't rely on. Although this guy isn't the only source to mention that AMD's Fury X was meant to be priced higher.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Should have caught that. Top notch Newegg quality control.


REALLY curious to see what they are wanting to fetch for the MSI Gaming 390x.....


----------



## zeppoli

Quote:


> Originally Posted by *hyp36rmax*
> 
> That is just a manufacturer suggestion as all builds will deter slightly. Go with your build how you intend it to be. As Cooler Master's new slogan goes... "Make it yours"


I was thinking that, but most times they just suggest which way air direction should go.. What reason do they have that the rad/fan and pump I assume, needs to be mounted ABOVE the GPU ? maybe they let gravity somehow play a role with the transfer of water?


----------



## edo101

Quote:


> Originally Posted by *gatygun*
> 
> To bad there is really no option for 1440p or even 1080p to get a solid 60fps on a single gpu without having to turn down settings. This is why crossfire / sli will always be a thing.


you know, this is what I'm glad I am not OCD about ULTRA preset. Most of the times, I don't even notice these Ultra settings besides the drop in framerate. I am happy I am not Ultra-d out. Saves me from having to tweak the hell out of my game to get slightly, very slightly better visuals from what the console peasants play. Witcher 3 and GTA V opened my eyes


----------



## xer0h0ur

I answered that already. Its for the air bubbles in the system causing gurgling sounds or pump grinding sounds. It goes away after a moment from startup.


----------



## Agent Smith1984

Any chance the 390's will finally have HDMI 2.0??

That was another reason for selling off my 290's... couldn't do 60hz on a 4K TV unless I found one with DP....


----------



## Awsan

One dual Fiji for my mITX with a side of water cooling please

And make that extra small


----------



## PontiacGTX

Quote:


> Originally Posted by *hamzta09*
> 
> Nano is a 175W card, so 670 levels?


no the Fury Nano isnt a midrange card sold as high end


----------



## }SkOrPn--'

Quote:


> Originally Posted by *hyp36rmax*
> 
> With the release of Nvidia's GTX TITAN X and AMD hot on their heels with increasing speculation up towards E3 2015 (06/16/15) for an official announcement of the R9 Radeon Fury X. Things are about to get exciting with innovative technology and API such as HBM, FREESYNC, DIRECT X 12, VULCAN. Let's start the discussion here:


Things are about to get exciting? One question then, where is the much needed DVI port for my 27" 1440p (2.5K) 120HZ IPS display, hmm? Why would they do something so idiotic after millions of DVI only 120hz LG and Samsung Korean displays sold over the last few years? Do us Enthusiasts with thousands of dollars in 1440p displays now need to solder on our own freggin ports??? Or do we need to throw away 1 and 2 year old displays just because the Engineers did not give us the ports? Active adapters that are worth anything cost an extra $100 or so, its insane.

I'm actually sick to my stomach after waiting all year for this release just to see my very hard work on my multi display setup isn't even supported out of the box. Even the Titan X was intelligent enough to have a DVI port.







An AMD fan is about to be lost I think....


----------



## Peter Nixeus

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> Things are about to get exciting? One question then, where is the much needed DVI port for my 27" 1440p (2.5K) 120HZ IPS display, hmm? Why would they do something so idiotic after millions of DVI only 120hz LG and Samsung Korean displays sold over the last few years? Do us Enthusiasts with thousands of dollars in 1440p displays now need to solder on our own freggin ports??? Or do we need to throw away 1 and 2 year old displays just because the Engineers did not give us the ports? Active adapters that are worth anything cost an extra $100 or so, its insane.
> 
> I'm actually sick to my stomach after waiting all year for this release just to see my very hard work on my multi display setup isn't even supported out of the box. Even the Titan X was intelligent enough to have a DVI port.
> 
> 
> 
> 
> 
> 
> 
> An AMD fan is about to be lost I think....


DisplayPort is the best connection for 2560x1440, 4K, and higher resolutions using a single cable connection. I don't think there is a DVI specification for anything higher than 2560x1600.


----------



## Roaches

I'm glad DVI is being tossed out in favor of DP. Its old, ecksbawx huge, very stiff, takes lots of I/O space on the PCI-E bracket and PCB. I'd very much like a card with more display connectivity like seen on those FirePro cards. I have no problem shelling out extra cash for DP to DVI adapters as needed.


----------



## rv8000

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> Things are about to get exciting? One question then, where is the much needed DVI port for my 27" 1440p (2.5K) 120HZ IPS display, hmm? Why would they do something so idiotic after millions of DVI only 120hz LG and Samsung Korean displays sold over the last few years? Do us Enthusiasts with thousands of dollars in 1440p displays now need to solder on our own freggin ports??? Or do we need to throw away 1 and 2 year old displays just because the Engineers did not give us the ports? Active adapters that are worth anything cost an extra $100 or so, its insane.
> 
> I'm actually sick to my stomach after waiting all year for this release just to see my very hard work on my multi display setup isn't even supported out of the box. Even the Titan X was intelligent enough to have a DVI port.
> 
> 
> 
> 
> 
> 
> 
> 
> An AMD fan is about to be lost I think....


1) I'm sure theyll have a dp/hdmi to dvi adapter in the packaging
2) If not buy one?


----------



## }SkOrPn--'

Quote:


> Originally Posted by *Peter Nixeus*
> 
> DisplayPort is the best connection for 2560x1440, 4K, and higher resolutions using a single cable connection. I don't think there is a DVI specification for anything higher than 2560x1600.


No there is not but that is off topic with my rant. Who needs DP when you only have a 2560x1440p gaming display with single DVI running at 120hz? I don't need DP on my gaming display, not for a long long time, not until my beautiful 1440p naturally dies on me, or when 120HZ 1440p is no longer decent. My 4K TV needs HDMI, but it only needs a 60HZ input, so no worries there. So, how do I adapt to a 120HZ DVI port without spending $100 extra? Again, no DVI port is absolutely insane to me. I only need one true DVI port ON THE CARD, because my other two displays work ok with dp-to-dvi and the TV only needs a HDMI cable.

Again, I feel gutted by AMD now. Saved up the money all year for this card and guess what, it looks like they are forcing an AMD fan to buy a competitors card. I will have to weigh the pros and cons of an adapter and see if its just better to go with NVIDIA now.


----------



## HiTechPixel

Quote:


> Originally Posted by *Peter Nixeus*
> 
> DisplayPort is the best connection for 2560x1440, 4K, and higher resolutions using a single cable connection. I don't think there is a DVI specification for anything higher than 2560x1600.


I just want to make it clear that you can't use resolutions higher than 4K using a single DisplayPort cable. You have to use MST and two DisplayPort cables just to get 5K. Shame on you Nixeus for not mentioning this.


----------



## Forceman

Quote:


> Originally Posted by *rv8000*
> 
> 1) I'm sure theyll have a dp/hdmi to dvi adapter in the packaging
> 2) If not buy one?


Adapters can't overclock.
Quote:


> Originally Posted by *}SkOrPn--'*
> 
> Again, I feel gutted by AMD now. Saved up the money all year for this card and guess what, it looks like they are forcing an AMD fan to buy a competitors card. I will have to weigh the pros and cons of an adapter and see if its just better to go with NVIDIA now.


Maybe just wait and see if the Fury non-X (or some AIB version) has a DVI. That's my plan.


----------



## hamzta09

Quote:


> Originally Posted by *Roaches*
> 
> I'm glad DVI is being tossed out in favor of DP. Its old, ecksbawx huge, very stiff, takes lots of I/O space on the PCI-E bracket and PCB. I'd very much like a card with more display connectivity like seen on those FirePro cards. I have no problem shelling out extra cash for DP to DVI adapters as needed.


DP and HDMI sucks.
Monitor goes idle- BLUE SCREEN (not bsod) "NO SIGNAL"

annoyyyyyyyyyyying


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> DP and HDMI sucks.
> Monitor goes idle- BLUE SCREEN (not bsod) "NO SIGNAL"
> 
> annoyyyyyyyyyyying


Never have that issue with my DP or HDMI.


----------



## edo101

Im sure Sapphire will make a variant with DVI. Sapphire has been good with making multiple I/O cards.


----------



## }SkOrPn--'

Quote:


> Originally Posted by *rv8000*
> 
> 1) I'm sure theyll have a dp/hdmi to dvi adapter in the packaging
> 2) If not buy one?


I wish. I just read the entire AMD PDF on Fury X and it requires an active adapter for my setup. Which means my Fury X now costs me $750.

I will contact my old AMD friends (used to work for them for a long time) and see if they will send me an adapter, because I'm not being forced to buy one if the "other" brand doesn't require it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hamzta09*
> 
> DP and HDMI sucks.
> Monitor goes idle- BLUE SCREEN (not bsod) "NO SIGNAL"
> 
> annoyyyyyyyyyyying
> 
> 
> 
> Never have that issue with my DP or HDMI.
Click to expand...

Happens on older monitors, my old Asus 1080p monitor does that but my (slightly) newer Asus one doesn't, both connected via HDMI.

Normally you get a No Signal blue screen then the monitor will go idle after about 2-3 seconds


----------



## edo101

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> I wish. I just read the entire AMD PDF on Fury X and it requires an active adapter for my setup. Which means my Fury X now costs me $750.
> 
> I will contact my old AMD friends (used to work for them for a long time) and see if they will send me an adapter, because I'm not being forced to buy one if the "other" brand doesn't require it.


Wait for Sapphire before feeding more green to team green.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Happens on older monitors, my old Asus 1080p monitor does that but my (slightly) newer Asus one doesn't, both connected via HDMI.
> 
> Normally you get a No Signal blue screen then the monitor will go idle after about 2-3 seconds


Thanks for clearing that up.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> I just want to make it clear that you can't use resolutions higher than 4K using a single DisplayPort cable. You have to use MST and two DisplayPort cables just to get 5K. Shame on you Nixeus for not mentioning this.


You would think someone as intelligent as you purport yourself to be would know that Displayport 1.2a barely has enough bandwidth for 4K @60Hz making it obvious that a single connection is not enough to handle anything beyond it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Happens on older monitors, my old Asus 1080p monitor does that but my (slightly) newer Asus one doesn't, both connected via HDMI.
> 
> Normally you get a No Signal blue screen then the monitor will go idle after about 2-3 seconds
> 
> 
> 
> Thanks for clearing that up.
Click to expand...

No worries


----------



## Peter Nixeus

Quote:


> Originally Posted by *HiTechPixel*
> 
> I just want to make it clear that you can't use resolutions higher than 4K using a single DisplayPort cable. You have to use MST and two DisplayPort cables just to get 5K. Shame on you Nixeus for not mentioning this.


There is DisplayPort 1.3 specification for DisplayPort cables and inputs/outputs that will take care of that.









The eDP inside 5k panels already support it but not the current scalars.


----------



## Peter Nixeus

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> No there is not but that is off topic with my rant. Who needs DP when you only have a 2560x1440p gaming display with single DVI running at 120hz? I don't need DP on my gaming display, not for a long long time, not until my beautiful 1440p naturally dies on me, or when 120HZ 1440p is no longer decent. My 4K TV needs HDMI, but it only needs a 60HZ input, so no worries there. So, how do I adapt to a 120HZ DVI port without spending $100 extra? Again, no DVI port is absolutely insane to me. I only need one true DVI port ON THE CARD, because my other two displays work ok with dp-to-dvi and the TV only needs a HDMI cable.
> 
> Again, I feel gutted by AMD now. Saved up the money all year for this card and guess what, it looks like they are forcing an AMD fan to buy a competitors card. I will have to weigh the pros and cons of an adapter and see if its just better to go with NVIDIA now.


These are AMD reference cards that they are showing to the press - does not completely negate the AIB partners to include Dual Link DVI in their versions.

Also here is a fun and true fact - during product development of our FreeSync monitor, I had planned on having only a single DisplayPort input for our monitor. But AMD kept suggesting to include DVI for legacy support so cards that do not support FreeSync can still benefit from 144Hz.


----------



## jerrolds

Quote:


> Originally Posted by *Peter Nixeus*
> 
> These are AMD reference cards that they are showing to the press - does not completely negate the AIB partners to include Dual Link DVI in their versions.
> 
> Also here is a fun and true fact - during product development of our FreeSync monitor, I had planned on having only a single DisplayPort input for our monitor. But AMD kept suggesting to include DVI for legacy support so cards that do not support FreeSync can still benefit from 144Hz.


This is good to know - i would definitely pick up a Fury X w/ DVI out. I was afraid AMD was limiting AIB partners to a very specific spec (HDMI/DP only)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Peter Nixeus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *}SkOrPn--'*
> 
> No there is not but that is off topic with my rant. Who needs DP when you only have a 2560x1440p gaming display with single DVI running at 120hz? I don't need DP on my gaming display, not for a long long time, not until my beautiful 1440p naturally dies on me, or when 120HZ 1440p is no longer decent. My 4K TV needs HDMI, but it only needs a 60HZ input, so no worries there. So, how do I adapt to a 120HZ DVI port without spending $100 extra? Again, no DVI port is absolutely insane to me. I only need one true DVI port ON THE CARD, because my other two displays work ok with dp-to-dvi and the TV only needs a HDMI cable.
> 
> Again, I feel gutted by AMD now. Saved up the money all year for this card and guess what, it looks like they are forcing an AMD fan to buy a competitors card. I will have to weigh the pros and cons of an adapter and see if its just better to go with NVIDIA now.
> 
> 
> 
> These are AMD reference cards that they are showing to the press - does not completely negate the AIB partners to include Dual Link DVI in their versions.
> 
> Also here is a fun and true fact - during product development of our FreeSync monitor, I had planned on having only a single DisplayPort input for our monitor. But AMD kept suggesting to include DVI for legacy support so cards that do not support FreeSync can still benefit from 144Hz.
Click to expand...

Interesting and good to know, thanks


----------



## edo101

Quote:


> Originally Posted by *Peter Nixeus*
> 
> These are AMD reference cards that they are showing to the press - does not completely negate the AIB partners to include Dual Link DVI in their versions.
> 
> Also here is a fun and true fact - during product development of our FreeSync monitor, I had planned on having only a single DisplayPort input for our monitor. But AMD kept suggesting to include DVI for legacy support so cards that do not support FreeSync can still benefit from 144Hz.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Interesting and good to know, thanks


C'mon man, don't you guys ever buy non reference cards. Sapphire alone makes so many different I/O combos with their cards


----------



## Sgt Bilko

Quote:


> Originally Posted by *edo101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Peter Nixeus*
> 
> These are AMD reference cards that they are showing to the press - does not completely negate the AIB partners to include Dual Link DVI in their versions.
> 
> Also here is a fun and true fact - during product development of our FreeSync monitor, I had planned on having only a single DisplayPort input for our monitor. But AMD kept suggesting to include DVI for legacy support so cards that do not support FreeSync can still benefit from 144Hz.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Interesting and good to know, thanks
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> C'mon man, don't you guys ever buy non reference cards. Sapphire alone makes so many different I/O combos with their cards
Click to expand...

Yeah i do and I'm in a similar position as i do have a Qnix monitor and i like that mine does 110hz but if Fury X is powerful enough for 4k gaming then I'm willing to take a look at other monitors. personally i want a 3440x1440 IPS but we'll see









Besides, my experience with Sapphire has been a little shaky so i may lean towards another AIB.


----------



## xer0h0ur

Didn't you guys hear though, someone posted a source saying that there isn't any air cooled Fury X and that AMD locked out AIBs from modifying the reference card spec. *rolleyes*


----------



## hamzta09

Quote:


> Originally Posted by *rdr09*
> 
> Never have that issue with my DP or HDMI.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Happens on older monitors, my old Asus 1080p monitor does that but my (slightly) newer Asus one doesn't, both connected via HDMI.
> 
> Normally you get a No Signal blue screen then the monitor will go idle after about 2-3 seconds


DP + PB287Q = Blue Screen with No Signal when Monitor goes idle.


----------



## Casey Ryback

[/quote]
Quote:


> Originally Posted by *xer0h0ur*
> 
> Particularly mad about the Fury X which was apparently supposed to be ~$850. Gigabyte is considering dropping AMD altogether because of it.


This is coming from the company that had so many problems with their R9290/290X cards, and gave the lineup a bad reputation.


----------



## Kinaesthetic

Well, this is slightly off-putting for all of us who were thinking about putting the Fury Nano into a HTPC system: http://forums.overclockers.co.uk/showpost.php?p=28185835&postcount=249

I was hoping that was Gibbo that said it....since he can be wrong. But that is straight from AMD_Matt......ughh.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*


Quote:


> Originally Posted by *xer0h0ur*
> 
> Particularly mad about the Fury X which was apparently supposed to be ~$850. Gigabyte is considering dropping AMD altogether because of it.


This is coming from the company that had so many problems with their R9290/290X cards, and gave the lineup a bad reputation.[/quote]

Had a Giga 280x....had unstable clocks at stock, Giga's AMD lineup has been a little shaky lately so people have been steering clear of them, i hope they do better this time around


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Well, this is slightly off-putting for all of us who were thinking about putting the Fury Nano into a HTPC system: http://forums.overclockers.co.uk/showpost.php?p=28185835&postcount=249
> 
> I was hoping that was Gibbo that said it....since he can be wrong. But that is straight from AMD_Matt......ughh.


Quote:


> Q: Does the AMD Radeon *R9 Fury X* GPU have HDMI 2.0?
> A: No. AMD recommends and uses DisplayPort 1.2a for 4K60 content.


That's the Fury X, nothing said about the Nano but more than likely doesn't either


----------



## xer0h0ur

Yeah I didn't have any idea about the problems Gigabyte had with the 2XX series. I had gone with Diamond.


----------



## edo101

Quote:


> Originally Posted by *Sgt Bilko*
> 
> This is coming from the company that had so many problems with their R9290/290X cards, and gave the lineup a bad reputation
> 
> Had a Giga 280x....had unstable clocks at stock, Giga's AMD lineup has been a little shaky lately so people have been steering clear of them, i hope they do better this time around


True Dat. My 290 Windforce OC is testament. I abandoned my sweet Sapphire Tech for Gigabyte's warranty which was a mistake


----------



## xer0h0ur

Quote:


> Originally Posted by *Kinaesthetic*
> 
> Well, this is slightly off-putting for all of us who were thinking about putting the Fury Nano into a HTPC system: http://forums.overclockers.co.uk/showpost.php?p=28185835&postcount=249
> 
> I was hoping that was Gibbo that said it....since he can be wrong. But that is straight from AMD_Matt......ughh.


That is just stupid. Is there even a legitimate reason for them avoiding HDMI 2.0 at this point? Hell, I am still standing around waiting for someone to put DP 1.3 into a video card for 4K @ 120Hz. Whomever gets there first is getting my money.


----------



## Ceadderman

*Place holder*!

I sooooooooooo cannot wait to get my Fury on!









~Ceadder


----------



## HiTechPixel

Quote:


> Originally Posted by *Peter Nixeus*
> 
> There is DisplayPort 1.3 specification for DisplayPort cables and inputs/outputs that will take care of that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The eDP inside 5k panels already support it but not the current scalars.


Haha, are you touting DisplayPort 1.3? Care to let us know when it will finally arrive in displays?


----------



## rv8000

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> I wish. I just read the entire AMD PDF on Fury X and it requires an active adapter for my setup. Which means my Fury X now costs me $750.
> 
> I will contact my old AMD friends (used to work for them for a long time) and see if they will send me an adapter, because I'm not being forced to buy one if the "other" brand doesn't require it.


In that 390x youtube unboxing video there was an hdmi to dvi adpater if im not mistake, I have a strong feeling one will be included...


----------



## }SkOrPn--'

Quote:


> Originally Posted by *rv8000*
> 
> In that 390x youtube unboxing video there was an hdmi to dvi adpater if im not mistake, I have a strong feeling one will be included...


That would be really sweet. But its hard to believe they would include a $100 active adapter in the box, with an already reasonably priced card. And even if they did include it, that is no guarantee it will do 120hz. Most adapters barely manage over 75hz max.


----------



## Elmy

Getting 4 of them. Save me a spot!


----------



## zeppoli

Quote:


> Originally Posted by *Peter Nixeus*
> 
> DisplayPort is the best connection for 2560x1440, 4K, and higher resolutions using a single cable connection. I don't think there is a DVI specification for anything higher than 2560x1600.


Lol reminds me of Iphone owners, "Well having a big screen is just stupid and you need to use two hands for it anyway"

@ 1440P I highly doubt anyone would notice any difference between and DVI or Displayport.

Bottom line, it is silly they did this, this thing has to really beat out the 980ti (which is what AMD designed this card to compete with) for me to go that way. I have to factor in a 70 dollar adapter cost. I'm not alone, I bet 80% of people buying this card will need some sort of adapter. Display ports are new, 3-5 years ago they weren't common on monitors.


----------



## Ironsmack

Quote:


> Originally Posted by *zeppoli*
> 
> Lol reminds me of Iphone owners, "Well having a big screen is just stupid and you need to use two hands for it anyway"
> 
> @ 1440P I highly doubt anyone would notice any difference between and DVI or Displayport.
> 
> Bottom line, it is silly they did this, this thing has to really beat out the 980ti (which is what AMD designed this card to compete with) for me to go that way. I have to factor in a 70 dollar adapter cost. I'm not alone, I bet 80% of people buying this card will need some sort of adapter. Display ports are new, 3-5 years ago they weren't common on monitors.


Well, with the new 1440p, 4k monitors coming out left and right - it seems like DP is on the rise. Yes, it doesn't help the end user who still uses a DVI/HDMI port on their current monitors.

But it's just a matter of time when the price of an adapter will go down.


----------



## LA_Kings_Fan

Hopeful to join







... time to upgrade the Sapphire R9 290X







a Fury X sounds like the ticket I need.


----------



## LegacyLG

what do people think about a Radeon R9 Fury X2

I might wait for that one it will have 8gb ram


----------



## KeepWalkinG

Sitting and waiting date


----------



## Synik

Quote:


> Originally Posted by *LegacyLG*
> 
> what do people think about a Radeon R9 Fury X2
> 
> I might wait for that one it will have 8gb ram


It may have 8 gb of ram but it will still be 4 gb usable memory until dx12 games support stacking in xfire


----------



## Scotty99

Can anyone clear **** up for me?

Are any of the r9-300 series NEW silicon? Are they all rebrands with different clocks/memory configs? I just got an email from newegg, but looking at shader counts, they all look like rebrands to me.


----------



## bkvamme

Quote:


> Originally Posted by *Scotty99*
> 
> Can anyone clear **** up for me?
> 
> Are any of the r9-300 series NEW silicon? Are they all rebrands with different clocks/memory configs? I just got an email from newegg, but looking at shader counts, they all look like rebrands to me.


All Rx 3xx are rebrands. Some have some minor tweaks, which could count as a upgrade, but it is still based on the previous silicon.


----------



## TK421

Fury series available for purchase anywhere in the US?


----------



## p4inkill3r

Quote:


> Originally Posted by *TK421*
> 
> Fury series available for purchase anywhere in the US?


Next week.


----------



## TK421

Quote:


> Originally Posted by *p4inkill3r*
> 
> Next week.


The wait is unbearable!


----------



## p4inkill3r

Quote:


> Originally Posted by *TK421*
> 
> The wait is unbearable!


It'll be here before you know it.


----------



## MiladEd

Some AMD R9 Fury X benchmarks vs. 980 Ti.

4k FPS in a bunch of new games:

http://i.imgur.com/BIqmOXb.png

Source: reddit.com/r/amd

source cited by OP: http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html *[Article Removed]*


----------



## HiTechPixel

One thing I noticed in the AMD Beijing slide is that the Fury X has an incredibly high minimum frame rate compared to 980 Ti / Titan X. Could this be a byproduct of HBM?


----------



## zeppoli

Quote:


> Originally Posted by *MiladEd*
> 
> Some AMD R9 Fury X benchmarks vs. 980 Ti.
> 
> 4k FPS in a bunch of new games:
> 
> http://i.imgur.com/BIqmOXb.png
> 
> Source: reddit.com/r/amd
> 
> source cited by OP: http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html *[Article Removed]*


HOLY CRAP!!! GO AMD!!!

So happy they pulled it off.. Being an old AMD Fanboy, but switched many years ago, this is is like a Rocky story!

That isn't a small margin either, that's pretty big!
woot!!!


----------



## MapRef41N93W

Quote:


> Originally Posted by *zeppoli*
> 
> HOLY CRAP!!! GO AMD!!!
> 
> So happy they pulled it off.. Being an old AMD Fanboy, but switched many years ago, this is is like a Rocky story!
> 
> That isn't a small margin either, *that's pretty big!*
> woot!!!


That's 1-5 FPS in everything but Sleeping Dogs. In an AMD test environment.


----------



## Forceman

Quote:


> Originally Posted by *zeppoli*
> 
> HOLY CRAP!!! GO AMD!!!
> 
> So happy they pulled it off.. Being an old AMD Fanboy, but switched many years ago, this is is like a Rocky story!
> 
> That isn't a small margin either, that's pretty big!
> woot!!!


Those are AMD supplied benchmarks though, so expect them to be best case scenarios.


----------



## Agent Smith1984

How did they miss the mark on HDMI 2.0 though?

If you are looking at the Nano for your 4K living room setup, and you already have the 4k TV (with no DP)....

Say hello to 30FPS......


----------



## hamzta09

http://www.inet.se/kampanj/2264/amd-300?Sort=price-a

300 series already in stock in Sweden.

I find the prices ridiculous for 390X. It should've been 1k less imo.


----------



## zeppoli

Quote:


> Originally Posted by *Forceman*
> 
> Those are AMD supplied benchmarks though, so expect them to be best case scenarios.


oh.. Then never mind.. I guess it will be the 980ti after all as the top dog. I'll wait for someone like Linus Tech to review it.

But looks if AMD exaggerates then the 980til will come out on top of regular testers. If thats the case well that sucks


----------



## Agent Smith1984

Quote:


> Originally Posted by *zeppoli*
> 
> oh.. Then never mind.. I guess it will be the 980ti after all as the top dog. I'll wait for someone like Linus Tech to review it.
> 
> But looks if AMD exaggerates then the 980til will come out on top of regular testers. If thats the case well that sucks


Nobody said AMD is exaggerating anything....

They just happened to supply numbers for titles that favor their architecture....

There is no doubt in my mind that this card will compete directly with the 980 ti, no problem whatsoever.

The only thing it lacks is the VRAM, but look at what you are getting!!!

HBM
Watercooling
Small form factor....

The list goes on.

This is a big win for AMD in my opinion.

The only thing they missed was HDMI 2.0....


----------



## blue1512

Quote:


> Originally Posted by *MapRef41N93W*
> 
> That's 1-5 FPS in everything but Sleeping Dogs. In an AMD test environment.


Which is "an AMD test environment"?

FYI, the leaked test came from PCWorld lab.


----------



## hamzta09

http://www.overclock3d.net/reviews/gpu_displays/msi_r9_390x_gaming_8g_review/14

980 Ti gets rekt by the 290X and 390X in this one.


----------



## Casey Ryback

Quote:


> Originally Posted by *zeppoli*
> 
> oh.. Then never mind.. I guess it will be the 980ti after all as the top dog.


Ah AMD supplied them so it's obviously false great conclusion.


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.overclock3d.net/reviews/gpu_displays/msi_r9_390x_gaming_8g_review/14
> 
> 980 Ti gets rekt by the 290X and 390X in this one.


Yeah, because most reviews have the 980 beating out the 980 Ti. How does junk like that get published?


----------



## BackwoodsNC

I see alot of titles on that list that have preformed better on green and now preforming better on red.







Oh wait, it's supplied from AMD so must be false.


----------



## hamzta09

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I see alot of titles on that list that have preformed better on green and now preforming better on red.
> 
> 
> 
> 
> 
> 
> 
> Oh wait, it's supplied from AMD so must be false.


Plenty of reviews from various sites you know.


----------



## zeppoli

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nobody said AMD is exaggerating anything....
> 
> They just happened to supply numbers for titles that favor their architecture....
> 
> There is no doubt in my mind that this card will compete directly with the 980 ti, no problem whatsoever.
> 
> The only thing it lacks is the VRAM, but look at what you are getting!!!
> 
> HBM
> Watercooling
> Small form factor....
> 
> The list goes on.
> 
> This is a big win for AMD in my opinion.
> 
> The only thing they missed was HDMI 2.0....


Water cooling does not always equal better, many people hate water cooling, personally if they could have designed this without water, I'd be more impressed. I also have no clue where I would mount the rad.

HBM yes! small form factor? doesn't matter to me, I assume most here it doesn't matter either, if they are fans of AMD lol.

What about the huge negative? the fact this is the only top tier card without a single DVI port. Just stupid! I asked 4 friends last night and none had a DP on their monitor. , so 650.00+ 75=725.00 bucks. It better perform that much better than the 980 ti.


----------



## PontiacGTX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Didn't you guys hear though, someone posted a source saying that there isn't any air cooled Fury X and that AMD locked out AIBs from modifying the reference card spec. *rolleyes*


it was watercooled.


----------



## Agent Smith1984

Fury will be air cooled, and $100 cheaper.

It should overclock well, and compete with the 980ti performance in most things....

I don't understand the problem


----------



## PontiacGTX

Quote:


> Originally Posted by *Forceman*
> 
> Those are AMD supplied benchmarks though, so expect them to be best case scenarios.


on the r9 290x all were true, but the performance difference is so small...


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Didn't you guys hear though, someone posted a source saying that there isn't any air cooled Fury X and that AMD locked out AIBs from modifying the reference card spec. *rolleyes*


Consider the source, but this is a published article that, according to them, includes new details not previously available (including an AMD supplied benchmark comparison).
Quote:


> In case it isn't obvious yet, the Fury X uses a very unique design. So unique, in fact, that AMD's add-in board partners (like Asus, MSI, and Sapphire) won't be able to customize the card with their own cooling solutions. The Fury X will be reference design-only, though AIBs will be able to tinker with the air-cooled Radeon R9 Fury released in July.


http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html


----------



## TK421

Quote:


> Originally Posted by *Forceman*
> 
> Consider the source, but this is a published article that, according to them, includes new details not previously available (including an AMD supplied benchmark comparison).
> http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html


was hoping for a fury x lighting lol


----------



## flopper

Fastest single core card in the world, Fury.
Thanks AMD


----------



## bkvamme

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.inet.se/kampanj/2264/amd-300?Sort=price-a
> 
> 300 series already in stock in Sweden.
> 
> I find the prices ridiculous for 390X. It should've been 1k less imo.


Yeah, those prices were very high. You can get a 4GB 290X for around 3000nok here, and the performance difference from the 390X shouldn't be that high. I was able to achieve a overclock to 1180-1190MHz stable on my 290X, which is around the same as those 390Xes.


----------



## PontiacGTX

For those concerned about 4GB for 4k
"We optimize our game performance so we don't see any limitation at all on 4K with 4 GB HBM. Even further to that, we know when Mantle came out, no one really had the benchmarks for Mantle because it was new "thing(replaced word" to the industry, right? and then the draw calls when the benchmark came out, it was the coolest stuff, right?
Umm, so I think what you are going to see now as the industry moves forward with HBM, people are going to start thinking about memory management a lot more different. Before your GDDR5 memory was just kind of like its own trash can. It just put stuff there and no one really thought about optimizing, like on the CPU side there's a lot more memory management, you know back and forth for different work loads. So I think with HBM there will be this really wide, ultra-bandwidth bus, its going to change how people utilize memory resources and its going to improve overtime with DIrectX 12 and Vulkan APIs and you'll get to see more HBM optimizations in gaming and applications."

Read more: http://wccftech.com/amd-radeon-r9-fury-x-official-specifications-pcb-shots-cooling-design-detailed/#ixzz3dQcO97d9


----------



## Agent Smith1984

Guys...

understand that AMD being first with HBM on their GPU really is a big deal, and a huge triumph for the red team!


----------



## xer0h0ur

R9 Nano: http://wccftech.com/amd-radeon-r9-nano-detailed-features-fiji-gpu-175w-tdp-single-8pin-connector-sff-design-faster-hawaii/
"AMD confirmed to us that the Radeon R9 Nano is about 85-90% of Radeon R9 Fury X (in terms of performance)."

Project Quantum aka Fury X2: http://wccftech.com/wipamd-project-quantum/


----------



## blue1512

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys...
> 
> understand that AMD being first with HBM on their GPU really is a big deal, and a huge triumph for the red team!


Well deserved. They put money and efforts into it, unlike nVidia who spent money on gimmwork *****.and powerpoint slide


----------



## p4inkill3r

Quote:


> Originally Posted by *PontiacGTX*
> 
> For those concerned about 4GB for 4k
> "We optimize our game performance so we don't see any limitation at all on 4K with 4 GB HBM. Even further to that, we know when Mantle came out, no one really had the benchmarks for Mantle because it was new "thing(replaced word" to the industry, right? and then the draw calls when the benchmark came out, it was the coolest stuff, right?
> Umm, so I think what you are going to see now as the industry moves forward with HBM, people are going to start thinking about memory management a lot more different. Before your GDDR5 memory was just kind of like its own trash can. It just put stuff there and no one really thought about optimizing, like on the CPU side there's a lot more memory management, you know back and forth for different work loads. So I think with HBM there will be this really wide, ultra-bandwidth bus, its going to change how people utilize memory resources and its going to improve overtime with DIrectX 12 and Vulkan APIs and you'll get to see more HBM optimizations in gaming and applications."
> 
> Read more: http://wccftech.com/amd-radeon-r9-fury-x-official-specifications-pcb-shots-cooling-design-detailed/#ixzz3dQcO97d9


It sounds like memory handling is going to undergo a paradigm shift and that those holding on to the 'VRAM limitations' imposed by having 'just 4GB' may not be right.


----------



## xer0h0ur

I am skeptical about these optimizations but sure as hell hope they are right about it.


----------



## PontiacGTX

Quote:


> Originally Posted by *p4inkill3r*
> 
> It sounds like memory handling is going to undergo a paradigm shift and that those holding on to the 'VRAM limitations' imposed by having 'just 4GB' may not be right.


what if AMD managed to optimize the vram usage given they share same architecture on GPUs? or if the low level api are all based on mantle?


----------



## Agent Smith1984

Quote:


> Originally Posted by *blue1512*
> 
> Well deserved. They put money and efforts into it, unlike nVidia who spent money on gimmwork *****.and powerpoint slide


Agreed...

AMD came to the table with highly competitive, new architecture, that uses less power, and uses a completely new VRAM architecture, and they are selling it to you at a good price...

I really don't understand any complaints other than the DVI and HDMI 2.0 stuff, which I do feel is a justified, but not worth shunning this series...


----------



## jerrolds

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Agreed...
> 
> AMD came to the table with highly competitive, new architecture, that uses less power, and uses a completely new VRAM architecture, and they are selling it to you at a good price...
> 
> I really don't understand any complaints other than the DVI and HDMI 2.0 stuff, which I do feel is a justified, but not worth shunning this series...


Agreed - i REALLY want a Fury X, but i cant justify getting a new monitor to replace my QNIX. Need a cheap adapter or wait for an AIB w/ DVI out







hopefully the latter is soon, and is also using the same watercooling solution.


----------



## Drebinx

Interesting choice of games and settings, not sure what to think....Definitely cannot wait til NDA on reviews lifts.


----------



## flopper

drawcalls


__ https://twitter.com/i/web/status/611579128941129728


----------



## p4inkill3r

Quote:


> Originally Posted by *dick_cheney*
> 
> Interesting choice of games and settings, not sure what to think....Definitely cannot wait til NDA on reviews lifts.


What's the source for that table?


----------



## Forceman

Quote:


> Originally Posted by *p4inkill3r*
> 
> What's the source for that table?


It's from AMD, same as the benchmarks posted earlier.


----------



## xer0h0ur

He probably got it from http://wccftech.com/amd-radeon-fury-official-gaming-benchmarks-fastest-singlegpu-graphics-card-world/


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> He probably got it from http://wccftech.com/amd-radeon-fury-official-gaming-benchmarks-fastest-singlegpu-graphics-card-world/


I haven't been on wccf yet today, thanks.


----------



## }SkOrPn--'

Quote:


> Originally Posted by *zeppoli*
> 
> What about the huge negative? the fact this is the only top tier card without a single DVI port. Just stupid! I asked 4 friends last night and none had a DP on their monitor. , so 650.00+ 75=725.00 bucks. It better perform that much better than the 980 ti.


This is why AMD is not getting my money for Fury X. In order for me to run 120hz on my 1440p IPS I must have a native DVI port on my card of choice. AMD is stupid to not include a DVI port for the hundreds of millions of Korean DVI only IPS monitors that sold in the last few years. And, adding an $75 to $100 adapter which does not officially support 120HZ is simply adding another layer prone to dying prematurely.

I just hope OEMs such as EVGA, Gigabyte and Asus find a way to include at least one DVI port. Otherwise I have a 27" 120HZ gaming monitor that I can't use for 120HZ gaming, unless of course I go with NVIDIA.


----------



## Agent Smith1984

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> This is why AMD is not getting my money for Fury X. In order for me to run 120hz on my 1440p IPS I must have a native DVI port on my card of choice. AMD is stupid to not include a DVI port for the hundreds of millions of Korean DVI only IPS monitors that sold in the last few years. And, adding an $75 to $100 adapter which does not officially support 120HZ is simply adding another layer prone to dying prematurely.
> 
> I just hope OEMs such as EVGA, Gigabyte and Asus find a way to include at least one DVI port. Otherwise I have a 27" 120HZ gaming monitor that I can't use for 120HZ gaming, unless of course I go with NVIDIA.


Why don't we wait for the retail package first, and see if it comes with an adapter, before we pass total judgement?


----------



## MiladEd

EVGA doesn't make AMD VGAs.


----------



## pdasterly

damn if i do and damned if i don't.
Love the competition in the gpu arena but its still too early for me at least cause i need monitors and gpu.
Obviously the gpu will dictate which monitors to buy(g-sync or freesync)

I want a freesync 4k monitor but i don't think they exist yet and need benchmarks for dual monitor 4k setup with either 2 fury's underwater or get one fury and wait til the dual gpu fury comes out later this year. Mean while my dismantled machine continues to collect dust


----------



## HiTechPixel

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> This is why AMD is not getting my money for Fury X. In order for me to run 120hz on my 1440p IPS I must have a native DVI port on my card of choice. AMD is stupid to not include a DVI port for the hundreds of millions of Korean DVI only IPS monitors that sold in the last few years. And, adding an $75 to $100 adapter which does not officially support 120HZ is simply adding another layer prone to dying prematurely.
> 
> I just hope OEMs such as EVGA, Gigabyte and Asus find a way to include at least one DVI port. Otherwise I have a 27" 120HZ gaming monitor that I can't use for 120HZ gaming, unless of course I go with NVIDIA.


Millions of people bought cheap knockoff korean monitors? Nope. More like a couple of thousand, like 3000. You guys are an extremely niche market stuck in the past. You're holding back technology.


----------



## Drebinx

Quote:


> Originally Posted by *xer0h0ur*
> 
> He probably got it from http://wccftech.com/amd-radeon-fury-official-gaming-benchmarks-fastest-singlegpu-graphics-card-world/


This...not that I trust WCCF that much but it seemed legitimate.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> Millions of people bought cheap knockoff korean monitors? Nope. More like a couple of thousand, like 3000. You guys are an extremely niche market stuck in the past. You're holding back technology.


Can't agree more. Buy cheap, get cheap. DVI was bound to go the way of the dinosaur like VGA. Far too bandwidth limited to be a relevant connection at this point.


----------



## jerrolds

Quote:


> Originally Posted by *HiTechPixel*
> 
> Millions of people bought cheap knockoff korean monitors? Nope. More like a couple of thousand, like 3000. You guys are an extremely niche market stuck in the past. You're holding back technology.


Well maybe thousands on OCN, but probably tens even hundreds of thousands for those that actually live in South Korea, and where QNIX/XStar/Catleap/etc are sold at retail.

I can see why AMD chose not to include it, cant daisy chain, stiff cables, more components adding price....still hoping for a 3rd party AIB to include DVI since no monitor is worth upgrading to over the QNIX for me atm. If only the Acer Predator ultrawide was 100hz+...


----------



## pdasterly

Quote:


> Originally Posted by *dick_cheney*
> 
> This...not that I trust WCCF that much but it seemed legitimate.


17.2 TFLOPS


----------



## boredmug

Quote:


> Originally Posted by *jerrolds*
> 
> Well maybe thousands on OCN, but probably tens even hundreds of thousands for those that actually live in South Korea, and where QNIX/XStar/Catleap/etc are sold at retail.
> 
> I can see why AMD chose not to include it, cant daisy chain, stiff cables, more components adding price....still hoping for a 3rd party AIB to include DVI since no monitor is worth upgrading to over the QNIX for me atm. If only the Acer Predator ultrawide was 100hz+...


http://www.overclock.net/t/1537403/tftcentral-acer-predator-xr341ck-34-curved-gaming-screen-with-g-sync According to this it is???


----------



## Drebinx

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> This is why AMD is not getting my money for Fury X. In order for me to run 120hz on my 1440p IPS I must have a native DVI port on my card of choice. AMD is stupid to not include a DVI port for the hundreds of millions of Korean DVI only IPS monitors that sold in the last few years. And, adding an $75 to $100 adapter which does not officially support 120HZ is simply adding another layer prone to dying prematurely.
> 
> I just hope OEMs such as EVGA, Gigabyte and Asus find a way to include at least one DVI port. Otherwise I have a 27" 120HZ gaming monitor that I can't use for 120HZ gaming, unless of course I go with NVIDIA.


While it sucks i think people are blowing the lack of DVI out of proportion, most people have HDMI and or DP.The DVI standard is 15 years old and in the age of DP and HDMI 2.0 there has to be a time when manufacturers phase out legacy connections and somewhat legacy standards. AMD announced way back in 2010 (Along with Intel and others) they would be phasing out DVI/VGA/LVDS by 2014/2015 anyway so it should come as no surprise.

Id be more disappointed in the lack of HDMI 2.0 for such a "Next-Gen" Card, but i suppose each card has its strong suits...can't have it all.


----------



## flopper

Quote:


> Originally Posted by *dick_cheney*
> 
> While it sucks i think people are blowing the lack of DVI out of proportion, most people have HDMI and or DP.The DVI standard is 15 years old and in the age of DP and HDMI 2.0 there has to be a time when manufacturers phase out legacy connections and somewhat legacy standards. AMD announced way back in 2010 (Along with Intel and others) they would be phasing out DVI/VGA/LVDS by 2014/2015 anyway so it should come as no surprise.
> 
> Id be more disappointed in the lack of HDMI 2.0 for such a "Next-Gen" Card, but i suppose each card has its strong suits...can't have it all.


I have 8 year old 120hz screens that have dvi/vga so for me its time to upgrade to more modern DP ones for sure.
never been a fan of adapters.


----------



## jerrolds

Quote:


> Originally Posted by *boredmug*
> 
> http://www.overclock.net/t/1537403/tftcentral-acer-predator-xr341ck-34-curved-gaming-screen-with-g-sync According to this it is???


Yes but...i follow that thread, and i coulda swore a rep "confirmed" that it was 100hz...but then quickly denied by another rep to be 75hz.

Ill have to go back again lol


----------



## boredmug

Quote:


> Originally Posted by *jerrolds*
> 
> Yes but...i follow that thread, and i coulda swore a rep "confirmed" that it was 100hz...but then quickly denied by another rep to be 75hz.
> 
> Ill have to go back again lol


Ah.. You are probably right then. I just hopped over there to check out the monitor as I'm interested in replacing my eyefinity setup with a super wide at some point.


----------



## }SkOrPn--'

Quote:


> Originally Posted by *dick_cheney*
> 
> While it sucks i think people are blowing the lack of DVI out of proportion, most people have HDMI and or DP.The DVI standard is 15 years old and in the age of DP and HDMI 2.0 there has to be a time when manufacturers phase out legacy connections and somewhat legacy standards. AMD announced way back in 2010 (Along with Intel and others) they would be phasing out DVI/VGA/LVDS by 2014/2015 anyway so it should come as no surprise.
> 
> Id be more disappointed in the lack of HDMI 2.0 for such a "Next-Gen" Card, but i suppose each card has its strong suits...can't have it all.


Yeah agreed, the lack of HDMI 2.0 is also a killer for me. However, I am not getting another new monitor, my 1440p IPS at 120HZ is plenty damn wonderful as is. When it dies eventually, then I will look for another monitor. If you don't have a DVI port, then your not getting my money this year, period. In 2020 or so when I get a 4K or 8K monitor "that is actually worthy of my money", then it will be DP or Thunderbolt based, or what ever the best tech is by then. lol

When I was researching to build my own brand name gaming monitors last year I spoke with most of these suppliers, one of them told me they couldn't even keep up with the demand they already had in 2013 and 2014 because they were selling roughly 1 million a month. So the claim of 3000 in total, is total utter BS. It was more like 3000 every hour 365 days a year all through 2012, 2013 and maybe even 2014. Gamers (Not stupid rich kids with money to blow) want fast gaming displays that look good, at an affordable price and NONE of the name brands came through, not for less then $1000 or so, and its just now starting to happen in 2015. The only real option for the normal casual gamer who wasn't wealthy was the Korean (LG and Samsung based displays) 1440p displays with a DVI only port, QNIX or Yamakasi's etc...

I can understand why AMD did not include DVI on their new Fury X, however with 99% or more of the displays still having this connection the only real loser is AMD. I know this for fact, because they already lost my money and everyone I know. I don't know a single person with a DP based monitor yet.

Point being, I will never buy a new monitor just to enjoy a newly released video card, not ever happening here. I will buy a new monitor when the old one dies on me, and *only if I cant repair it myself*, lol.


----------



## Drebinx

Quote:


> Originally Posted by *}SkOrPn--'*
> 
> Yeah agreed, the lack of HDMI 2.0 is also a killer for me. However, I am not getting another new monitor, my 1440p IPS at 120HZ is plenty damn wonderful as is. When it dies eventually, then I will look for another monitor. If you don't have a DVI port, then your not getting my money this year, period. In 2020 or so when I get a 4K or 8K monitor "that is actually worthy of my money", then it will be DP or Thunderbolt based, or what ever the best tech is by then. lol
> 
> When I was researching to build my own brand name gaming monitors last year I spoke with most of these suppliers, one of them told me they couldn't even keep up with the demand they already had in 2013 and 2014 because they were selling roughly 1 million a month. So the claim of 3000 in total, is total utter BS. It was more like 3000 every hour 365 days a year all through 2012, 2013 and maybe even 2014. Gamers (Not stupid rich kids with money to blow) want fast gaming displays that look good, at an affordable price and NONE of the name brands came through, not for less then $1000 or so, and its just now starting to happen in 2015. The only real option for the normal casual gamer who wasn't wealthy was the Korean (LG and Samsung based displays) 1440p displays with a DVI only port, QNIX or Yamakasi's etc...
> 
> I can understand why AMD did not include DVI on their new Fury X, however with 99% or more of the displays still having this connection the only real loser is AMD. I know this for fact, because they already lost my money and everyone I know. I don't know a single person with a DP based monitor yet.
> 
> Point being, I will never buy a new monitor just to enjoy a newly released video card, not ever happening here. I will buy a new monitor when the old one dies on me, and *only if I cant repair it myself*, lol.


1 million a month in sales maybe, but definitely not monitors. only 99.6 million monitors were sold last year (Source: IDC) so i honestly doubt niche Korean ips account for 12% of the market and somehow didnt even make it onto the top 5 (Number 5 Being LG with an avg of 3M sold per Quarter). Hopefully your monitor last you a few more years, i also await a good fast 4K/5K Ips Monitor.


----------



## Ceadderman

Quote:


> Originally Posted by *bkvamme*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> Can anyone clear **** up for me?
> 
> Are any of the r9-300 series NEW silicon? Are they all rebrands with different clocks/memory configs? I just got an email from newegg, but looking at shader counts, they all look like rebrands to me.
> 
> 
> 
> All Rx 3xx are rebrands. Some have some minor tweaks, which could count as a upgrade, but it is still based on the previous silicon.
Click to expand...

So the Fury, Nova and 2x cards are re-brand? GTK.









I am sick of hearing about "re-brands". If you have a better idea then submit it to the developers of your choice. Cause even nVidia puts out mostly re-brands to get 1-3 new products out there in a timely fashion.









~Ceadder


----------



## HiTechPixel

Quote:


> Originally Posted by *Ceadderman*
> 
> So the Fury, Nova and 2x cards are re-brand? GTK.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am sick of hearing about "re-brands". If you have a better idea then submit it to the developers of your choice. Cause even nVidia puts out mostly re-brands to get 1-3 new products out there in a timely fashion.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


The Fury, Fury X, Nano and Fury X2 are not 3xx cards. They're their own line of cards, like the Nvidia Titan lineup. So yes, every single R9 3xx card is a rebrand.


----------



## xer0h0ur

Yeah but if the specifications these manufacturers keep listing is accurate then they made quite a bit of changes to get power consumption and heat output down.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah but if the specifications these manufacturers keep listing is accurate then they made quite a bit of changes to get power consumption and heat output down.


Oh yeah, no doubt about that. I think AMD made some changes to the GPUs but they're still by large the same GPUs we've seen before, just with new names and some new cool improvements/features.


----------



## rv8000

Why are people still on about this HDMI/dP stuff, the exclusion of 2.0 is based off a forum post, albeit from a reputable person, just wait for the thing to be released and the reviews to be published (to show package contents as well)...


----------



## HiTechPixel

Quote:


> Originally Posted by *rv8000*
> 
> Why are people still on about this HDMI/dP stuff, the exclusion of 2.0 is based off a forum post, albeit from a reputable person, just wait for the thing to be released and the reviews to be published (to show package contents as well)...


Although I exclusively use DisplayPort, the HDMI complaint is valid. Very much so. There should be no reason for there not to be HDMI 2.0 on the card. The Nano, for example, is a short but powerful card aimed at the ITX and HTPC market. And if the people in the HTPC market can't use their shiny 4K TVs, then why should they buy the Nano?


----------



## Forceman

Quote:


> Originally Posted by *rv8000*
> 
> Why are people still on about this HDMI/dP stuff, the exclusion of 2.0 is based off a forum post, albeit from a reputable person, just wait for the thing to be released and the reviews to be published (to show package contents as well)...


It's also in that PCWorld article, which is supposed to be sourced from AMD directly.


----------



## rv8000

Quote:


> Originally Posted by *HiTechPixel*
> 
> Although I exclusively use DisplayPort, the HDMI complaint is valid. Very much so. There should be no reason for there not to be HDMI 2.0 on the card. The Nano, for example, is a short but powerful card aimed at the ITX and HTPC market. And if the people in the HTPC market can't use their shiny 4K TVs, then why should they buy the Nano?


And we know almost nothing about its' actual specs







, the Nano that is. There's also things we still dont know about fury x/fury. We know nothing about packaging and adapters as well.


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Why are people still on about this HDMI/dP stuff, the exclusion of 2.0 is based off a forum post, albeit from a reputable person, just wait for the thing to be released and the reviews to be published (to show package contents as well)...


About as reliable as a chimpanzee...? Either he made a mistake, or he didn't. Ocuk deleted almost the whole thread so I'm guessing the post was more wrong then right. This is the same reputable guy who stole my how to thread sometime ago.


----------



## hyp36rmax

So much speculation with HDMI 2.0. I went ahead and asked Roy @ AMD hopefully get an answer via Twitter.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> Although I exclusively use DisplayPort, the HDMI complaint is valid. Very much so. There should be no reason for there not to be HDMI 2.0 on the card. The Nano, for example, is a short but powerful card aimed at the ITX and HTPC market. And if the people in the HTPC market can't use their shiny 4K TVs, then why should they buy the Nano?


Yeah I am able to look past chopping off the DVI port but there isn't any reason they shouldn't have included HDMI 2.0 considering damn near no TVs have DP.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am able to look past chopping off the DVI port but there isn't any reason they shouldn't have included HDMI 2.0 considering damn near no TVs have DP.


That i agree with, should of had 2.0 for sure..... Especially that nano


----------



## zeppoli

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Agreed...
> 
> I really don't understand any complaints other than the DVI and HDMI 2.0 stuff, which I do feel is a justified, but not worth shunning this series...


Because for very few the card will cost 650.00 for most people it will cost 675.00-700.00 with a cheap adapter, for many others 725.00 or even higher with an active adapter.

its not like there is no room for a DVI port, it's a huge fail, they must realize that most people do NOT have display ports.


----------



## xer0h0ur

Get it through your head already. AMD had been planning to phase out DVI ports since 2010. Buy a new monitor or get bent and buy a green team card that should last you like a year and change before they start gimping it for Pascal performance.


----------



## rv8000

Quote:


> Originally Posted by *tsm106*
> 
> About as reliable as a chimpanzee...? Either he made a mistake, or he didn't. Ocuk deleted almost the whole thread so I'm guessing the post was more wrong then right. This is the same reputable guy who stole my how to thread sometime ago.


Shame on him them


----------



## bobbavet

TBH I feel AMD have wasted a whole lot of time, money and effort on this 300 series.

Would have been better investment to have Fury as is and launch into you next series based on Fury tech.

HBM1/2 and getting on with 14/16nm processors.


----------



## rv8000

Quote:


> Originally Posted by *bobbavet*
> 
> TBH I feel AMD have wasted a whole lot of time, money and effort on this 300 series.
> 
> Would have been better investment to have Fury as is and launch into you next series based on Fury tech.
> 
> HBM1/2 and getting on with 14/16nm processors.


What waste of time?

Board partners were likely responsible for all the pcb updates, its not like 390x, 390, 380, and 370 are anything new. As for the Fury X, Fury, and Nano are most likely all geared towards working on the node shrink in 2016.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Get it through your head already. AMD had been planning to phase out DVI ports since 2010. Buy a new monitor or get bent and buy a green team card that should last you like a year and change before they start gimping it for Pascal performance.


Or just simply buy an HDMI adapter(you can get them cheaper than $5) and connect it that way. What in Sam hades is so difficult about that?









I stopped using DVI the moment I had HDMI capability.









~Ceadder


----------



## xer0h0ur

I don't know if you guys realize that the upcoming node shrink is actually not true 16 nanometer. True high performance 16 nanometer won't be ready for quite a while. This is going to be a hybrid process with 20 nanometer interconnects using 16 nanometer transistors if memory serves me right.


----------



## Forceman

Quote:


> Originally Posted by *Ceadderman*
> 
> Or just simply buy an HDMI adapter(you can get them cheaper than $5) and connect it that way. What in Sam hades is so difficult about that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I stopped using DVI the moment I had HDMI capability.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


The people that care do so because they have overclockable monitors and you can't overclock with an adapter. So it isn't as easy as "use an adapter". It isn't a show stopper, but if you already have a perfectly good 1440p monitor, you'd kind of like to be able to continue using it, and not have to shell out $500 for a replacement. Hopefully some versions of the Fury non-X have DVI.


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> About as reliable as a chimpanzee...? Either he made a mistake, or he didn't. Ocuk deleted almost the whole thread so I'm guessing the post was more wrong then right. This is the same reputable guy who stole my how to thread sometime ago.
> 
> 
> 
> Shame on him them
Click to expand...

Concur. What's more sad is that they made him a rep lol.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know if you guys realize that the upcoming node shrink is actually not true 16 nanometer. True high performance 16 nanometer won't be ready for quite a while. This is going to be a hybrid process with 20 nanometer interconnects using 16 nanometer transistors if memory serves me right.


16nm FinFET is what it's called, no? I might be horribly mistaken.


----------



## glenn37216

So benchmarks are appearing with physx and hairworks turned off. Guess all that power by amd still runs like crap with gameworks titles. so looks like the real delima is... Do you want nicer physx models and beautiful hair or do you want an extra 5-10% fps boost instead?


----------



## Ceadderman

Quote:


> Originally Posted by *glenn37216*
> 
> So benchmarks are appearing with physx and hairworks turned off. Guess all that power by amd still runs like crap with gameworks titles. so looks like the real delima is... Do you want nicer physx models and beautiful hair or do you want an extra 5-10% fps boost instead?












~Ceadder


----------



## jerrolds

Quote:


> Originally Posted by *glenn37216*
> 
> So benchmarks are appearing with physx and hairworks turned off. Guess all that power by amd still runs like crap with gameworks titles. so looks like the real delima is... Do you want nicer physx models and beautiful hair or do you want an extra 5-10% fps boost instead?


PhysX and Hairworks are Nvidia technology right? Why not complain that the Fury cards dont work with Gsync.

Assuming youre right i suppose you can get better physx and hairworks performance on the handful of games ...looks like 3 haha http://physxinfo.com/wiki/HairWorks#Games_and_game_engine_integrations

And about the same number that support PhysX http://physxinfo.com/wiki/Upcoming_GPU_PhysX_games

Vs 5-10% faster fps across the board...

According to this you can run Hairworks on AMD cards without much of a performance hit http://wccftech.com/witcher-3-run-hairworks-amd-gpus-crippling-performance/

Not much of a dilemma to me


----------



## zeppoli

Quote:


> Originally Posted by *xer0h0ur*
> 
> Get it through your head already. AMD had been planning to phase out DVI ports since 2010. Buy a new monitor or get bent and buy a green team card that should last you like a year and change before they start gimping it for Pascal performance.


LMAO! are you kidding? you do realize they JUST released the 300 series.. ??Refresh or not video cards should still have DVI ports, actually there is NOTHING wrong with a DVI-D port and why almost all monitors have them. Even people buying this card or a 980 ti, I bet less than 5% have 4k displays, and without a 4k display DVI port, DP its all the same .


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah I am able to look past chopping off the DVI port but there isn't any reason they shouldn't have included HDMI 2.0 considering damn near no TVs have DP.


Not long ago few monitors if any had displayport now a lot of them do.
DP better tech.

I am upgrading my monitors to DP and 120hz+
they are 8 years old about time I did that.


----------



## glenn37216

Quote:


> Originally Posted by *jerrolds*
> 
> PhysX and Hairworks are Nvidia technology right? Why not complain that the Fury cards dont work with Gsync.
> 
> Assuming youre right i suppose you can get better physx and hairworks performance on the handful of games ...looks like 3 haha http://physxinfo.com/wiki/HairWorks#Games_and_game_engine_integrations
> 
> And about the same number that support PhysX http://physxinfo.com/wiki/Upcoming_GPU_PhysX_games
> 
> Vs 5-10% faster fps across the board...
> 
> According to this you can run Hairworks on AMD cards without much of a performance hit http://wccftech.com/witcher-3-run-hairworks-amd-gpus-crippling-performance/
> 
> Not much of a dilemma to me[/quote
> Quote:
> 
> 
> 
> Originally Posted by *jerrolds*
> 
> PhysX and Hairworks are Nvidia technology right? Why not complain that the Fury cards dont work with Gsync.
> 
> Assuming youre right i suppose you can get better physx and hairworks performance on the handful of games ...looks like 3 haha http://physxinfo.com/wiki/HairWorks#Games_and_game_engine_integrations
> 
> And about the same number that support PhysX http://physxinfo.com/wiki/Upcoming_GPU_PhysX_games
> 
> Vs 5-10% faster fps across the board...
> 
> According to this you can run Hairworks on AMD cards without much of a performance hit http://wccftech.com/witcher-3-run-hairworks-amd-gpus-crippling-performance/
> 
> Not much of a dilemma to me
> 
> 
> 
> appreciate the sarcasm but still doesnt solve the debate. Newer upcoming titles like Batman for example use the gameworks tech to enhance physics. Not just hairworks... From what i just saw in leaked bencbmarks the fury and all its fps glory still cant produce the quality 680 can with physx and gameworks enhancements turned on. More fps means nothing if the Gpu cant emulate or work around nvidias tech like nvidia does in amd enhanced games. ie dirt rally , star wars Battlefront.
Click to expand...


----------



## Mega Man

your argument is invalid, AMD lets nvidia optimize their games, nvidia refuses to do so, most of the time they dont even let the game developers look at it, but some rare instances they do, then they force a nda so that they cant talk with it to amd at all.

simple solution dont support gameworks games and it will go away, like all nvidia tech, it is not good for the industry it is only good for nvidia pockets


----------



## flopper

Quote:


> Originally Posted by *Mega Man*
> 
> your argument is invalid, AMD lets nvidia optimize their games, nvidia refuses to do so, most of the time they dont even let the game developers look at it, but some rare instances they do, then they force a nda so that they cant talk with it to amd at all.
> 
> simple solution dont support gameworks games and it will go away, like all nvidia tech, it is not good for the industry it is only good for nvidia pockets


nor for us PC gamers either.
Thanks to AMD I feel a revival finally in that arena.


----------



## djsatane

Quote:


> The AMD Radeon R9 Fury X will officially be available on-shelves on the 24th of June.


- sure.... what are the chances of one actually being able to buy it


----------



## gatygun

New gpu people fuji titanic.




Spoiler: Warning: Spoiler!



Made up with my pro paint skills, make it happen amd


----------



## Sgt Bilko

Quote:


> Originally Posted by *gatygun*
> 
> New gpu people fuji titanic.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Made up with my pro paint skills, make it happen amd


Haha, looks great!

In all seriousness though, this would be possible wouldn't it?

just another PLX chip and boom.....Single slot Tri-Fire


----------



## HiTechPixel

Quote:


> Originally Posted by *djsatane*
> 
> - sure.... what are the chances of one actually being able to buy it


Retailers already have them in stock since weeks ago, they're just waiting for the GO! from AMD.


----------



## joeh4384

I am looking forward to the reviews next week.


----------



## jerrolds

Really hoping AMD finally solved the CFX microstutter problem that affected everything including the 290 series (even tho it was slightly better the last time around)

Or has it been fixed recently? I know during the first few months of 290 release it wasnt perfected yet.


----------



## rdr09

Quote:


> Originally Posted by *jerrolds*
> 
> Really hoping AMD finally solved the CFX microstutter problem that affected everything including the 290 series (even tho it was slightly better the last time around)


i have zero stutter with my 2 290s. my i7 SB handles them beautifully. i dont play the latest games, though, especially gameworks. zero.


----------



## boredmug

Quote:


> Originally Posted by *rdr09*
> 
> i have zero stutter with my 2 290s. my i7 SB handles them beautifully. i dont play the latest games, though, especially gameworks. zero.


I don't notice stutter with my 290x's either.


----------



## zeppoli

Quote:


> Originally Posted by *rdr09*
> 
> i have zero stutter with my 2 290s. my i7 SB handles them beautifully. i dont play the latest games, though, especially gameworks. zero.


yes you do.. Or you just don't know what they are..

Every single CF setup has them. what older games do you play? I'd like to try.

Thanks


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> yes you do.. Or you just don't know what they are..
> 
> Every single CF setup has them. what older games do you play? I'd like to try.
> 
> Thanks


i 've had 7900 series briefly before frame pacing came out, so i know. zero, BF4, C3, C2, Skyrim, Grid, PS2, etc.

They are like one car. i know your issue . . . temp.


----------



## BackwoodsNC

They fixed that stutter with Hawaii, notice no crossfire fingers. They used a hardware fix. The lower tiers that have crossfire fingers still have a issue. They never really fixed with those even with the software fix, it helped bit not fixed.


----------



## tsm106

http://www.overclock.net/t/1559813/155fps-to-63fps-back-to-155fps-r9-290-cf/0_40#post_24022785

lol trollbait


----------



## Casey Ryback

GTAV frametimes.

http://www.guru3d.com/articles-pages/gta-v-pc-graphics-performance-review,8.html

Couple of tiny spikes but looks pretty good.


----------



## jerrolds

Quote:


> Originally Posted by *Casey Ryback*
> 
> GTAV frametimes.
> 
> http://www.guru3d.com/articles-pages/gta-v-pc-graphics-performance-review,8.html
> 
> Couple of tiny spikes but looks pretty good.


Yup i stand corrected - looks like AMD largely fixed microstutter, at least on the 290 series

http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,32.html

Frametimes are pretty tight on the 295x2.


----------



## zeppoli

Quote:


> Originally Posted by *rdr09*
> 
> i 've had 7900 series briefly before frame pacing came out, so i know. zero, BF4, C3, C2, Skyrim, Grid, PS2, etc.
> 
> They are like one car. i know your issue . . . temp.


naa, my temps are flawless, 80c for top card, 67c bottom card, 70-85 on VRM both cards at load.

I will agree, BF4, C3 do work great with CF, but GTA V and many other AAA games have them


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> naa, my temps are flawless, 80c for top card, 67c bottom card, 70-85 on VRM both cards at load.
> 
> I will agree, BF4, C3 do work great with CF, but GTA V and many other AAA games have them


Choose to disregard like before . . .






edit: btw, i agree with tsm.


----------



## Agent Smith1984

I saw micro stutter in two situations with my setup....

The first, was when I had accidently applied a much higher clock to the primary card, and no overclock to second.

There seemed to be these instances where the load would drop on the primary, and get a slight skip.....

Once I got the clocks locked, I never saw it happen again.

The only other case I saw CF issues, was with Skyrim.... it was getting 150+ FPS, but the character would barely move, and there was skipping, much like when you have a horrible ping on a BF4 map.

THAT'S IT THOUGH.....

Crossfire has performed beautifully for me, and I am looking forward to doing it again.


----------



## the9quad

When crossfire works its glorious, when it doesnt its depressing. More and more newer games have issues. Crossfire is why i no longer buy any games day one. My level of anger has gone down to zero since i made that decision. Its why i dont play dying light and farcry 4 even though i own them. Its why i didnt buy the witcher 3. Frostbite seems to work great and i wouldnt heaitate to buy a frostbite engine game day one though. not knocking cfx but im being realistic. If a game supports it ill buy it, if it doesnt ill wait for a sale.


----------



## zeppoli

Quote:


> Originally Posted by *rdr09*
> 
> Choose to disregard like before . . .
> 
> 
> 
> 
> 
> 
> edit: btw, i agree with tsm.


Maybe the Eyefinity has something to do with it, but a simple google search and

well

https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=crossfire%20gta%20v


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> Maybe the Eyefinity has something to do with it, but a simple google search and
> 
> well
> 
> https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=crossfire%20gta%20v


The tri made the setup more complicated.


----------



## flopper

http://forums.overclockers.co.uk/showpost.php?p=28199976&postcount=706
Quote:


> Just to confirm.
> 
> The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.
> 
> In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver [email protected] gaming on UHD televisions that support HDMI 2.0.


----------



## tsm106

It doesn't make sense to portion all the bandwidth to decode x265 and not include hdmi 2.0. Who the hell do they think x265 is directed but 4k tv users? But the 4k tv users will not be able to direct connect to their gpus, sort of defeating the point in the first place.


----------



## boredmug

Quote:


> Originally Posted by *the9quad*
> 
> When crossfire works its glorious, when it doesnt its depressing. More and more newer games have issues. Crossfire is why i no longer buy any games day one. My level of anger has gone down to zero since i made that decision. Its why i dont play dying light and farcry 4 even though i own them. Its why i didnt buy the witcher 3. Frostbite seems to work great and i wouldnt heaitate to buy a frostbite engine game day one though. not knocking cfx but im being realistic. If a game supports it ill buy it, if it doesnt ill wait for a sale.


What issues you have with dying light? It runs great for me but there are some strobing issues at main screen or when your character gets splattered with blood.


----------



## Kane2207

Quote:


> Originally Posted by *flopper*
> 
> http://forums.overclockers.co.uk/showpost.php?p=28199976&postcount=706


Yay dongles!

Are AMD Apple now?


----------



## Agent Smith1984

They can throw in one of those multi purpose crazy looking I/O dongles that used to come with the Radeon All-In-Wonders back in the day!!! HAHAHA


----------



## Hazardz

Not sure if they was posted but here's a reviewer's guide to the Fury X.

http://videocardz.com/56728/amd-radeon-r9-fury-x-reviewers-guide


----------



## zeppoli

Quote:


> Originally Posted by *Hazardz*
> 
> Not sure if they was posted but here's a reviewer's guide to the Fury X.
> 
> http://videocardz.com/56728/amd-radeon-r9-fury-x-reviewers-guide


Thanks, looks like the stock benchmarks added into some more information.
I'm gonna wait for the video showing the side by sides, and a few reviews.


----------



## Gregster

Quote:


> I have being playing today, but I either have a driver issue or the rig I was using had NVIDIA sabotage going on as the performance was not as per expectation, should of being quicker.
> 
> Now got the Fury home with me and shall play with it over weekend comparing to my 290X and 980Ti G1. But from the press pack I have the Fury X is a 980Ti competitor, not a beater and not a loser.
> 
> Also NVIDIA are not going to move 980Ti price, simply no way as they know MSRP on Fury X is around £549 inc. VAT, same price as their 980Ti which they will argue has more features and more VRAM.
> 
> Also NVIDIA will not move price when AMD has such limited stocks and all resellers are price gouging and sell way over £600 due to such limited stocks.
> 
> This sub £500 980Ti is a truly genuine deal, a very silly one at that and yes OcUK does lose nearly £3 a card, this is purely a marketing stunt. Take one of the best 980Ti on the market with best warranty and then move sub £500 and make a load of noise!! Marketing!!!!




Interesting, so other than having some teething issues, Gibbo at OcUK has said it competes with the 980Ti and wins some and loses some.


----------



## zeppoli

why in the world would they have "limited stock" its not an Iphone, there will not be thousands lined out outside of a retailer. lol
100's maybe a few thousand worldwide at best.

and I assume AMD knows this, especially if reviews say its not beating the 980ti, or even if it only matches the 980ti, all those that have DVI only, and no affiliation or brand loyalty, will choose the obvious, Nvidia.


----------



## Kuivamaa

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I saw micro stutter in two situations with my setup....
> 
> The first, was when I had accidently applied a much higher clock to the primary card, and no overclock to second.
> 
> There seemed to be these instances where the load would drop on the primary, and get a slight skip.....
> 
> Once I got the clocks locked, I never saw it happen again.
> 
> The only other case I saw CF issues, was with Skyrim.... it was getting 150+ FPS, but the character would barely move, and there was skipping, much like when you have a horrible ping on a BF4 map.
> 
> THAT'S IT THOUGH.....
> 
> Crossfire has performed beautifully for me, and I am looking forward to doing it again.


Skyrim and 150fps, the horror. Engine goes screwy when you remove vsync, this was probably your issue . And then people wonder why we get upset on the idea that fallout 4 will be based on a gamebryo engine derivative.


----------



## hamzta09

Quote:


> Originally Posted by *Kuivamaa*
> 
> Skyrim and 150fps, the horror. Engine goes screwy when you remove vsync, this was probably your issue . And then people wonder why we get upset on the idea that fallout 4 will be based on a gamebryo engine derivative.


Not capping fps to 60 was the issue.


----------



## flopper

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> Interesting, so other than having some teething issues, Gibbo at OcUK has said it competes with the 980Ti and wins some and loses some.


I assume without much thinking Minfps will be unprecedented with the Fury the main metric if you use wider big resolution gaming.


----------



## Gregster

Quote:


> Originally Posted by *flopper*
> 
> I assume without much thinking Minfps will be unprecedented with the Fury the main metric if you use wider big resolution gaming.


Well going by the guy who works for AMD, you are massively mistaken.
Quote:


> Originally Posted by AMDMatt;27458664
> I'm still going to do it when i get time, just been very busy. I will use games that use more than 4gb to showcase the advantage 8gb offers. *The biggest difference comes in minimum fps as games that use more than 4gb typically suffer from frame drops and hitching, which ruin the experience. This is not an issue on the 8gb versions.*
> 
> As for pairing the cards up against each other, this is made difficult as the 8gb versions use different memory to my 4gb versions and the clock speeds are different. As Kaap suggested, if less than 4gb is used then the 4gb versions will be slightly faster for the reasons stated.
> 
> Regarding unannounced products, i cannot comment on those.
> 
> http://forums.overclockers.co.uk/newreply.php?do=newreply&p=27458664


That pretty much sums it up for me.


----------



## rdr09

Quote:


> Originally Posted by *Gregster*
> 
> Well going by the guy who works for AMD, you are massively mistaken.
> That pretty much sums it up for me.


We all look up to that guy here in OCN's AMD threads.


----------



## flopper

Quote:


> Originally Posted by *Gregster*
> 
> Well going by the guy who works for AMD, you are massively mistaken.
> That pretty much sums it up for me.


Fury use different memory tech.
if you compare straight over without any source to back such statement up, your trolling and spreading fud.
oh yes you just did.


----------



## Sgt Bilko

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gregster*
> 
> Well going by the guy who works for AMD, you are massively mistaken.
> That pretty much sums it up for me.
> 
> 
> 
> Fury use different memory tech.
> if you compare straight over without any source to back such statement up, your trolling and spreading fud.
> oh yes you just did.
Click to expand...

^ That, a few more days till launch and people are still jumping ship because of something that hasn't been proven?


----------



## Gregster

Quote:


> Originally Posted by *rdr09*
> 
> We all look up to that guy here in OCN's AMD threads.


Yer, I am getting that impression here









Quote:


> Originally Posted by *flopper*
> 
> Fury use different memory tech.
> if you compare straight over without any source to back such statement up, your trolling and spreading fud.
> oh yes you just did.


I hope you are right and I was just quoting the AMD rep and what he had to say about High resolutions and 4GB. Maybe you should be saying that to AMDMatt over at OcUK, as I was direct quoting


----------



## Nickyvida

Hi. Just a quick question for members in here

What are the chances we see a custom 3rd party Fury X cards from AIB partners?

I'm interested in getting the Fury X but i'm not looking to get the W/C version but an air cooled card.


----------



## szeged

Higher bandwidth doesn't mean less memory used though. Overclocking gddr5 doesn't make games use less vram. I doubt hbm will be different on that front. 4gb will still be 4gb 8gb will still be 8gb.

Amds marketing team sure is doing a good job of confusing the uninformed.


----------



## MiladEd

We all have been under the impression that Fury X will only corporate the reference design, but Fury X will have different versions. And that Fury, is only an aircooled version of Fury X, and is not a cut down version. Only time will tell though.


----------



## Casey Ryback

Quote:


> Originally Posted by *szeged*
> 
> Higher bandwidth doesn't mean less memory used though. Overclocking gddr5 doesn't make games use less vram. I doubt hbm will be different on that front. 4gb will still be 4gb 8gb will still be 8gb.
> 
> Amds marketing team sure is doing a good job of confusing the uninformed.


AMD stated that compression, and more advanced memory allocation would help the 4GB perform well at high res.

They stated that they'd never done such things with DDR5 for various reasons.

They said this will be done through drivers and will not be relying on devs to do anything for the games.

We will see in the benchmarks I guess, if it's marketing or truth.


----------



## szeged

I hope it's true but for some reason I just can't help but to think that is their marketing team trying to make the 4gb hbm limitation less of a concern to people who only care about vram. We will see next week though, I'll grab a fury x asap to test for myself @ 4k since I don't trust any reviewers results anyways because they usually say "at ultra settings" or something but a lot of games ultra preset doesn't include any or all aa and since my monitor is big even 4k needs some aa for me.


----------



## DNMock

Quote:


> Originally Posted by *Casey Ryback*
> 
> AMD stated that compression, and more advanced memory allocation would help the 4GB perform well at high res.
> 
> They stated that they'd never done such things with DDR5 for various reasons.
> 
> They said this will be done through drivers and will not be relying on devs to do anything for the games.
> 
> We will see in the benchmarks I guess, if it's marketing or truth.


So basically the Fury will handle a game that uses 4.5 GB of Vram about as well as a 970 handles a game that uses 4 GB of Vram. Awesome...

Fortunately most games can be run with pretty much no notable difference by dropping a couple settings down to put it below 4GB of Vram usage. (at 4k of course)

Still hugely disheartened by this. Really, really wanted these to be world beater cards too...


----------



## Casey Ryback

Quote:


> Originally Posted by *DNMock*
> 
> So basically the Fury will handle a game that uses 4.5 GB of Vram about as well as a 970 handles a game that uses 4 GB of Vram. Awesome...


No I don't think that's the case, but we will see.


----------



## dir_d

Quote:


> Originally Posted by *DNMock*
> 
> So basically the Fury will handle a game that uses 4.5 GB of Vram about as well as a 970 handles a game that uses 4 GB of Vram. Awesome...
> 
> Fortunately most games can be run with pretty much no notable difference by dropping a couple settings down to put it below 4GB of Vram usage. (at 4k of course)
> 
> Still hugely disheartened by this. Really, really wanted these to be world beater cards too...


You are going to take anything anyone says on Fury X as the absolute truth when the NDA is still up and the cards aren't released, that makes no sense. Wait for 3rd party reviews first.


----------



## Forceman

Quote:


> Originally Posted by *Nickyvida*
> 
> Hi. Just a quick question for members in here
> 
> What are the chances we see a custom 3rd party Fury X cards from AIB partners?
> 
> I'm interested in getting the Fury X but i'm not looking to get the W/C version but an air cooled card.


The current information is that there will only be reference Fury X cards, and you'll have to wait for the non-X for custom designs.


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> The current information is that there will only be reference Fury X cards, and you'll have to wait for the non-X for custom designs.


I'm hoping they go straight to custom designs for Fury. Considering it's a lot of heat to handle still and AMD seems to be wanting to step far away from all that jazz with the 290/290x reference release, I could see them moving in that direction.


----------



## pdasterly

when should we see waterblocks for fury such as ekwb?


----------



## Nickyvida

Quote:


> Originally Posted by *Forceman*
> 
> The current information is that there will only be reference Fury X cards, and you'll have to wait for the non-X for custom designs.


Oh hell, i was afraid of that given that Fury has cut down shaders(less performance?) and i can't get the watercooled AIO as my PC has no space to mount a rad in it.

Going down the Titan route will cost them customers.I can't understand why they can't release an air cooled Fury X.

Guess i'll wait and see. If there's no definite air cooled Fury X i may as well keep my money and wait for Greenland.


----------



## Forceman

Quote:


> Originally Posted by *Nickyvida*
> 
> Oh hell, i was afraid of that given that Fury has cut down shaders(less performance?) and i can't get the watercooled AIO as my PC has no space to mount a rad in it.
> 
> Going down the Titan route will cost them customers.I can't understand why they can't release an air cooled Fury X.
> 
> Guess i'll wait and see. If there's no definite air cooled Fury X i may as well keep my money and wait for Greenland.


If history is any guide it won't be that far off the performance of the X, and probably be the price/performance card to get anyway. Sucks we have to wait a month to find out.


----------



## xer0h0ur

Quote:


> Originally Posted by *szeged*
> 
> Higher bandwidth doesn't mean less memory used though. Overclocking gddr5 doesn't make games use less vram. I doubt hbm will be different on that front. 4gb will still be 4gb 8gb will still be 8gb.
> 
> Amds marketing team sure is doing a good job of confusing the uninformed.


I know we have no idea how much vRAM usage this game has but its encouraging to see things like this pop up: http://wccftech.com/amd-fury-x-tested-12k-60fps/


----------



## Nickyvida

Quote:


> Originally Posted by *Forceman*
> 
> If history is any guide it won't be that far off the performance of the X, and probably be the price/performance card to get anyway. Sucks we have to wait a month to find out.


Still, i would have liked, and would pay extra for full 4096 shader air cooled Fury X. But there are sources floating around that there are air cooled non reference Fury X cards may be coming.. unless i missed something at the live AMD E3 unveil. Did Lisa explicitly confirm that Fury X would only be limited to reference designs?

If a Fury air cooled version is possible, why not an air cooled fury X?

June 24th is but a few days away though, maybe we might find out more then. I hope AIB partners are already working on non reference designs.


----------



## Kane2207

Quote:


> Originally Posted by *Nickyvida*
> 
> Still, i would have liked, and would pay extra for full 4096 shader air cooled Fury X. But there are sources floating around that there are air cooled non reference Fury X cards may be coming.. unless i missed something at the live AMD E3 unveil. Did Lisa explicitly confirm that Fury X would only be limited to reference designs?
> 
> If a Fury air cooled version is possible, why not an air cooled fury X?
> 
> June 24th is but a few days away though, maybe we might find out more then. I hope AIB partners are already working on non reference designs.


It may be that Fury (non X) has lower clocks and it'll almost certainly be cut down at least a little bit.

AMD may have found an air cooled X couldn't hit the performance figures required without an AIO.


----------



## Forceman

Quote:


> Originally Posted by *Nickyvida*
> 
> Still, i would have liked, and would pay extra for full 4096 shader air cooled Fury X. But there are sources floating around that there are air cooled non reference Fury X cards may be coming.. unless i missed something at the live AMD E3 unveil. Did Lisa explicitly confirm that Fury X would only be limited to reference designs?
> 
> If a Fury air cooled version is possible, why not an air cooled fury X?
> 
> June 24th is but a few days away though, maybe we might find out more then. I hope AIB partners are already working on non reference designs.


She said the Fury X was the liquid-cooled version, but she didn't rule out custom designs at E3. Some of the follow-up articles that came out recently have said it will be reference only though.

For example:
Quote:


> In case it isn't obvious yet, the Fury X uses a very unique design. So unique, in fact, that AMD's add-in board partners (like Asus, MSI, and Sapphire) won't be able to customize the card with their own cooling solutions. The Fury X will be reference design-only, though AIBs will be able to tinker with the air-cooled Radeon R9 Fury released in July.


http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html


----------



## DNMock

Quote:


> Originally Posted by *dir_d*
> 
> You are going to take anything anyone says on Fury X as the absolute truth when the NDA is still up and the cards aren't released, that makes no sense. Wait for 3rd party reviews first.


even being able to handle an extra .5GB through compression and other improvements would be a stretch. To constantly be compressing and decompressing data like that takes processing power, where do you think that processing power is going to come from?

Last time I checked AMD didn't have a staff of wizards on hand, so unfortunately they are chained to the laws of physics on what silicone can do just like everyone else.

Keep in mind this is only really an issue for the top 1% or less of folks who run multiple GPU's on either multiple 4k monitors or on 4k with everything cranked up to 11. I just happen to be one of those people and for us, we have a fever and the only cure is more cow bell Vram


----------



## Nickyvida

Quote:


> Originally Posted by *Kane2207*
> 
> It may be that Fury (non X) has lower clocks and it'll almost certainly be cut down at least a little bit.
> 
> AMD may have found an air cooled X couldn't hit the performance figures required without an AIO.


Ah. Well i guess if that's the case it's Greenland or bust. i was hoping to snag Fury X but i'm not going to deal with AIO or water cooling anytime soon( fear of leaks and the lack of space) and i don't want to compromise on any performance hit for the price quoted, given that it's still on 28nm process which i am currently on with my ageing 780.
Quote:


> Originally Posted by *Forceman*
> 
> She said the Fury X was the liquid-cooled version, but she didn't rule out custom designs at E3. Some of the follow-up articles that came out recently have said it will be reference only though.
> 
> For example:
> http://www.pcworld.com/article/2937335/behold-the-beast-full-amd-radeon-r9-fury-x-tech-specs-and-design-details-revealed.html


Well if she didn't rule out custom designs that would be super. Did Nvidia do the same(as in rule out custom non reference versions of the Titan when it first came out?)

Hopefully what is in the article is a rumor.


----------



## xer0h0ur

Well for what its worth Nvidia locked AIBs from modifying the reference design of the Titan X but relatively recently they went back on that stance and there was information floating around about a water cooled version of the Titan X along with a backplate which it sorely needs. Likely, Nvidia's ace in the hole to take down Fury X if AMD's card beats it at stock.


----------



## DNMock

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well for what its worth Nvidia locked AIBs from modifying the reference design of the Titan X but relatively recently they went back on that stance and there was information floating around about a water cooled version of the Titan X along with a backplate which it sorely needs. Likely, Nvidia's ace in the hole to take down Fury X if AMD's card beats it at stock.


Whoa, wait a tick, can you provide a link on that for me please? So you are saying there will be non voltage locked Titan-X's floating around soon?


----------



## xer0h0ur

No no no. I don't know where you got the voltage unlocked part from what I said. I am merely talking about a water cooled Titan X with a backplate on it. I will try to find the article for you.


----------



## xer0h0ur

Here it is: http://wccftech.com/nvidia-geforce-gtx-titan-ultra-liquid-cooled-edition-works/

And for what its worth I found this which I had not seen before: http://wccftech.com/colorful-unveils-igame-gtx-980-ti-kudan-igame-gtx-titanx-kudan-igame-z170-lineup/


----------



## kcuestag

Can't wait to see what the prices are for a Fury X here in Spain. If they're decent, might as well sell my 2x R9 290X and grab one Fury X and for once live through a generation with just one GPU.









Right now my 2x 290X are overkill to me at 1440p.


----------



## DNMock

Quote:


> Originally Posted by *xer0h0ur*
> 
> No no no. I don't know where you got the voltage unlocked part from what I said. I am merely talking about a water cooled Titan X with a backplate on it. I will try to find the article for you.


Oh, you were just referencing the Hydro copper EVGA cards, they had those on the original Titan as well. I thought you were talking about full on non-reference Titan-X cards like having a Titan-X Strix or Kingpin edition Titan-X which would mean non voltage locked cards. My bad.


----------



## hamzta09

When will real benches hit for Fury X? Dont wanna wait too long and miss out on the 980 Ti batches only to find out the Fury isnt that good


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> When will real benches hit for Fury X? Dont wanna wait too long and miss out on the 980 Ti batches only to find out the Fury isnt that good


June 24, at some as yet unknown time


----------



## Evil Penguin

What time does newegg usually release GPUs at launch?
Also, does it show up in their search results right away?

I'd hate to miss the order-window.


----------



## p4inkill3r

Quote:


> Originally Posted by *Evil Penguin*
> 
> What time does newegg usually release GPUs at launch?
> Also, does it show up in their search results right away?
> 
> I'd hate to miss the order-window.


Better get dat f5 finger ready.


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> Better get dat f5 finger ready.


My body is ready.


----------



## DividebyZERO

So here is a thought on VRAM i had with Fury X. GTA5 auto detects VRAM and by default allows only certain amount of settings based on that. So with fury, is GTA5 going to lock you at 4GB settings unless you override? Even if the driver handles how vram is handled at the driver level it will still report 4GB and as such games will limit based on that? Ones that check it anyways?


----------



## weinstein888

Quote:


> Originally Posted by *DividebyZERO*
> 
> So here is a thought on VRAM i had with Fury X. GTA5 auto detects VRAM and by default allows only certain amount of settings based on that. So with fury, is GTA5 going to lock you at 4GB settings unless you override? Even if the driver handles how vram is handled at the driver level it will still report 4GB and as such games will limit based on that? Ones that check it anyways?


Yeah, override it. The GTA VRAM limiting feature is basically just a loosely accurate suggestion.


----------



## djsatane

What *actual* prices you guys think we will see these cards at june 24 if there is any left in stock after like 5 minutes... I suspect prices will get inflated extremly high fast due to shortages.


----------



## Ceadderman

Just saw the prices of R9 390x on Newegg. $650 is about right. 390x is ~$475.

~Ceadder


----------



## Gregster

Gibbo over at OcUK said his Fury X is faster than his 290X @ 1100/6000, so that is good news.


----------



## DividebyZERO

Quote:


> Originally Posted by *Gregster*
> 
> Gibbo over at OcUK said his Fury X is faster than his 290X @ 1100/6000, so that is good news.


.................


----------



## Shatun-Bear

Quote:


> Originally Posted by *Gregster*
> 
> Gibbo over at OcUK said his Fury X is faster than his 290X @ 1100/6000, so that is good news.


I don't get it? Of course it is.

Design of the card is the best I have seen AMD come up with by some distance. Only the 295X2 comes close IMO.
Quote:


>


http://www.techpowerup.com/forums/threads/enjoy-fury.213660/


----------



## rdr09

Quote:


> Originally Posted by *Shatun-Bear*
> 
> I don't get it? Of course it is.
> 
> Design of the card is the best I have seen AMD come up with by some distance. Only the 295X2 comes close IMO.
> http://www.techpowerup.com/forums/threads/enjoy-fury.213660/


i don't get it either. any one of my 290s can be faster than a 290X @ 1100.


----------



## ZARuslan

Hi! I have case Aerocool Xpredator with Noctua NH-D14 (cooling fx-8120). It leave small space to mount radiator behaind it on back place for cooler and it will be cooling with hot air from cpu cooler(fx-8120). I think it very bad for cooling efficiency. I wanted to place radiator on botom of case, because i thought that water loop is filled for 100%. Now i saw manual for fury x and they suggest to place radiator above gpu. Im sure it's for pump, it should always filled with water or it will brake very fast. What options? Place radiator on top(if there will ne enough space) or place behind front 5.25 panel (i have to figure out how to secure it there).
Thanks for tips and sorry for my bad english.


----------



## FreeElectron

Are there any reviews up?


----------



## Forceman

Quote:


> Originally Posted by *FreeElectron*
> 
> Are there any reviews up?


June 24th.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gregster*
> 
> Gibbo over at OcUK said his Fury X is faster than his 290X @ 1100/6000, so that is good news.


Thanks Captain Obvious.


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Just saw the prices of R9 390x on Newegg. $650 is about right. 390x is ~$475.
> 
> ~Ceadder


There should be cheaper 8GB 390X's. I have seen them at $429


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> There should be cheaper 8GB 390X's. I have seen them at $429


They are, sapphire seems to have the only real custom 390x, with 3 DP it is actually tempting for me to pick up a couple just to play with or to hold out until dual Fiji comes along(fall?)


----------



## DSgamer64

Depending on the price point and performance, I will either be going with the Fury X or Fury X2. Game optimization is obviously still a concern for me right now, so I don't know if really picking another AMD card makes a lot of sense until they get that issue sorted out.


----------



## Ceadderman

I'm waiting to find out what Fury 2x is gonna sell for. Cause to be honest after seeing the pics above of Fury X, while it looks smexy







it looks short for an ATX board. I'm not worried about size constraints in my FT case. So hopefully 2x will be what I get. Fury X will work too but yeah it's ubersmall.










~Ceadder


----------



## hyp36rmax

Quote:


> Originally Posted by *DSgamer64*
> 
> Depending on the price point and performance, I will either be going with the Fury X or Fury X2. Game optimization is obviously still a concern for me right now, so I don't know if really picking another AMD card makes a lot of sense until they get that issue sorted out.


I feel that i'll be holding out for a a couple Fury X2 if the price is right. I'm wishfully thinking the magic $999, however it'll probably be around $1499.99. But who knows as long as AMD is touting the Fury X2 as the fastest single PCB dual GPU then i'm down. i've got no limit 4k aspirations


----------



## Sleazybigfoot

I'm not entirely sure but I think you've got the Fury and Fury X launch dates switched around.

Fury X is 24th of June and the Fury 14th of July right?

(In the specifications part of the first post)


----------



## xer0h0ur

Yes. Fury X launches before Fury and R9 Nano doesn't have a specified launch date nor does the Fury X2.


----------



## hamzta09

Benches on 24th, is that confirmed or speculation?


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> Benches on 24th, is that confirmed or speculation?


That's confirmed (by AMD) to be the day they go on sale, so it stands to reason there'd be reviews out then also.


----------



## dade_kash_xD

That lack of HDMI 2.0 support is a deal breaker for me. Too bad. I really wanted these fiji cards.


----------



## hyp36rmax

Quote:


> Originally Posted by *Sleazybigfoot*
> 
> I'm not entirely sure but I think you've got the Fury and Fury X launch dates switched around.
> 
> Fury X is 24th of June and the Fury 14th of July right?
> 
> (In the specifications part of the first post)


Thank you for catching that. My mistake as i was probably really excited about the announcement and put it in the wrong spot.


----------



## bulldogger

Quote:


> Originally Posted by *dade_kash_xD*
> 
> That lack of HDMI 2.0 support is a deal breaker for me. Too bad. I really wanted these fiji cards.


And no Displayport 1.3 either which means no HDCP 2.2 so even an adapter that gets you HDMI 2.0 specs still will not work if you want to hook up to one of the new 4K HDMI 2.0 television sets which also require HDCP 2.2. Deal breaker!


----------



## Balsagna

So.... can my QNIX 2560x1440p @ 120hz not use this card?

And to think, I was going to buy one. Looks like, Nvidia might take more of my money


----------



## Kane2207

Quote:


> Originally Posted by *Balsagna*
> 
> So.... can my QNIX 2560x1440p @ 120hz not use this card?
> 
> And to think, I was going to buy one. Looks like, Nvidia might take more of my money


Not presently, I'm not aware of any active adaptors that'll run 1440p @120 from DP to DVI.

You'd have to buy a new monitor from what I understand - I appear to be in a similar position.


----------



## jerrolds

Quote:


> Originally Posted by *Kane2207*
> 
> Not presently, I'm not aware of any active adaptors that'll run 1440p @120 from DP to DVI.
> 
> You'd have to buy a new monitor from what I understand - I appear to be in a similar position.


Correct - my active adapter tops out around [email protected] altho someone claims theyre able to hit 104hz..i cant find the link. A vendor a few pages back says that AMD may be pushing AIBs to still support DVI, maybe DP1.3/HDMI2 ..so we might be in luck sometime in Q3

My 290X can tide me over i think till then.


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> Correct - my active adapter tops out around [email protected] altho someone claims theyre able to hit 104hz..i cant find the link.


I think it might have been in the Qnix owners thread. He fiddled the timings, if I remember right.

Edit: here it is.
Quote:


> Originally Posted by *fullban*
> 
> this 1 off amazon it has to be that exact model number
> 
> http://www.amazon.co.uk/gp/product/B00856WJH8?psc=1&redirect=true&ref_=oh_aui_detailpage_o08_s00
> 
> and use these tighter timings in cru .as u can see I think this adapter wont go above 400 pixel clock (I could only squeeze 104hz which is great)


----------



## Kane2207

Quote:


> Originally Posted by *jerrolds*
> 
> Correct - my active adapter tops out around [email protected] altho someone claims theyre able to hit 104hz..i cant find the link. A vendor a few pages back says that AMD may be pushing AIBs to still support DVI, maybe DP1.3/HDMI2 ..so we might be in luck sometime in Q3
> 
> My 290X can tide me over i think till then.


AMD seem certain reliable adaptors will be available this year, as all these threads are still only speculative, they could even include one in the box for all we know


----------



## TK421

Can we expect the Fury X to be sold in newegg, amazon etc by the next 2 week? Or will they be available earlier than 2 weeks from now?


----------



## Forceman

Quote:


> Originally Posted by *TK421*
> 
> Can we expect the Fury X to be sold in newegg, amazon etc by the next 2 week? Or will they be available earlier than 2 weeks from now?


The should go on sale Wednesday, although Amazon doesn't always have day one stock. Newegg should be good though.


----------



## TK421

Quote:


> Originally Posted by *Forceman*
> 
> The should go on sale Wednesday, although Amazon doesn't always have day one stock. Newegg should be good though.


problem is newegg gouges a lot more than amazon (not the 3rd party sellers)


----------



## Forceman

Quote:


> Originally Posted by *TK421*
> 
> problem is newegg gouges a lot more than amazon (not the 3rd party sellers)


I think day one stock is usually priced at MSRP at Newegg, it's after that they sometimes jack the price. I guess it depends on their initial supply though.


----------



## TK421

Quote:


> Originally Posted by *Forceman*
> 
> I think day one stock is usually priced at MSRP at Newegg, it's after that they sometimes jack the price. I guess it depends on their initial supply though.


hm, I hope that's the case


----------



## jerrolds

Quote:


> Originally Posted by *Forceman*
> 
> I think day one stock is usually priced at MSRP at Newegg, it's after that they sometimes jack the price. I guess it depends on their initial supply though.


Yup day one 290X prices were at MSRP - i remember getting mine for $649 w/ BF4.


----------



## Kokin

Quote:


> Originally Posted by *Balsagna*
> 
> So.... can my QNIX 2560x1440p @ 120hz not use this card?
> 
> And to think, I was going to buy one. Looks like, Nvidia might take more of my money


We'll have to see if the non-reference cards will have DVI.


----------



## TK421

Quote:


> Originally Posted by *Kokin*
> 
> We'll have to see if the non-reference cards will have DVI.


Lightning Fury X? :3


----------



## Kane2207

Quote:


> Originally Posted by *Kokin*
> 
> We'll have to see if the non-reference cards will have DVI.


AMD have stated the X will be reference (DP + HDMI) only.


----------



## zeppoli

Just a couple days away, then the "truth" shall be revealed. For myself and many others that need the adapter, 700+ total dollars for this GPU it better beat the 980 ti by at least 5% overall, and I'm talking the G1 980ti.

I was very excited but the closer we come to the actual numbers the more I believe this card will be losing most benchmark reviews.
Hope I'm wrong, but knowing AMD, I'm probably not


----------



## LegacyLG

Any guess on what fury x 2 would cost?


----------



## BackwoodsNC

I just want to know the mounting hole spacing. I don't think normal gpu only blocks will work. Anyone know?


----------



## magicc8ball

Quote:


> Originally Posted by *DividebyZERO*
> 
> They are, sapphire seems to have the only real custom 390x, with 3 DP it is actually tempting for me to pick up a couple just to play with or to hold out until dual Fiji comes along(fall?)


Sapphire is not the only one with a true custom card. look that the Gigabyte G1 390x. Asus, MSI, and Powercolor all of the say port layout but still are custom cards.

I may have mis-read what you were referring, if I did then please clear the air.
Quote:


> Originally Posted by *TK421*
> 
> Lightning Fury X? :3


This is what I am drooling over, if they make one.

What do you think a lightning Fury X, Fury, and Nano, or just the X?


----------



## Sleazybigfoot

Quote:


> Originally Posted by *hyp36rmax*
> 
> Thank you for catching that. My mistake as i was probably really excited about the announcement and put it in the wrong spot.


No worries haha, pretty stoked myself, re-reading everything checking local stores for a price.








Can't wait to read reviews and benchmarks.


----------



## DividebyZERO

Quote:


> Originally Posted by *magicc8ball*
> 
> Sapphire is not the only one with a true custom card. look that the Gigabyte G1 390x. Asus, MSI, and Powercolor all of the say port layout but still are custom cards.
> 
> I may have mis-read what you were referring, if I did then please clear the air.
> This is what I am drooling over, if they make one.
> 
> What do you think a lightning Fury X, Fury, and Nano, or just the X?


I missed that one, thanks for the link. I have been trying to find bare pcb shots to see if my ref 290x waterblocks will work. So far i dont feel very confident since i have 290x ek blocks and they are not rev2. This will only be a concern if i decide against fury.


----------



## magicc8ball

Quote:


> Originally Posted by *DividebyZERO*
> 
> I missed that one, thanks for the link. I have been trying to find bare pcb shots to see if my ref 290x waterblocks will work. So far i dont feel very confident since i have 290x ek blocks and they are not rev2. This will only be a concern if i decide against fury.


No problem. I just took a look myself and I think your out of luck with the ex blocks fitting the 390x, but it would be epic if it does fit.

It would be hard for me to decide against the fury as I want to try out HBM for myself. Plus it is a huge upgrade for me, coming from a MSI 7970 Lightning. I will also say and it is kinda sad, but I will be sad when I take out my 7970. Keeping it around as a backup will be a no brainier and I might even build something to display it off in. That should tell you how much I have enjoyed this card. To bad the EK WB were shorting out the cards or it would have been watercooled.


----------



## Shatun-Bear

Quote:


> Originally Posted by *zeppoli*
> 
> Just a couple days away, then the "truth" shall be revealed. For myself and many others that need the adapter, *700+ total dollars for this GPU it better beat the 980 ti by at least 5% overall, and I'm talking the G1 980ti.*
> 
> I was very excited but the closer we come to the actual numbers the more I believe this card will be losing most benchmark reviews.
> Hope I'm wrong, but knowing AMD, I'm probably not


Hardly a fair comparison as that is significantly faster than the reference 980 Ti and costs $690.

I think leaked benchmarks and general word of mouth suggests the Fury X _at least_ offers around the same performance as a 980 Ti. And to think that is with immature drivers from AMD. No doubt future drivers will help this powerhouse continue to accelerate past reference 980 Ti's whilst the drivers with Nvidia's GPU were pretty great from day 1 so there isn't as much scope for more performance down the line.


----------



## hyp36rmax

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Hardly a fair comparison as that is significantly faster than the reference 980 Ti and costs $690.
> 
> I think leaked benchmarks and general word of mouth suggests the Fury X at least offers around the same performance as a 980 Ti. And to think that is with immature drivers from AMD. No doubt future drivers will help this powerhouse continue to accelerate past reference 980 Ti's whilst the drivers with Nvidia's GPU were pretty great from day 1 so there isn't as much scope for more performance down the line.


Agreed and to think how well the AMD GPU's matures with each driver and generation compared to Nvidia. Does anyone even care about a GTX 680 or 780 these days? The 7970 / 280X and 290X/390X as well as the FURY serieswill probably be much more of an investment in the long run.


----------



## joeh4384

Quote:


> Originally Posted by *hyp36rmax*
> 
> Agreed and to think how well the AMD GPU's matures with each driver and generation compared to Nvidia. Does anyone even care about a GTX 680 or 780 these days? The 7970 / 280X and 290X/390X as well as the FURY serieswill probably be much more of an investment in the long run.


AMD cards age like fine wine. That is the one good thing about all the re-brands they do.


----------



## Agent Smith1984

R9 390 delivering today....

Can't wait to overclock









Every review is showing 390/390x as a direct competitor to the 980 now, and beating it in many cases....

With 390 competing with GTX 980, Fury likely competing with 980 Ti, and Fury X competing with Titan X, I'd say the momentum has swung back in the red team's favor!!!

At least from a performance/dollar standpoint... AMD will never have the market share NVIDIA does, but it sure feels good to have purchased an AMD card.
I always feel good when I get this kind of value (compared to other hardware on the market, spending $300 on ANYTHING hurts, but.... lol)

This exact statement from hardwarecunucks totally applied to me:

"Regardless of whether you want to call this a rebrand or refresh (I'm firmly on the refresh side), the R9 390X is an undeniably appealing card for anyone who can't justify spending over $450 for a GPU. It is truly amazing to see that a Hawaii-based derivative can be so competitive this far into its life. I'm just not sure if that represents a ringing endorsement for the versatility of AMD's GCN 1.1 architecture or an honest critique about how the graphics performance yardsticks haven't moved all that much in almost two years. Maybe it's both."


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> R9 390 delivering today....
> 
> Can't wait to overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Every review is showing 390/390x as a direct competitor to the 980 now, and beating it in many cases....
> 
> With 390 competing with GTX 980, Fury likely competing with 980 Ti, and Fury X competing with Titan X, I'd say the momentum has swung back in the red team's favor!!!
> 
> At least from a performance/dollar standpoint... AMD will never have the market share NVIDIA does, but it sure feels good to have purchased an AMD card.
> I always feel good when I get this kind of value (compared to other hardware on the market, spending $300 on ANYTHING hurts, but.... lol)


I was looking at reviews and was surprised how close the 390x got to the 980ti in some titles mostly 4k i think it was.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Agent Smith1984*
> 
> R9 390 delivering today....
> 
> Can't wait to overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Every review is showing 390/390x as a direct competitor to the 980 now, and beating it in many cases....
> 
> With 390 competing with GTX 980, Fury likely competing with 980 Ti, and Fury X competing with Titan X, I'd say the momentum has swung back in the red team's favor!!!
> 
> At least from a performance/dollar standpoint... AMD will never have the market share NVIDIA does, but it sure feels good to have purchased an AMD card.
> I always feel good when I get this kind of value (compared to other hardware on the market, spending $300 on ANYTHING hurts, but.... lol)
> 
> This exact statement from hardwarecunucks totally applied to me:
> 
> "Regardless of whether you want to call this a rebrand or refresh (I'm firmly on the refresh side), the R9 390X is an undeniably appealing card for anyone who can't justify spending over $450 for a GPU. It is truly amazing to see that a Hawaii-based derivative can be so competitive this far into its life. I'm just not sure if that represents a ringing endorsement for the versatility of AMD's GCN 1.1 architecture or an honest critique about how the graphics performance yardsticks haven't moved all that much in almost two years. Maybe it's both."


390 is the 300-series GPU I was eyeing as it offers some serious bang-for-buck, and the 8GB is really nice.

I agree with your thoughts about AMD being more competitive this cycle. The fact that they have 4 Fji cards coming out this year (Fury X, Fury, Nano, and Fury X2/Maxx) means they are going to offer some serious competition to Nvidia. The Fury and Nano, especially, look like great GPUs if the rumours are to be believed. The Fury offering Fury X performance for $550 and the Nano 10% less for most probably $450-500.


----------



## psychok9

Quote:


> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hyp36rmax*
> 
> Agreed and to think how well the AMD GPU's matures with each driver and generation compared to Nvidia. Does anyone even care about a GTX 680 or 780 these days? The 7970 / 280X and 290X/390X as well as the FURY serieswill probably be much more of an investment in the long run.
> 
> 
> 
> AMD cards age like fine wine. That is the one good thing about all the re-brands they do.
Click to expand...

Yeah, I think it is good when you don't want change card every year, like me.
I'm worried about Maxwell life "performance" support on the long run and about the size of video ram of the fury x. Pro/cons for both.

Inviato dal mio GT-I9305 utilizzando Tapatalk


----------



## flopper

Quote:


> Originally Posted by *psychok9*
> 
> Yeah, I think it is good when you don't want change card every year, like me.
> I'm worried about Maxwell life "performance" support on the long run and about the size of video ram of the fury x. Pro/cons for both.
> 
> Inviato dal mio GT-I9305 utilizzando Tapatalk


AMD simply has the better stuff over time also.
software mantle now used as a world standard developed in mind with their GCN tech makes it a no brainer to go amd with windows 10.


----------



## zeppoli

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Hardly a fair comparison as that is significantly faster than the reference 980 Ti and costs $690.
> 
> I think leaked benchmarks and general word of mouth suggests the Fury X _at least_ offers around the same performance as a 980 Ti. And to think that is with immature drivers from AMD. No doubt future drivers will help this powerhouse continue to accelerate past reference 980 Ti's whilst the drivers with Nvidia's GPU were pretty great from day 1 so there isn't as much scope for more performance down the line.


Why is that not a fair comparison ? It will cost ME (and others) more money to run a Fury X in our systems then it would the G1 980ti.

I'd say its a perfect comparison, I mean this is supposed to be a new hot design, you're telling me they design this state of the art Video card , new type of memory new processor and all it can do is keep up with the old Maxwell 2 980 ti.
When this was first announced we saw it comparable to the titan X, actually I think one of those benchmarks had the titan X beat, now where back to the 980ti, I'm just afraid when the actual benchmarks come out we'll see it lagging behind in most tests.

Again, I hope I'm wrong.


----------



## Shatun-Bear

Quote:


> Originally Posted by *zeppoli*
> 
> Why is that not a fair comparison ? It will cost ME (and others) more money to run a Fury X in our systems then it would the G1 980ti.
> 
> I'd say its a perfect comparison, I mean this is supposed to be a new hot design, you're telling me they design this state of the art Video card , new type of memory new processor and all it can do is keep up with the old Maxwell 2 980 ti.
> When this was first announced we saw it comparable to the titan X, actually I think one of those benchmarks had the titan X beat, now where back to the 980ti, I'm just afraid when the actual benchmarks come out we'll see it lagging behind in most tests.
> 
> Again, I hope I'm wrong.


You do realize the Titan X is only a few percentage points faster than the 980 Ti, and in actual gaming, is more or less the same as the cheaper card? Also, the G1 980 Ti you originally mentioned is faster than a Titan X and practically makes that card obsolete. This was said as much in its review on TechPowerUp:
Quote:


> Originally Posted by *TechPowerUp*
> As we predicted in our GTX 980 Ti reference review, the first custom GTX 980 Ti variant we tested today comes with large performance improvements over the GTX 980 Ti, even beating the much more expensive GTX Titan X, which makes it obsolete. Compared to the GTX 980 Ti, we see a 10-15% performance increase depending on the resolution. The Titan X is defeated as it is up to 11% slower than Gigabyte's GTX 980 Ti.


http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/35.html

So that the Fury X is trading blows with the 980 Ti practically means it is doing the same with the Titan X. Nothing has changed performance wise. You are getting confused with the idea that the Titan X is noticeably faster than the 980 Ti because, I dunno, it's got 12GB of VRAM and it's $900. But like I said, it's made obsolete to most users as it's slower than non-reference 980 Ti cards.


----------



## zeppoli

Quote:


> Originally Posted by *Shatun-Bear*
> 
> You do realize the Titan X is only a few percentage points faster than the 980 Ti, and in actual gaming, is more or less the same as the cheaper card? Also, the G1 980 Ti you originally mentioned is faster than a Titan X and practically makes that card obsolete. This was said as much in its review on TechPowerUp:
> http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/35.html
> 
> So that the Fury X is trading blows with the 980 Ti practically means it is doing the same with the Titan X. Nothing has changed performance wise. You are getting confused with the idea that the Titan X is noticeably faster than the 980 Ti because, I dunno, it's got 12GB of VRAM and it's $900. But like I said, it's made obsolete to most users as it's slower than non-reference 980 Ti cards.


Yes I know in some cases some of the 980ti's show it being faster and it was a huge blow the titan x owners, but still overall its still crowned as the kind of all single GPU's.

I won't say anymore, ill just wait


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> I was looking at reviews and was surprised how close the 390x got to the 980ti in some titles mostly 4k i think it was.


Yeah, and in my case, I never planned to drop $650 on the flagship to begin with.

I never spend more than $350 for a single GPU, just never, period..... Hell, it's usually no more than $250









So the plan for me, was to get this new 390, turn around a month or two from now, and grab a second one.

Spread that spending out over a few months (mainly for financial reasons







) and still end up with 8GB usable VRAM over a 1024 bit bus at over 1700+MHz with OC, and some really potent core performance (5120 shaders at 1150+ MHz depending on my final clock).

The pair of cards will run right around that $650 mark either way, but the crossfire performance of the two 390's should put a pretty good whoopin on Fury....


----------



## ChronoBodi

And this is why I don't buy the Titan-type cards anymore.

Ohhh it looked like a good idea in Feb 2013 for me when considering it has an then-unheard of 6GB of Vram and was 100% faster then the Fermi big-die that preceded it.

So yes, I had Sli Titans, but back then we didn't know of that there would be a cheaper GTX 780 Ti or the 290x that provided the same or better perf for at least $400 less.

So, of course I sold my two Titans for $600 each and got r9 290X crossfire 8gb, $420 each.

Yes I do not get back my full amount, but rarely do you get a slightly better GPU with more VRAM at an "profit".

$1200 from the two Titans sold = $840 for two 290Xs = $350ish left over with slightly better gpus.

And on another note, the new Titan X and Furies are still stuck at 28nm, an almost 4 years old node. It's very impressive the performance they can wring out of such an old node, but, you can only imagine what the 14nm node can do for GPUs next year.

This year is not a good time to buy any GPU IMO. We're right on that cusp of transition over to the desperately-needed node shrink.


----------



## hyp36rmax

Quote:


> Originally Posted by *ChronoBodi*
> 
> And this is why I don't buy the Titan-type cards anymore.
> 
> Ohhh it looked like a good idea in Feb 2013 for me when considering it has an then-unheard of 6GB of Vram and was 100% faster then the Fermi big-die that preceded it.
> 
> So yes, I had Sli Titans, but back then we didn't know of that there would be a cheaper GTX 780 Ti or the 290x that provided the same or better perf for at least $400 less.
> 
> So, of course I sold my two Titans for $600 each and got r9 290X crossfire 8gb, $420 each.
> 
> Yes I do not get back my full amount, but rarely do you get a slightly better GPU with more VRAM at an "profit".
> 
> $1200 from the two Titans sold = $840 for two 290Xs = $350ish left over with slightly better gpus.
> 
> *And on another note, the new Titan X and Furies are still stuck at 28nm, an almost 4 years old node. It's very impressive the performance they can wring out of such an old node, but, you can only imagine what the 14nm node can do for GPUs next year.
> 
> This year is not a good time to buy any GPU IMO. We're right on that cusp of transition over to the desperately-needed node shrink.*


Very good post and insight. Makes you think.


----------



## Shatun-Bear

Quote:


> Originally Posted by *zeppoli*
> 
> Yes I know in some cases some of the 980ti's show it being faster and it was a huge blow the titan x owners, but still overall its still crowned as the kind of all single GPU's.
> 
> I won't say anymore, ill just wait


Fair enough. Two more days, then we will see what AMD has really got hey..


----------



## Agent Smith1984

Titan comes out, cost $1000, 780ti comes out 4-6 weeks later and competes with it......

Titan X comes out, then 980 ti 4-6 weeks later and competes with it....

Why do people keep falling for the trap???


----------



## Kane2207

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Titan comes out, cost $1000, 780ti comes out 4-6 weeks later and competes with it......
> 
> Titan X comes out, then 980 ti 4-6 weeks later and competes with it....
> 
> Why do people keep falling for the trap???


Titan came out in February, 780 ti came out November...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kane2207*
> 
> Titan came out in February, 780 ti came out November...


Okay, point taken, didn't follow that series that closely, but with the new series it was pretty close. ....

Maybe Titan X owners thought it would be a gap again, so they went ahead and bought....


----------



## Kane2207

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, point taken, didn't follow that series that closely, but with the new series it was pretty close. ....
> 
> Maybe Titan X owners thought it would be a gap again, so they went ahead and bought....


Yeah, Titan X timings vs 980ti was bad, I wouldn't be too happy. The original Titan wasn't so bad, it was 6-7 months before the 290X/780ti came along and beat it, even then a volt unlocked original Titan was still quite competitive.

People get screwed as early adopters anywayanyway, but I'm waiting for both camps to play their hands this time before I upgrade.


----------



## Agent Smith1984

Good thing the mining thing is dead (sorry for anyone who lost money on mining, bitcoin, etc.... I am speaking from a PC gaming perspective)

The whole mining thing screwed up the 280/290 pricing SO BAD......

Now we should see a normal and steady pricing decline in the market. Which is the natural order of things...

I'm sure AMD and it's suppliers wouldn't mind a return....

Especially Elpida..... they were getting top pennies for their VRAM chips once Hynix hit capacity.


----------



## blue1512

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Titan comes out, cost $1000, 780ti comes out 4-6 weeks later and competes with it......i
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Okay, point taken, didn't follow that series that closely, but with the new series it was pretty close. ....
> 
> Maybe Titan X owners thought it would be a gap again, so they went ahead and bought....
> 
> 
> 
> Your point is valid. It was 780 which was released 2 month after the original Titan, cheaper and able to beat Titan with a good custom design.
> And the lambs still keep silent and giving them money...
Click to expand...


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The whole mining thing screwed up the 280/290 pricing SO BAD......


Also anyone who had a 7950 sold it for as much as they purchased it for. It was great.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> Your point is valid. It was 780 which was released 2 month after the original Titan, cheaper and able to beat Titan with a good custom design.
> And the lambs still keep silent and giving them money...


It's not though is it? The time frame was greatly exaggerated and an overclocked Titan still handily beat a 780 at the time with 3GB of additional VRAM to boot.

You really need to fact check before you attempt incendery posts...


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> It's not though is it? The time frame was greatly exaggerated and an overclocked Titan still handily beat a 780 at the time with 3GB of additional VRAM to boot.
> 
> You really need to fact check before you attempt incendery posts...


Unless you mod your Titan Bios, there are handful of custom 780 which beats Titan, even when both are overclocked
http://www.guru3d.com/articles_pages/evga_geforce_gtx_780_sc_acx_review,21.html
http://www.guru3d.com/articles_pages/msi_geforce_gtx_780_lightning_review,21.html


----------



## flopper

Quote:


> Originally Posted by *Kane2207*
> 
> off topic
> .


Quote:


> Originally Posted by *blue1512*
> 
> off topic


Seems nice that the Fury is out tomorow we know a lot more then why its a Godlike card.
cant wait to hold one in my hands to slowly insert it into my computer to connect it up to the web.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> Unless you mod your Titan Bios, there are handful of custom 780 which beats Titan, even when both are overclocked
> http://www.guru3d.com/articles_pages/evga_geforce_gtx_780_sc_acx_review,21.html
> http://www.guru3d.com/articles_pages/msi_geforce_gtx_780_lightning_review,21.html


1. Flashing the BIOS was childs play.
2. You've picked this as an example?
Quote:


> With 980 MHz, the MSI GTX 780 Lightning runs a large overclock out of the box. All these improvements do cost $770 though, *making the GTX 780 Lightning the most expensive air-cooled GTX 780 available right now.*


Are you planning on a similar complaint if/when a non-reference, overclocked Fury outpaces a Fury X?


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> 1. Flashing the BIOS was childs play.
> 2. You've picked this as an example?
> Are you planning on a similar complaint if/when a non-reference, overclocked Fury outpaces a Fury X?


When I give examples, of course that example must be strong. Note that the price of MSI Lightning didn't reach Titan's, and even a reference 780 can surpass Titan's performance when overclocked. Flashing the BIOS wasn't childs play by the way.

This time at least the existence of Fury was well informed, and Fury X isn't priced at 1k USD. AMD also gave Fury X the best cooler available, so I doubt that a custom Fury can compete with FuryX


----------



## szeged

flashing the bios on 780s and titans was literally the click of a button and the tap of the "y" key if you used the auto flash tool. You basically had to try to screw it up, i dont know why people still try to act like it was harder than sending someone outside the solar system.


----------



## ondoy




----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> When I give examples, of course that example must be strong. Note that the price of MSI Lightning didn't reach Titan's, and even a reference 780 can surpass Titan's performance when overclocked. Flashing the BIOS wasn't childs play by the way.
> 
> This time at least the existence of Fury was well informed, and Fury X isn't priced at 1k USD. AMD also gave Fury X the best cooler available, so I doubt that a custom Fury can compete with FuryX


Flashing the BIOS was childs play, it required a BIOS and one entry on a command line. If you're struggling with that then boy are you on the wrong site.

Saying a reference 780 can beat a Titan means nothing when you can overclock the Titan to the same extent. You carry on though, labeling people sheep... I'm not stupid, I can see the intentions in your posts...


----------



## Kane2207

Quote:


> Originally Posted by *ondoy*


Is that $999 US dollars????

Also - I can't quite make out what that HDMI sticker says, looks like something is included, either a cable or could they maybe have included a DP>HDMI 2.0 adapter?


----------



## blue1512

Quote:


> Originally Posted by *Kane2207*
> 
> Flashing the BIOS was childs play, it required a BIOS and one entry on a command line. If you're struggling with that then boy are you on the wrong site.
> 
> Saying a reference 780 can beat a Titan means nothing when you can overclock the Titan to the same extent. You carry on though, labeling people sheep... I'm not stupid, I can see the intentions in your posts...


It's just me or you keep ignoring the main point of my reply?

Back to the discussion, 780 to Titan is the same as 980Ti to TitanX, that the whole point here. It's different in Fury/FuryX case, as I stated.


----------



## Kane2207

Quote:


> Originally Posted by *blue1512*
> 
> It's just me or you keep ignoring the main point of my reply?
> 
> Back to the discussion, 780 to Titan is the same as 980Ti to TitanX, that the whole point here. It's different in Fury/FuryX case, as I stated.


The whole point of your post was just to call people names and incite another red vs green scenario.

I didn't miss that at all...


----------



## ondoy

Quote:


> Originally Posted by *Kane2207*
> 
> Is that $999 US dollars????
> 
> Also - I can't quite make out what that HDMI sticker says, looks like something is included, either a cable or could they maybe have included a DP>HDMI 2.0 adapter?


should be around 749 USD....
including markup...


----------



## Kane2207

Quote:


> Originally Posted by *ondoy*
> 
> should be around 749 USD....
> including markup...


I'm talking about the sticker on the box that clearly states $999, that's a hefty markup if it is US.


----------



## bkvamme

Quote:


> Originally Posted by *Kane2207*
> 
> Also - I can't quite make out what that HDMI sticker says, looks like something is included, either a cable or could they maybe have included a DP>HDMI 2.0 adapter?


The sticker simply states that a 1.8M HDMI extension cable is included.


----------



## StereoPixel

Quote:


> Originally Posted by *ondoy*


450 GB/s? not 512 GB/s?


----------



## Casey Ryback

Quote:


> Originally Posted by *StereoPixel*
> 
> 450 GB/s? not 512 GB/s?


Hmm what the..........


----------



## Valenz

Quote:


> Originally Posted by *ondoy*


Thanks crazy markup , smh .. 290x all over again.


----------



## Casey Ryback

Quote:


> Originally Posted by *Valenz*
> 
> Thanks crazy markup , smh .. 290x all over again.


?

How do we know it's USD?


----------



## DFroN

Quote:


> Originally Posted by *Casey Ryback*
> 
> ?
> 
> How do we know it's USD?


Quote:


> Originally Posted by *Kane2207*
> 
> Is that $999 US dollars????


Quote:


> Originally Posted by *Valenz*
> 
> Thanks crazy markup , smh .. 290x all over again.


According to the other thread its Singapore Dollars not US.


----------



## DividebyZERO

Having to wait for benchmarks reviews on release date is really not helping decide if its worth it to buy, then by the time you find out it is everyone will have already bought whats in stock...

Wish i had an inkling about performance before launch.(independent review not the same companies benchmarks)


----------



## Casey Ryback




----------



## zealord

are reviews coming tomorrow?


----------



## p4inkill3r

Quote:


> Originally Posted by *Casey Ryback*


That's so clean.


----------



## magicc8ball

So if it is Singapore money that is roughly 745 here in the States.
Quote:


> Originally Posted by *zealord*
> 
> are reviews coming tomorrow?


What I have heard is at tomorrow morning at 8 or 7 central time is when they can start releasing reviews. So we are less than a day away if true.
Quote:


> Originally Posted by *Casey Ryback*


They even put braided cable over the power cable for the pump. You wont ever see that unless you are taking the cover off for some reason. But just shows how much attention to detail they have put, very impressed.


----------



## blue1512

Quote:


> Originally Posted by *ondoy*


This is Singapore dollar. 999 ~ 745 USD. GST and shop premium included
Note that this is Sapphire version which comes with a DP-DVI adaptor,


----------



## zealord

Quote:


> Originally Posted by *magicc8ball*
> 
> So if it is Singapore money that is roughly 745 here in the States.
> What I have heard is at tomorrow morning at 8 or 7 central time is when they can start releasing reviews. So we are less than a day away if true.


----------



## Casey Ryback

It almost looks like they could've condensed it even more ie 6.5"-7" instead of 7.5"? either way it's good.


----------



## Agent Smith1984

That 450GB/s instead of 512 is strange to me....

Hell I'm pulling over 400GB on my 390 (not to say that the HBM isn't still significantly faster)

The thing to remember is that with a 4096 bit bus width, the slightest bit of overclocking drastically increases the bandwidth.


----------



## zealord

interesting that all the Fury X boxes don't mention 4GB


----------



## gerardfraser

Looks like 4GB.I ordered Gigabyte AMD Fury X today.

Pre-orders Ncix Canada
http://www.ncix.com/detail/gigabyte-radeon-fury-x-1050mhz-43-110438.htm
http://www.ncix.com/detail/powercolor-r9-fury-x-1050mhz-c0-110441.htm


----------



## hyp36rmax

Quote:


> Originally Posted by *Casey Ryback*


Cooler Master?


----------



## Agent Smith1984

Quote:


> Originally Posted by *zealord*
> 
> interesting that all the Fury X boxes don't mention 4GB


What's really funny, is that they actually do, but it's written really tiny in the bottom left where it mentions the HBM....

I can understand them being slightly shy about it, but in reality, anybody who is looking at a card in that range, know that the real story is that AMD invested in R&D and brought us HBM first. For now, 4GB is fine.... and if devs make proper use of VRAM with DX12, it should be fine for the long haul also.


----------



## Casey Ryback

Quote:


> Originally Posted by *hyp36rmax*
> 
> Cooler Master?


That's some random picture I found, I'm not sure of what manufacturer they went with on the cooler.

I find it interesting the fury X uses two 8 pin connectors, yet only a 500W PSU is recommended.

Whereas the 980ti uses an 8 and 6 pin, but nvidia suggests a 600W unit.


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's really funny, is that they actually do, but it's written really tiny in the bottom left where it mentions the HBM....


It's pretty clear on the powercolor box, and they had to advertise a lot of features.


----------



## xer0h0ur

It has the potential to draw 150W per 8-pin + 75W at the PCI-E slot for a total potential power draw of 375W while overclocking. That doesn't mean its how much power it draws though. Simply its max power draw limitation. I too am confused at the advertising stating 450GBps instead of 512.


----------



## ondoy




----------



## p4inkill3r

Quote:


> Originally Posted by *gerardfraser*
> 
> Looks like 4GB.I ordered Gigabyte AMD Fury X today.
> 
> Pre-orders Ncix Canada
> http://www.ncix.com/detail/gigabyte-radeon-fury-x-1050mhz-43-110438.htm
> http://www.ncix.com/detail/powercolor-r9-fury-x-1050mhz-c0-110441.htm


$829 CAD

http://www.ncixus.com/search/?categoryid=0&q=FURY+X

$829 USD


----------



## hyp36rmax

Quote:


> Originally Posted by *Casey Ryback*
> 
> That's some random picture I found, I'm not sure of what manufacturer they went with on the cooler.
> 
> I find it interesting the fury X uses two 8 pin connectors, yet only a 500W PSU is recommended.
> 
> Whereas the 980ti uses an 8 and 6 pin, but nvidia suggests a 600W unit.


Looks like a Cooler Master ODM piece according to the user guide at the end.

Quote:


> *6.3 Liquid Cooling Solution Safety Guidelines*
> 
> The liquid cooling solution for the AMD Radeon™ R9 Fury X graphics card is a third-
> party solution that is provided by *Cooler Master Co.,* Ltd. More information about the
> unit can be obtained through Cooler Master using a Cooler Master part number of
> *DCV-01647-A1-HF*.
> 
> The antifreeze/coolant contained within the liquid cooling solution for your graphics
> card is a hazardous chemical that may cause bodily harm if it is allowed to escape from
> the cooling solution. This section provides details about the coolant and the safety
> measures that should be taken in case of coolant leakage.
> Review this section carefully and always handle the liquid cooling solution and
> graphics card with care. More information can be obtained directly from the coolant
> supplier (100004-Shell (China) Limited) at [email protected] product
> code 001C4856 and the corresponding safety data sheet
> Safety Data Sheet, Shell Long Life-OAT - 45 degrees C, Antifreeze/Coolant, document number 000000011241 MSDS_CN,
> version 1.2
> .


----------



## kuzotronic

*ondoy*, according to this benchmark (seems like the settings are the same)

1440p - 



4K - 




FuryX is around 5% slower at 1440p and 10% slower at 4K than 980Ti stock.

Did you notice any stuttering and fps fluctuations during benchmark? The game uses almost 6Gb of vram it's interesting how it handles the game with 4Gb total. HBM magic?)


----------



## geoxile

Quote:


> Originally Posted by *kuzotronic*
> 
> *ondoy*, according to this benchmark *(seems like the settings are the same)*
> 
> 1440p -
> 
> 
> 
> 4K -
> 
> 
> 
> 
> FuryX is around 5% slower at 1440p and 10% slower at 4K than 980Ti stock.
> 
> Did you notice any stuttering and fps fluctuations during benchmark? The game uses almost 6Gb of vram it's interesting how it handles the game with 4Gb total. HBM magic?)


Settings don't look the same. Textures on ultra and "FXAA & Camera blur".

Also, that 4K video is running on high settings, high textures, not ultra.


----------



## xer0h0ur

Quote:


> Originally Posted by *kuzotronic*
> 
> *ondoy*, according to this benchmark (seems like the settings are the same)
> 
> 1440p -
> 
> 
> 
> 4K -
> 
> 
> 
> 
> FuryX is around 5% slower at 1440p and 10% slower at 4K than 980Ti stock.
> 
> Did you notice any stuttering and fps fluctuations during benchmark? The game uses almost 6Gb of vram it's interesting how it handles the game with 4Gb total. HBM magic?)


Did you lose the ability to read? You linked fully Nvidia benchmarks where their *Titan X* was underperforming against the 980 Ti.


----------



## kuzotronic

*geoxile* yep, you're correct) that means we have a clear winner in a face of Fury X at SOM! What really interests me is if 4Gb doesn't cause any stuttering in such high vram usage games on Fury X. Batman AK has 5Gb+ usage as well but needs tons of optimizations right now

*xer0h0ur*
is that really such an issue that only Nvidia GPU's are present in the benchmarks? calm you nerves, man

if you can, please provide a better link so we can make a proper comparison here, that's it)


----------



## xer0h0ur

If you want to see a user's report of Fury X: http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633.html


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That 450GB/s instead of 512 is strange to me....
> 
> .


typo


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> typo


On a retail package?

That's pretty sad on the manufacturer's part...


----------



## xer0h0ur

Quote:


> Originally Posted by *kuzotronic*
> 
> *geoxile* yep, you're correct) that means we have a clear winner in a face of Fury X at SOM! What really interests me is if 4Gb doesn't cause any stuttering in such high vram usage games on Fury X. Batman AK has 5Gb+ usage as well but needs tons of optimizations right now
> 
> *xer0h0ur*
> is that really such an issue that only Nvidia GPU's are present in the benchmarks? calm you nerves, man
> 
> if you can, please provide a better link so we can make a proper comparison here, that's it)


Just saying. If someone makes a claim and lists benchmarks that has nothing to do with a Fury X as their source I tend to brush you off as not having a clue what you're talking about. Already listed a thread where someone has a Fury X and is running tests on it.


----------



## hamzta09

Quote:


> Originally Posted by *flopper*
> 
> typo


Its 256 GB/s in GPU-Z apparently.


----------



## kuzotronic

*xer0h0ur* gotcha, tnx for the link)


----------



## Forceman

Quote:


> Originally Posted by *hamzta09*
> 
> Its 256 GB/s in GPU-Z apparently.


Yeah GPU-Z needs updating, the ROP count is wrong also.


----------



## anotheraznguy

http://tieba.baidu.com/p/3845714245

Based on this opened up picture. Is it just me or is there no active cooling for the VRM. Its hard to tell where that copper heat pipe is going though.


----------



## xer0h0ur

Afterburner and GPU-Z both need to get updates for the new cards. I believe voltage isn't unlocked yet.


----------



## xer0h0ur

Quote:


> Originally Posted by *anotheraznguy*
> 
> http://tieba.baidu.com/p/3845714245
> 
> Based on this opened up picture. Is it just me or is there no active cooling for the VRM. Its hard to tell where that copper heat pipe is going though.


The picture that shows the open unit sucks on that site. You can barely see the copper piping below the braided wiring which is what is directly making contact with the VRMs. This is thankfully not a half-measure cooling solution like it was with the 295X2's cooler only cooling the GPUs. This is in fact cooling the GPU, HBM and VRMs.


----------



## Agent Smith1984

For anyone interested in the 3 series....

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club#post_24077911

It's still a baby, but I will be adding more information as I gather it....


----------



## DMatthewStewart

What is the official release date? Ive put aside some cash and stopped building my open-bench test rig (Thats been delayed a few times already and will probably never be finished)


----------



## dir_d

Quote:


> Originally Posted by *DMatthewStewart*
> 
> What is the official release date? Ive put aside some cash and stopped building my open-bench test rig (Thats been delayed a few times already and will probably never be finished)


Tomorrow for Fury X


----------



## zeppoli

Quote:


> Originally Posted by *kuzotronic*
> 
> *ondoy*, according to this benchmark (seems like the settings are the same)
> 
> 1440p -
> 
> 
> 
> 4K -
> 
> 
> 
> 
> FuryX is around 5% slower at 1440p and 10% slower at 4K than 980Ti stock.
> 
> Did you notice any stuttering and fps fluctuations during benchmark? The game uses almost 6Gb of vram it's interesting how it handles the game with 4Gb total. HBM magic?)


And cost more money! ***** who would be so silly to buy one then? I mean unless you're a hardcore fan boy, why would you buy a weaker/slower product, for MORE MONEY?


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> And cost more money! ***** who would be so silly to buy one then? I mean unless you're a hardcore fan boy, why would you buy a weaker/slower product, for MORE MONEY?


Except its not weaker than a 980 Ti......try looking at actual Fury X benchmarking. Like the owner I posted right after the post you quoted.


----------



## rv8000

Quote:


> Originally Posted by *anotheraznguy*
> 
> http://tieba.baidu.com/p/3845714245
> 
> Based on this opened up picture. Is it just me or is there no active cooling for the VRM. Its hard to tell where that copper heat pipe is going though.


The copper pipe is in contact with the vrm section that requires cooling.


----------



## snow cakes

so is there any intention of making a r9 395x2? or a dual-gpu in general? Or a competitor for the titan-x?


----------



## Kane2207

Quote:


> Originally Posted by *snow cakes*
> 
> so is there any intention of making a r9 395x2? or a dual-gpu in general? Or a competitor for the titan-x?


A dual Fiji card was shown, no idea on release date yet though, or price, but it's coming.


----------



## bkvamme

Quote:


> Originally Posted by *zeppoli*
> 
> And cost more money! ***** who would be so silly to buy one then? I mean unless you're a hardcore fan boy, why would you buy a weaker/slower product, for MORE MONEY?


Price was in Singapore dollars, not USD







.

According to the thread, it's around 749USD including sales tax and some other stuff.


----------



## DMatthewStewart

Quote:


> Originally Posted by *dir_d*
> 
> Tomorrow for Fury X


Whoa! I hope I put aside enough $$


----------



## Phantasia

695€ in spain...

http://www.pccomponentes.com/sapphire_r9_fury_x_4gb_hbm.html


----------



## magicc8ball

Quote:


> Originally Posted by *Phantasia*
> 
> 695€ in spain...
> http://www.pccomponentes.com/sapphire_r9_fury_x_4gb_hbm.html


Dang 776 in USD... I hope they stay at 650 here in the States.


----------



## zealord

Quote:


> Originally Posted by *magicc8ball*
> 
> Dang 776 in USD... I hope they stay at 650 here in the States.


695€ is an amazing price for Spain I'd assume. Should be 649$ in the US.









The cheapest reference 980 Ti in all of Europe is around 700€ if I am not mistaken.

No worries


----------



## xer0h0ur

Every single european price you find should be higher than a US price.


----------



## Mad Pistol

So the general consensus so far is that the Fury X is faster than the 980 Ti and neck-and-neck with the Titan X. If true, that's a pretty epic job on AMD's part.









This may force a price drop on the 980 Ti, but knowing Nvidia, they're going to ride it to the bank that 6GB > 4GB.


----------



## magicc8ball

Quote:


> Originally Posted by *xer0h0ur*
> 
> Every single european price you find should be higher than a US price.


Well I was thinking w/o VAT that it would be around what we would pay in the US (650 USD) (582 Euro). I know in all Countries there will be some sites that mark it up, just was not expecting that high.


----------



## Gumbi

Quote:


> Originally Posted by *magicc8ball*
> 
> Dang 776 in USD... I hope they stay at 650 here in the States.


Converting currency for currency is such a fail lol.

You have to take taxes into account, EU includes VAT (value added tax @ 19 -23 %) in every sale price. Murica doesn't.


----------



## magicc8ball

Quote:


> Originally Posted by *Gumbi*
> 
> Converting currency for currency is such a fail lol.
> 
> You have to take taxes into account, EU includes VAT (value added tax @ 19 -23 %) in every sale price. Murica doesn't.


Ah well thanks!
Quote:


> Originally Posted by *tajoh111*
> 
> http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/
> 
> "One word of warning should you buy a Fiji and molest it in various ways that overclockers and enthusiasts normally do, be careful. If you look at the above picture you can see the pretty patterns on the interposer, they look good but don't taste good. If you want to clean off the thermal paste and replace it with your own cooling solution, be really careful of these areas. Why? Because the interposer, basically a chip, is mounted face up, it is not a traditional flip chip part with the transistors and metal layers protected by the wafer, they fragile bits are on top this time.
> 
> How fragile? Don't touch them, don't wipe them off, and otherwise don't do anything that could break a far sub-micron metal trace. It is really fragile and you will destroy your very expensive GPU if you do this, don't say we didn't warn you. This is a tech transition that hasn't been seen since the days when flip chips replaced wire bonding so think back to the bad old days before you mod. Really, be careful or you will end up with an expensive 4GB, water-cooled doorstop."
> 
> Some parts of fiji, particularly stuff on the interposer are ridiculous sensitive to damage it appears. Considering the sensitivity, I can imagine, that partners are not going to warranty the card after the heatsink has been removed since cleaning off the chip and putting your own thermal paste has the potential to screw up the cards.
> 
> This sounds like, people that put on water cooling are probably best just leaving the stock solution on since the risk is pretty high.


For those that missed it in the R9 Introduction Thread. Tajoh111 posted this up talking about how delicate the interposer is.


----------



## xer0h0ur

Another reason to skip this generation. I will just sit back and let them iron out the kinks for Arctic Islands.


----------



## coelacanth

Quote:


> Originally Posted by *xer0h0ur*
> 
> It has the potential to draw 150W per 8-pin + 75W at the PCI-E slot for a total potential power draw of 375W while overclocking. That doesn't mean its how much power it draws though. Simply its max power draw limitation. I too am confused at the advertising stating 450GBps instead of 512.


Those are the specs, but in reality a video card can draw much more than that from 2x 8-pin and PCIE slot. 375W is not total potential power draw. The limit is much higher higher than the spec.


----------



## xer0h0ur

Quote:


> Originally Posted by *coelacanth*
> 
> Those are the specs, but in reality a video card can draw much more than that from 2x 8-pin and PCIE slot. 375W is not total potential power draw. The limit is much higher higher than the spec.


Well in reality its not too much higher. For instance the 295X2 goes past the limitations and draws a total of a shade under 450W from the same setup of two 8-pins and the PCI-E slot. I don't know what the physical limitation would be before you're testing your luck and begin to overheat/melt the PCI-E power cables. I don't even know if there is any other video card out there that pushes the limits on power draw as much as the 295X2 did.


----------



## rv8000

Does anyone know which timezone the NDA applies to? I've read all sorts of rumors, PST, EST, UTC...


----------



## hamzta09

http://itsalmo.st/#furyxndalifted


----------



## $ilent

Guess im in the "keeping my current setup" group with my 970s in sli, if the fury X performs same or slightly better than Titan X since my 970s sli are same/slightly better than titan X. Still it would be nice for everyone if the fury X really does compete with the titan x. Guess thats another major impulse buy I can prevent myself from making


----------



## magicc8ball

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well in reality its not too much higher. For instance the 295X2 goes past the limitations and draws a total of a shade under 450W from the same setup of two 8-pins and the PCI-E slot. I don't know what the physical limitation would be before you're testing your luck and begin to overheat/melt the PCI-E power cables. I don't even know if there is any other video card out there that pushes the limits on power draw as much as the 295X2 did.


If I did my calculations right then a single 16 gauge wire (which is what I believe it to be after some research) can provide 240 watts @12volts, that equates to 20 amps. Since the PCIe connecter has 3 of those were talking 720watts per 8 pin connector for a total of 1515 watts for the card(720+720+75... so no worries about the cable melting anytime soon haha.

could be wrong but I think that is correct. I second guess it cause those are ridiculous numbers...


----------



## zeppoli

Quote:


> Originally Posted by *Mad Pistol*
> 
> So the general consensus so far is that the Fury X is faster than the 980 Ti and neck-and-neck with the Titan X. If true, that's a pretty epic job on AMD's part.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This may force a price drop on the 980 Ti, but knowing Nvidia, they're going to ride it to the bank that 6GB > 4GB.


It won't be better, there are hints out there that the 980ti is faster.


----------



## xer0h0ur

The 295X2 pushes 50A between the two 8-pins and they require each 8-pin to have access to 30A a piece. I can attest to those cables getting hot. I vaguely remember some site testing it and showing the thermal output using a flir cam. It made me realize just how hot those cables get.


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> It won't be better, there are hints out there that the 980ti is faster.


Hints hugh? Apparently you missed the hint I gave you of the guy who already has his Fury X in hand and is testing it. He already confirmed it beats 980 Ti but either way you will find out in the reviews soon.


----------



## zealord

I hope reviewers will go into detail about 4K benchmarking. That is where we are heading. Even if I am still at 1080p I am more interested in 4K benchmarks, frame pacing, VRAM Usage and overclockability of the Fury X.


----------



## magicc8ball

Quote:


> Originally Posted by *zealord*
> 
> I hope reviewers will go into detail about 4K benchmarking. That is where we are heading. Even if I am still at 1080p I am more interested in 4K benchmarks, frame pacing, VRAM Usage and overclockability of the Fury X.


That is why I love watching TTL over at overclock3d, he is to the point and typically pushes it as much as he can. Anyone know of another reviewer that does the same?


----------



## xer0h0ur

At this point my curiosity lies in how overclockable Fury X is, if its voltage unlocked, and probably most importantly of all what these vague memory techniques are AMD was talking about with respect to the usage of the 4GB of HBM.


----------



## royfrosty

Hi guys,

Its time to step up from being most of the time, a lurker.

Reporting all the way from Singapore! Just a brief intro, i'm the one whom got the Sapphire Fury X in Singapore and i noticed the huge spike in viewers of my thread based in http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633.html

I have been reading some of the ocn community members and there are some questions that needed to be answered.

Bear with me for the next couple of days during the weekends to finish up the reviews, as i have some personal commitments at home.

So far, what i have tested, yes it is either on par with the stock 980ti OR slightly faster than the stock 980ti in most benchmarks and gaming benchmarks. I have yet to post those results but surely in the next upcoming days.

Also i will be trying to overclock this card.

Overall i only have results for Shadow of Mordor. Overall the HBM is working great on 4k. It did not surpass 3.8gb VRAM for 4k ultra settings.



Next is the wrong bandwidth printing on the box. Confirmed by GPUz.


----------



## zeppoli

Quote:


> Originally Posted by *zealord*
> 
> I hope reviewers will go into detail about 4K benchmarking. That is where we are heading. Even if I am still at 1080p I am more interested in 4K benchmarks, frame pacing, VRAM Usage and overclockability of the Fury X.


I agree but still willing to bet majority of 980ti / fury x owners will be using 1080 or 1440


----------



## hamzta09

Quote:


> Originally Posted by *magicc8ball*
> 
> That is why I love watching TTL over at overclock3d, he is to the point and typically pushes it as much as he can. Anyone know of another reviewer that does the same?


TTL is fun, his videos are, shame he dont do more fun stuff anymore.


----------



## p4inkill3r

Quick Q&A with Huddy about HDMI 2.0, CFX, and other stuff.


----------



## magicc8ball

Quote:


> Originally Posted by *hamzta09*
> 
> TTL is fun, his videos are, shame he dont do more fun stuff anymore.


It is a shame that he doesn't, I hope he picks back up again. Does it seem to you that his community is dying a tad?


----------



## szeged

are reviews going to be popping up at 12:00 tonight you guys think? im getting more and more excited for the nda lift lol.


----------



## $ilent

Quote:


> Originally Posted by *szeged*
> 
> are reviews going to be popping up at 12:00 tonight you guys think? im getting more and more excited for the nda lift lol.


Every time I see your name I think sergy the little meerkat from the compare the meerkat advert lol sorry. I take it you'll be buying three fury x at release?


----------



## Forceman

Quote:


> Originally Posted by *szeged*
> 
> are reviews going to be popping up at 12:00 tonight you guys think? im getting more and more excited for the nda lift lol.


I don't think anyone know for sure, but the prevailing opinion seems to be leaning toward morning.


----------



## ChronoBodi

Quote:


> Originally Posted by *$ilent*
> 
> Every time I see your name I think sergy the little meerkat from the compare the meerkat advert lol sorry. I take it you'll be buying three fury x at release?


If he does get three Fury Xs, where you put the three radiators? That's what I don't like so much about AIO cooling, that extra step of screwing in the radiators and somehow having the space for it.


----------



## $ilent

Quote:


> Originally Posted by *ChronoBodi*
> 
> If he does get three Fury Xs, where you put the three radiators? That's what I don't like so much about AIO cooling, that extra step of screwing in the radiators and somehow having the space for it.


Yeah it would be tricky. Seems AMD haven't wagered much on their customers picking up more than one fury x


----------



## Greyson Travis

Quote:


> Originally Posted by *$ilent*
> 
> Yeah it would be tricky. Seems AMD haven't wagered much on their customers picking up more than one fury x


Gonna stack those like legos soon lol. Just a side note. It will be awful sight to see plenty of tentacles running throughout the rig.
Custom WC would reach the shores so much later as the card has a unique form factor and manufacturers have to start from scratch


----------



## ChronoBodi

Quote:


> Originally Posted by *$ilent*
> 
> Yeah it would be tricky. Seems AMD haven't wagered much on their customers picking up more than one fury x


Or rather, custom aircooled tri-fans from Sapphire will do on their non-X furies.

I forget, are the air-cooled Furies lower core count or just lower clock speed?


----------



## blue1512

Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys,
> 
> Its time to step up from being most of the time, a lurker.
> 
> Reporting all the way from Singapore! Just a brief intro, i'm the one whom got the Sapphire Fury X in Singapore and i noticed the huge spike in viewers of my thread based in http://forums.hardwarezone.com.sg/hardware-clinic-2/%5Bgpu-review%5D-sapphire-amd-r9-fury-x-rise-5087633.html
> 
> I have been reading some of the ocn community members and there are some questions that needed to be answered.
> 
> Bear with me for the next couple of days during the weekends to finish up the reviews, as i have some personal commitments at home.
> 
> So far, what i have tested, yes it is either on par with the stock 980ti OR slightly faster than the stock 980ti in most benchmarks and gaming benchmarks. I have yet to post those results but surely in the next upcoming days.
> 
> Also i will be trying to overclock this card.
> 
> Overall i only have results for Shadow of Mordor. Overall the HBM is working great on 4k. It did not surpass 3.8gb VRAM for 4k ultra settings.
> 
> 
> 
> Next is the wrong bandwidth printing on the box. Confirmed by GPUz.


Thanks for your effort







The internet has been all about your number since yesterday


----------



## G227

Though this might be interesting for some - based on royfrosty's benchmarks (saw them on the singapore forum - all credits for those go to him!), I have recreated the benchmarks on my setup with Titan X. Obviously this is *rough comparison* with many things that might not be accurate (driver optimization, further overclocking experience with Fury X, better BIOS, silicon lottery etc. etc.). It's just to provide a comparison.

Here goes:

*Preliminary GPU comparison between Titan X and Fury X Take this with grain of salt.*

Note: I have adjusted the final scores for Heaven the CPU difference as the person with Fury X was running 4770K and I run 5820K.



Full size picture here: http://oi58.tinypic.com/2mo3153.jpg
EDIT3: Not sure about the clocks really for the OC version - royfrosty will probably clarify

NOTE: As some members here pointed out, Unigine Heaven might not run as well on AMD drivers so this might not be representative of the actual in-game performance! No other tests aside from SoM benches were run on the Fury X.

*Here are the Shadow of Mordor 1440p comparisons:*



Link here: http://oi59.tinypic.com/2efq1ef.jpg

*& here @4K:*



Link here: http://oi59.tinypic.com/n6bbiw.jpg
Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys,
> 
> Its time to step up from being most of the time, a lurker.
> 
> ...


Do correct me if I misrepresented your benchmarks. And thanks for the effort!


----------



## blue1512

Quote:


> Originally Posted by *ChronoBodi*
> 
> Or rather, custom aircooled tri-fans from Sapphire will do on their non-X furies.
> 
> I forget, are the air-cooled Furies lower core count or just lower clock speed?


There will be 2 versions of reference non-X, water cooled and air cooled, so the logic points out that Fury will have lower core count. AIBs are free to add their custom design.


----------



## Forceman

Pre-release review popped up in China. Here's the thread about it:

http://www.overclock.net/t/1561804/vmod-amd-radeon-r9-fury-x-4gb-hbm-4096-bit-review


----------



## hyp36rmax

*+Updated Members Registration Form*

It's Live! Click *Here* or this short cut!











Spoiler: Warning: Spoiler! Members Registration











Bring it! Welcome to the club!


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Every time I see your name I think sergy the little meerkat from the compare the meerkat advert lol sorry. I take it you'll be buying three fury x at release?


im gonna try to get just one so i can compare it to my titan x's.

hopefully i can get one on launch day.

Just got done playing lords of the fallen 4k, absolute maximum settings possible, vram used was 7800 - 7900 on average, fps was hard capped at 60 =\ gonna try another game. Shadows of mordor seems to be popular for gpu gaming benches these days so i may buy a copy off ebay and test that.


----------



## billyboy8888

Quote:


> Originally Posted by *szeged*
> 
> are reviews going to be popping up at 12:00 tonight you guys think? im getting more and more excited for the nda lift lol.


I've always wanted to ask you, is your profile pic yourself or just a random pic you thought was funny.


----------



## szeged

Quote:


> Originally Posted by *billyboy8888*
> 
> I've always wanted to ask you, is your profile pic yourself or just a random pic you thought was funny.


random pic of myself that i found on the internet that has nothing to do with me that i thought was funny.

the world may never know


----------



## Agent Smith1984

http://www.kitguru.net/site-news/announcements/zardon/amd-withdraw-kitguru-fury-x-sample-over-negative-content/

Yikes!

Shots fired!


----------



## HoZy

Quote:


> Originally Posted by *Mad Pistol*
> 
> So the general consensus so far is that the Fury X is faster than the 980 Ti and neck-and-neck with the Titan X. If true, that's a pretty epic job on AMD's part.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This may force a price drop on the 980 Ti, but knowing Nvidia, they're going to ride it to the bank that 6GB > 4GB.


If only the AMD Driver support was there, I'm a very hurt Red-Team owner with my launch 290X Xfire pair.


----------



## royfrosty

false
Quote:


> Originally Posted by *G227*
> 
> Though this might be interesting for some - based on royfrosty's benchmarks (saw them on the singapore forum - all credits for those go to him!), I have recreated the benchmarks on my setup with Titan X. Obviously this is *rough comparison* with many things that might not be accurate (driver optimization, further overclocking experience with Fury X, better BIOS, silicon lottery etc. etc.). It's just to provide a comparison.
> 
> Here goes:
> 
> *Preliminary GPU comparison between Titan X and Fury X Take this with grain of salt.*
> 
> Note: I have adjusted the final scores for Heaven the CPU difference as the person with Fury X was running 4770K and I run 5820K.
> 
> 
> 
> Full size picture here: http://oi58.tinypic.com/2mo3153.jpg
> EDIT3: Not sure about the clocks really for the OC version - royfrosty will probably clarify
> 
> NOTE: As some members here pointed out, Unigine Heaven might not run as well on AMD drivers so this might not be representative of the actual in-game performance! No other tests aside from SoM benches were run on the Fury X.
> 
> *Here are the Shadow of Mordor 1440p comparisons:*
> 
> 
> 
> Link here: http://oi59.tinypic.com/2efq1ef.jpg
> 
> *& here @4K:*
> 
> 
> 
> Link here: http://oi59.tinypic.com/n6bbiw.jpg
> Do correct me if I misrepresented your benchmarks. And thanks for the effort!


Hi,

Here are my bench test equipment.

MSI Z97S SLI Plus
Intel i7-4770k (Stock clocks)
Avexir Core 1600mhz 2x4gb kit
Crucial m550 512Gb
Sapphire Fury X
Superflower Leadex 1000w Platinum

Hope that helps.


----------



## Origondoo

So,
just looking for those first results it can conclude that Fury X is on the same performance level as 980 ti and Titan X, if the comparisson is done at the same core clock.

And
the next conslusion would be that Nvidia is able to outperform Fury X by being a better overclocker.

So if AMD can refine their production process + custom PCB + unlocked BIOS could bring Fury X on the same level of the overclockability...
Let see for custom releases (Lightning







)


----------



## Phantasia

I really don't think that overclocability of the fury will improve that much on these batches, these are the beginings of HBM.

What I see so far is that if the pricing line is this one: 695€ in spain with VAT included already. (Which will be WAY Less in the US, as with most of the parts). Being at the level of a Ti and a Titan, is a great bet.

Really dying to see oficial benchmarks and reviews of this thing.


----------



## Gumbi

Quote:


> Originally Posted by *Origondoo*
> 
> So,
> just looking for those first results it can conclude that Fury X is on the same performance level as 980 ti and Titan X, if the comparisson is done at the same core clock.
> 
> And
> the next conslusion would be that Nvidia is able to outperform Fury X by being a better overclocker.
> 
> So if AMD can refine their production process + custom PCB + unlocked BIOS could bring Fury X on the same level of the overclockability...
> Let see for custom releases (Lightning
> 
> 
> 
> 
> 
> 
> 
> )


Why in God's name are you comparing speeds clock for clock across different architectures?

Hawaii is faster than Maxwell clock for clock, but notmnay Hawaii's do more than 1150 on air, whereas Maxwell does 1400~ on air handily enough.


----------



## Origondoo

Quote:


> Originally Posted by *Gumbi*
> 
> Why in God's name are you comparing speeds clock for clock across different architectures?
> 
> Hawaii is faster than Maxwell clock for clock, but notmnay Hawaii's do more than 1150 on air, whereas Maxwell does 1400~ on air handily enough.


The core of my comment was: in default clock speed AMD and NVIDIA are on the same level (also they have different structures). But there have been always people saying: BUT I can overclock....
It's the same like GTX 970 perform on the same level as 290X, but only if you overclock it like hell...

Then other suggeston:
Is there a possibility to have FPS per Watt comparison?


----------



## Gumbi

Quote:


> Originally Posted by *Origondoo*
> 
> The core of my comment was: in default clock speed AMD and NVIDIA are on the same level (also they have different structures). But there have been always people saying: BUT I can overclock....
> It's the same like GTX 970 perform on the same level as 290X, but only if you overclock it like hell...
> 
> Then other suggeston:
> Is there a possibility to have FPS per Watt comparisson?


nVidia would crush, they are more efficient per watt. Plus they win in large margins in CPU bound games (1.4× performance).


----------



## flopper

Quote:


> Originally Posted by *Gumbi*
> 
> nVidia would crush, they are more efficient per watt. Plus they win in large margins in CPU bound games (1.4× performance).


Nano is then added into the equation.
for those that like power watt stuff


----------



## Gumbi

Quote:


> Originally Posted by *flopper*
> 
> Nano is then added into the equation.
> for those that like power watt stuff


True dat, true dat. I hope it lives up to expectations. Beating a 290X for sub 200w tdp is very impressive. I don't see it happening if the preliminary Fury X benches are accurate tbh, but one can hope!


----------



## flopper

Quote:


> Originally Posted by *Gumbi*
> 
> True dat, true dat. I hope it lives up to expectations. Beating a 290X for sub 200w tdp is very impressive. I don't see it happening if the preliminary Fury X benches are accurate tbh, but one can hope!


Nah for fury x you use the frame rate control thingy.
there is none buying the entusiast level cards for power as main criteria.

If it was none buy the 980ti as its hot, noisy and a really bad buy for power envelope.


----------



## szeged

which is why non reference coolers exist.


----------



## $ilent

Just saw a post on reddit from someone in NZ who picked up a fury x. Couldn't even manage +50mhz on the core for benchmarking. Bit disappointing that but then again I saw a similar thing when the 290x was released.


----------



## Casey Ryback

Quote:


> Originally Posted by *$ilent*
> 
> Just saw a post on reddit from someone in NZ who picked up a fury x. Couldn't even manage +50mhz on the core for benchmarking. Bit disappointing that but then again I saw a similar thing when the 290x was released.


Could've been a shill.

Nothing would surprise me at this point.


----------



## freezer2k

Quote:


> Originally Posted by *$ilent*
> 
> Just saw a post on reddit from someone in NZ who picked up a fury x. Couldn't even manage +50mhz on the core for benchmarking. Bit disappointing that but then again I saw a similar thing when the 290x was released.


Didn't AMD advertise the Fury X as a great overclocker?


----------



## flopper

Quote:


> Originally Posted by *freezer2k*
> 
> Didn't AMD advertise the Fury X as a great overclocker?


need software for voltage adjustments first.


----------



## Sgt Bilko

http://www.eteknix.com/first-amd-r9-fury-x-benchmarks-are-here

Looks pretty good


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://www.eteknix.com/first-amd-r9-fury-x-benchmarks-are-here
> 
> Looks pretty good


bit old internet time wise.


----------



## ChronoBodi

Where's the regular Fury benches?

Actually I don't even know if it has lesser core count than fury x, and nobody is talking about the regular fury.


----------



## Casey Ryback

Quote:


> Originally Posted by *ChronoBodi*
> 
> Where's the regular Fury benches?
> 
> Actually I don't even know if it has lesser core count than fury x, and nobody is talking about the regular fury.


Fury isn't getting released today.


----------



## devilhead




----------



## $ilent

The NDA is lifted in a mere 8 minutes.


----------



## Kane2207

Quote:


> Originally Posted by *$ilent*
> 
> The NDA is lifted in a mere 8 minutes.


Smart move retiring from staff.

I won't envy a mods job in about 7 minutes time...


----------



## Sgt Bilko

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> http://www.eteknix.com/first-amd-r9-fury-x-benchmarks-are-here
> 
> Looks pretty good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bit old internet time wise.
Click to expand...

Bah

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives.html

http://www.overclock3d.net/reviews/gpu_displays/amd_r9_fury_x_review/1

Theres some more









And some more: http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-Fury-X-4GB-Review-Fiji-Finally-Tested

http://www.forbes.com/sites/jasonevangelho/2015/06/24/amd-radeon-r9-fury-x-review-amd-at-their-best/


----------



## rv8000

Yahoo, ordered and on the way.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202155


----------



## rdr09

Quote:


> Originally Posted by *rv8000*
> 
> Yahoo, ordered and on the way.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202155


i wonder if i can cut those lines and include it in my loop? of course i can but . . .


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Why in God's name are you comparing speeds clock for clock across different architectures?
> 
> Hawaii is faster than Maxwell clock for clock, but *notmnay Hawaii's do more than 1150 on air*, whereas Maxwell does 1400~ on air handily enough.


DAT JUST CHANGED THO


----------



## Agent Smith1984

Guys...

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,1.html

I'm a little dissapointed









It looks to be right smack dab in the middle of 390x and 980 ti..... and not a direct competitor to 980 ti....

Not to mention it looks like it doesn't OC well at all (locked voltage and HBM speed won't budge).

I know it's early, but this kinda makes me glad I went 390...


----------



## Sgt Bilko

Quote:


> Originally Posted by *rv8000*
> 
> Yahoo, ordered and on the way.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202155


And they are out of stock


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And they are out of stock


You have to wonder why with those benchmark/overclocking results....

Man, talking about holding my breath for a little too long.... smh


----------



## p4inkill3r

I went ahead and ordered the Sapphire Fury X from Amazon even though it shows out of stock. Free Prime shipping!
Quote:


> Thank you, your order has been placed.
> An email confirmation has been sent to you.
> New! Get shipment notifications on your mobile device with the free Amazon app.
> 
> Order Number: 109-3199785-8158603
> Sapphire Radeon R9 Fury X 4GB HBM HDMI / TRIPLE DP PCI-Express Graphics Card 21246-00-40G will be shipped to Wade ******* by Amazon.com.
> Estimated delivery: not yet available


----------



## hyp36rmax

Quote:


> Originally Posted by *p4inkill3r*
> 
> I went ahead and ordered the Sapphire Fury X from Amazon even though it shows out of stock. Free Prime shipping!


Welcome to the club!


----------



## New green

Does anyone know if the reviews out are about an overclocked fury x or not? From what I've read fury x only slightly pulls ahead of the 980ti @ 4k and the 980ti pulls ahead @ 1440. Is it just me does factoring in physx & hair from nvidia make the 980ti the better card until next years 14nm hbm2 releases or will new drivers and better overclocking make the fury x better?


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys...
> 
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,1.html
> 
> I'm a little dissapointed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It looks to be right smack dab in the middle of 390x and 980 ti..... and not a direct competitor to 980 ti....
> 
> Not to mention it looks like it doesn't OC well at all (locked voltage and HBM speed won't budge).
> 
> I know it's early, but this kinda makes me glad I went 390...


same thing was said about hawaii at launch. my 290 can oc to 1300. Original BIOS.


----------



## criminal

Up for sale: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&N=8000&Order=BESTMATCH&Description=PPSSPYMKCFJSSG&icid=319818


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> same thing was said about hawaii at launch. my 290 can oc to 1300. Original BIOS.


If you got a 290 that can pull 1300 on the core, you'd better keep it!









Run that hacked 15.15 driver with it, and you can probably compete with Fury









But seriously, I see what you are saying....

The card is new, the driver is new, etc...

It is sad to see locked voltage, locked RAM clock, and such poor overclocks on the initial reviews though. Most people are hitting between 1125 and 1150 from what I've seen.








You'd think they would gladly allow voltage increases on such a well cooled card!


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> DAT JUST CHANGED THO


My numbers were conservative. A good Maxwell can do 1500, and a good Hawaii 1200 (maybe a touch more).

As for locked voltage, it may just be like the 290 release where everyone used the Asus BIOS which had a full ranfe of voltage options.

Also, you can't allow water in a loop to get to 85/90 c AFAIK


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> same thing was said about hawaii at launch. my 290 can oc to 1300. Original BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you got a 290 that can pull 1300 on the core, you'd better keep it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Run that hacked 15.15 driver with it, and you can probably compete with Fury
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But seriously, I see what you are saying....
> 
> The card is new, the driver is new, etc...
> 
> It is sad to see locked voltage, locked RAM clock, and such poor overclocks on the initial reviews though. Most people are hitting between 1125 and 1150 from what I've seen.
> 
> 
> 
> 
> 
> 
> 
> 
> You'd think they would gladly allow voltage increases on such a well cooled card!
Click to expand...

Voltage is locked because Afterburner etc need to be updated to support it.....same with every new card out afaik


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you got a 290 that can pull 1300 on the core, you'd better keep it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Run that hacked 15.15 driver with it, and you can probably compete with Fury
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But seriously, I see what you are saying....
> 
> The card is new, the driver is new, etc...
> 
> It is sad to see locked voltage, locked RAM clock, and such poor overclocks on the initial reviews though. Most people are hitting between 1125 and 1150 from what I've seen.
> 
> 
> 
> 
> 
> 
> 
> 
> You'd think they would gladly allow voltage increases on such a well cooled card!


My second 290 is holding back my first . . .

http://www.3dmark.com/3dm/4644282?


----------



## New green

Quote:


> Originally Posted by *rdr09*
> 
> same thing was said about hawaii at launch. my 290 can oc to 1300. Original BIOS.


From what I've read on the guru3d article the OC utilities don't support the card yet to control voltage and memory.


----------



## rdr09

Quote:


> Originally Posted by *New green*
> 
> From what I've read on the guru3d article the OC utilities don't support the card yet to control voltage and memory.


See post # 1131.


----------



## New green

Quote:


> Originally Posted by *rdr09*
> 
> See post # 1131.


Sorry I was following the thread but posted a little late jumping back and forth from web pages.


----------



## bastian

Anyone who buys a Fury product is clearly just a fanboy for AMD. How anyone can find anything redeemable about it is beyond me. What a total disappointment in every aspect by AMD.


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> Anyone who buys a Fury product is clearly just a fanboy for AMD. How anyone can find anything redeemable about it is beyond me. What a total disappointment in every aspect by AMD.


I raise my hand.lol

But . . . i don't like going back to old driver.


----------



## p4inkill3r

Quote:


> Originally Posted by *bastian*
> 
> Anyone who buys a Fury product is clearly just a fanboy for AMD. How anyone can find anything redeemable about it is beyond me. What a total disappointment in every aspect by AMD.


Look, a troll!


----------



## Agent Smith1984

That makes sense....

Ready to see what she'll do with some cranked votlage, and 1200/700 area clock speeds..... That should wake it up quite a bit


----------



## bastian

Quote:


> Originally Posted by *p4inkill3r*
> 
> Look, a troll!


Yeah, so what is great about it again?

Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


----------



## p4inkill3r

Quote:


> Originally Posted by *bastian*
> 
> Yeah, so what is great about it again?
> 
> Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


So you're just here in the Owner's Club to squawk? Troll confirmation acquired.

There are plenty of other threads to bash AMD in.


----------



## Casey Ryback

Quote:


> Originally Posted by *bastian*
> 
> Anyone who buys a Fury product is clearly just a fanboy for AMD. How anyone can find anything redeemable about it is beyond me. What a total disappointment in every aspect by AMD.


Fair enough to have your opinion but just remember that they are aiding yourself, by giving nvidia's only competitor some cash to stay in operation.

If it was priced a bit better it would sell just fine. (It probably will anyway somehow)

They should've just wacked massive AIB air coolers on them and charged 50-100 less honestly.


----------



## New green

Quote:


> Originally Posted by *bastian*
> 
> Anyone who buys a Fury product is clearly just a fanboy for AMD. How anyone can find anything redeemable about it is beyond me. What a total disappointment in every aspect by AMD.


Too early to say since drivers and OC utilities are holding it back. In the end just be glad no one company has complete control over the market share.

I am disappointed though considering hbm is more or less a placeholder until 14nm hbm2 releases next year. I'm considering just holding off upgrading till then.


----------



## Gumbi

Quote:


> Originally Posted by *bastian*
> 
> Yeah, so what is great about it again?
> 
> Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


It's cheaper and performs almost as well. Great cooling. It doesn't overclock well because there isn't full voltage control yet. Just like when the 290X released, people were struggling to get over 1100, now people hit 1300 under water.

It will come in time. Small form factor is a boon to some people too. If you get a 980ti you are getting slightly better performance for slightly more money. It's not like it's wins all round for it.


----------



## Casey Ryback

Quote:


> Originally Posted by *bastian*
> 
> Yeah, so what is great about it again?
> 
> Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


You have a nice graphics card now go troll some kids on your favourite game


----------



## bastian

Quote:


> Originally Posted by *Casey Ryback*
> 
> Fair enough to have your opinion but just remember that they are aiding yourself, by giving nvidia's only competitor some cash to stay in operation.
> 
> If it was priced a bit better it would sell just fine. (It probably will anyway somehow)
> 
> They should've just wacked massive AIB air coolers on them and charged 50-100 less honestly.


Don't confuse my post. It has come off rather harsh because of total disappointment. I would have loved Fury to have been great. Then at least it could offer up some competition for nVidia to get a hold of this crazy pricing we have been in for many years.


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> Yeah, so what is great about it again?
> 
> Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


It won't come with a Troll.


----------



## flopper

Beats the 980ti in the games I play and resolution.
now waiting for the 14 of July to decide what card of the Fury it will be.


----------



## p4inkill3r

Quote:


> Originally Posted by *bastian*
> 
> Don't confuse my post. It has come off rather harsh because of total disappointment. I would have loved Fury to have been great. Then at least it could offer up some competition for nVidia to get a hold of this crazy pricing we have been in for many years.


This is lame IMO. I see so many people with pure nvidia GPU lineage hoping AMD does well so their prices come down but would never buy an AMD card.


----------



## bastian

Quote:


> Originally Posted by *p4inkill3r*
> 
> This is lame IMO. I see so many people with pure nvidia GPU lineage hoping AMD does well so their prices come down but would never buy an AMD card.


The only thing that is lame is AMD. They needed it to be $100 less than the 980 Ti to make sense.


----------



## p4inkill3r

$100 less for on-par performance?


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> The only thing that is lame is AMD. They needed it to be $100 less than the 980 Ti to make sense.


Why don't you troll at the 980 club?


----------



## edo101

Quote:


> Originally Posted by *bastian*
> 
> The only thing that is lame is AMD. They needed it to be $100 less than the 980 Ti to make sense.


Gonna have to go back to your cave and work on a better bait troll. Even Dory from Nemo wouldn't fall for that pathetic bait. Try harder next time, the future is still bright for you







I see potential


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> It won't come with a Troll.


----------



## Casey Ryback

Quote:


> Originally Posted by *bastian*
> 
> Don't confuse my post.


I didn't confuse anything, you posted foolishly

1. 'Anyone that buys it is a fanboy' - assumption of people's motives

2. 'Why anybody could find anything redeemable' - lack of understanding of some of the pros to the card

3. 'A disappointment in every aspect' - " " "

Not everyone is obsessed with winning every benchmark out there, there's plenty of scenarios where the card does just fine.


----------



## zeppoli

So I think every review I've read, this card is NOT a 980 TI killer, actually it those reviews it barely matches the 980 ti.

so *** is AMD thinking ? they must have a bunch of fools out there that will simply buy an AMD based on fanboy.

Why would anyone buy a slower card that is less efficient (based on the forbes review) for the same cost/more money?

Oh well, I think many were willing to try AMD, even their mighty new technology still cannot even beat the old tech of Nvidia.


----------



## freezer2k

Interesting:

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/29.html

and

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-7.html

Looks like even without additional voltage the card can draw up to 450W!

So much for 275W TDP....


----------



## Casey Ryback

Quote:


> Originally Posted by *freezer2k*
> 
> Looks like even without additional voltage the card can draw up to 450W!
> 
> So much for 275W TDP....


Ok now, breathe......calm down.

Stop exaggerating







it's 430W, and it's in a stupid program called furmark....get me?

PEAK......means peak in gaming, 280W.


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> So I think every review I've read, this card is NOT a 980 TI killer, actually it those reviews it barely matches the 980 ti.
> 
> so *** is AMD thinking ? they must have a bunch of fools out there that will simply buy an AMD based on fanboy.
> 
> Why would anyone buy a slower card that is less efficient (based on the forbes review) for the same cost/more money?
> 
> Oh well, I think many were willing to try AMD, even their mighty new technology still cannot even beat the old tech of Nvidia.


King of Trolls.lol


----------



## edo101

Quote:


> Originally Posted by *zeppoli*
> 
> So I think every review I've read, this card is NOT a 980 TI killer, actually it those reviews it barely matches the 980 ti.
> 
> so *** is AMD thinking ? they must have a bunch of fools out there that will simply buy an AMD based on fanboy.
> 
> Why would anyone buy a slower card that is less efficient (based on the forbes review) for the same cost/more money?
> 
> Oh well, I think many were willing to try AMD, even their mighty new technology still cannot even beat the old tech of Nvidia.


I'm thinking I'm gonna wait for better drivers, Windows 10 and overclocking tools. Oh and the continued support once the next gen card releases. Not sure if I can get that with Nvidia's crash ready drivers. See two can do it too.

Ahh not gonna respond to trolls anymore. Don't wanna bring this war over here. Steady yourselves boys, its gonna be a long day.







Shields Up


----------



## Casey Ryback

Quote:


> Originally Posted by *edo101*
> 
> Gonna have to go back to your cave and work on a better bait troll. Even Dory wouldn't fall for that pathetic bait. Try harder next time, the future is still bright for you
> 
> 
> 
> 
> 
> 
> 
> I see potential


Yes they do show amazing promise, might be a leader of a powerful horde some day


----------



## edo101

Quote:


> Originally Posted by *rdr09*
> 
> King of Trolls.lol


he does show potential. Wonder how high his midichlorian count is.


----------



## p4inkill3r

Quote:


> Originally Posted by *zeppoli*
> 
> So I think every review I've read, this card is NOT a 980 TI killer, actually it those reviews it barely matches the 980 ti.
> 
> so *** is AMD thinking ? they must have a bunch of fools out there that will simply buy an AMD based on fanboy.
> 
> Why would anyone buy a slower card that is less efficient (based on the forbes review) for the same cost/more money?
> 
> Oh well, I think many were willing to try AMD, even their mighty new technology still cannot even beat the old tech of Nvidia.


Wait, you aren't buying one?


----------



## rdr09

Quote:


> Originally Posted by *edo101*
> 
> he does show potential. Wonder how high his midichlorian count is.


The Trolls want us to pay more and undergo . . .

http://www.overclock.net/t/1561503/nvidia-geforce-353-30-game-ready-driver-for-batman-arkham-knight/110#post_24082507

Edit: this post may very well ban me from OCN. lol


----------



## zeppoli

Quote:


> Originally Posted by *edo101*
> 
> I'm thinking I'm gonna wait for better drivers, Windows 10 and overclocking tools. Oh and the continued support once the next gen card releases. Not sure if I can get that with Nvidia's crash ready drivers. See two can do it too.
> 
> Ahh not gonna respond to trolls anymore. Don't wanna bring this war over here. Steady yourselves boys, its gonna be a long day.
> 
> 
> 
> 
> 
> 
> 
> Shields Up


While it might improve with Win 10/Direct x 12 or newer more optimized drivers.. guess what, the 980 ti will too.

I wanted to so bad, love the Fury, wanted to buy one today, but again why?? the R9 290 is much cheaper than its competitor and we all knew that and its why many went that route, but where is the selling point for the fury X vs its cheaper (yes right now the 980 ti is cheaper) competitor ?

its slow in almost every benchmark,
It doesnt overclock well.
Use more power=/more heat
cost the same or more expensive
doesn't have DVI out

and lets not forget, many games today are made with Nvidia in mind, I 'm not a fan of this like many but cannot deny that its true.

Sucks, but I know i'll be ok with the 980 ti.


----------



## bastian

Quote:


> Originally Posted by *p4inkill3r*
> 
> $100 less for on-par performance?


On par?







Okay.... then.


----------



## edo101

Quote:


> Originally Posted by *rdr09*
> 
> The Trolls want us to pay more and undergo . . .
> 
> http://www.overclock.net/t/1561503/nvidia-geforce-353-30-game-ready-driver-for-batman-arkham-knight/110#post_24082507
> 
> Edit: this post may very well ban me from OCN. lol


Don't forget dropped support for any cards that are not current cards.


----------



## rdr09

Quote:


> Originally Posted by *zeppoli*
> 
> While it might improve with Win 10/Direct x 12 or newer more optimized drivers.. guess what, the 980 ti will too.
> 
> I wanted to so bad, love the Fury, wanted to buy one today, but again why?? the R9 290 is much cheaper than its competitor and we all knew that and its why many went that route, but where is the selling point for the fury X vs its cheaper (yes right now the 980 ti is cheaper) competitor ?
> 
> its slow in almost every benchmark,
> It doesnt overclock well.
> Use more power=/more heat
> cost the same or more expensive
> doesn't have DVI out
> 
> and lets not forget, many games today are made with Nvidia in mind, I 'm not a fan of this like many but cannot deny that its true.
> 
> Sucks, but I know i'll be ok with the 980 ti.


Here is a tip: Do not go all the way back to 320 driver and stay away from the latest.


----------



## New green

Tomshardware reported during stress tests that,

"Almost 90 °C at the motherboard slot indicates that the VRM pins have passed 100 °C. This certainly isn't a great way to run the card long-term, but then again, stress tests aren't an everyday usage scenario. Still, it would have been nice to see some reserves for overclocking."

I'm thinking by the time better drivers and OC utilities are out non referenced fury x's should also be around the corner and of course next years 14nm hbm2. My personal opinion is nvidias physx flex and hair tip the 980ti in its favor over the fury x. Still, I'd rather buy amd considering their track record on updating older cards drivers whereas nvidia has gimped old cards to push sales on new cards.


----------



## edo101

Quote:


> Originally Posted by *zeppoli*
> 
> While it might improve with Win 10/Direct x 12 or newer more optimized drivers.. guess what, the 980 ti will too.
> 
> I wanted to so bad, love the Fury, wanted to buy one today, but again why?? the R9 290 is much cheaper than its competitor and we all knew that and its why many went that route, but where is the selling point for the fury X vs its cheaper (yes right now the 980 ti is cheaper) competitor ?
> 
> its slow in almost every benchmark,
> It doesnt overclock well.
> Use more power=/more heat
> cost the same or more expensive
> doesn't have DVI out
> 
> and lets not forget, many games today are made with Nvidia in mind, I 'm not a fan of this like many but cannot deny that its true.
> 
> Sucks, but I know i'll be ok with the 980 ti.


Well sorry it didn't work out. The rest of us are gonna wait for better drivers and actual overclocking tools. Not to mention, AMD is typically good with adjusting prices.

Oh and the DVI thing, I game at 1440p60 so I can just get a 15 dollar adpater to play on it. Sucks but there are ways to get around it. I just can't be bothered to even consider team green knowing that they are doing to PC gaming and their recent driver strain of ebola


----------



## Casey Ryback

Quote:


> Originally Posted by *bastian*
> 
> On par?
> 
> 
> 
> 
> 
> Okay.... then.


Really? you're going to cherry pick results now?

Please tell us why you didn't pick shadows of mordor or farcry 4?










You see the fury beating the 980ti at 1440p.................

Your troll-fu has really failed you, back to troll academy.


----------



## edo101

Quote:


> Originally Posted by *Casey Ryback*
> 
> Really? you're going to cherry pick results like a tech infant?
> 
> please tell us why you didn't pick shadows of mordor or farcry 4?


better yet, tell him to download 353 drivers and try to open Microsoft Office or google chrome.
I've heard those apps are demanding for Nvidia cards as of late


----------



## bastian

Quote:


> Originally Posted by *Casey Ryback*
> 
> Really? you're going to cherry pick results like a tech infant?
> 
> please tell us why you didn't pick shadows of mordor or farcry 4?


I'm quite certain the two games I picked are the 2 most popular PC games at the moment.
Quote:


> Originally Posted by *edo101*
> 
> better yet, tell him to download 353 drivers and try to open Microsoft Office or google chrome.


No issues here.


----------



## Agent Smith1984

I think that's what most people are missing....

Windows 10 is set to launch soon, and with it comes DX12....

We have already seen that AMD looks to fair better in DX12 draw calls, and though we don't know how that directly impacts framerates in titles to come, we can clearly see that there is some advantage....

Those early to hate and troll should be well aware that AMD has been testing this card against NVIDIA cards in Windows 10 DX12 for some time now....

Look what a simple OC and driver/BIOS mods just did for a 2 year old Hawaii chip......
All the sudden the $500 GTX 980 is not so attractive against the $430 390X is it???

IS IT???

I'm banking on there being a lot more in store for Fury X than what these initial tests are going to reveal...

Hawaii dropped and killed the Titan and 780ti performance/dollar, and that was with inflated crypto mining pricing.

Now AMD turned around and stuck it to the next generation 970 and 980 WITH THE SAME GPU/architecture.... 2 YEARS LATER!!!

You don't think they deserve some major props for that accomplishment?

NVIDIA must-havers went and dropped $600 at launch on GTX 980's only to find out Hawaii had a little left in the tank.

So don't count the Fiji out just yet!


----------



## edo101

Quote:


> Originally Posted by *bastian*
> 
> I'm quite certain the two games I picked are the 2 most popular PC games at the moment.
> No issues here.


wish your other friends are as lucky. Really. I would hate not being able to work on my computer because my card couldn't handle Microsoft Word.


----------



## p4inkill3r

Quote:


> Originally Posted by *bastian*
> 
> On par?
> 
> 
> 
> 
> 
> Okay.... then.


5-8% variance difference? That's hairsplittingly fine and should be matchable/beatable once we get Trixx/Afterburner's voltage unlocks.


----------



## G227

Quote:


> Originally Posted by *bastian*
> 
> Yeah, so what is great about it again?
> 
> Its amazing lead over the competition... nope. Its awesome performance at 4k.. nope. How about 1440p surely... nope. Its great overclocking... nope. Then its got to be the power savings right... nope. Wait wait, then the large framebuffer.... nope. Well, at least its $50 cheaper if you don't mind failing hard at everything else. Enjoy.


What's great about is that *it creates competition and we all - as in consumers - win*. Just take the 980Ti - not in a sense that 980Ti is faster, but that it costs $650. And it costs $650 instead of 700-800 because of Fury X. Its beyond obvious that NVIDIA priced it that way to compete with Fury X which they knew was gonna be priced there. NVIDIA created a lot of bashing for itself and namely from TX owners who felt they were ripped off (though not really - TX is still THE card







). NVIDIA is not stupid and the only reason why to price 980Ti this low was the Fury X (that BS explanation that they wanted to make 4K affordable







hahahaha haha ha.. ha ... ha NVIDIA making something affordable







).

So in summary - Fury X is a great card - innovative, small, beautiful and quite powerful - albeit not as much as 980Ti or TX (partly due to current possible limitation such as OC software compatibility, poorer driver optimization etc.). and does not seem to overclock all that well before we hit that sweet custom BIOS field







. But its so SO very important that NVIDIA has competition - for us consumers that we don't have to pay 2K per GPU in couple of years (or so - just making stuff up now). It's basic economics.

Now I'm not hating on NVIDIA either - they make stellar GPUs (but like to price them high







). *I myself own Titan X and love it. But we need AMD and Fury X delivers* - at least some.


----------



## Horsemama1956

Personally I don't care about the top end as it does nothing for me knowing which company has the fastest GPU. I really just want to see more about the NANO.

I also do not understand why reviews happen when the software sites use isn't updated for the product.


----------



## edo101

Quote:


> Originally Posted by *p4inkill3r*
> 
> 5-8% variance difference? That's hairsplittingly fine and should be matchable/beatable once we get Trixx/Afterburner's voltage unlocks.


Stop that, all of you. The Troll is a simple creature. It does not understand logic


----------



## Forceman

Can we leave all this driver and AMD/Nvidia bickering in the news and reviews threads where it belongs. That kind of crap doesn't need to be in an owners thread.


----------



## bastian

Quote:


> Originally Posted by *p4inkill3r*
> 
> 5-8% variance difference? That's hairsplittingly fine


Clearly you weren't around here when people were thinking it was going to destroy the 980 Ti/Titan X.


----------



## edo101

Quote:


> Originally Posted by *Forceman*
> 
> Can we leave all this driver and AMD/Nvidia bickering in the news and reviews threads where it belongs. That kind of crap doesn't need to be in an owners thread.


Thats like asking if we can have a graphics card thread without any trolls. It won't happen. For the sake of derailing this thing any further, I'm withdrawing from combat. its been a good fight. WITNESS ME BROTHERS, VALHALLA, I GO RIDING ETERNAL AND CHROME

But it wont stop after this.

OT: When can we get any news on the Nano


----------



## omega53

Quote:


> Originally Posted by *Mad Pistol*
> 
> So the general consensus so far is that the Fury X is faster than the 980 Ti and neck-and-neck with the Titan X. If true, that's a pretty epic job on AMD's part.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This may force a price drop on the 980 Ti, but knowing Nvidia, they're going to ride it to the bank that 6GB > 4GB.


Look who was wrong


----------



## p4inkill3r

Quote:


> Originally Posted by *bastian*
> 
> Clearly you weren't around here when people were thinking it was going to destroy the 980 Ti/Titan X.


Haha. Ok, mate.


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> I'm quite certain the two games I picked are the 2 most popular PC games at the moment.
> No issues here.


How can you have issues?


----------



## Casey Ryback

Quote:


> Originally Posted by *G227*
> 
> Now I'm not hating on NVIDIA either - they make stellar GPUs (but like to price them high
> 
> 
> 
> 
> 
> 
> 
> ). *I myself own Titan X and love it. But we need AMD and Fury X delivers* - at least some.


Noooooooo you will not talk so logically! It will fall on deaf ears!

As painkiller states the difference in games linked (W3 GTAV)are so small anyway.

People get so uptight about the smallest of figures it's incredible.


----------



## p4inkill3r

Quote:


> Originally Posted by *rdr09*
> 
> How can you have issues?


Ouch.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> How can you have issues?


oh wow.... that's classic!

Anyways....

Back to the legit conversation!

I've seen one person with card in hand on here, and one order placed....

Anybody else able to secure one?


----------



## bastian

Quote:


> Originally Posted by *rdr09*
> 
> How can you have issues?


How does this relate to anything? Keep trying to reach my friend. Doesn't change the fact the 980 Ti is the better card.

I have a defective 980 Ti G1 Gaming card. Its getting replaced. It happens to any make of card. My 980 G1 Gaming is perfect. Are you trying to say Fury won't have any defects? Good luck with that. I'm waiting to hear about the leaking water cooling.


----------



## Casey Ryback

Quote:


> Originally Posted by *rdr09*
> 
> How can you have issues?


eww bad quality control on those nvidia cards eh bastian?


----------



## edo101

Quote:


> Originally Posted by *Agent Smith1984*
> 
> oh wow.... that's classic!
> 
> Anyways....
> 
> Back to the legit conversation!
> 
> I've seen one person with card in hand on here, and one order placed....
> 
> Anybody else able to secure one?


no, you gotta refresh you browser, faster than a 144hz monitor to get one.


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> How does this relate to anything? Keep trying to reach my friend. Doesn't change the fact the 980 Ti is the better card.
> 
> I have a defective 980 Ti G1 Gaming card. Its getting replaced. It happens to any make of card. My 980 G1 Gaming is perfect.


You need kleenex?


----------



## Casey Ryback

Quote:


> Originally Posted by *bastian*
> 
> Clearly you weren't around here when people were thinking it was going to destroy the 980 Ti/Titan X.


Way to deflect the point about small percentages.

All these pointless posts will be deleted because it's the owners club and I feel bad about that.

I'm done here, feeding time is over bastian.

May you find a comfy bridge to rest your head under this evening.


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> oh wow.... that's classic!
> 
> Anyways....
> 
> Back to the legit conversation!
> 
> I've seen one person with card in hand on here, and one order placed....
> 
> Anybody else able to secure one?


Quote:


> Originally Posted by *MunneY*
> 
> They aren't in stock yet at Amazon but they are coming today, I confirmed with a rep.
> 
> Sapphire - $649 - http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-21246-00-40G/dp/B01012TLSS
> 
> Gigabyte - $669 - http://www.amazon.com/Gigabyte-FURY-4096-Graphics-GV-R9FURYX-4GD-B/dp/B0106B8UAY/
> 
> XFX - $679 - http://www.amazon.com/XFX-RADEON-Graphics-Cards-R9-FURY-4QFA/dp/B0106IJXX0/
> 
> VisionTek - $699 - http://www.amazon.com/VisionTek-Radeon-Express-Graphics-900814/dp/B010A7V6KA/


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> Not as much kleenex as a Fury X owner will need when their GPU leaks.


Do not go all the way back to 320 driver.

I find you more of a whiner than your 980 Ti.









Forceman is getting mad.lol


----------



## Agent Smith1984

What's the term these days?

Oh yeah....

"You mad bro?


----------



## New green

Does anyone know if the ASRocks OC formula MOBO with its liquid proof coating will protect the board from the anti freeze solution used in the fury x. From what I've been reading about most aio leaks the solution usually vaporizes before it hits the board unless the leak is catastrophic.


----------



## eXe.Lilith

Just ordered 2 Fury X from Sapphire, should arrive in about a week (hopefully I'll be among the first to get mine here as I literally grabbed them the minute they went up on my e-tailer's site).
Now I'm just waiting for EKWB to release some blocks for them.


----------



## jerrolds

Well disappointed that so far the Fury X lags behind the 980ti at 1440p and trades at 4K, but also glad since i dont have DP/HDMI on my monitor.

Looks like a superclocked 980ti is next on my list. Damn, really dont want to go green and support Gameworks.


----------



## p4inkill3r




----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*


Nice vid!

Clearly the card is built for 4K, and that is what AMD focused on.

We may be starting to see 1080P GPU comparisons to be a lot less meaningful than they used to be....


----------



## skkane

Nice results in CF.

I wonder how the setup sounded though. I heard the one card in hardware canuck's video (idling in desktop) and the noise was unbearable. No way you're browsing the internet late at night in silence.


----------



## pdasterly

amd needs to pull one more trick from under their sleeve if they want my money.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice vid!
> 
> Clearly the card is built for 4K, and that is what AMD focused on.
> 
> We may be starting to see 1080P GPU comparisons to be a lot less meaningful than they used to be....


I am waiting for 14 of July and the windows 10 launch later myself.
how the cards perform at windows 10 is what I am interested in.


----------



## sage101

I thought this was the owners thread so i don't understand how this thread could possibly attract this many trolls. If you have no intention of buying any of the fiji cards then why post your anti amd/fury remarks here? I for 1 is eyeing out the nano, can't wait for more info on the nano like price and performance seems like a sweet card.


----------



## THUMPer1

Looking forward to see how the air cooled ones perform at 1440p. That is the next resolution I am going too.


----------



## Agent Smith1984

I have to be honest though....

Aside from that video which shows the card to compete against Titan X, every single other review I have found has put the Fury X right between the 980 and the 980ti, which is a bad look for AMD when they want $650, and it's their flagship.... all of the "conlusions" from everyone of these reviews tells the same story, and it points to overall disappointment.

I like the concept, I like the design, I like how the card looks on paper, and I certainly like AMD overall as a company....
But unless the drivers and/or Windows 10(DX12) push Fury's performance ahead a bit, then I'm just going to stick with my plan of running 390's in crossfire (same price and more HP).

I only purchased one 390 (previously running 2x 290 in crossfire) because I needed a card immediately after selling off my trixxies, and I wanted to see how the summer plays out for both AMD and NVIDIA (performance, series, and prices)...

I won't drop the gavel on Fiji just yet, but I have to admit, I was expecting a little more from this card.


----------



## pdasterly

and when your dissapointed that you waited, then what?
Quote:


> Originally Posted by *sage101*
> 
> I thought this was the owners thread so i don't understand how this thread could possibly attract this many trolls. If you have no intention of buying any of the fiji cards then why post your anti amd/fury remarks here? I for 1 is eyeing out the nano, can't wait for more info on the nano like price and performance seems like a sweet card.


----------



## skkane

Here's a better "noise" video. Does not sound so bad here but would still get pretty anoying at night when it's really quiet. With an open case.


----------



## jerrolds

Quote:


> Originally Posted by *sage101*
> 
> I thought this was the owners thread so i don't understand how this thread could possibly attract this many trolls. If you have no intention of buying any of the fiji cards then why post your anti amd/fury remarks here? I for 1 is eyeing out the nano, can't wait for more info on the nano like price and performance seems like a sweet card.


Pre 290X launch it was exactly the same - the thread reached over 100pages before release. It eventually spawned another official owners thread.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have to be honest though....
> 
> Aside from that video which shows the card to compete against Titan X, every single other review I have found has put the Fury X right between the 980 and the 980ti, which is a bad look for AMD when they want $650, and it's their flagship.... all of the "conlusions" from everyone of these reviews tells the same story, and it points to overall disappointment.
> 
> I like the concept, I like the design, I like how the card looks on paper, and I certainly like AMD overall as a company....
> But unless the drivers and/or Windows 10(DX12) push Fury's performance ahead a bit, then I'm just going to stick with my plan of running 390's in crossfire (same price and more HP).
> 
> I only purchased one 390 (previously running 2x 290 in crossfire) because I needed a card immediately after selling off my trixxies, and I wanted to see how the summer plays out for both AMD and NVIDIA (performance, series, and prices)...
> 
> I won't drop the gavel on Fiji just yet, but I have to admit, I was expecting a little more from this card.


They could have paced the hype better, told the drivers was work in progress drop the preconcived notions a bit.
14 July next then and 29 July for win 10.


----------



## Casey Ryback

Quote:


> Originally Posted by *sage101*
> 
> I thought this was the owners thread so i don't understand how this thread could possibly attract this many trolls.


Trolls don't care for logic or sanity.

They feed off the complete opposite


----------



## sage101

Quote:


> Originally Posted by *pdasterly*
> 
> and when your dissapointed that you waited, then what?


Well i don't think i'll be disappointed since i expect it to beat the GTX980 which was the card I'm looking @ for an upgrade. I only game @ 1080p and only got a 500watt psu so my options are kinda limited so it's between the nano and the gtx980, price would be the deciding factor for me.


----------



## xer0h0ur

Quote:


> Originally Posted by *eXe.Lilith*
> 
> Just ordered 2 Fury X from Sapphire, should arrive in about a week (hopefully I'll be among the first to get mine here as I literally grabbed them the minute they went up on my e-tailer's site).
> Now I'm just waiting for EKWB to release some blocks for them.


Be extremely careful with the interposer. I can't stress that enough. Don't touch it, wipe it, swab it. In fact don't make contact with anything other than the TIM you're cleaning off the four HBM stacks or the GPU's die. Apparently those interposers are fragile.


----------



## pdasterly

Quote:


> Originally Posted by *sage101*
> 
> I thought this was the owners thread so i don't understand how this thread could possibly attract this many trolls. If you have no intention of buying any of the fiji cards then why post your anti amd/fury remarks here? I for 1 is eyeing out the nano, can't wait for more info on the nano like price and performance seems like a sweet card.


Quote:


> Originally Posted by *sage101*
> 
> Well i don't think i'll be disappointed since i expect it to beat the GTX980 which was the card I'm looking @ for an upgrade. I only game @ 1080p and only got a 500watt psu so my options are kinda limited so it's between the nano and the gtx980, price would be the deciding factor for me.


My expectations are higher than yours, I want to game at 4k with at min dual monitors, then grab some vr when it comes to market. Guess i will have to wait again for next gen of cards to come out or at least dual gpu fury. Im limited to two pci slots


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Be extremely careful with the interposer. I can't stress that enough. Don't touch it, wipe it, swab it. In fact don't make contact with anything other than the TIM you're cleaning off the four HBM stacks or the GPU's die. Apparently those interposers are fragile.


Probably why they implemented their own water cooling solution themselves, instead of leaving it up to the buyers to purchase blocks....

I wonder if the air cooled models will also have this fragile interposer?


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> Be extremely careful with the interposer. I can't stress that enough. Don't touch it, wipe it, swab it. In fact don't make contact with anything other than the TIM you're cleaning off the four HBM stacks or the GPU's die. Apparently those interposers are fragile.


is there any proof of this? I am just curious how people know this without having the card. When looking at close up pictures of the die, the interposer looks like it is covered by a clear material.


----------



## Agent Smith1984

Quote:


> Originally Posted by *pdasterly*
> 
> My expectations are higher than yours, I want to game at 4k with at min dual monitors. Guess i will have to wait again for next gen of cards to come out.


Based on the techpowerup review that includes the Fury X and the 390x, I don't think Fury is even a good upgrade for Hawaii users....

I think you'd have to be springing ahead from 7900/280 series to be breaking into "next-level" performance.

Again, I'm not trying to troll, or shed hate on the Fiji, cause we still know very little from a potential standpoint, but I do call 'em like I see 'em.

Of course, anyone can see that I bought a 390 and just say I'm advocating my purchase







Okay, fair enough, but everyone who drops dough on GPU's does the same, whether it be trolling 980 ti owners, or people who just purchased the Fury X.


----------



## xer0h0ur

Quote:


> Originally Posted by *DividebyZERO*
> 
> is there any proof of this? I am just curious how people know this without having the card. When looking at close up pictures of the die, the interposer looks like it is covered by a clear material.


This
Quote:


> Originally Posted by *tajoh111*
> 
> http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/
> 
> "One word of warning should you buy a Fiji and molest it in various ways that overclockers and enthusiasts normally do, be careful. If you look at the above picture you can see the pretty patterns on the interposer, they look good but don't taste good. If you want to clean off the thermal paste and replace it with your own cooling solution, be really careful of these areas. Why? Because the interposer, basically a chip, is mounted face up, it is not a traditional flip chip part with the transistors and metal layers protected by the wafer, they fragile bits are on top this time.
> 
> How fragile? Don't touch them, don't wipe them off, and otherwise don't do anything that could break a far sub-micron metal trace. It is really fragile and you will destroy your very expensive GPU if you do this, don't say we didn't warn you. This is a tech transition that hasn't been seen since the days when flip chips replaced wire bonding so think back to the bad old days before you mod. Really, be careful or you will end up with an expensive 4GB, water-cooled doorstop."
> 
> Some parts of fiji, particularly stuff on the interposer are ridiculous sensitive to damage it appears. Considering the sensitivity, I can imagine, that partners are not going to warranty the card after the heatsink has been removed since cleaning off the chip and putting your own thermal paste has the potential to screw up the cards.
> 
> This sounds like, people that put on water cooling are probably best just leaving the stock solution on since the risk is pretty high.


----------



## pdasterly

that sucks cause I sold my gpu and monitors back in feb, hoping for greener pastures. From consumer stand point the card needs more memory and 100-150 price break. I would still have to buy water block which adds to price, 980ti hydro copper looks appetizing. Im a amd fanboi too, well maybe not but nvidia its love the product, hate the price


----------



## DividebyZERO

Quote:


> Originally Posted by *xer0h0ur*
> 
> This


Are they saying that AMD told him this because in that article it looks like they are just saying it and not AMD.


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Based on the techpowerup review that includes the Fury X and the 390x, I don't think Fury is even a good upgrade for Hawaii users....
> 
> I think you'd have to be springing ahead from 7900/280 series to be breaking into "next-level" performance.
> 
> Again, I'm not trying to troll, or shed hate on the Fiji, cause we still know very little from a potential standpoint, but I do call 'em like I see 'em.
> 
> Of course, anyone can see that I bought a 390 and just say I'm advocating my purchase
> 
> 
> 
> 
> 
> 
> 
> Okay, fair enough, but everyone who drops dough on GPU's does the same, whether it be trolling 980 ti owners, or people who just purchased the Fury X.


I support AMD yet I am so far underwhelmed by this release. I mean its fairly stronger than Hawaii thus far but not enough. By raw power it should be performing better simply by virtue of its increased SP count yet its not performing up to its estimates. Clearly the drivers aren't remotely good enough yet. I half expected AMD to have an ace in the hole Omega-esque driver waiting for Fiji's release but that didn't happen. Something doesn't add up here.

Either way I am not upgrading to Maxwell or Fiji so I am a bystander in all of these shenanigans for the time being. The only card that might even tempt me is Fury X2 and I can only hope that by then they have their crap together driver-wise and the cards are overclocking better at that point. I still need to see DX12 performance. This release is literally round 1 and the round goes to the 980 Ti / Titan X.


----------



## xer0h0ur

Quote:


> Originally Posted by *DividebyZERO*
> 
> Are they saying that AMD told him this because in that article it looks like they are just saying it and not AMD.


Now that I have no insight into. They speak as if they had the card in hand and know this first hand however I can't confirm that.


----------



## szeged

hmm...might no be grabbing a fury x based on all the reviews.

Voltage locked (for now at least) no custom AIBs allowed (unless everyone is mistaken)

only just comparable to the TI (for now, obviously drivers will improve it)

i just dont see a point in buying it to have a little bench off comparison when the overclocking on it isnt looking that good meanwhile all 3 of my titans do over 1500+ pretty easily.

Time to set my sights on fury non x, now that could be interesting.


----------



## New green

Quote:


> Originally Posted by *pdasterly*
> 
> that sucks cause I sold my gpu and monitors back in feb, hoping for greener pastures. From consumer stand point the card needs more memory and 100-150 price break. I would still have to buy water block which adds to price, 980ti hydro copper looks appetizing. Im a amd fanboi too, well maybe not but nvidia its love the product, hate the price


From what I've been reading the 4gb vram isn't an issue when viewed from allocated memory to actual memory used in modern titles. It may also be too early in wanting to replace the stock aio since drivers and OC utilities are not optimized yet and we don't really know how far the stock aio can push the card when overclocked. If your hoping for greener pastures next years 14nm hbm2 is the true successor to gddr5. I'm still not sure if I am going to buy the fury x though seeing as it's out of stock on most sites I will be waiting anyways.

The good thing about amd hbm versus nvidias gddr5 is track record for driver support on older cards. It is going to be very interesting to see how much nvidia is willing to support its old gddr5 cards if they release low to high end hbm2 cards next year.


----------



## pdasterly

no room for aio, let alone two. Would have to plumb into existing radiators and fans.
maybe if they can offer a card with water block on it like the hydro copper or poseidon


----------



## hyp36rmax

*+ Updated Reviews Post*

*Reviews:* Link

*Here's a run down of the Reviews so far:*

*AMD R9 Radeon Fury X Reviews*


*Site Name**Article Name**Date**Link*TechPowerUp

AMD Radeon R9 Fury X 4 GB

06/24/15LinkToms Hardware

AMD Radeon R9 Fury X 4 GB

06/24/15LinkHardware Canucks

AMD Radeon R9 Fury X 4 GB

06/24/15LinkGuru 3D

The Radeon R9 Fury X Analyzed: AMD Unleashes Fury

06/24/15LinkBit-Tech

AMD Radeon R9 Fury X Review

06/24/15LinkOverClock3D.net

AMD R9 Fury X Review

06/24/15LinkHardOCP

AMD Radeon R9 Fury X Video Card Review

06/24/15LinkHexus

AMD Radeon R9 Fury X 4GB

06/24/15LinkVMOD Tech

AMD RADEON™ R9 FURY X 4GB HBM 4096-bit Review

06/24/15LinkTechFrag

AMD Radeon R9 Fury X Official Benchmark Results Released

06/24/15LinkHardware.FR

AMD Radeon R9 Fury X : le GPU Fiji et sa mémoire HBM en test

06/24/15LinkSWE Clockers

AMD Radeon R9 Fury X

06/24/15LinkHardware.INFO

AMD Radeon R9 Fury X review: AMD's new flag ship graphics card

06/24/15LinkJagat Review

Review Radeon R9 Fury X: AMD Gaming VGA Best When It!

06/24/15LinkHispaZone

AMD Radeon R9 Fury X Series

06/24/15LinkForbes

Radeon R9 Fury X Review: This Is AMD At Their Best

06/24/15LinkPC Gamer

AMD Radeon R9 Fury X tested: not quite a 980 Ti killer

06/24/15LinkTechReport

AMD's Radeon R9 Fury X graphics card reviewed: The red team vents its Fury

06/24/15LinkPC World

AMD Radeon R9 Fury X graphics card review: AMD's long-awaited 4K powerhouse

06/24/15Link


----------



## DividebyZERO

Has anyone seen 5k reviews yet? I am looking for 5k and Crossfire reviews.


----------



## hyp36rmax

Quote:


> Originally Posted by *DividebyZERO*
> 
> Has anyone seen 5k reviews yet? I am looking for 5k and Crossfire reviews.


I have yet to see these with the one exception of Crossfire from Digital Storm. I'm sure within a week we will see more Crossfire results along with some re-reviews once AMD drops another driver update and Windows 10 next month.


----------



## xer0h0ur

Quote:


> Originally Posted by *New green*
> 
> From what I've been reading the 4gb vram isn't an issue when viewed from allocated memory to actual memory used in modern titles. It may also be too early in wanting to replace the stock aio since drivers and OC utilities are not optimized yet and we don't really know how far the stock aio can push the card when overclocked. If your hoping for greener pastures next years 14nm hbm2 is the true successor to gddr5. I'm still not sure if I am going to buy the fury x though seeing as it's out of stock on most sites I will be waiting anyways.
> 
> The good thing about amd hbm versus nvidias gddr5 is track record for driver support on older cards. It is going to be very interesting to see how much nvidia is willing to support its old gddr5 cards if they release low to high end hbm2 cards next year.


Actually the Pascal die that Huang was showing off is TSMC's 16nm FinFET+


----------



## HiTechPixel

Quote:


> Originally Posted by *DividebyZERO*
> 
> Has anyone seen 5k reviews yet? I am looking for 5k and Crossfire reviews.


I am looking for 5K Crossfire reviews too. The results will decide if I go Titan X SLI or Fury X Crossfire since I have a 5K monitor.


----------



## doza

just watched fury video review and first thing that came up my mind was vrm temps









Closed chase with basically no way you could somehow put a fan to blow for safer temp's (ex. vrm ) over a card.
Fury is a monster and pushing that vrm's with no active cooling is gona do nothing good for longevity of a card :S


----------



## New green

Quote:


> Originally Posted by *xer0h0ur*
> 
> Actually the Pascal die that Huang was showing off is TSMC's 16nm FinFET+


Ahh ya I goofed that amd is using samsungs 14nm while nvidia is with tsmc 16nm.


----------



## xer0h0ur

Quote:


> Originally Posted by *doza*
> 
> just watched fury video review and first thing that came up my mind was vrm temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Closed chase with basically no way you could somehow put a fan to blow for safer temp's (ex. vrm ) over a card.
> Fury is a monster and pushing that vrm's with no active cooling is gona do nothing good for longevity of a card :S


The copper pipe is making direct contact with the VRMs. I don't see how a fan can do any better than water cooling?


----------



## bastian

Quote:


> Originally Posted by *rdr09*
> 
> Do not go all the way back to 320 driver.
> 
> I find you more of a whiner than your 980 Ti.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Forceman is getting mad.lol


LinusTechTips got a defective Fury X for their review:




But nice try, friend.


----------



## rdr09

Quote:


> Originally Posted by *bastian*
> 
> LinusTechTips got a defective Fury X for their review:
> 
> 
> 
> 
> But nice try, friend.


I thought you are done whining. sorry for the folks who live with you. alone, right?

edit: you can't find the 980 Ti club?


----------



## Allanitomwesh

Numbers on the benchmarks look wrong to me. Like the drivers are ****. Is that just me?


----------



## ondoy

Spoiler: Warning: Spoiler!











oh the internet is full of....


----------



## skkane

Quote:


> Originally Posted by *bastian*
> 
> LinusTechTips got a defective Fury X for their review:
> .


Apparently that Linus guy dropped it when taking it out of his car, or so they are saying. That might have killed it... sensible either way.


----------



## szeged

Quote:


> Originally Posted by *skkane*
> 
> Apparently that Linus guy dropped it when taking it out of his car, or so they are saying. That might have killed it... sensible either way.


whos saying that lol.


----------



## Agent Smith1984

I dunno, but I think Fury X broke OCN, lol

Hmmm, this is interesting....

Fury X is slightly ahead of Titan X in all of these charts.....
http://www.techpowerup.com/reviews/AMD/R9_Fury_X/33.html


----------



## zeppoli

Quote:


> Originally Posted by *Allanitomwesh*
> 
> Numbers on the benchmarks look wrong to me. Like the drivers are ****. Is that just me?


LOL no, its just what else is left to blame it on? fanboys can't fathom that the new card just isn't all that great, so we like to hope that its the drivers.

nothing wrong with the drivers, the card performs at a 500 dollar price point, thats the only issue, once they get to that price point they will have a strong card, they have no business at 650 bucks, thats for the power house category, like the 980ti.

In AMD's defense maybe they didn't know about the TI, and thought they were only competing with the 980.
So they should have , lowered the price to compete with the 980, or build another card to compete with the 980 ti.


----------



## Agent Smith1984

Quote:


> Originally Posted by *zeppoli*
> 
> LOL no, its just what else is left to blame it on? fanboys can't fathom that the new card just isn't all that great, so we like to hope that its the drivers.
> 
> nothing wrong with the drivers, the card performs at a 500 dollar price point, thats the only issue, once they get to that price point they will have a strong card, they have no business at 650 bucks, thats for the power house category, like the 980ti.
> 
> In AMD's defense maybe they didn't know about the TI, and thought they were only competing with the 980.
> So they should have , lowered the price to compete with the 980, or build another card to compete with the 980 ti.


You don't know any of what you said as fact though....

It could definitely be drivers....

It could also definitely be a short coming.

I say give it a few weeks.

Anybody in here wanting one is probably going to have to wait that long to get their hands on one anyways.


----------



## skkane

Quote:


> Originally Posted by *szeged*
> 
> whos saying that lol.


Multiple comments about it in that youtube vid.


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> LOL no, its just what else is left to blame it on? fanboys can't fathom that the new card just isn't all that great, so we like to hope that its the drivers.
> 
> nothing wrong with the drivers, the card performs at a 500 dollar price point, thats the only issue, once they get to that price point they will have a strong card, they have no business at 650 bucks, thats for the power house category, like the 980ti.
> 
> In AMD's defense maybe they didn't know about the TI, and thought they were only competing with the 980.
> So they should have , lowered the price to compete with the 980, or build another card to compete with the 980 ti.


I am going to report you for trolling if you keep this up. Get a life and go talk about your own tech.


----------



## pdasterly

fury broke overclock temporary lol


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You don't know any of what you said as fact though....
> 
> It could definitely be drivers....
> 
> It could also definitely be a short coming.
> 
> I say give it a few weeks.
> 
> Anybody in here wanting one is probably going to have to wait that long to get their hands on one anyways.


Even if it is just drivers though, is that really a good answer? They've been working on this card for ages, and you'd certainly think they'd have good drivers for the launch of their new flagship card. If you aren't going to pull out all the stops for this launch, then when are you?


----------



## semitope

If ther wasn't a new non-beta driver for this card then it very well can be drivers. Even if there was one released recently it could still be. If it doesn't improve with dx12 and windows 10 I'd start to worry.


----------



## Gumbi

Quote:


> Originally Posted by *zeppoli*
> 
> LOL no, its just what else is left to blame it on? fanboys can't fathom that the new card just isn't all that great, so we like to hope that its the drivers.
> 
> nothing wrong with the drivers, the card performs at a 500 dollar price point, thats the only issue, once they get to that price point they will have a strong card, they have no business at 650 bucks, thats for the power house category, like the 980ti.
> 
> In AMD's defense maybe they didn't know about the TI, and thought they were only competing with the 980.
> So they should have , lowered the price to compete with the 980, or build another card to compete with the 980 ti.


390/390x are competing with 980s now (very close to 980 in basically all cases). And for much less!!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Even if it is just drivers though, is that really a good answer? They've been working on this card for ages, and you'd certainly think they'd have good drivers for the launch of their new flagship card. If you aren't going to pull out all the stops for this launch, then when are you?


I agree with that 100%, and with that sentiment, I am accepting that this card could very well be a short coming for AMD.

Then again, with the 10% boost Hawaii just got from a slight OC and some major driver improvements, it could just as easily be bad drivers....

I wonder if AMD spent so much time worrying themselves with 4K performance on this card, that they failed to see the lackluster 1080/1440 performance right under their noses....


----------



## criminal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I agree with that 100%, and with that sentiment, I am accepting that this card could very well be a short coming for AMD.
> 
> Then again, with the 10% boost Hawaii just got from a slight OC and some major driver improvements, it could just as easily be bad drivers....
> 
> I wonder if AMD spent so much time worrying themselves with 4K performance on this card, t*hat they failed to see the lackluster 1080/1440 performance right under their noses*....


Which to me points to a driver issue.

I know it is not a good excuse for AMD, but I strongly believe it is the drivers.


----------



## TK421

out of stock in amazon....

less than 12h


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> Even if it is just drivers though, is that really a good answer? They've been working on this card for ages, and you'd certainly think they'd have good drivers for the launch of their new flagship card. If you aren't going to pull out all the stops for this launch, then when are you?


not from you? even maxwell is having driver issues. it happens in every new offering on both brands.

edit: zeppoli, yah, you'd expect that kind of thinking. the guy is built more for Xbone.


----------



## Hazardz

Having only read the [H] and TPU reviews so far, I'm disappointed in the Fury X. Sub-4K performance is abysmal for the price and it should really have been marked at sub-$550 to give Nvidia some real competition.

In 4K, it's more or less similar or better to the 980 Ti if you don't use AA, something I would do if I was playing in that resolution. I guess they justify the $650 MSRP for people looking to play only in 4K without AA and with DP 1.2, a very narrow market. Are they really planning to only sell a handful?


----------



## frunction

Quote:


> Originally Posted by *TK421*
> 
> out of stock in amazon....
> 
> less than 12h


I don't think they were ever in stock on Amazon.


----------



## Agent Smith1984

Quote:


> Originally Posted by *criminal*
> 
> Which to me points to a driver issue.
> 
> I know it is not a good excuse for AMD, but I strongly believe it is the drivers.


Could be the driver, or HBM finally showing up against NVIDIA's 384-bit bus at hi-res......

These cards are so powerful that at lower resolutions it's pure compute power.
With that being the case, NVIDIA is winning.... now whether it's an architecture thing, or a driver overhead/issue is to be seen.

If you are running 290/390 series already, this card serves no purpose at 1080P
The Hawaii is good enough at that res, and this card is marginally better.

At 1440p the 980ti is better for the price

At 4K it's considerable, but no single card is GREAT at 4k yet anyways, only good....

Also, if you are running Hawaii already, then adding a second GPU, or selling yours and adding cash for a 295x2 at it's current $589-629 going rate, may be your best bet (unless you are completely against CF), because in all of these tests, it is clearly still the top dog single slot solution (until we see a Titan X2 (or whatever) and Fury X2.

The more I read, the more I am having trouble putting my finger on exactly what part of the market this card is supposed to represent.

Again, I can't totally pass judgement until we see if anything new develops from a driver/performance standpoint.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Could be the driver, or HBM finally showing up against NVIDIA's 384-bit bus at hi-res......
> 
> These cards are so powerful that at lower resolutions it's pure compute power.
> With that being the case, NVIDIA is winning.... now whether it's an architecture thing, or a driver overhead/issue is to be seen.
> 
> If you are running 290/390 series already, this card serves no purpose at 1080P
> The Hawaii is good enough at that res, and this card is marginally better.
> 
> At 1440p the 980ti is better for the price
> 
> At 4K it's considerable, but no single card is GREAT at 4k yet anyways, only good....
> 
> Also, if you are running Hawaii already, then adding a second GPU, or selling yours and adding cash for a 295x2 at it's current $589-629 going rate, may be your best bet (unless you are completely against CF), because in all of these tests, it is clearly still the top dog single slot solution (until we see a Titan X2 (or whatever) and Fury X2.
> 
> The more I read, the more I am having trouble putting my finger on exactly what part of the market this card is supposed to represent.
> 
> Again, I can't totally pass judgement until we see if anything new develops from a driver/performance standpoint.


launch drivers . . .

http://www.3dmark.com/compare/fs/2814236/fs/1392805

http://www.3dmark.com/compare/3dm11/5059839/3dm11/4519473


----------



## magic8192

Anyone seen any reviews/benchmarks of the FURY X with compute/FAH/Boinc?


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> launch drivers . . .
> 
> http://www.3dmark.com/compare/fs/2814236/fs/1392805
> 
> http://www.3dmark.com/compare/3dm11/5059839/3dm11/4519473


Well, you certainly proved your point!









And I agree, it could be/probably totally is the drivers....

Just looking at it objectively, since I was expecting a lot more from this release.
I mean, how did AMD not have the driver right... they knew what they were up against. It's like they comfortably fell on their own sword.

And what's worse, is that if it is drivers, and they release some soon that drastically improve performance of the card, then that just leaves another bullet in the chamber for everyone who is constantly criticizing AMD about their driver support.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, you certainly proved your point!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I agree, it could be/probably totally is the drivers....
> 
> Just looking at it objectively, since I was expecting a lot more from this release.
> I mean, how did AMD not have the driver right... they knew what they were up against. It's like they comfortably fell on their own sword.
> 
> And what's worse, is that if it is drivers, and they release some soon that drastically improve performance of the card, then that just leaves another bullet in the chamber for everyone who is constantly criticizing AMD about their driver support.


you were not around here when the original Titan came out, huh?


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> you were not around here when the original Titan came out, huh?


Nah, but I read some of the bits about it.

I know this kind of thing happens.... it has been happening since I got my first graphics card....

It's why you never use the driver disc that comes in the box right? lol


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nah, but I read some of the bits about it.
> 
> I know this kind of thing happens.... it has been happening since I got my first graphics card....
> 
> *It's why you never use the driver disc that comes in the box right?* lol


zep will do that. lol


----------



## xer0h0ur

Did you forget about how close the 980 and 290X were when Maxwell launched? They pulled ahead through driver development just as I expect the Fiji cards to get better performance over time.


----------



## Alastair

I don't think the Fury at the moment clocks well because they are on locked volts. I imagine that in order for AMD to get as much power efficiency as possible they probably used as low of a voltage as they could and a tiny bit more to maintain stability. I doubt AMD would say they are an overclockers dream if their in house testing didn't show it. So I just think the fact that they are voltage locked for now is what's holding them back.


----------



## billyboy8888

So what's the verdict on Fury X vs 980TI? Would later OC capability of Fury X improve with driver and firmware updates?
I'm just a simple guy that wants a high end GPU that will last me 1.5 - 2 years. I play most games and do some light video editing.
Should I buy Fury X for its sheet GPU power and potential future improve from drivers/firmware? Or should I just play it safe and get the 980ti?

Do you guys think future drivers/firmware updates will make fury x much more powerful than 980ti or just tiny bit better, or even just enough to catch up?


----------



## TK421

Quote:


> Originally Posted by *frunction*
> 
> I don't think they were ever in stock on Amazon.


Ah really? I saw sappire, and xfx listed tho.


----------



## zeppoli

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am going to report you for trolling if you keep this up. Get a life and go talk about your own tech.


ummm??? please don't ? lol

honestly, you're not serious, after all this is a TECH forum, and we're talking about the latest tech.
the GPU competing with the GTX 980 Nvidia makes.

Y U SO UPSET


----------



## xer0h0ur

Quote:


> Originally Posted by *zeppoli*
> 
> they won't pull ahead though, this was way too big of a launch, AMD was working their tails off to make the perfect drivers.. this is what they got.
> Why would you be upset though? there is nothing wrong with the card, it performs fine and the drivers are fine, they just need to adjust pricing now and everything will be great


I am upset at people like you trolling. I have nothing invested into this launch nor will I. I had no intention of buying Maxwell or Fiji. I didn't say they would pull ahead. I am not clairvoyant. I have no idea how much subsequent drivers will help Fiji's performance.


----------



## Newbie2009

so has anyone actually pre ordered/bought one?


----------



## New green

Quote:


> Originally Posted by *billyboy8888*
> 
> So what's the verdict on Fury X vs 980TI? Would later OC capability of Fury X improve with driver and firmware updates?
> I'm just a simple guy that wants a high end GPU that will last me 1.5 - 2 years. I play most games and do some light video editing.
> Should I buy Fury X for its sheet GPU power and potential future improve from drivers/firmware? Or should I just play it safe and get the 980ti?
> 
> Do you guys think future drivers/firmware updates will make fury x much more powerful than 980ti or just tiny bit better, or even just enough to catch up?


I think the fury x will definitely improve overtime. One thing to keep in mind is both companies approaches to supporting their old cards. With hbm2 next year I can definitely see nvidia gimping their gddr5 line of cards to promote their hbm2 cards down the road when compared to how amd has been know to support their older cards especially if hbm2 completely takes over the entry to enthusiasts markets. It will be interesting to see how long the transition from gddr5 to hbm2 will be at the entry level.


----------



## rv8000

Quote:


> Originally Posted by *Newbie2009*
> 
> so has anyone actually pre ordered/bought one?


Yea, mine says "packaging" under order status on the egg. Chances are it came from cali though, and I cheaped out on shipping so it probably won't come in until tuesday unless they magically stocked the NJ warehouse with release day products for once.


----------



## Agent Smith1984

Quote:


> Originally Posted by *zeppoli*
> 
> they won't pull ahead though, this was way too big of a launch, AMD was working their tails off to make the perfect drivers.. this is what they got.
> Why would you be upset though? there is nothing wrong with the card, it performs fine and the drivers are fine, they just need to adjust pricing now and everything will be great


Everything is not fine....

Moving Fury X to the price point you are suggesting throws the whole market!!

Fury vanilla was meant to fill the price gap you are suggesting the X be at... and it skews things down the line.

This cards needs to at least compete within 2-4% of 980ti and gain the sales through it's better cooling and size.

That's why it is absolutely dire that AMD get the drivers dialed in. And I am sure they will. Right around the same time retailers get more cards in stock









Edit:

Let me add.... I could care less personally, but AMD needs that market share (it's not "fine" for them), as I'm not purchasing Maxwell or Fiji..

I decided to stick with the refreshed, and long refined Hawaii cards...








Yes, money was also a factor... but so was logic


----------



## xer0h0ur

Only thing is that AMD shot themselves in the foot on the high end. Damn near no one has a case that can accommodate more than two of these radiators. So nearly no one will be able to run three or four Fury X's. They are leaving that for people who put 3rd party full coverage waterblocks on their cards. I can't be any more disappointed by their decision to lock out AIBs from their own cooling solutions on Fury X.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Only thing is that AMD shot themselves in the foot on the high end. Damn near no one has a case that can accommodate more than two of these radiators. So nearly no one will be able to run three or four Fury X's. They are leaving that for people who put 3rd party full coverage waterblocks on their cards. I can't be any more disappointed by their decision to lock out AIBs from their own cooling solutions on Fury X.


100% Agreed!

Hopefully Fury is 3% slower, and $100 cheaper, with plenty of 3rd part air coolers, and water blocks.

The problem with that though, is that if they don't get this performance closer to 980ti/Tx, and they end up in the $450-550 as their highest tier products.... then their GPU devision is going to look a lot like their CPU devision...









Competing in the mid-high end segment, offering good value, but at higher power usage... etc.... so on and so forth. You can see the similarities


----------



## anotheraznguy

http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block#post_24085692

Luckily EK is introducing a Single slot Waterblock for the fury x so for those w/ the money of getting a few GPU's wont have any issues.


----------



## p4inkill3r

Quote:


> Originally Posted by *Newbie2009*
> 
> so has anyone actually pre ordered/bought one?


----------



## Danio

Quote:


> Originally Posted by *p4inkill3r*


Wow, did you pay over $50 for shipping?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Danio*
> 
> Wow, did you pay over $50 for shipping?


When one drops 650 on the flagship GPU the day it is released, one does not simply get "FREE 4-7 day shipping"

I paid $27 to overnight my 390 from Jersey... was it a waste? Probably, but I wanted that joint son!!









BTW... Amazon is taxin mofo's now too...


----------



## szeged

probably tax + shipping.


----------



## p4inkill3r

Quote:


> Originally Posted by *Danio*
> 
> Wow, did you pay over $50 for shipping?


Free shipping, 8.25% tax.


----------



## New green

Quote:


> Originally Posted by *anotheraznguy*
> 
> http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block#post_24085692
> 
> Luckily EK is introducing a Single slot Waterblock for the fury x so for those w/ the money of getting a few GPU's wont have any issues.


Never buy referenced cards. That single slot waterblock will be on all four furys as well? I might consider the nano now when there's more info. Going to be real interesting to see how far these can be pushed once OC utilities can fully unlock it.


----------



## Ceadderman

Say what?!?









Reference cards are *always* preferable. Especially if you wist to put an aftermarket block on your GPU.









Never buy Referance. What a riot!









~Ceadder


----------



## New green

Really? Oops I'm new. I thought aftermarket fury x's would have better water blocks already attached.


----------



## Ceadderman

Nope. They have CLC loops attached. EK Waterblocks will have something to replace them in the not too distant future. It's always easier to find a block for a Reference Card over most of the manufacturer reworked cards.









~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Danio*
> 
> Wow, did you pay over $50 for shipping?
> 
> 
> 
> Free shipping, 8.25% tax.
Click to expand...

Works out to be $80-$100 cheaper for Aussies to buy from Amazon if the leaked pricing for here is to be believed.

Sigh....guess i better start looking for a 4k monitor now


----------



## hyp36rmax

Single Slot FURY X EK waterblock Sexiness!


----------



## boredmug

Considering this is new tech i would assume it will get better with driver optimizations. I never jump on the newest gpu's when they come out and i'm not one to upgrade everytime something better comes out so i'm going to pass it up and see what the node shrink and hbm2 brings. I'm running CF 290x's and happy with their performance thus far. It's almost funny to me that the trolls come in here talking smack when really the figures aren't far off from 980ti.

I WILL laugh when Win 10 and dx12 drops and it's a game changer...


----------



## TK421

Quote:


> Originally Posted by *New green*
> 
> Really? Oops I'm new. I thought aftermarket fury x's would have better water blocks already attached.


Just a question on the fury x, what is the purpose of a metal coldplate like the ones on older gen cards (gddr5, separated vrm heatsink) when all of the major heat components of the fury is already covered with the water cooler?


----------



## szeged

Quote:


> Originally Posted by *boredmug*
> 
> Considering this is new tech i would assume it will get better with driver optimizations. I never jump on the newest gpu's when they come out and i'm not one to upgrade everytime something better comes out so i'm going to pass it up and see what the node shrink and hbm2 brings. I'm running CF 290x's and happy with their performance thus far. It's almost funny to me that the trolls come in here talking smack when really the figures aren't far off from 980ti.
> 
> I WILL laugh when Win 10 and dx12 drops and it's a game changer...


10 and 12 arent only going to effect amd performance, you do know that right?


----------



## DividebyZERO

Quote:


> Originally Posted by *hyp36rmax*
> 
> 
> 
> 
> Single Slot FURY X EK waterblock Sexiness!


I have to admit that is freaking awesome


----------



## Ceadderman

Sure he does. But I am reasonably sure that Fury X is somewhat limited at this time based on lack of support.









~Ceadder


----------



## rdr09

Quote:


> Originally Posted by *hyp36rmax*
> 
> 
> 
> 
> Single Slot FURY X EK waterblock Sexiness!


Nice.

Quote:


> Originally Posted by *boredmug*
> 
> Considering this is new tech i would assume it will get better with driver optimizations. I never jump on the newest gpu's when they come out and i'm not one to upgrade everytime something better comes out so i'm going to pass it up and see what the node shrink and hbm2 brings. I'm running CF 290x's and happy with their performance thus far. It's almost funny to me that the trolls come in here talking smack when really the figures aren't far off from 980ti.
> 
> I WILL laugh when Win 10 and dx12 drops and it's a game changer...


they want us to stop supporting amd and go . . .


Spoiler: Warning: Spoiler!


----------



## Devnant

Purchasing a Fury X right now is certainly a gamble. A gamble Win 10, DX12 and new drivers will change everything.

History doesn´t always repeat itself, though.

Remember when the 290X launched and it outperformed the TITAN for half the price? Fury X is much more than half the TITAN X price and hardly competes with NVIDIA´s second most powerful GPU right now.

It certainly seems overpriced, specially considering all the extra features the 980 ti brings to the table: 2 more VRAM, 2.0 HDMI and Dx 12_1 support just to name a few.


----------



## boredmug

Quote:


> Originally Posted by *szeged*
> 
> 10 and 12 arent only going to effect amd performance, you do know that right?


No, i understand that but so far it looks like AMD has an advantage with the draw calls. It really doesn't matter to me to be honest. If it fits my budget and plays my games at a good framerate i really don't care if it benches at the top of the list. I usually pick up the uber technophiles cast offs when the next best thing drops anyways. The price WILL drop on these and then it will become more attractive whether or not it's king of the bench. Logically, i expect Nvidia to edge out AMD in most cases. Huge R&D and market share. It makes sense. It's all about price point to me and i expect the price on these cards to be adjusted.


----------



## rv8000

Quote:


> Originally Posted by *Devnant*
> 
> Purchasing a Fury X right now is certainly a gamble. A gamble Win 10, DX12 and new drivers will change everything.
> 
> History doesn´t always repeat itself, though.
> 
> Remember when the 290X launched and it outperformed the TITAN for half the price? Fury X is much more than half the TITAN X price and *hardly competes* with NVIDIA´s second most powerful GPU right now.
> 
> It certainly seems overpriced, specially considering all the extra features the 980 ti brings to the table: 2 more VRAM, 2.0 HDMI and Dx 12_1 support just to name a few.


Have parrots learned to type in recent years or something?

Several reviews show it matching the 980ti in some cases, in a minor few beating the 980ti and TX, 40-50% of the time being around 5% slower than the 980ti, and then maybe 10% of the time having a very odd performance gap of 25% and performing closely to the 980/390x. 4GB is plenty for 99% of users. It'll be a long time before we see a full dx12 game that utilizes everything it has to offer, both camps don't offer full dx12 support so kind of pointless to argue that. Your best argument lies within the HDMI 2.0 issue, and unless you only plan on using a 4k tv or have a 120hz+ monitor without dP its not a HUGE deal imo.


----------



## gatygun

The card has a terrible name and is overpriced to hell.

4gb of v-ram isn't gonna sell to the masses that want to invest more then 500+ on a premium videocard. No matter what the speed is, the 290 series should have already showcased this.

AMD is interesting for most people as a good solid alternative from nvidia that pushes high performance for cheaper prices. The fact that the fury x isn't pulling away from the 980ti and the 980ti is faster is absolutely not going to sell this card. It makes it non existent for people.

In my opinion anybody with a 290 and 7970 are better off waiting or simple crossfire there card to get the same / better results if they can deal with crossfire for far more cheaper, and sit this generation out entirely until both company's move to hbm2 next year + we have a die shrink etc.

It also doesn't help amd that with the lackbusting flag ship cards, there normal 300 series looks like completely useless and non existent. Because it's simple a rebrand and we already had those card. If you wanted a 390x or 390 you would have gotten a 290/290x already and oc'ed it to get the same results. So for who are those cards exactly? specially with there premium price on top of it. The 390x and 390 are way to slow to push the 8gb of v-ram anyway.

In other words the whole 300 series + fury lineup seems to be far way from what people wanted from AMD and in my view they seem to be almost non existent because of this move. It also doesn't help that they are extremely late to the party, the 980 is already old news and therefore the entire 300 series lineup is old news.

What they should have done:

1) Cut hbm from the fury x / fury, push gddr5 on it and throw 8gb on it.
2) sell it for 499 fury x / 399 for fury
3) rename the fury x and fury towards 390 and 390x, so that people actually have the feeling that the 300 series is actually not non existent or useless, people know with the 200 series with what to expect
4) release the 395x2 with watercooling with 2x8gb chips.

Then just sit this year out and release next year.

1) hbm2 card with 8gb v-ram or even more, and move into the 400 series.

In my opinion as i have a 290, i see no interest in this entire 300/fury solution to upgrade towards, unless you upgrade every year, but even then the 980 ti seems to be far more interesting to go for.


----------



## xer0h0ur

Quote:


> Originally Posted by *szeged*
> 
> 10 and 12 arent only going to effect amd performance, you do know that right?


You do realize that Maxwell 2 is limited to 31 asynchronous compute commands versus Hawaii and Tonga being able to process 64 asynchronous compute commands simultaneously along with its graphics queue? We don't even know yet if Fiji was packed with more ACEs. Hawaii and Tonga had 8 ACEs a piece, each of which is running up to 8 asynchronous compute commands. If AMD went whole hog and packed more ACEs then you might just eat your own words. DX12 favors parallel computing and if there is one thing that GCN is, its highly parallel. Maxwell, not so much.

http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading


----------



## szeged

That's true but really is the answer for fury to wait another year possibly for dx12 to get into full swing?

Hey guys we delayed our cards for months, now just wait another few months for dx12 so the card is finally the Titan killer we wanted it to be.

By that time pascal is gonna be starting and Arctic islands will be hopefully popping up soon with it and fury vs the ti won't matter anymore.

"just wait longer" shouldn't always be the answer from amd.


----------



## djsatane

Quote:


> Originally Posted by *gatygun*
> 
> The card has a terrible name and is overpriced to hell.
> 
> 4gb of v-ram isn't gonna sell to the masses that want to invest more then 500+ on a premium videocard. No matter what the speed is, the 290 series should have already showcased this.
> 
> AMD is interesting for most people as a good solid alternative from nvidia that pushes high performance for cheaper prices. The fact that the fury x isn't pulling away from the 980ti and the 980ti is faster is absolutely not going to sell this card. It makes it non existent for people.
> 
> In my opinion anybody with a 290 and 7970 are better off waiting or simple crossfire there card to get the same / better results if they can deal with crossfire for far more cheaper, and sit this generation out entirely until both company's move to hbm2 next year + we have a die shrink etc.
> 
> It also doesn't help amd that with the lackbusting flag ship cards, there normal 300 series looks like completely useless and non existent. Because it's simple a rebrand and we already had those card. If you wanted a 390x or 390 you would have gotten a 290/290x already and oc'ed it to get the same results. So for who are those cards exactly? specially with there premium price on top of it. The 390x and 390 are way to slow to push the 8gb of v-ram anyway.
> 
> In other words the whole 300 series + fury lineup seems to be far way from what people wanted from AMD and in my view they seem to be almost non existent because of this move. It also doesn't help that they are extremely late to the party, the 980 is already old news and therefore the entire 300 series lineup is old news.
> 
> What they should have done:
> 
> 1) Cut hbm from the fury x / fury, push gddr5 on it and throw 8gb on it.
> 2) sell it for 499 fury x / 399 for fury
> 3) rename the fury x and fury towards 390 and 390x, so that people actually have the feeling that the 300 series is actually not non existent or useless, people know with the 200 series with what to expect
> 4) release the 395x2 with watercooling with 2x8gb chips.
> 
> Then just sit this year out and release next year.
> 
> 1) hbm2 card with 8gb v-ram or even more, and move into the 400 series.
> 
> In my opinion as i have a 290, i see no interest in this entire 300/fury solution to upgrade towards, unless you upgrade every year, but even then the 980 ti seems to be far more interesting to go for.


Great post!


----------



## xer0h0ur

Quote:


> Originally Posted by *szeged*
> 
> That's true but really is the answer for fury to wait another year possibly for dx12 to get into full swing?
> 
> Hey guys we delayed our cards for months, now just wait another few months for dx12 so the card is finally the Titan killer we wanted it to be.
> 
> By that time pascal is gonna be starting and Arctic islands will be hopefully popping up soon with it and fury vs the ti won't matter anymore.
> 
> "just wait longer" shouldn't always be the answer from amd.


Its not their answer. In fact its the answer you're giving...I am skeptical anyways that Nvidia can do a complete 180 and go from compute gimped Maxwell to a compute heavy Pascal while maintaining a lead. Only time will tell.

I do find it hilarious though that as soon as I point out Nvidia's distinct disadvantage architecturally to GCN with respect to DX12, people immediately say well its not like it matters DX12 is still not out yet and full support won't arrive till blah blah blah and by then these cards will be useless because blah blah blah is out.


----------



## szeged

It's the answer I'm giving? Maybe you should re read the past few pages. Fury isn't the Titan killer everyone wanted it to be, people respond with well just wait till dx12 is out in a year and fury will be so good. Yeah because waiting a year later is what we want. Anyways, done with this little back and forth because it's off topic really, I'll read your response to this so you don't write it for nothing but I won't reply as I don't want to get the thread locked or cleaned.


----------



## szeged

Is it confirmed that full dx12 will launch with Windows 10? Also can current amd and nvidia cards take full advantage of dx12?

Also I have no bias, I was 100 percent ready to get a fury today but the lack of overclocking has turned me away for right now. I'll wait to see if the voltage gets unlocked and if it does I'll get one. Also tell my 2x 290x lightning cards, 270x,hd 7870, HD 7970 and 290x matrix that's on the way that I'm biased. I do have 3 TX in my most recent rig that I just built and I'm not hiding that but that doesn't mean anything. Just because someone has a different view than you doesn't mean they are bias or a shill or fanboy. You appear to have a strong liking for amd and only want to see the good from amd but I'm not gonna call you a bias fanboy because of it. Save that for the trolls.


----------



## Kane2207

Quote:


> Originally Posted by *xer0h0ur*
> 
> We are like a month and change away from Windows 10 and DX12. There are DX12 games that are launching in months time. You're exaggerating so hard that your bias is showing. Try covering that up a little better bud.


C'mon, I want to see AMD do well but the results point to an OK card that's below everyone's expectations given the hype. I see no real reasons to purchased it over the competition, which is disappointing because I wanted a top end card that wasn't Nvidia.

I am not willing to wait to see if, and it's a big _if_, drivers improve things, whether Windows 10 makes a difference, or if DX12 is a 'game changer'. Some of those things are so far on the horizon it would be stupid to shell out £500+ for a card on that basis. Speculative improvements in all of those elements are such an unknown quantity I may as well ask my cat what he thinks the performance will be like. It could be 5%, it could be 50%, proper finger in the wind guess work.

That's not how I'm going to decide on how I spend my money.


----------



## xer0h0ur

You're welcome to look back at the various posts where I was quite critical of Fury X. I even said that if this was a boxing match that Nvidia's Titan X / 980 Ti won the first round. Hell I can list off all the stupid crap AMD has done here so you don't have to find the posts. #1 (In my opinion at least) Locking AIBs to water cooled reference design, #2 No HDMI 2.0 and to a lesser extent chopping off the dual link DVI (although this doesn't affect me and I am all for dropping DVI), #3 claiming to have the fastest card in the world and releasing benchmarks showing as much only to have it not even keep up with 980 Ti in a lot of scenarios, #4 launching Fury X with seemingly unpolished drivers giving questionable performance. I am sure I am forgetting other things.

The point is I am highly critical of them despite desperately wanting them to get a win. I don't want to see Nvidia's garbage practices become the industry norm. I would likely cease to be a PC gamer if Nvidia one day put ATI/AMD out of business in the dGPU market because there is no way I would support them with my dollars. The constant lies and attempts to fracture the industry are things I can't look past. I have a deep seeded hate for them since they bought 3DFX and shelved their technology for years and all of their shadiness in the recent past doesn't inspire any changes in my stance either. It is what it is.

Note: I am not foolish enough to discredit that which I hate. I give credit where its due, even if I don't like it.


----------



## rv8000

Quote:


> Originally Posted by *Kane2207*
> 
> *cough* at stock *cough*


Regardless, its not like it's a slaughter fest. $599 would be a better price imo, but I doubt AMD can actually price it much lower as the 980ti came in at much lower than expected with basically Titan X performance. Judging from Hawaii I figured there wouldn't be much oc headroom, and I have a bad feeling it won't be much more than 1200-1250. I don't really care which card is ultimately the fastest but you really have to wonder whats up with drivers when there are those cases where it matches or beats a ti/tx, and then all of the sudden on another test it's 30% behind; I don't expect driver miracles but some games show it's clearly capable of matching the 980ti and TX.

To the other person who quoted me, we could play the pick a benchmark in your arguments favor game all day, it is clear in several reviews that Fury X can match or slightly out pace the ti/tx and then be far behind in others.


----------



## semitope

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're welcome to look back at the various posts where I was quite critical of Fury X. I even said that if this was a boxing match that Nvidia's Titan X / 980 Ti won the first round. Hell I can list off all the stupid crap AMD has done here so you don't have to find the posts. #1 (In my opinion at least) Locking AIBs to water cooled reference design, #2 No HDMI 2.0 and to a lesser extent chopping off the dual link DVI (although this doesn't affect me and I am all for dropping DVI), #3 claiming to have the fastest card in the world and releasing benchmarks showing as much only to have it not even keep up with 980 Ti in a lot of scenarios, #4 launching Fury X with seemingly unpolished drivers giving questionable performance. I am sure I am forgetting other things.
> 
> The point is I am highly critical of them despite desperately wanting them to get a win. I don't want to see Nvidia's garbage practices become the industry norm. I would likely cease to be a PC gamer if Nvidia one day put ATI/AMD out of business in the dGPU market because there is no way I would support them with my dollars. The constant lies and attempts to fracture the industry are things I can't look past. I have a deep seeded hate for them since they bought 3DFX and shelved their technology for years and all of their shadiness in the recent past doesn't inspire any changes in my stance either. It is what it is.
> 
> Note: I am not foolish enough to discredit that which I hate. I give credit where its due, even if I don't like it.


similar but I don't so much hate nvidia as I don't want them to use my money to damage PC gaming. Makes no sense to support a company that's holding the industry back with their practices. Also do not trust them after my 970 turned into a lie. Even now I wonder if they aren't messing with pixels/color in their cards to improve performance. No trust man.


----------



## xer0h0ur

Quote:


> Originally Posted by *rv8000*
> 
> Regardless, its not like it's a slaughter fest. $599 would be a better price imo, but I doubt AMD can actually price it much lower as the 980ti came in at much lower than expected with basically Titan X performance. Judging from Hawaii I figured there wouldn't be much oc headroom, and I have a bad feeling it won't be much more than 1200-1250. I don't really care which card is ultimately the fastest but you really have to wonder whats up with drivers when there are those cases where it matches or beats a ti/tx, and then all of the sudden on another test it's 30% behind; I don't expect driver miracles but some games show it's clearly capable of matching the 980ti and TX.
> 
> To the other person who quoted me, we could play the pick a benchmark in your arguments favor game all day, it is clear in several reviews that Fury X can match or slightly out pace the ti/tx and then be far behind in others.


I actually shook my head when an AMD guy said that its necessary to optimize for HBM with the Fiji line. As if AMD needs another thing they have to "optimize" their drivers for. I was under the impression that any memory technology upgrade was an automatic thing that never required driver shenanigans to have it operate at its full potential. So that leads me to believe one of two things. The optimizations they are referring to are for the sake of making the 4GB limitation work for them, or it means that HBM in and of itself needs driver help to operate properly no matter what capacity. If its the latter then oh freakin boy. People are in for headaches.


----------



## Kane2207

Quote:


> Originally Posted by *semitope*
> 
> similar but I don't so much hate nvidia as I don't want them to use my money to damage PC gaming. Makes no sense to support a company that's holding the industry back with their practices. Also do not trust them after my 970 turned into a lie. Even now I wonder if they aren't messing with pixels in their cards to improve performance. No trust man.


Devs and publishers are ruining the industry all by themselves, I'm not sure they need Nvidias help


----------



## Forceman

Quote:


> Originally Posted by *rv8000*
> 
> Regardless, its not like it's a slaughter fest. $599 would be a better price imo, but I doubt AMD can actually price it much lower as the 980ti came in at much lower than expected with basically Titan X performance. Judging from Hawaii I figured there wouldn't be much oc headroom, and I have a bad feeling it won't be much more than 1200-1250. I don't really care which card is ultimately the fastest but you really have to wonder whats up with drivers when there are those cases where it matches or beats a ti/tx, and then all of the sudden on another test it's 30% behind; I don't expect driver miracles but some games show it's clearly capable of matching the 980ti and TX.
> 
> To the other person who quoted me, we could play the pick a benchmark in your arguments favor game all day, it is clear in several reviews that Fury X can match or slightly out pace the ti/tx and then be far behind in others.


What if those times it is lagging far behind, and close to the 390X, it isn't drivers but it is instead ROP limited? Fury has basically the same pixel fill rate as Hawaii, which could be a big limitation in certain games. After all, that was one of the reasons given for why the 290X was better than Kepler in 4K benches.


----------



## xer0h0ur

There we go, I knew I had forgotten something in my complaints. They increased the texture mapping units quite a bit but left the ROPs at the same amount as Hawaii? That literally made no sense to me. Didn't they also leave die space too? As in the die could have been bigger?


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> I actually shook my head when an AMD guy said that its necessary to optimize for HBM with the Fiji line. As if AMD needs another thing they have to "optimize" their drivers for. I was under the impression that any memory technology upgrade was an automatic thing that never required driver shenanigans to have it operate at its full potential. So that leads me to believe one of two things. The optimizations they are referring to are for the sake of making the 4GB limitation work for them, or it means that HBM in and of itself needs driver help to operate properly no matter what capacity. If its the latter then oh freakin boy. People are in for headaches.


Well you have to consider the fact its not like going from GDDR3 to GDDR5, where speed, timings, and power usage is the change while their memory interface design is probably fairly similar. Now consider a major architectural change to how GCN interfaces with it's memory system, optimization for HBM isn't and idea I'd throw out the window but I'm not gonna cling to predicting miracles from it either.

OT comment to people in general: I really doubt some peoples critical thinking skills, we can't look at one review and be done with all the others just because it favors your view. We'll never get the whole picture that way.
Quote:


> Originally Posted by *Forceman*
> 
> What if those times it is lagging far behind, and close to the 390X, it isn't drivers but it is instead ROP limited? Fury has basically the same pixel fill rate as Hawaii, which could be a big limitation in certain games. After all, that was one of the reasons given for why the 290X was better than Kepler in 4K benches.


True, forgot to think about this. I'll have to look at some benchmarks between the 980 and 390x more closely then.


----------



## Forceman

They said the die was as big as it could be, within about 4 mm^2 I think they said, but maybe they could have ditched the TrueAudio DSP? Don't know how much space that takes up though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> What if those times it is lagging far behind, and close to the 390X, it isn't drivers but it is instead ROP limited? Fury has basically the same pixel fill rate as Hawaii, which could be a big limitation in certain games. After all, that was one of the reasons given for why the 290X was better than Kepler in 4K benches.


This is a huge part of it buddy! Good call!

Look at nvidia ROP


----------



## bobbavet

Well a bit ordinary imo.

If there was to be a 300 series, I really believe that Fury X should have been the new 390/390x, priced and hyped accordingly.

Move all the cards down the line and beat Nvidia at every price point. Headlines would read "Well played AMD, well played indeed."

So much time, effort, money and hype on what can only be described as a "non event" on the 300 series and "close but no cigar" at enthusiast.

Still keen to see Xfire and Furyx2. No single GPU is tickling my 4k fancy.

Found some prelim Xfire Fury.

AMD Fury X CrossFire Gaming Benchmarks vs SLI Titan X

No doubt AMD will price itself out of sales. Delusional on "Titan Killer" rather than the 295x2 price range it deserves.


----------



## xer0h0ur

Quote:


> Originally Posted by *rv8000*
> 
> Well you have to consider the fact its not like going from GDDR3 to GDDR5, where speed, timings, and power usage is the change while their memory interface design is probably fairly similar. Now consider a major architectural change to how GCN interfaces with it's memory system, optimization for HBM isn't and idea I'd throw out the window but I'm not gonna cling to predicting miracles from it either.
> 
> OT comment to people in general: I really doubt some peoples critical thinking skills, we can't look at one review and be done with all the others just because it favors your view. We'll never get the whole picture that way.
> True, forgot to think about this. I'll have to look at some benchmarks between the 980 and 390x more closely then.


Right, I just have little faith in AMD's driver team as is and now that every game needs to get individual HBM optimization, things are not looking so good anymore to me.


----------



## Kane2207

Quote:


> Originally Posted by *xer0h0ur*
> 
> Right, I just have little faith in AMD's driver team as is and now that every game needs to get individual HBM optimization, things are not looking so good anymore to me.


Ah, you're damned if you and damned if you don't. I'm still waiting for Nvidias driver team to get their head around hardware accelerated web browsing


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> Right, I just have little faith in AMD's driver team as is and now that every game needs to get individual HBM optimization, things are not looking so good anymore to me.


*Shrugs*

Neither camp has been all that great with drivers lately. It doesn't help that the majority of games are ports outsourced to another studio either. It really isn't a good time to be a pc gamer, but has it really ever been? Seems as a community we're always super disappointed about something or the other.


----------



## bobbavet

So HBM needs optimizations and updates per game. More wait time. *sigh*

I feel like "Padme" on the landing pad of Mustafar.

"AMD you are breakin my heart, you are going down a path I cannot follow" lols

Well, I probably will go down that path later but implemented better elsewhere.

Finding an "unloved" 295x2 looks good to me.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bobbavet*
> 
> Well a bit ordinary imo.
> 
> If there was to be a 300 series, I really believe that Fury X should have been the new 390/390x, priced and hyped accordingly.
> 
> Move all the cards down the line and beat Nvidia at every price point. Headlines would read "Well played AMD, well played indeed."
> 
> So much time, effort, money and hype on what can only be described as a "non event" on the 300 series and "close but no cigar" at enthusiast.
> 
> Still keen to see Xfire and Furyx2. No single GPU is tickling my 4k fancy.
> 
> Found some prelim Xfire Fury.
> 
> AMD Fury X CrossFire Gaming Benchmarks vs SLI Titan X
> 
> No doubt AMD will price itself out of sales. Delusional on "Titan Killer" rather than the 295x2 price range it deserves.


To be honest....
300 series has competed where it was supposed to so far....

Just sayin'


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> What if those times it is lagging far behind, and close to the 390X, it isn't drivers but it is instead ROP limited? Fury has basically the same pixel fill rate as Hawaii, which could be a big limitation in certain games. After all, that was one of the reasons given for why the 290X was better than Kepler in 4K benches.


Thinking that over, wouldn't the opposite be the case, with the fury and 290/390x have declining performance as the amount of pixels increase with resolution?


----------



## bobbavet

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To be honest....
> 300 series has competed where it was supposed to so far....
> 
> Just sayin'


Yeh but why kick your opponent in the shins when you could "punch to the nuts"

Just sayin. lols


----------



## xer0h0ur

Yup, that is what they said. Like I mentioned though, I have no idea if its due to the 4GB limitation or if its simply a thing both Nvidia and AMD are going to have to deal with going forward due to the nature of how HBM operates.


----------



## th3illusiveman

Quote:


> Originally Posted by *bobbavet*
> 
> Well a bit ordinary imo.
> 
> If there was to be a 300 series, I really believe that Fury X should have been the new 390/390x, priced and hyped accordingly.
> 
> Move all the cards down the line and beat Nvidia at every price point. Headlines would read "Well played AMD, well played indeed."
> 
> So much time, effort, money and hype on what can only be described as a "non event" on the 300 series and "close but no cigar" at enthusiast.
> 
> Still keen to see Xfire and Furyx2. No single GPU is tickling my 4k fancy.
> 
> Found some prelim Xfire Fury.
> 
> AMD Fury X CrossFire Gaming Benchmarks vs SLI Titan X
> 
> No doubt AMD will price itself out of sales. Delusional on "Titan Killer" rather than the 295x2 price range it deserves.


I agree with you 100%. This card should never have been called Fury X or whatever but rather 390X and it should have been priced at either $550 or $600 and it would have gotten decent reviews. The Fury non X should have been the 390 and depending on performance it could have been between $450 and 500 and it too would be well received. The 290X should have been moved down to 380X while being priced the same as the 970 etc etc.

They have solid hardware, as they usually do on the GPU side, it's the idiotic management AGAIN that screwed up and it AGAIN will cost them market share and sales. They need some massive corporate restructuring because the idiots running it right now are running them into the ground and betraying the work their excellent engineers create.


----------



## Ceadderman

Say what?









Why would *anybody* name it 390?

We know that 390 is rebranded 290. And 385 will likely be 285.

It's completely new architecture so AMD appropriately named it. Just there is no current support for it outside of AMD and upcoming DX12.

One question however...

Who and when did anybody from AMD market the card as 980/Titan killer? I haven't seen anything of the kind out there. It just might be but we as a community need to learn how to temper such expectations before AMD launches anything and wait for the support to get to where it needs to be with the new architecture before we'll know for sure.









~Ceadder


----------



## Agent Smith1984

Hmmmmm

More shaders but no increase in architecture performance per shader....

Just adding more shaders.... Where else had AMD used this tactic?

Guys... I'm a huge AMD supporter... And I'm more than happy with my 390 purchase and my fx-8300, but I'm starting to see a pattern here


----------



## szeged

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmmmmm
> 
> More shaders but no increase in architecture performance per shader....
> 
> Just adding more shaders.... Where else had AMD used this tactic?
> 
> Guys... I'm a huge AMD supporter... And I'm more than happy with my 390 purchase and my fx-8300, but I'm starting to see a pattern here


AMD gpu team is borrowing ideas from their cpu team.

Keep adding more coressssssssssssss


----------



## bobbavet

Quote:


> Originally Posted by *Ceadderman*
> 
> Say what?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why would *anybody* name it 390?
> 
> We know that 390 is rebranded 290. And 385 will likely be 285.
> 
> It's completely new architecture so AMD appropriately named it. Just there is no current support for it outside of AMD and upcoming DX12.
> 
> ~Ceadder


So where is the new arch / new series name "rule book"? Who wrote it?

AMD were thinking out of the "wrong square" imo.

When I said "Titan Killer" I was referring to the impending FuryX2 read again.


----------



## xer0h0ur

Yeah, its a beefed up card but its by no means a new architecture. It remains GCN. If it was a complete departure from GCN then you would have a case to say that. But its not. It remains a part of the GCN family.


----------



## bobbavet

How is Fiji with a different memory interface a complete arch change?


----------



## Ceadderman

It *is* new architecture when the HBM are on the same die as the GPU. Is everything else new? No. But it's hair splitting to suggest that it's anything else but new architecture, since there's nothing else like it on the market from nVidia. Right?









Embrace the darkside.









~Ceadder


----------



## semitope

Quote:


> Originally Posted by *xer0h0ur*
> 
> I actually shook my head when an AMD guy said that its necessary to optimize for HBM with the Fiji line. As if AMD needs another thing they have to "optimize" their drivers for. I was under the impression that any memory technology upgrade was an automatic thing that never required driver shenanigans to have it operate at its full potential. So that leads me to believe one of two things. The optimizations they are referring to are for the sake of making the 4GB limitation work for them, or it means that HBM in and of itself needs driver help to operate properly no matter what capacity. If its the latter then oh freakin boy. People are in for headaches.


it is said the driver manages the VRAM usage. In that case, yeah it would likely need new drivers.


----------



## freezer2k

No,

It's really just very old GCN with HBM hacked into it.

This is why we see 450W TDP at stock voltage!
Yes this is with Furmark, but it really shows that they basically just added more of the same old GCN cores.


----------



## TK421

Quote:


> Originally Posted by *freezer2k*
> 
> No,
> 
> It's really just very old GCN with HBM hacked into it.
> 
> This is why we see 450W TDP at stock voltage!
> Yes this is with Furmark, but it really shows that they basically just added more of the same old GCN cores.


It's unwise to say that furmark will demonstrate true tdp, it's just a power virus.


----------



## Ceadderman

8.6tfs? That's a lot of Data flow.









~Ceadder


----------



## DividebyZERO

Quote:


> Originally Posted by *HiTechPixel*
> 
> I am looking for 5K Crossfire reviews too. The results will decide if I go Titan X SLI or Fury X Crossfire since I have a 5K monitor.


The only thing stopping me right now is the 4GB VRAM and unknowns of overclocking/crossfire. Anyone interested in memory though might want to look at *this*


----------



## semitope

Quote:


> Originally Posted by *freezer2k*
> 
> No,
> 
> It's really just very old GCN with HBM hacked into it.
> 
> This is why we see 450W TDP at stock voltage!
> Yes this is with Furmark, but it really shows that they basically just added more of the same old GCN cores.


I think both amd and nvidia limit the effect of furmark. Just happens that AMD limits it less?

Anyway it was 350 to 370 on Furmark. The same kind of thing happens with maxwell on furmark. An overclocked 980ti also can get close to 300W in gaming


----------



## gatygun

Quote:


> Originally Posted by *DividebyZERO*
> 
> The only thing stopping me right now is the 4GB VRAM and unknowns of overclocking/crossfire. Anyone interested in memory though might want to look at *this*


I wonder how they recorded the v-ram usage, because most of the left over ram gets used as cache anyway and isn't actually needed mostly in games.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DividebyZERO*
> 
> The only thing stopping me right now is the 4GB VRAM and unknowns of overclocking/crossfire. Anyone interested in memory though might want to look at *this*


Where'd you find this?
Edit: nm, found link, thanks!

Shadow is an AMD favored title, but very VRAM intensive...

Also, any word on skipping yet? VRAM max doesn't always show up in fps, but rather comes as a stutter...


----------



## gatygun

Quote:


> Originally Posted by *th3illusiveman*
> 
> I agree with you 100%. This card should never have been called Fury X or whatever but rather 390X and it should have been priced at either $550 or $600 and it would have gotten decent reviews. The Fury non X should have been the 390 and depending on performance it could have been between $450 and 500 and it too would be well received. The 290X should have been moved down to 380X while being priced the same as the 970 etc etc.
> 
> They have solid hardware, as they usually do on the GPU side, it's the idiotic management AGAIN that screwed up and it AGAIN will cost them market share and sales. They need some massive corporate restructuring because the idiots running it right now are running them into the ground and betraying the work their excellent engineers create.


The main reason they did this is to inflate there prices right out of the gate.

If they would have asked 650 euro's for a 390x people would just lol at it, and say "will pick it up a year later for half the price".

This is also why nvidia releases it's first 900 serie card as a weaker card, and holds back there real top end card for a later revision on a more cheaper solution. The real 980 was the titan x basically, but as there is no competition they can release a cut down version for cheaper and call it a 980 TI.


----------



## Casey Ryback

Quote:


> Originally Posted by *gatygun*
> 
> If they would have asked 650 euro's for a 390x people would just lol at it, and say "will pick it up a year later for half the price".


I think this is actually half the problem with the market at the moment. Companies can just gouge customers.

Nobody boycotts anything anymore or protests to paying huge amounts for products, they just lap it up. it's about being the coolest cat on the block or something.

here's one of the biggest online retailers in australia...........

980ti's all but one or two sold out.

http://www.pccasegear.com/index.php?main_page=index&cPath=193_1766

AMD fury X.....sold out instantly.

http://www.pccasegear.com/index.php?main_page=index&cPath=193_1774

It just makes you wonder really, can we blame the actual companies for the high prices.

If you can set a higher price, and sell 98% as many units anyway.....it's a win for them right?


----------



## xer0h0ur

Quote:


> Originally Posted by *freezer2k*
> 
> No,
> 
> It's really just very old GCN with HBM hacked into it.
> 
> This is why we see 450W TDP at stock voltage!
> Yes this is with Furmark, but it really shows that they basically just added more of the same old GCN cores.


Except GCN has been updated throughout its life cycle. Calling GCN old is like calling a Navy Seal old. Doesn't change the fact he can still kick butt. As for Furmark, its a meaningless tool used by some to push a card to its limits. Its not representative of performance under gaming loads at all.


----------



## bobbavet

Maybe AMD are trolling. It does have dual bios switch.
Who knows what shenanigans can be gotten up to.


----------



## weinstein888

I'm so conflicted on two of these vs. two 980ti. I run 5760x1080 at the moment and I think they'd be about neck and neck for the majority of games. I play GTA V though and it looks like, unless AMD releases some beastly driver updates, the green team is taking that one. I just don't want to support Nvidia at the moment...decisions, decisions. What's stopping me is 1. Overclocking, 2. Stubby PCB that'll look silly in my TJ11, 3. 4GB. Anyone know when more of these will be available anyway?


----------



## Casey Ryback

Quote:


> Originally Posted by *weinstein888*
> 
> I'm so conflicted on two of these vs. two 980ti. I run 5760x1080 at the moment and I think they'd be about neck and neck for the majority of games. I play GTA V though and it looks like, unless AMD releases some beastly driver updates, the green team is taking that one. I just don't want to support Nvidia at the moment...decisions, decisions. What's stopping me is 1. Overclocking, 2. Stubby PCB that'll look silly in my TJ11, 3. 4GB. Anyone know when more of these will be available anyway?


If you're considering a fury card, the maybe wait for the release of the air cooled fury coming soon.

This should also give some time to see any driver updates and unlocking of voltage and OC potential.


----------



## gatygun

Quote:


> Originally Posted by *Casey Ryback*
> 
> I think this actually half the problem with the market at the moment. Companies can just gouge customers.
> 
> Nobody boycotts anything anymore or protests to paying huge amounts for products, they just lap it up. it's about being the coolest cat on the block or something.
> 
> here's one of the biggest online retailers in australia...........
> 
> 980ti's all but one or two sold out.
> 
> http://www.pccasegear.com/index.php?main_page=index&cPath=193_1766
> 
> AMD fury X.....sold out instantly.
> 
> http://www.pccasegear.com/index.php?main_page=index&cPath=193_1774
> 
> It just makes you wonder really, can we blame the actual companies for the high prices.


Well the 980ti is basically the only card atm that really pushes numbers forwards for a "somewhat affordable price". The fury x also probably has a limited supply currently.

But then, there are a ton of people that have no clue what they are doing tho, and just buy whatever is the newest.

I saw a crap ton of people buy a 7990 at a shop where i worked for a while, that had no clue that it was a dual gpu card and just wanted to put it in a tri-fire / 4 way solution

I know people that play with 3-4 gpu's in borderless windowed mode with costum loop cooling already from launch day and have no clue they only used for the entire time just 1 gpu, because it doesn't work in anything else then full screen. These people also tend to never update there drivers and lack a ton of cf profiles or any optimisation at all.

I know review sites that test games still at this date ( like fury x / titan x / 980 ti / 295x2 benchmark ) with 14.12 driver, pick the witcher 3 game, ultra every setting and leave the tesselation at 64. And then talk about how bad amd performance in the game.

Enough moronic people in the world, that's for sure. My mind was blown multiple times from people that bought expensive stuff and never ended up using it lol.

At the end i could see with people buying expensive freesync monitors or g-sync for that matter, to upgrade earlier towards there same gpu manufacturer that they currently have. As they also have to change there display otherwise. Specially if you just invested a few hunders of euro's / dollars ( more like 500+ ) in a new monitor.


----------



## Casey Ryback

Quote:


> Originally Posted by *gatygun*
> 
> I know people that play with 3-4 gpu's in borderless windowed mode with costum loop cooling already from launch day and have no clue they only used for the entire time just 1 gpu, because it doesn't work in anything else then full screen. These people also tend to never update there drivers and lack a ton of cf profiles or any optimisation at all.
> 
> I know review sites that test games still at this date ( like fury x / titan x / 980 ti / 295x2 benchmark ) with 14.12 driver, pick the witcher 3 game, ultra every setting and leave the tesselation at 64. And then talk about how bad amd performance in the game.


----------



## weinstein888

Quote:


> Originally Posted by *Casey Ryback*
> 
> If you're considering a fury card, the maybe wait for the release of the air cooled fury coming soon.
> 
> This should also give some time to see any driver updates and unlocking of voltage and OC potential.


Isn't the air cooled fury supposed to have fewer stream processors and lower clock speed out of the box? Either way it's all going on water, so I don't really care what the cooler looks like. Is it possible that the Fury non-x will perform better than the X kind of like ti vs tx? I'm just gonna wait for driver updates in the coming weeks and hope that it changes. I really don't want to buy Nvidia this go around so I'm hoping with all my heart that the numbers go up.


----------



## Casey Ryback

Quote:


> Originally Posted by *weinstein888*
> 
> Isn't the air cooled fury supposed to have fewer stream processors and lower clock speed out of the box?Is it possible that the Fury non-x will perform better than the X kind of like ti vs tx?


I think it's possible that it will be very close to the fury X performance. Just as the 7950/7970, 290/290X and as you say the ti and tx.

Often the average OC potential seems to be slightly higher on the cut down gpu's right?


----------



## Agent Smith1984

6/24/15
The day AMD almost broke the internet!


----------



## svenge

Quote:


> Originally Posted by *Ceadderman*
> 
> We know that 390 is rebranded 290. And 385 will likely be 285.


Wasn't the 285 "Tonga" rebranded as the 380 "Antigua" already?


----------



## Forceman

Quote:


> Originally Posted by *rv8000*
> 
> Thinking that over, wouldn't the opposite be the case, with the fury and 290/390x have declining performance as the amount of pixels increase with resolution?


My theory is that at lower resolutions the Fury is running up against the ROP limit (because it has ample shaders), while the Hawaii cards are more evenly balanced. Then at higher resolutions the Fury is able to keep close to the ROP limit while the Hawaii cards start to fall behind (because of fewer shaders). So the Fury is ROP limited at lower resolutions, while Hawaii is shader limited at higher ones. The 980 Ti has more ROPs so it is more likely shader limited at all resolutions, and so shows better scaling. But I'm not real strong on the pipeline stuff, so that may be totally off-base.


----------



## Allanitomwesh

This card is brute forcing. That's my opinion of it.


----------



## $k1||z_r0k

Quote:


> Originally Posted by *HiTechPixel*
> 
> I am looking for 5K Crossfire reviews too. The results will decide if I go Titan X SLI or Fury X Crossfire since I have a 5K monitor.


5K is a waste for gaming imo... better results just to run your Dell/Apple 5K monitor with 4K resolution and add in minor anti-aliasing. in reality you will not notice the difference at all with 5K unless you have a 150"+ screen, and you need double the power needed for 4K to run games in 5K which is not worth it.


----------



## DividebyZERO

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> 5K is a waste for gaming imo... better results just to run your Dell/Apple 5K monitor with 4K resolution and add in minor anti-aliasing. in reality you will not notice the difference at all with 5K unless you have a 150"+ screen, and you need double the power needed for 4K to run games in 5K which is not worth it.


I wholeheartedly disagree after seeing my measly 6400x3600 down sampled on my 4k monitor. While it may take insane amounts of power to run it i can easily see the difference from it down to 4k. Only in games with poor graphics is it harder to tell. The other half of the battle is more subjective but even if it requires turning some settings down from max for me its a night and day difference.


----------



## F4ze0ne

I'm underwhelmed with Fury X performance. It's good, but not good enough to compete. AMD need to go back to the drawing board.


----------



## eXe.Lilith

Quote:


> Originally Posted by *F4ze0ne*
> 
> I'm underwhelmed with Fury X performance. It's good, but not good enough to compete. AMD need to go back to the drawing board.


I disagree, it's got everything it needs to compete. People need to get off their high horses is all. Everybody that was expecting a card that was crushing all nVidia has to offer were being naive. AMD, with few means managed to deliver a new architecture that even though based on the same fab process as Hawaii improves slightly in TDP but more importantly a lot in terms of performance. Sure, everybody's mad that the Fury X comes short of besting the 980 Ti, but in truth you're comparing a brand new architecture with nascent drivers to one that's been around for quite some time, with drivers that are quite mature.


----------



## Casey Ryback

Quote:


> Originally Posted by *eXe.Lilith*
> 
> I disagree, it's got everything it needs to compete. People need to get off their high horses is all. Everybody that was expecting a card that was crushing all nVidia has to offer were being naive.


Agree 100%


----------



## F4ze0ne

I understand your position. But AMD needs a hit to put them back in the game. They have fallen behind in the enthusiast market. It's ok to admit things need to change. Those changes will be made and results come out better. I'm hoping this was a stop gap to match the 980Ti or chase it while better things are cooking.


----------



## pengs

Quote:


> Originally Posted by *Allanitomwesh*
> 
> This card is brute forcing. That's my opinion of it.


Brute forcing through the GW's games, that's for sure...


----------



## en9dmp

Having read all the reviews I'm really not impressed at all. At 4k, Metro Last Light (a game I probably would replay through at 4k) is roughly 40-50% slower than the 980 Ti without AA... This is staggeringly bad, as it's barely an improvement on the 290x!... In one of the reviews, the new 390x was actually performing pretty close, so a couple of these crossfired could be a feasible 4k solution.

It's worrying that there are such drastic performance variations between titles, and also such large variances between average and minimum fps across the board. The nVidia cards are much tighter in this aspect. Lastly I expected this card to be an overclocking beast, and clearly that is also not the case... Still, I'm praying that driver updates can fix these issues, otherwise I can't really understand why anyone would buy this card...

Having said that, given that I've been wanting this for some time, I have ordered 2 to crossfire which should arrive on Sunday. Actually got them at a pretty good price from Scan in the UK - £519 each for the Sapphire ones. I will be testing them out in the next month and hoping things improve with driver updates, otherwise I'll either try and return them (unlikely) or take a small hit selling on ebay and get some 980Tis to put under water.


----------



## Kokin

Quote:


> Originally Posted by *F4ze0ne*
> 
> I understand your position. But AMD needs a hit to put them back in the game. They have fallen behind in the enthusiast market. It's ok to admit things need to change. Those changes will be made and results come out better. I'm hoping this was a stop gap to match the 980Ti or chase it while better things are cooking.


The Fury is their answer to changing for the better. HBM is the first of its kind and Nvidia will follow suit with Pascal by using stacked memory. I'm disappointed that it didn't surpass the 980Ti/TitanX, but I disagree that it's not good enough to compete.

It takes time, but we have yet to see what slightly matured drivers have to offer. There's also the question of how well the overclocking performance changes once the voltage is unlocked.


----------



## Clockster

Quote:


> Originally Posted by *en9dmp*
> 
> Having read all the reviews I'm really not impressed at all. At 4k, Metro Last Light (a game I probably would replay through at 4k) is roughly 40-50% slower than the 980 Ti without AA... This is staggeringly bad, as it's barely an improvement on the 290x!... In one of the reviews, the new 390x was actually performing pretty close, so a couple of these crossfired could be a feasible 4k solution.
> 
> It's worrying that there are such drastic performance variations between titles, and also such large variances between average and minimum fps across the board. The nVidia cards are much tighter in this aspect. Lastly I expected this card to be an overclocking beast, and clearly that is also not the case... Still, I'm praying that driver updates can fix these issues, otherwise I can't really understand why anyone would buy this card...
> 
> Having said that, given that I've been wanting this for some time, I have ordered 2 to crossfire which should arrive on Sunday. Actually got them at a pretty good price from Scan in the UK - £519 each for the Sapphire ones. I will be testing them out in the next month and hoping things improve with driver updates, otherwise I'll either try and return them (unlikely) or take a small hit selling on ebay and get some 980Tis to put under water.


I'm sorry what?

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,26.html


----------



## Ashura

Quote:


> Originally Posted by *Kokin*
> 
> The Fury is their answer to changing for the better. HBM is the first of its kind and Nvidia will follow suit with Pascal by using stacked memory. I'm disappointed that it didn't surpass the 980Ti/TitanX, but I disagree that it's not good enough to compete.
> 
> It takes time, but we have yet to see what slightly matured drivers have to offer. There's also the question of how well the overclocking performance changes once the voltage is unlocked.


Yes, also would like to see how it performs in windows 10 & dx12.
I don't really have huge expectations though.


----------



## en9dmp

Quote:


> Originally Posted by *Clockster*
> 
> I'm sorry what?
> 
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,26.html


Hmmmm.... evidence seems to vary wildly from source to source, check out the IGN review:
http://uk.ign.com/articles/2015/06/24/amd-radeon-r9-fury-x-review



There was another source from yesterday showing a graph across multiple games with min and avg fps which really highlighted the point, but I have to say I can't seem to find that today...

I hope I'm wrong, hell I just ordered 2 of them, but there does seem to be a lot of variation across the reviews so hard to know for sure... Also crossfiring them is unknown at the moment what kind of performance increases can be expected.


----------



## Noufel

Any one tested the FuryX on win 10 with 3dmark cpu overhead test ???


----------



## F4ze0ne

Quote:


> Originally Posted by *Kokin*
> 
> It takes time, but we have yet to see what slightly matured drivers have to offer.


It needs to be ready where it's priced now. The drivers need to perform today, when cards are purchased from retailers.


----------



## 1337LutZ

Quote:


> Originally Posted by *Noufel*
> 
> Any one tested the FuryX on win 10 with 3dmark cpu overhead test ???


@Noufel heres a review that has the overhead test.

http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html?start=22


----------



## Noufel

Quote:


> Originally Posted by *1337LutZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> Any one tested the FuryX on win 10 with 3dmark cpu overhead test ???
> 
> 
> 
> @Noufel
> heres a review that has the overhead test.
> 
> http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html?start=22
Click to expand...

thnx for the link








Very good review btw


----------



## Gumbi

Quote:


> Originally Posted by *en9dmp*
> 
> Hmmmm.... evidence seems to vary wildly from source to source, check out the IGN review:
> http://uk.ign.com/articles/2015/06/24/amd-radeon-r9-fury-x-review
> 
> 
> 
> There was another source from yesterday showing a graph across multiple games with min and avg fps which really highlighted the point, but I have to say I can't seem to find that today...
> 
> I hope I'm wrong, hell I just ordered 2 of them, but there does seem to be a lot of variation across the reviews so hard to know for sure... Also crossfiring them is unknown at the moment what kind of performance increases can be expected.


CPU limited. nVidia perform much better in cpu limited games. As you can see even the 780ti is beating the Fury x in that game.


----------



## Kokin

Quote:


> Originally Posted by *F4ze0ne*
> 
> It needs to be ready where it's priced now. The drivers need to perform today, when cards are purchased from retailers.


If you are looking to buy it ASAP, then yes it does fall short of Nvidia's offerings if one is only looking at benchmarks. The DX12 overhead test shows some promise, but without actual DX12 games to benchmark, AMD is a big question mark for buyers in the market.

However, I'm not one for being an early adopter, so I don't have that mindset. I like to buy products when it's been stable for a while and is a bit cheaper.


----------



## th3illusiveman

The fury non X will be a hit. I doubt the lack of shaders will have a drastic impact on its performance compared to fury and since a 390X is already nipping at the heels of a 980, a more powerful card from AMD that sits between the $500 - $650 gap of the 980 - Ti has the potential to do well. It has to come with aftermarket coolers though otherwise we will have another 290X ref blower mess.

Fury non X at $500 with a custom cooler would sell very well. I would buy one.


----------



## Casey Ryback

Quote:


> Originally Posted by *th3illusiveman*
> 
> It has to come with aftermarket coolers though otherwise we will have another 290X ref blower mess.
> 
> Fury non X at $500 with a custom cooler would sell very well. I would buy one.


Various sources have stated that the fury will be using custom coolers


----------



## en9dmp

Quote:


> Originally Posted by *Gumbi*
> 
> CPU limited. nVidia perform much better in cpu limited games. As you can see even the 780ti is beating the Fury x in that game.


An i7-4790k is limiting performance? I can't find any info online to suggest an i7-4790k would be a bottleneck, especially at 4k...

Looking at the test system in the Hardwareluxx review: http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html?start=6

The cpu used is only an i7-3960X and it's showing stronger performance at 4k in Metro LL (38.4fps, with the Ti showing 37.4fps). Based on this and the other reviews, I'm inclined to believe the IGN table I posted above is incorrect...


----------



## tx12

Fury X ROM spotted (that's an incomplete readout made by GPU-Z):
http://s000.tinyupload.com/index.php?file_id=36312630247630455724


----------



## Gumbi

Quote:


> Originally Posted by *en9dmp*
> 
> An i7-4790k is limiting performance? I can't find any info online to suggest an i7-4790k would be a bottleneck, especially at 4k...
> 
> Looking at the test system in the Hardwareluxx review: http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35798-amd-radeon-r9-fury-x-im-test.html?start=6
> 
> The cpu used is only an i7-3960X and it's showing stronger performance at 4k in Metro LL (38.4fps, with the Ti showing 37.4fps). Based on this and the other reviews, I'm inclined to believe the IGN table I posted above is incorrect...


Possibly. I know in some games (WoW being the only significant example I can think of off the top of my head), use only 2 cores and nVidia cards crush AMD cards in it. To the point that 770s match or beat 290xs IIRC.


----------



## The Stilt

Quote:


> Originally Posted by *tx12*
> 
> Fury X ROM spotted (that's an incomplete readout made by GPU-Z):
> http://s000.tinyupload.com/index.php?file_id=36312630247630455724


It´s complete.
Thanks


----------



## tx12

Quote:


> Originally Posted by *The Stilt*
> 
> It´s complete.
> Thanks


Flash chip is 25P20, 256KB. Readout is 128KB, PC BIOS zone only (UEFI maybe in 2nd 128KB zone).

I suppose atiflash with fiji support should already be somethere on the internet?


----------



## The Stilt

Quote:


> Originally Posted by *tx12*
> 
> Flash chip is 25P20, 256KB. Readout is 128KB, PC BIOS zone only (UEFI maybe in 2nd 128KB zone).


My ASUS 290X cards have 2Mb chips too while the actual flash size is 64KB.
The ROM size is EE00h and there is no UEFI (first and the last rom at the same time).


----------



## Valenz

Quote:


> Originally Posted by *Gumbi*
> 
> Possibly. I know in some games (WoW being the only significant example I can think of off the top of my head), use only 2 cores and nVidia cards crush AMD cards in it. To the point that 770s match or beat 290xs IIRC.


My 290x lightning was horrible @ 4k with all settings maxed and AA off hit about 45-55 max fps in empty areas and 19 below in crowded areas whereas my 780 classifed would get 65 same setting in crowded areas.. I just put in my 980ti gaming last night and getting 100+ with AA turned on and max settings.


----------



## Clockster

http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/

Fury X CF performance.


----------



## en9dmp

Quote:


> Originally Posted by *Clockster*
> 
> http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/
> 
> Fury X CF performance.


Shame they only benched 3 games, but the scaling looks pretty impressive... Would have been nice to have seen FC4 and GTAV. But by the looks of things CF should be able to maintain a stable 60fps @ 4k


----------



## Kuivamaa

The more I read about this card the more I am liking it.

http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html

I mean I don't understand the commotion. The performance is excellent. The 980Ti surely has more VRAM but I can find the Fury X 70 euros cheaper than the cheapest Ti over here and HBM is bound to get handy in the future.


----------



## flopper

Quote:


> Originally Posted by *Kuivamaa*
> 
> The more I read about this card the more I am liking it.
> 
> http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html
> 
> I mean I don't understand the commotion. The performance is excellent. The 980Ti surely has more VRAM but I can find the Fury X 70 euros cheaper than the cheapest Ti over here and HBM is bound to get handy in the future.


they failed to manage expectations.
its why they got a beating.
the hardocp is out of line imo.

anyway look forward the 14.


----------



## p4inkill3r

Quote:


> Originally Posted by *Kuivamaa*
> 
> The more I read about this card the more I am liking it.
> 
> http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html
> 
> I mean I don't understand the commotion. The performance is excellent. The 980Ti surely has more VRAM but I can find the Fury X 70 euros cheaper than the cheapest Ti over here and HBM is bound to get handy in the future.


The commotion, at least on this site, is primarily from people who had no intention of buying one to begin with.


----------



## escksu

Btw, just want to share some benchmarks.





IMHO, looks pretty decent.....


----------



## Ha-Nocri

I always expected Fury X to be slower than 980ti, so I'm not disappointed at all. Actually, I'm quite surprised that it beats 980ti @1440p an 4k in older games. It might actually be AMD drivers that is holding the card back in newer titles, as several reviewers pointed out. I mean, the drivers version is 15.15. Does that mean they r from January this year?!


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> they failed to manage expectations.
> its why they got a beating.
> the hardocp is out of line imo.
> 
> anyway look forward the 14.


Yeah Hard OCP was rough on the card. A lot harder that Kit Guru has ever been on AMD, and AMD yanked their access to a sample.....


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah Hard OCP was rough on the card. A lot harder that Kit Guru has ever been on AMD, and AMD yanked their access to a sample.....


I disagree, they had some strong opinions on the card after reviewing it, far different to the kitguru video based on speculation.

Not that I agree with hard OCP and their total negative stance on the card itself.

But if that's their opinion so be it, they can't see any positives, same as many people honestly.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah Hard OCP was rough on the card. A lot harder that Kit Guru has ever been on AMD, and AMD yanked their access to a sample.....


I expect they dont get a fury the 14 to review


----------



## escksu

Btw, just want to comment. I don't think Fury X is disappointing.... Look at my 3Dmark results.... They are 2 x Fury X in CF. Looks pretty decent and comparable with Titan X SLI.


----------



## Ha-Nocri

Quote:


> Originally Posted by *escksu*
> 
> Btw, just want to comment. I don't think Fury X is disappointing.... Look at my 3Dmark results.... They are 2 x Fury X in CF. Looks pretty decent and comparable with Titan X SLI.


Ofc not. It's bloody powerful card. How r ur VRM temps btw?


----------



## escksu

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Ofc not. It's bloody powerful card. How r ur VRM temps btw?


Unfortunately, I don't have anything to measure the VRM though.

But right now, I am just waiting for the EK waterblock. The stock WC gets rather warm to the touch. So, its not that great.


----------



## Newbie2009

More benchmarks please. How is the OC on stock volts? I read that putting +50 powertune hurts performance when overclocking as throttles


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> I expect they dont get a fury the 14 to review


I agreed with their opinion of the card......

I think the card itself is really cool... but it just didn't give me a reason to buy one.

290 gave you a reason.... hell, that card was ridiculously hot, used a ton of power, and people still bought it!

Then AMD turned around and used Hawaii again to compete with the next gen NVIDIA cards (referring to 970 and 980), after it had already whooped up on the 7 series cards.

Thats two generations of competition for the Hawaii GPU, and it's a hot, power hungry beast, but the damn thing delivered knock outs like Tyson in his prime!

Now we have this Fury card, and unless AMD is counting on better drivers, or DX12 dominance, this architecture will be short lived....

Can you at all picture, Fiji going through two generations and still being relevant? I can't....

I really do love the card, I just wish it's selling points weren't the cooler, and the size, and the reduced TDP, because as we all have seen, no one cares. If it were a power hog that ran 90c, but kicked 980ti around, it'd be a bigger success from a sales standpoint, though I think reviewers would bash it even harder...

We'd be reading "Fury X delivers a powerful punch to NVIDIA's 980 ti and Titan X, but does so at the cost of high power usage, and ridiculous temperatures"

Now the story reads "Fury has better TDP than Hawaii, a morte advanced architecture with HBM, and the card runs extremely cool with it's water cooling solution, but at $650 it fails to perform at it's pricing level"


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I agreed with their opinion of the card......
> 
> I think the card itself is really cool... but it just didn't give me a reason to buy one.
> 
> 290 gave you a reason.... hell, that card was ridiculously hot, used a ton of power, and people still bought it!
> 
> Then AMD turned around and used Hawaii again to compete with the next gen NVIDIA cards (referring to 970 and 980), after it had already whooped up on the 7 series cards.
> 
> Thats two generations of competition for the Hawaii GPU, and it's a hot, power hungry beast, but the damn thing delivered knock outs like Tyson in his prime!
> 
> Now we have this Fury card, and unless AMD is counting on better drivers, or DX12 dominance, this architecture will be short lived....
> 
> Can you at all picture, Fiji going through two generations and still being relevant? I can't....
> 
> I really do love the card, I just wish it's selling points weren't the cooler, and the size, and the reduced TDP, because as we all have seen, no one cares. If it were a power hog that ran 90c, but kicked 980ti around, it'd be a bigger success from a sales standpoint, though I think reviewers would bash it even harder...
> 
> We'd be reading "Fury X delivers a powerful punch to NVIDIA's 980 ti and Titan X, but does so at the cost of high power usage, and ridiculous temperatures"
> 
> Now the story reads "Fury has better TDP than Hawaii, a morte advanced architecture with HBM, and the card runs extremely cool with it's water cooling solution, but at $650 it fails to perform at it's pricing level"


now and then I think amd is a gang of entusiast amateurs.
Love that but not when its about managing the launches as fury drivers wasnt ready for the reviews.
I would have waited for windows 10 myself.


----------



## HiTechPixel

It's a shame that the Fury X is a dud after several months of AMD hyping it up to be *THE* Titan killer. The way I see it, it has 2 flaws:

1. It can't overclock even if its life depended on it. Thing is, both the 980 Ti and the Titan X can overclock like freaking champions, so they're going to pull ahead *MAJORLY*.

2. It's limited to 4GB HBM1. I don't care what AMD says it can or can't do. I don't care what reviewers says it can or can't do. 4GB WILL NOT WORK AT 4K WHEN YOU FACTOR IN ANTI-ALIASING, WHICH IS REQUIRED AT 4K TO PRODUCE A GOOD IMAGE. 4GB WILL ALSO NOT WORK AT 5K BECAUSE 4GB SIMPLY ISN'T ENOUGH.

If the Fury X didn't have these two problems then I'd be all over the card. I'd even go Tri-fire or heck, possibly even Quad-fire.


----------



## magic8192

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Can you at all picture, Fiji going through two generations and still being relevant? I can't....


I don't get this. We have the introduction of HBM that if nothing else seems to hold lots of promise. Seems like the future to me.


----------



## p4inkill3r

Quote:


> Originally Posted by *HiTechPixel*
> 
> It's a shame that the Fury X is a dud after several months of AMD hyping it up to be *THE* Titan killer. The way I see it, it has 2 flaws:
> 
> 1. It can't overclock even if its life depended on it. Thing is, both the 980 Ti and the Titan X can overclock like freaking champions, so they're going to pull ahead *MAJORLY*.
> 
> 2. It's limited to 4GB HBM1. I don't care what AMD says it can or can't do. I don't care what reviewers says it can or can't do. 4GB WILL NOT WORK AT 4K WHEN YOU FACTOR IN ANTI-ALIASING, WHICH IS REQUIRED AT 4K TO PRODUCE A GOOD IMAGE. 4GB WILL ALSO NOT WORK AT 5K BECAUSE 4GB SIMPLY ISN'T ENOUGH.
> 
> If the Fury X didn't have these two problems then I'd be all over the card. I'd even go Tri-fire or heck, possibly even Quad-fire.


First, the words "TITAN KILLER" were never uttered by anyone at AMD, so you can take that back to St. Elsewhere.
Secondly, and for the thousandth time, no reviewer OCd a card with unlocked voltage. 100mhz OCs on stock volts were all we saw.
Lastly, you're wrong about 4GB as evidenced by Fury's performance in 4k benchmarks.


----------



## escksu

OK, I tried overclocking this thing but for some reason it can't oc at all.... I only made it to 1.1GHz.......its kind of disappointing esp. when its paired with water cooling. HBM is cool but I am not sure if we really need it. But then, I am used to ATI/AMD cards so I guess I can only stick with it.

Hopefully, new drivers/bIOS etc will make OC better. I tried sliding the power bar but it makes no difference.


----------



## escksu

Sorry I don't really have benchmarks to run cause I don't have games!! lol.... I hardly game (now playing only world of tanks and startrek online).


----------



## Agent Smith1984

Quote:


> Originally Posted by *magic8192*
> 
> I don't get this. We have the introduction of HBM that if nothing else seems to hold lots of promise. Seems like the future to me.


HBM 2.0 within a year, and much larger frame buffers ahead....

This card was a bridge in my opinion.


----------



## en9dmp

I think 2 of these in crossfire will last at least 2 generations, especially once games start to take advantage of DX12 with SFR and memory pooling. Should see much better performance as well as removing the 4MB limitation, which may start to actually become an issue with late 2015 early 2016 titles at 4K...


----------



## DStealth

Quote:


> Originally Posted by *escksu*
> 
> Btw, just want to share some benchmarks.
> 
> IMHO, looks pretty decent.....


Share some single card results and with max OC also, to compare them


----------



## escksu

Quote:


> Originally Posted by *HiTechPixel*
> 
> It's a shame that the Fury X is a dud after several months of AMD hyping it up to be *THE* Titan killer. The way I see it, it has 2 flaws:
> 
> 1. It can't overclock even if its life depended on it. Thing is, both the 980 Ti and the Titan X can overclock like freaking champions, so they're going to pull ahead *MAJORLY*.
> 
> 2. It's limited to 4GB HBM1. I don't care what AMD says it can or can't do. I don't care what reviewers says it can or can't do. 4GB WILL NOT WORK AT 4K WHEN YOU FACTOR IN ANTI-ALIASING, WHICH IS REQUIRED AT 4K TO PRODUCE A GOOD IMAGE. 4GB WILL ALSO NOT WORK AT 5K BECAUSE 4GB SIMPLY ISN'T ENOUGH.
> 
> If the Fury X didn't have these two problems then I'd be all over the card. I'd even go Tri-fire or heck, possibly even Quad-fire.


Actually, its not that bad. YOu can OC like champions but its going to shorten the lifespan of the card. Lots of pple face artifacts/dead cards after extreme overclocking (including myself). They are only good for benchmarks. Not for everday use. thats why I don't OC graphics cards these days.

4GB is actually enough. I tried gaming at 4K (using VSR on 1080p monitor). No issues. Tried BF4, Crysis 3, no problems max settings. Runs very well actually. This shows 4GB is fine.

Tri-fire and quad-fire is NOT recommended. I tried quad R9 290.... Ran poorly. The drivers just aren't optimised for it. Bad stuttering, its not any faster than 2 cards. If they change this in Fury X, then I will get a pair of Fury X2 when its out.


----------



## escksu

Quote:


> Originally Posted by *DStealth*
> 
> Share some single card results and with max OC also, to compare them


It can't overclock much. I tried 1150MHz and artifacts appear. So I will just stick at the stock speed.


----------



## flopper

Quote:


> Originally Posted by *escksu*
> 
> OK, I tried overclocking this thing but for some reason it can't oc at all.... I only made it to 1.1GHz.......its kind of disappointing esp. when its paired with water cooling. HBM is cool but I am not sure if we really need it. But then, I am used to ATI/AMD cards so I guess I can only stick with it.
> 
> Hopefully, new drivers/bIOS etc will make OC better. I tried sliding the power bar but it makes no difference.


around 1100mhz seems to be the case atm.
voltage control is coming but will take time to get here.
amd matt cards did 1125mhz at average and he has 4 cards.


----------



## magic8192

Quote:


> Originally Posted by *HiTechPixel*
> 
> 2. It's limited to 4GB HBM1. I don't care what AMD says it can or can't do. I don't care what reviewers says it can or can't do. 4GB WILL NOT WORK AT 4K WHEN YOU FACTOR IN ANTI-ALIASING, WHICH IS REQUIRED AT 4K TO PRODUCE A GOOD IMAGE. 4GB WILL ALSO NOT WORK AT 5K BECAUSE 4GB SIMPLY ISN'T ENOUGH.


Maybe you should go and look at some of the benchmarks a little closer. In benchmark after benchmark, it is at 4k with AA on that the Fury X shines. It certainly is not what you would expect, but that is the reality. There are problems here, but the fact is that AMD has proven that it can be done.


----------



## th3illusiveman

This is one of the slowest GPU owners clubs i've ever seen







Guess the card isn't flying off the shelves. I fully expect that to change when the Fury non X launches. I have a good feeling about that card. Shame we have to wait AGAIN


----------



## Sgt Bilko

Quote:


> Originally Posted by *th3illusiveman*
> 
> This is one of the slowest GPU owners clubs i've ever seen
> 
> 
> 
> 
> 
> 
> 
> Guess the card isn't flying off the shelves. I fully expect that to change when the Fury non X launches. I have a good feeling about that card. Shame we have to wait AGAIN


It's sold out in Aus afaik and Amazon still hasn't got stock in.
Takes a day or so for delivery....only been launched for 24hrs now :/


----------



## Agent Smith1984

Quote:


> Originally Posted by *th3illusiveman*
> 
> This is one of the slowest GPU owners clubs i've ever seen
> 
> 
> 
> 
> 
> 
> 
> Guess the card isn't flying off the shelves. I fully expect that to change when the Fury non X launches. I have a good feeling about that card. Shame we have to wait AGAIN


You do know the card released yesterday right? So the soonest anyone would most likely report as an owner would be today..... lol


----------



## Talon720

Does anyone even know what voltage controller used. This same thing happened at 290x release people were quick to jump and say no voltage control. It just took some time before it was supported by software and hex based modded bios. Also maybe this is revisionist history but didn't this same memory argument happen with ddr3 to ddr5? people were suggesting to go with ddr5 1gb vs ddr3 2gb . Isn't that the same thing here with ddr5 to HBM? 4gb ddr5 dosnt equal 4gb HBM. HBM seems to have a lot of advantages. I went back and read the ddr3 to ddr5 threads and it seriously sounded exactly the same way. Any benchmark hiccups could be driver optimization....look at the 290x now compared to when it first came out. Even if the fury x dosnt beat Titan or matches 980 ti the size factor per performance for price will make it the only choice for some.


----------



## th3illusiveman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You do know the card released yesterday right? So the soonest anyone would most likely report as an owner would be today..... lol


When the 7970, 680, 780, 290/X, 780 Ti, 970/80 and 980 Ti launched those clubs had significantly more traffic then this thread has had. Most posts right after reviews are of people scrambling to find cards at different sites and the threads are VERY active. This one is not anywhere close. Observation.


----------



## hyp36rmax

Quote:


> Originally Posted by *th3illusiveman*
> 
> When the 7970, 680, 780, 290/X, 780 Ti, 970/80 and 980 Ti launched those clubs had significantly more traffic then this thread has had. Most posts right after reviews are of people scrambling to find cards at different sites and the threads are VERY active. This one is not anywhere close. Observation.


People are probably still in shock with the results. There's also a good handful of FURY threads in the news section that popped up. It could also be the limited availability of the cards? I'll give it a couple weeks.


----------



## freezer2k

http://www.mindfactory.de/product_info.php/4096MB-Sapphire-Radeon-R9-FURY-X-Hybrid-PCIe-3-0-x16--Full-Retail-_1007173.html

699eur incl. Tax in stock (Germany)


----------



## joeh4384

Anyone here have one yet? Not living up to hype aside, it still looks like a pretty beastly card. I would definitely consider Fury or Fury X2 in the future pending on how drivers etc are in a couple of months.


----------



## p4inkill3r

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone here have one yet? Not living up to hype aside, it still looks like a pretty beastly card. I would definitely consider Fury or Fury X2 in the future pending on how drivers etc are in a couple of months.


I'm waiting for order fulfillment from Amazon.


----------



## HiTechPixel

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone here have one yet? Not living up to hype aside, it still looks like a pretty beastly card. I would definitely consider Fury or Fury X2 in the future pending on how drivers etc are in a couple of months.


I was asleep when it released but it has either not come into stock yet or has sold out in Sweden.


----------



## Clockster

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone here have one yet? Not living up to hype aside, it still looks like a pretty beastly card. I would definitely consider Fury or Fury X2 in the future pending on how drivers etc are in a couple of months.


2x Asus R9 Fury X arriving Monday next week.


----------



## hyp36rmax

Quote:


> Originally Posted by *Clockster*
> 
> 2x Asus R9 Fury X arriving Monday next week.












I look forward to your results!


----------



## blue1512

Did anyone notice the inconsistent numbers between different reviews?
Plot twist

__
https://www.reddit.com/r/3b21fp/fury_x_possibly_reviewed_with_wrong_drivers/


----------



## Casey Ryback

Quote:


> Originally Posted by *joeh4384*
> 
> Anyone here have one yet? Not living up to hype aside, it still looks like a pretty beastly card. I would definitely consider Fury or Fury X2 in the future pending on how drivers etc are in a couple of months.


Same here









Waiting for a cheaper air cooled version and we'll see if they have any updates for drivers in that time frame to make it a deal for me.


----------



## tpi2007

Locked for cleaning.

Edit: cleaned, 48 heated argument posts deleted.

Guys, I know that this is a hot topic, and that the card has just been released and is out of stock in many places and not many people have it and drivers and pricing and the competition, and whatever else there is to discuss, but please don't let the heat of the moment get the best of you.

Be civil, even if you strongly disagree with someone else's opinion, everybody is entitled to one, voice you disagreement without insulting others.

Also keep on topic as much as possible, otherwise it's more fluff people have to go through to get to the good stuff.

If this gets out of hand again warnings and infractions will be handed out where appropriate.


----------



## escksu

Btw, this is how the card looks like



















For price it cost $999 in my country, that around 664.222 EUR according to xe.com.

The cooler is a bit of let down I have to say... Looks somewhat shabby. But the card itself is really high quality. Just look at the components used.


----------



## hyp36rmax

Quote:


> Originally Posted by *tpi2007*
> 
> Locked for cleaning.
> 
> Edit: cleaned, 45 heated argument posts deleted.
> 
> Guys, I know that this is a hot topic, and that the card has just been released and is out of stock in many places and not many people have it and drivers and pricing and the competition, and whatever else there is to discuss, but please don't let the heat of the moment get the best of you.
> 
> Be civil, even if you strongly disagree with someone else's opinion, everybody is entitled to one, voice you disagreement without insulting others.
> 
> Also keep on topic as much as possible, otherwise it's more fluff people have to go through to get to the good stuff.
> 
> If this gets out of hand again warnings and infractions will be handed out where appropriate.


Thank you!


----------



## escksu

Overclocking, I tried to bench at 1150MHz but crash on last benchmark. For some reason the power limit don't seems to do anything. Highest possible is around 1130MHz.


----------



## Casey Ryback

Quote:


> Originally Posted by *escksu*
> 
> Btw, this is how the card looks like
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For price it cost $999 in my country, that around 664.222 EUR according to xe.com.
> 
> The cooler is a bit of let down I have to say... Looks somewhat shabby. But the card itself is really high quality. Just look at the components used.


It's pretty amazing what they crammed into that PCB!


----------



## rdr09

Quote:


> Originally Posted by *escksu*
> 
> Btw, just want to share some benchmarks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IMHO, looks pretty decent.....


that's about 40% faster than my 2 290s at stock. i can close the gap to 22% by oc'ing my 290s to 1280 core. But, these are my 290s that can oc above average and no way to run them at those clocks for 7/24.


----------



## Agent Smith1984

It's very possible that this card will overclock very similar to Hawaii... maybe slightly better with the water cooling, and also depending on voltage allowed.

I am curious though, as to whether they will unlock memory clocks.
I am beginning to think that they will leave that locked. It's likely that even if they did unlock it, that it won't clock very well, which may leave a worse taste in people's mouths than just leaving it locked, and chocking it up to the HBM design.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> that's about 40% faster than my 2 290s at stock. i can close the gap to 22% by oc'ing my 290s to 1280 core. But, these are my 290s that can oc above average and no way to run them at those clocks for 7/24.


Wait.... you are saying it's 40% faster than (2) 290's????


----------



## xer0h0ur

LOL I score 7437 with three moderately overclocked Hawaii XT's: http://www.3dmark.com/fs/5055862

That is impressive by two cards.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL I score 7437 with three moderately overclocked Hawaii XT's: http://www.3dmark.com/fs/5055862
> 
> That is impressive by two cards.


Ahhh, two cards.... didn't catch that part.

Now that makes sense.


----------



## escksu

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wait.... you are saying it's 40% faster than (2) 290's????


Yes, he is right. Cause I am running a pair of Fury X, 2 x Fury X is around 40% faster than 2 x 290


----------



## tpi2007

Quote:


> Originally Posted by *escksu*
> 
> Btw, this is how the card looks like
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For price it cost $999 in my country, that around 664.222 EUR according to xe.com.
> 
> The cooler is a bit of let down I have to say... Looks somewhat shabby. But the card itself is really high quality. Just look at the components used.


Be very careful when cleaning the thermal paste around the HBM, according to SemiAccurate the parts of the interposer that are exposed are supposedly very fragile. I'm betting that there is some degree of exaggeration in his remarks, otherwise we wouldn't have two brands of waterblock makers with products already announced, and you've got to clean the original thermal paste somehow, but in any case, word of caution is always a good thing:

http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/
Quote:


> One word of warning should you buy a Fiji and molest it in various ways that overclockers and enthusiasts normally do, be careful. If you look at the above picture you can see the pretty patterns on the interposer, they look good but don't taste good. If you want to clean off the thermal paste and replace it with your own cooling solution, be really careful of these areas. Why? Because the interposer, basically a chip, is mounted face up, it is not a traditional flip chip part with the transistors and metal layers protected by the wafer, they fragile bits are on top this time.
> 
> How fragile? Don't touch them, don't wipe them off, and otherwise don't do anything that could break a far sub-micron metal trace. It is really fragile and you will destroy your very expensive GPU if you do this, don't say we didn't warn you. This is a tech transition that hasn't been seen since the days when flip chips replaced wire bonding so think back to the bad old days before you mod. Really, be careful or you will end up with an expensive 4GB, water-cooled doorstop.


Quote:


> Originally Posted by *hyp36rmax*
> 
> Thank you!


You're welcome. Just deleted three more even more way back, 48 then.


----------



## blue1512

Different charts say different thing


And please carefully check the driver
Quote:


> Just putting this out there. Here's a review that seems quite different from the other ones: https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http%3A%2F%2Fwww.ixbt.com%2Fvideo3%2Ffiji-part3.shtml This one actually shows that Fury beats 980 ti in a lot of tests and even titan x in some tests. Of course in lower resolutions it seems that NVidia still wins a lot of times, but it's not as bad as it is with some other reviewers. Now here's a possible reason why. Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th. The press driver on AMD's FTP server is 15.15-150611a-185358E. I think this is probably the reason of this inconsistency.


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL I score 7437 with three moderately overclocked Hawaii XT's: http://www.3dmark.com/fs/5055862
> 
> That is impressive by two cards.


Yea! I only get 5166 with two VAPOR-X R9 290X 8GB haha. *Link* . Can't wait to get a couple of these.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wait.... you are saying it's 40% faster than (2) 290's????


compute it again. not very good at that . ..

stock 290s

http://www.3dmark.com/3dm/4398658?

290s @ 1280 . . .

http://www.3dmark.com/3dm/4644611?

my second 290 is holding back my launch car a bit.


----------



## xer0h0ur

Quote:


> Originally Posted by *tpi2007*
> 
> Be very careful when cleaning the thermal paste around the HBM, according to SemiAccurate the parts of the interposer that are exposed are supposedly very fragile. I'm betting that there is some degree of exaggeration in his remarks, otherwise we wouldn't have two brands of waterblock makers with products already announced, and you've got to clean the original thermal paste somehow, but in any case, word of caution is always a good thing:
> 
> http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/
> 
> You're welcome. Just deleted three more even more way back, 48 then.


In the EKWB thread their rep says some dolt took a screwdriver to it to clean off TIM? Who does that? Anyways he says its not nearly as fragile as semiaccurate would make it seem.


----------



## escksu

Quote:


> Originally Posted by *xer0h0ur*
> 
> In the EKWB thread their rep says some dolt took a screwdriver to it to clean off TIM? Who does that? Anyways he says its not nearly as fragile as semiaccurate would make it seem.


Lol... screw driver!!!

Btw, I was too lazy to clean off the TIM, but put back the heatsink.... apparently temps not affected. But I will use the standard method of tissue paper and alcohol. Should not damage the GPU. Only thing is to wait for the EK waterblock


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> compute it again. *not very good at that . ..*
> 
> stock 290s
> 
> http://www.3dmark.com/3dm/4398658?
> 
> 290s @ 1280 . . .
> 
> http://www.3dmark.com/3dm/4644611?
> 
> my second 290 is holding back my launch car a bit.


You talking about me?









Confused...

Anyways, I'm still amazed at how well that 290 you have clocks.
Wish we all had samples like that.

Have you tried 390 bios, or 15.15 hack yet?

Find it strange that the 15.15 brings so much to the table for Hawaii, but seems to be a bum driver for Fury


----------



## flopper

Quote:


> Originally Posted by *escksu*
> 
> Yes, he is right. Cause I am running a pair of Fury X, 2 x Fury X is around 40% faster than 2 x 290


and they will only get better over time.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Find it strange that the 15.15 brings so much to the table for Hawaii, but seems to be a bum driver for Fury


whole different technology.
give it a few driver revisions you find the fury unleashed.


----------



## xer0h0ur

Quote:


> Originally Posted by *escksu*
> 
> Lol... screw driver!!!
> 
> Btw, I was too lazy to clean off the TIM, but put back the heatsink.... apparently temps not affected. But I will use the standard method of tissue paper and alcohol. Should not damage the GPU. Only thing is to wait for the EK waterblock


I have used alcohol and q-tips with precision for ages. Ever since I modded my Voodoo 5 5500 waaaaaaaaaaaaaaaay back in the day.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You talking about me?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Confused...
> 
> Anyways, I'm still amazed at how well that 290 you have clocks.
> Wish we all had samples like that.
> 
> Have you tried 390 bios, or 15.15 hack yet?
> 
> Find it strange that the 15.15 brings so much to the table for Hawaii, but seems to be a bum driver for Fury


i am such a wuzz in flashing bioses after i borked one of my 7950s. i've benched my first 290 up to 1330 in 3DMark11. Bought it in Provantage.

Where i am going to buy my Fury if ever.


----------



## escksu

Talking about 390, I am skeptical that 390 bios will work on 290, this is because the amount of RAM is different. I have a 290 lying around though...maybe can try...lol...


----------



## HiTechPixel

Quote:


> Originally Posted by *escksu*


This might very well be the sexiest reference PCB in history. Just look at that GPU die. And look at how much they stuffed on the backside. Oh my god. It's beautiful. I'm almost crying now.

If it turns out that the voltage can be unlocked and if it turns out that the core can be overclocked to the same amount as the 980 Ti/Titan X I might consider getting one or two. But a proper waterblock seems like a necessity because the VRMs with the stock CLC overheat like crazy.


----------



## tpi2007

Quote:


> Originally Posted by *xer0h0ur*
> 
> In the EKWB thread their rep says some dolt took a screwdriver to it to clean off TIM? Who does that? Anyways he says its not nearly as fragile as semiaccurate would make it seem.


Yes, I think from the pictures one can make out some kind of a transparent resin coating to protect the exposed parts of the interposer, so it probably does have some resistance. But a screwdriver is indeed not the same as a cotton swab lol (though cotton swabs can be dangerous if the plastic rod goes through the cotton).


----------



## Gumbi

Quote:


> Originally Posted by *HiTechPixel*
> 
> This might very well be the sexiest reference PCB in history. Just look at that GPU die. And look at how much they stuffed on the backside. Oh my god. It's beautiful. I'm almost crying now.
> 
> If it turns out that the voltage can be unlocked and if it turns out that the core can be overclocked to the same amount as the 980 Ti/Titan X I might consider getting one or two. But a proper waterblock seems like a necessity because the VRMs with the stock CLC overheat like crazy.


How does the stock setup cool the VRMs? How hot do they get at stock?


----------



## HiTechPixel

Quote:


> Originally Posted by *Gumbi*
> 
> How does the stock setup cool the VRMs? How hot do they get at stock?


They are cooled by a single water-bearing copper pipe which frankly doesn't seem to do the job. If early thermal imaging is any indication then the VRMs on the Fury X run as hot as the VRMs on the Titan X. In other words, over 100 °C.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gumbi*
> 
> How does the stock setup cool the VRMs? How hot do they get at stock?


Find a picture of Fury X with the shroud cover off and you can see the copper pipe that is sitting on top of the VRMs. There really is no reason they should be overheating.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> They are cooled by a single water-bearing copper pipe which frankly doesn't seem to do the job. If early thermal imaging is any indication then the VRMs on the Fury X run as hot as the VRMs on the Titan X. In other words, over 100 °C.


It really makes no sense to me that they would be overheating. I mean the only logical conclusion I can arrive at is that they either used some low quality thermal pads to transfer the heat from the VRMs to the copper pipe or they went full potato and just sat the pipe directly on top of the VRMs with nothing aiding the thermal transfer in between it. I mean even slathering some TIM on it would be better than just sitting a pipe on top of it with nothing else.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> It really makes no sense to me that they would be overheating. I mean the only logical conclusion I can arrive at is that they either used some low quality thermal pads to transfer the heat from the VRMs to the copper pipe or they went full potato and just sat the pipe directly on top of the VRMs with nothing aiding the thermal transfer in between it. I mean even slathering some TIM on it would be better than just sitting a pipe on top of it with nothing else.


I honestly have no idea why the VRMs are heating up like they are. As you say, it makes no sense. All I know is that if I end up getting the Fury X I'm getting a proper waterblock.


----------



## xer0h0ur

I can't help but think that using some Fujipoly Ultra Extreme like I used on my EK waterblocks would do wonders for these VRMs.


----------



## pengs

Quote:


> Originally Posted by *HiTechPixel*
> 
> They are cooled by a single water-bearing copper pipe which frankly doesn't seem to do the job. If early thermal imaging is any indication then the VRMs on the Fury X run as hot as the VRMs on the Titan X. In other words, over 100 °C.


The two sources which I've seen have put a Furmark load on it which increases the power consumption by about 40%. I'm not convinced 100*C is 24/7. I'm sure AMD have appropriately rated VRM's to withstand that temperature, the 290/x was capable of 130*C but obviously your not going to want to hold that temperature. If 100*C is put on appropriately rated VRM's there should be no problem.

The 120mm fan is also not spinning up and the AIO closed loop also seems to be overcome at this point on these stress tests. I'm willing to bet that with a fan profile or a fan replacement you could most likely drop those temperatures anyhow. Probably not that much of a big deal and the fact that this is pulling slightly beyond the 375w limit makes it the absolute extreme, no enhanced fan profile + Furmark. If that's the absolute (unrealistic) extreme it's actually not all that bad.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can't help but think that using some Fujipoly Ultra Extreme like I used on my EK waterblocks would do wonders for these VRMs.


Surely there's some thermal pad between the VRMs and the pipe, yes?
Can the dude that provided the bare PCB shots confirm?


----------



## HiTechPixel

Quote:


> Originally Posted by *p4inkill3r*
> 
> Surely there's some thermal pad between the VRMs and the pipe, yes?
> Can the dude that provided the bare PCB shots confirm?


I could only find this picture of the disassembled cooler assembly. It would seem that there's nothing that really transfers the heat from the VRM to the copper pipe aside from a piece of the metal cooler assembly itself.


----------



## DNMock

Not sure if this has been posted here yet, but with an EK block, the Fury X becomes a true single slot card:



Now it's a short card and single slot, the possibilities that opens up for builds is pretty fantastic.

Don't know if they make a micro atx board with 4 single slot pcie 3.0 x 16's but that build would be godly


----------



## xer0h0ur

Far as I know there isn't a single consumer CPU out there capable of running 4 PCI-E slots at 16X unless there is something out there with more than 40 PCI-E lanes.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Far as I know there isn't a single consumer CPU out there capable of running 4 PCI-E slots at 16X unless there is something out there with more than 40 PCI-E lanes.


You are correct, PLX chips are needed for this.

However I'd be very, very interested in an mATX board with 4x PCI-E slots wired at 16X with the help of a PLX chip. That'd be perfect.


----------



## DNMock

Quote:


> Originally Posted by *xer0h0ur*
> 
> Far as I know there isn't a single consumer CPU out there capable of running 4 PCI-E slots at 16X unless there is something out there with more than 40 PCI-E lanes.


I mean 4 pcie 3.0 slots capable of up to x16 (the basic big ones that your gpu plugs in to). Obviously they would be running at 16x8x8x8 but I was more referring to the full length slots themselves.


----------



## xer0h0ur

A couple of reads in case people haven't come across them yet:

http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/

"AMD is pleased to bring you the new R9 390 series which has been in development for a little over a year now. To clarify, the new R9 390 comes standard with 8GB of GDDR5 memory and outpaces the 290X. Some of the areas AMD focused on are as follows:
1) Manufacturing process optimizations allowing AMD to increase the engine clock by 50MHz on both 390 and 390X while maintaining the same power envelope
2) New high density memory devices allow the memory interface to be re-tuned for faster performance and more bandwidth
· Memory clock increased from 1250MHz to 1500MHz on both 390 and 390X
· Memory bandwidth increased from 320GB/s to 384GB/s
· 8GB frame buffer is standard on ALL cards, not just the OC versions
3) Complete re-write of the GPUs power management micro-architecture
· Under "worse case" power virus applications, the 390 and 390X have a similar power envelope to 290X
· Under "typical" gaming loads, power is expected to be lower than 290X while performance is increased"

http://wccftech.com/amd-project-quantum-amd-processors/

"We have Quantum designs that feature both AMD and Intel processors, so we can fully address the entire market. I'm sure you've heard AMD leaders speak before about how we're driving growth in the company and our key businesses, and that one of the key strategies we have for doing that is listening to customers."


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> A couple of reads in case people haven't come across them yet:
> 
> http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/
> 
> "AMD is pleased to bring you the new R9 390 series which has been in development for a little over a year now. To clarify, the new R9 390 comes standard with 8GB of GDDR5 memory and outpaces the 290X. Some of the areas AMD focused on are as follows:
> 1) Manufacturing process optimizations allowing AMD to increase the engine clock by 50MHz on both 390 and 390X while maintaining the same power envelope
> 2) New high density memory devices allow the memory interface to be re-tuned for faster performance and more bandwidth
> · Memory clock increased from 1250MHz to 1500MHz on both 390 and 390X
> · Memory bandwidth increased from 320GB/s to 384GB/s
> · 8GB frame buffer is standard on ALL cards, not just the OC versions
> 3) Complete re-write of the GPUs power management micro-architecture
> · Under "worse case" power virus applications, the 390 and 390X have a similar power envelope to 290X
> · Under "typical" gaming loads, power is expected to be lower than 290X while performance is increased"
> 
> http://wccftech.com/amd-project-quantum-amd-processors/
> 
> "We have Quantum designs that feature both AMD and Intel processors, so we can fully address the entire market. I'm sure you've heard AMD leaders speak before about how we're driving growth in the company and our key businesses, and that one of the key strategies we have for doing that is listening to customers."


Everything said regarding the 300 series has been true for me...

Adding information to this as I go:
http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club

Still learning the card, and it is definitely a different animal than the 290.


----------



## taem

Quote:


> Originally Posted by *Clockster*
> 
> http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/
> 
> Fury X CF performance.


Thanks for that link. My initial take on Fury was "pass" because it's not really a 4k capable card, and my 290 is fine for 1440p. But on those benches the 4k perf with crossfire seems solid. Surprising, with only 4gb vram. I think I'm still going to wait for more vram but this does make me think about it a bit more, especially if the Fiji cards with more vram are not coming anytime soon. Still -- waiting seems like the better bet though, for cheaper 4k displays and hdmi 2.0 and dp 1.3 on screen and card.

Overall, I'm underwhelmed by Fury.

Quote:


> Originally Posted by *DNMock*
> 
> Not sure if this has been posted here yet, but with an EK block, the Fury X becomes a true single slot card:


Awesome. Thank god we're ditching dvi. And I say that as someone who has a 1440p Korean monitor that only accepts dvi that I'll have to toss or buy a dp to dvi adapter for. (And those are either wonky for the cheap ones and super expensive for the ones that work.)

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can't help but think that using some Fujipoly Ultra Extreme like I used on my EK waterblocks would do wonders for these VRMs.


FUE did nothing for my Heatkiller 290, temps did not change at all. Maybe Watercool provides higher quality stock pads.


----------



## Agent Smith1984

Quote:


> Originally Posted by *taem*
> 
> Thanks for that link. My initial take on Fury was "pass" because it's not really a 4k capable card, and my 290 is fine for 1440p. But on those benches the 4k perf with crossfire seems solid. Surprising, with only 4gb vram. I think I'm still going to wait for more vram but this does make me think about it a bit more, especially if the Fiji cards with more vram are not coming anytime soon. Still -- waiting seems like the better bet though, for cheaper 4k displays and hdmi 2.0 and dp 1.3 on screen and card.
> 
> Overall, I'm underwhelmed by Fury.
> Awesome. Thank god we're ditching dvi. And I say that as someone who has a 1440p Korean monitor that only accepts dvi that I'll have to toss or buy a dp to dvi adapter for. (And those are either wonky for the cheap ones and super expensive for the ones that work.)
> FUE did nothing for my Heatkiller 290, temps did not change at all. Maybe Watercool provides higher quality stock pads.


Doesn't this card come with a DP -> dual DVI adapter?
Could of sworn I saw one retail package including one.....


----------



## Clockster

For the guys interested it looks like Asus will have the 1st air cooled fury cards, mainly their Strix or DCIII setup.


----------



## flopper

Quote:


> Originally Posted by *Clockster*
> 
> For the guys interested it looks like Asus will have the 1st air cooled fury cards, mainly their Strix or DCIII setup.


link?


----------



## Clockster

Quote:


> Originally Posted by *flopper*
> 
> link?


Just something I heard from someone









Edit: I should also mention, the Fury X launch clocks were lower than what I saw a few weeks ago, not sure why AMD dropped the speed though.


----------



## flopper

Quote:


> Originally Posted by *Clockster*
> 
> Just something I heard from someone


Ok wink wink I know what you mean


----------



## xer0h0ur

I don't know what you did but there isn't a chance in hell stock pads performed identically to FUE. I can't measure the difference it made on my 295X2 as there aren't any temperature readouts on the VRMs of that card but I can certainly say it made a difference on the temps shown for my 290X's VRMs. Note I waterblocked my cards though. I didn't just change the pads on the stock blower.


----------



## HiTechPixel

Quote:


> Originally Posted by *taem*
> 
> Awesome. Thank god we're ditching dvi. And I say that as someone who has a 1440p Korean monitor that only accepts dvi that I'll have to toss or buy a dp to dvi adapter for. (And those are either wonky for the cheap ones and super expensive for the ones that work.)


Indeed, aesthetically speaking DVI ports are ugly as hell and take up a lot of space.


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know what you did but there isn't a chance in hell stock pads performed identically to FUE. I can't measure the difference it made on my 295X2 as there aren't any temperature readouts on the VRMs of that card but I can certainly say it made a difference on the temps shown for my 290X's VRMs. Note I waterblocked my cards though. I didn't just change the pads on the stock blower.


It's a Heatkiller waterblock and I'm telling you temps did not change after upgrading the stock pads to FUE. This isn't something you can screw up, it's not like paste. Maybe temps are lower by like 1-2c. Nothing like the 15-20c drops most folks report. I should note though, my temps were good with the stock pads (maxed in low 60s on a warm day with 290 oc'd to 1500 oops 1200 core) and it was pointless to get the FUE in the first place probably. Because I was already getting, with the stock pads, what a lot of folks get with the FUEs. I had a thread about this where I posted numbers.

It's why I'm speculating Watercool might provide better quality pads. I wouldn't be surprised, everything they do is straight out of the top drawer.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *escksu*
> 
> Lol... screw driver!!!
> 
> Btw, I was too lazy to clean off the TIM, but put back the heatsink.... apparently temps not affected. But I will use the standard method of tissue paper and alcohol. Should not damage the GPU. Only thing is to wait for the EK waterblock
> 
> 
> 
> I have used alcohol and q-tips with precision for ages. Ever since I modded my Voodoo 5 5500 waaaaaaaaaaaaaaaay back in the day.
Click to expand...

I use Blue towels and Isopropyl. If it's difficult to clean using Isopropyl then I use Arctic Cleaner and Conditioner. I do use QTips but only for VRMs.









Blue towels are the shiznit.









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *taem*
> 
> It's a Heatkiller waterblock and I'm telling you temps did not change after upgrading the stock pads to FUE. This isn't something you can screw up, it's not like paste. Maybe temps are lower by like 1-2c. Nothing like the 15-20c drops most folks report. I should note though, my temps were good with the stock pads (maxed in low 60s on a warm day with 290 oc'd to 1500 oops 1200 core) and it was pointless to get the FUE in the first place probably. Because I was already getting, with the stock pads, what a lot of folks get with the FUEs. I had a thread about this where I posted numbers.
> 
> It's why I'm speculating Watercool might provide better quality pads. I wouldn't be surprised, everything they do is straight out of the top drawer.


Okay, that was a reading fail on my part then. I read that as the stock pads used on the original cooler were as efficient as FUE which was nearly causing an eye twitch because I knew there wasn't a chance in hell that was right.


----------



## Final8ty

http://forums.overclockers.co.uk/showthread.php?p=28229521#post28229521


----------



## Agent Smith1984

Quote:


> Originally Posted by *Final8ty*
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?p=28229521#post28229521




















Here is the firestrike for that setup:

http://www.3dmark.com/3dm/7500097


----------



## flopper

Quote:


> Originally Posted by *Final8ty*
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?p=28229521#post28229521


Yea nice tidy chassis.
Mine is like a earthquake passed by.

red red red radeon


----------



## xer0h0ur

Note he ignored his own company's instruction to keep the radiator above the card's height. This is the only damn way anyone can put that many of those radiators into a moderately sized case.


----------



## joeh4384

Quote:


> Originally Posted by *Final8ty*
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?p=28229521#post28229521


Wow.


----------



## HiTechPixel

Those four AIOs look like a mess.


----------



## xer0h0ur

Well no matter how hard you try you're not going to make that look elegant. That is why people end up going with custom open looks and waterblocks.


----------



## Mtom

Quote:


> Originally Posted by *HiTechPixel*
> 
> They are cooled by a single water-bearing copper pipe which frankly doesn't seem to do the job. If early thermal imaging is any indication then the VRMs on the Fury X run as hot as the VRMs on the Titan X. In other words, over 100 °C.


About the Fury VRM heat...the thermal images showing 100C were made under Furmark load with 380W power consumption. Tomshardware monitored it in gaming as well, and they only hit 60ish.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well no matter how hard you try you're not going to make that look elegant. That is why people end up going with custom open looks and waterblocks.


And then you have people wondering why you spend several thousands on water-cooling when stock-cooling *JUST WERKS*. Same thing with people wondering why I color match my hardware. I seriously can't live with a computer that doesn't look good, even if it uses air-cooling.


----------



## Kane2207

Quote:


> Originally Posted by *HiTechPixel*
> 
> And then you have people wondering why you spend several thousands on water-cooling when stock-cooling *JUST WERKS*. Same thing with people wondering why I color match my hardware. I seriously can't live with a computer that doesn't look good, even if it uses air-cooling.


Haha, can't live without colour matched components on his PC. That's the most first world problem I've ever heard on OCN.

+rep


----------



## HiTechPixel

Quote:


> Originally Posted by *Kane2207*
> 
> Haha, can't live without a colour matched components on his PC. That's the most first world problem I've ever heard on OCN.
> 
> +rep


I know, I know, but I can't help myself. I'm dumb like that: BLUE RAM IN A MONOTONE BUILD?! CALL THE POLICE! AN UNSLEEVED FAN CABLE THAT IS BARELY VISIBLE?! CALL THE GOVERNMENT AND NUKE THE COMPUTER!


----------



## xer0h0ur

LOL, you're telling me. If you take a look at my rig's current setup. I went through all that trouble when my case doesn't even have a window to begin with. Doesn't matter to me though since I know exactly what its like in there. I do it for my personal satisfaction, not to show it off.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, you're telling me. If you take a look at my rig's current setup. I went through all that trouble when my case doesn't even have a window to begin with. Doesn't matter to me though since I know exactly what its like in there. I do it for my personal satisfaction, not to show it off.


Exactly my point! Nobody but you really knows what the inside of your computer looks like, but still, I have to make it look good! And then a couple of months later I've forgotten how it looks.


----------



## Agent Smith1984

Quote:


> Originally Posted by *HiTechPixel*
> 
> Exactly my point! Nobody but you really knows what the inside of your computer looks like, but still, I have to make it look good! And then a couple of months later I've forgotten how it looks.


I'm with you man.

I sold off my 290 trixxies, and migrated to 390 Gaming just for the color scheme. See the old setup in my sig.

I actually painted the outside edges of the 290's with red nail polish just to cover that ugly yellow.


----------



## DNMock

Quote:


> Originally Posted by *HiTechPixel*
> 
> And then you have people wondering why you spend several thousands on water-cooling when stock-cooling *JUST WERKS*. Same thing with people wondering why I color match my hardware. I seriously can't live with a computer that doesn't look good, even if it uses air-cooling.


I've got your solution right here:

Non heat sinks:



Heat Sinks:



Now you can get whatever components you want without having to worry about breaking your color scheme


----------



## xer0h0ur

If somebody manages to find out how many ACEs (asynchronous compute engines) Fiji was given please inform us. I can't seem to get this question answered at all and its a very important factor going forward with DX12.


----------



## RaduZ

Lol those guys on the overclock.co.uk thread are nuts







) They were like... "mehh I orderd one but when I saw how good they look in your sistem I orderd a couple more..." like they are talking about pizza or smth







)

Also owners seem to be verry happy with the cards.


----------



## Kane2207

Quote:


> Originally Posted by *RaduZ*
> 
> Lol those guys on the overclock.co.uk thread are nuts
> 
> 
> 
> 
> 
> 
> 
> ) They were like... "mehh I orderd one but when I saw how good they look in your sistem I orderd a couple more..." like they are talking about pizza or smth
> 
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Also owners seem to be verry happy with the cards.


They banned users and deleted the posts of anyone even slightly critical of AMD before launch iirc - so take from that forum what you will


----------



## xer0h0ur

Quote:


> Originally Posted by *xer0h0ur*
> 
> If somebody manages to find out how many ACEs (asynchronous compute engines) Fiji was given please inform us. I can't seem to get this question answered at all and its a very important factor going forward with DX12.


Nevermind, I saw a die diagram in techpowerup's review. It still has 8 ACEs like the 285/290/290X. A bit confusing again that they stood pat on an architectural lead. I guess they felt being able to run 64 async commands to Maxwell 2's 31 was enough of a lead. I still think they should have packed more ROPs and ACEs.


----------



## Final8ty

Quote:


> Originally Posted by *Kane2207*
> 
> They banned users and deleted the posts of anyone even slightly critical of AMD before launch iirc - so take from that forum what you will


False, no one gets banned or deleted posts for being critical of AMD or NV, users get banned or deleted posts for how they behave in regards to other forum members and blatant trouble making, there are ways to go about being critical and there are ways not too.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nevermind, I saw a die diagram in techpowerup's review. It still has 8 ACEs like the 285/290/290X. A bit confusing again that they stood pat on an architectural lead. I guess they felt being able to run 64 async commands to Maxwell 2's 31 was enough of a lead. I still think they should have packed more ROPs and ACEs.


I too think they should have done that. But maybe it wasn't feasible for them, for whatever reason? A hardware, a physical constraint perhaps? Or maybe economical?


----------



## Kane2207

Quote:


> Originally Posted by *Final8ty*
> 
> False, no one gets banned or deleted posts for being critical of AMD or NV, users get banned or deleted posts for how they behave in regards to other forum members and blatant trouble making, there are ways to go about being critical and there are ways not too.


That's not how I remember things happening when people were linking the controversial (and rightly so) Kitguru video.

Bans and edits were handed out sharply.


----------



## Final8ty

Quote:


> Originally Posted by *Kane2207*
> 
> That's not how I remember things happening when people were linking the controversial (and rightly so) Kitguru video.
> 
> Bans and edits were handed out sharply.


As i have said its all about how forum members go about making there comments.


----------



## Kane2207

Quote:


> Originally Posted by *Final8ty*
> 
> As i have said its all about how forum members go about making there comments.


Fair comment


----------



## RaduZ

Quote:


> Originally Posted by *Kane2207*
> 
> That's not how I remember things happening when people were linking the controversial (and rightly so) Kitguru video.
> 
> Bans and edits were handed out sharply.


Well I got banned for that when I linked it in a comment on the Kitguru article with the Fury.


----------



## Final8ty

Quote:


> Originally Posted by *RaduZ*
> 
> Well I got banned for that when I linked it in a comment on the Kitguru article with the Fury.


You would only get banned for a link if it contained inappropriate language or the thread had been closed because of silly bickering and you linked the article again in another thread after the fact.


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> I too think they should have done that. But maybe it wasn't feasible for them, for whatever reason? A hardware, a physical constraint perhaps? Or maybe economical?


Yeah man your guess is as good as mine.


----------



## RaduZ

Quote:


> Originally Posted by *Final8ty*
> 
> You would only get banned for a link if it contained inappropriate language or the thread had been closed because of silly bickering and you linked the article again in another thread after the fact.


I posted a link to a kitguru clip on a kitguru article and that's the whole story.


----------



## Final8ty

Quote:


> Originally Posted by *RaduZ*
> 
> I posted a link to a kitguru clip on a kitguru article and that's the whole story.


Sorry but its never as simple as that.


----------



## ban25

Just placed an order for a Sapphire Fury X with Tiger Direct after striking out with my Amazon pre-order (never fulfilled). Has anyone who ordered from them received a shipping notification yet?


----------



## Tivan

Quote:


> Originally Posted by *Final8ty*
> 
> Sorry but its never as simple as that.


Kitguru were not interested in having the video that lead to them losing their FuryX review sample, anywhere near the article where they tried to spin the story in a very different way.

The comments are run using Disqus. I don't think Kitguru hold themselves to the standards one would hold oneself to, when running an actual community.

That's about the full story I'd imagine.


----------



## DividebyZERO

Quote:


> Originally Posted by *ban25*
> 
> Just placed an order for a Sapphire Fury X with Tiger Direct after striking out with my Amazon pre-order (never fulfilled). Has anyone who ordered from them received a shipping notification yet?


AMDMatt did, hes got all 4 that were shipped out.


----------



## RaduZ

Quote:


> Originally Posted by *Final8ty*
> 
> Sorry but its never as simple as that.


But it is, and 2 fellow OCN users had the same fate when they tried.


----------



## MlNDSTORM

Hoping to pick this up in a month or 2, upgrading from my 780 Ti.


----------



## Final8ty

Quote:


> Originally Posted by *RaduZ*
> 
> But it is, and 2 fellow OCN users had the same fate when they tried.


There was bickering posts gets deleted suspensions handed out, another thread was made and it got locked, warnings are given if you then post the link again then you will get suspended.


----------



## Tivan

Quote:


> Originally Posted by *Final8ty*
> 
> There was bickering post gets deleted suspensions handed out, another thread was made and it got locked, warnings are given if you then post the link again then you will get suspended.


Are you trying to imply that posting a link to a video with very high relevance to the topic at hand, is leading to a suspension, merely because other users were bickering while posting the same video?


----------



## Final8ty

Quote:


> Originally Posted by *Tivan*
> 
> Are you trying to imply that posting a link to a video with very high relevance to the topic at hand, is leading to a suspension, merely because other users were bickering while posting the same video?


No there would of been warnings first.


----------



## methadon36

Quote:


> Originally Posted by *Final8ty*
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?p=28229521#post28229521


Couldn't get a single card and this guy gets 4 lol...


----------



## Final8ty

Quote:


> Originally Posted by *methadon36*
> 
> Couldn't get a single card and this guy gets 4 lol...


There is another guy in the same thread who also got 4.


----------



## Boomstick727

Liking the Fury X, back to AMD until Pascal arrives


----------



## Final8ty

@ Boomstick727 you took your time getting here.


----------



## Boomstick727

Quote:


> Originally Posted by *Final8ty*
> 
> @ Boomstick727 you took your time getting here.


Ha, been a busy day


----------



## methadon36

Quote:


> Originally Posted by *Final8ty*
> 
> There is another guy in the same thread who also got 4.


Are they distributors or AMD employees or just very lucky people with spare cash to spend


----------



## KnightWolf654

got mine today and may have got a dud. once i got the drivers installed i am getting a bsod saying the device driver got stuck in a infinite loop. can't get into windows for more then 30 seconds with out coming up. i'm going to try and reinstall windows and see if that helps, anyone else got any other ideas?


----------



## Final8ty

Quote:


> Originally Posted by *methadon36*
> 
> Are they distributors or AMD employees or just very lucky people with spare cash to spend


The pics are from an AMD employee but he had to buy from retail with his own money, the other guy Kaapstad is just a user like me and you, found him http://www.overclock.net/u/187871/kaapstad.


----------



## bonami2

Quote:


> Originally Posted by *methadon36*
> 
> Are they distributors or AMD employees or just very lucky people with spare cash to spend


Well if your a employee at a main shop im sure you can grab them


----------



## Final8ty

Quote:


> Originally Posted by *bonami2*
> 
> Well if your a employee at a main shop im sure you can grab them


Nope he had to order online like everyone else.


----------



## methadon36

Quote:


> Originally Posted by *Final8ty*
> 
> The pics are from an AMD employee but he had to buy from retail with his own money, the other guy Kaapstad is just a user like me and you, found him http://www.overclock.net/u/187871/kaapstad.


Alrghty then. If you will excuse me, im gonna rub all the jelly on some toast


----------



## Ceadderman

So bummed. Furies getting snapped up left an right an Sapphire will likely have no more when I am ready.









Guess I will wait for x2 to launch.









~Ceadder


----------



## HiTechPixel

Quote:


> Originally Posted by *KnightWolf654*
> 
> got mine today and may have got a dud. once i got the drivers installed i am getting a bsod saying the device driver got stuck in a infinite loop. can't get into windows for more then 30 seconds with out coming up. i'm going to try and reinstall windows and see if that helps, anyone else got any other ideas?


Never had that problem before. But don't wreck your brain trying to solve it if there is no solution, just return it if possible.


----------



## p4inkill3r

Quote:


> Originally Posted by *KnightWolf654*
> 
> got mine today and may have got a dud. once i got the drivers installed i am getting a bsod saying the device driver got stuck in a infinite loop. can't get into windows for more then 30 seconds with out coming up. i'm going to try and reinstall windows and see if that helps, anyone else got any other ideas?


DDU in safe mode or Windows reinstall would be my first action.


----------



## bonami2

Quote:


> Originally Posted by *Final8ty*
> 
> Nope he had to order online like everyone else.


Canada computer already show them on their website idk if it mean they had stock

Their is not only online seller that have priority


----------



## p4inkill3r

I got upgraded to one day shipping on Amazon Prime but their rep had no ETA for shipment. ;/


----------



## KnightWolf654

i tried that a few times with no luck. since i have not reinstalled windows in almost 4 years i am hoping that will resolve it. i guess if not i will contact newegg for a replacement.


----------



## p4inkill3r

Quote:


> Originally Posted by *KnightWolf654*
> 
> i tried that a few times with no luck. since i have not reinstalled windows in almost 4 years i am hoping that will resolve it. i guess if not i will contact newegg for a replacement.


What GPU are you switching from?


----------



## KnightWolf654

Quote:


> Originally Posted by *p4inkill3r*
> 
> What GPU are you switching from?


HD7970


----------



## p4inkill3r

The infinite loops bug has been going on for years. Which drivers are you installing?


----------



## hyp36rmax

Quote:


> Originally Posted by *Boomstick727*
> 
> Liking the Fury X, back to AMD until Pascal arrives
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!





Spoiler: Warning: Spoiler!






>






About time! Welcome to the club #1!


----------



## escksu

Nice card and nice comp!!


----------



## hyp36rmax

*Now that some of you have gotten your FURY X GPU's be sure to register!*

It's Live! Click *Here* or this short cut!











Spoiler: Warning: Spoiler! Members Registration











Bring it! Welcome to the club!


----------



## Boomstick727

Quote:


> Originally Posted by *hyp36rmax*
> 
> 
> About time! Welcome to the club #1!


Quote:


> Originally Posted by *escksu*
> 
> Nice card and nice comp!!


Cheers guys









Wasn't sure if that second comment was aimed at me but I'll take it anyway lol.


----------



## xer0h0ur

For what its worth the only time I ever had a video card BSOD every time Windows booted was with my cousin's MSI Gaming 290X that came in a brand new rig.


----------



## 12Cores

Has anyone here upgraded from 7970/280x crossfire setup, if so is performance comparable?


----------



## ban25

Quote:


> Originally Posted by *Boomstick727*
> 
> Liking the Fury X, back to AMD until Pascal arrives


Congrats!


----------



## Maximization

ek said blocks will be ready next week. I dont see any fury x's anywhere to get


----------



## Casey Ryback

Quote:


> Originally Posted by *12Cores*
> 
> Has anyone here upgraded from 7970/280x crossfire setup, if so is performance comparable?


The fury X is around double the performance (of a single 7970), no crossfire software issues, lower heat and noise of course.

You're going to get a much lower frame time variance on a single card so it will bring a much smoother experience overall.

edited to avoid confusion.


----------



## Clockster

Monday can't come soon enough. Really looking forward to trying 2 Fury x in cf


----------



## Orthello

Quote:


> Originally Posted by *Clockster*
> 
> Monday can't come soon enough. Really looking forward to trying 2 Fury x in cf


From the benchs i've seen CF Furys Xs are doing really well. Be interesting to see if some sites in the coming weeks do follow up CF reviews.


----------



## Bludge

Quote:


> Originally Posted by *Casey Ryback*
> 
> The fury X is around double the performance (of the single card), no crossfire software issues, lower heat and noise of course.


No crossfire issues? That would be a first, the dual cards I've owned have had exactly the same issues in none supportive games as my dual card setups.


----------



## Clockster

Quote:


> Originally Posted by *Orthello*
> 
> From the benchs i've seen CF Furys Xs are doing really well. Be interesting to see if some sites in the coming weeks do follow up CF reviews.


Yeah all the cf benches I've come across show fantastic scaling.
Quote:


> Originally Posted by *Bludge*
> 
> No crossfire issues? That would be a first, the dual cards I've owned have had exactly the same issues in none supportive games as my dual card setups.


I don't think he mentioned anything about issues. He was specifically talking about performance.
That said a single Fury X is still enough for 4K, so if I run into issues I'll just disable the 2nd card until a fix or update is released.
Dual card systems are almost never hassle free. This goes for both AMD and Nvidia, I've had enough cf and sli rigs to know this.


----------



## Elmy

I am getting mine on the next batch. Quad Fury X's ... Club3D brand.


----------



## Arizonian

Quote:


> Originally Posted by *hyp36rmax*
> 
> *Now that some of you have gotten your FURY X GPU's be sure to register!*
> 
> It's Live! Click *Here* or this short cut!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler! Members Registration
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bring it! Welcome to the club!


And with these cards starting to come in, with a great OP (thank you hyp36max), our GPU editor Staryoshi has stopped by I see and given the club the *[Official]* tag. Congrats!









With that, I'd like to remind members this is now officially a club with owners. Please remember for you non-owners there is a discussion thread in the news section to voice your opinions on these cards. This is now officially a place for owners to be able to discuss their GPU's experience, overclocking, issues, etc....ENJOY these cool beasts!


----------



## Casey Ryback

Quote:


> Originally Posted by *Bludge*
> 
> No crossfire issues? That would be a first, the dual cards I've owned have had exactly the same issues in none supportive games as my dual card setups.


Fury X isn't a dual gpu card though.....................hope that's on topic for the now official owners club


----------



## Sgt Bilko

Quote:


> Originally Posted by *Elmy*
> 
> I am getting mine on the next batch. Quad Fury X's ... Club3D brand.


I'm jelly....Always wanted to give Club3D a go but they are never sold here


----------



## KnightWolf654

Quote:


> Originally Posted by *p4inkill3r*
> 
> The infinite loops bug has been going on for years. Which drivers are you installing?


i'm installing the 15.15 drivers from amd even tried the one that gigabyte had. also tried the 15.6 drivers for the other cards but it wouldn't even let me. reinstalling windows now.


----------



## Nickyvida

Hi guys.

Just a quick question, have tech sites confirmed that the Fury X will not come with an air cooled version since the embargo has lifted?

Been reading conflicting reports that air cooled Fury X will not be for sale.


----------



## eurostyle360

Quote:


> Originally Posted by *Nickyvida*
> 
> Hi guys.
> Just a quick question, have tech sites confirmed that the Fury X will not come with an air cooled version since the embargo has lifted?
> Been reading conflicting reports that air cooled Fury X will not be for sale.


I'm fairly certain Fury X will remain a water-cooled reference design only, and the Fury (non-X) will come out soon in air-cooled form and can be changed up by manufacturers.

So does anyone know where I can buy one that's in stock? I tried tigerdirect but I don't think I can wait til the estimated July 7th delivery date


----------



## hyp36rmax

Quote:


> Originally Posted by *Arizonian*
> 
> And with these cards starting to come in, with a great OP (thank you hyp36max), our GPU editor Staryoshi has stopped by I see and given the club the *[Official]* tag. Congrats!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With that, I'd like to remind members this is now officially a club with owners. Please remember for you non-owners there is a discussion thread in the news section to voice your opinions on these cards. This is now officially a place for owners to be able to discuss their GPU's experience, overclocking, issues, etc....ENJOY these cool beasts!


Thank you Arizonian and Staryoshi!


----------



## p4inkill3r

Quote:


> Originally Posted by *eurostyle360*
> 
> I'm fairly certain Fury X will remain a water-cooled reference design only, and the Fury (non-X) will come out soon in air-cooled form and can be changed up by manufacturers.
> 
> So does anyone know where I can buy one that's in stock? I tried tigerdirect but I don't think I can wait til the estimated July 7th delivery date


Fury X is going to be limited to the reference design, much like nvidia's Titan. The non-X Fury will be the one you see with Sapphire's Tri-X/MSI's Twin Frozr, etc. when it is released next month.

I don't see any in stock anywhere; my order placed with Amazon the minute they were listed is as of this morning still unfulfilled. There are some people that have received theirs from Newegg and Tiger that were ordered when the initial stock was listed but since then, Newegg has yet to release any more.


----------



## Himo5

How long are those water pipes on the Fury X? Are they long enough to reach the back fan vent from Slot 5 instead of Slot 2?


----------



## kayan

Just a question to all those who have gotten their X's: Has anyone come from [email protected] or a 295x2? If so, what are your impressions between the [email protected] vs 1 (or 2) Fury X? Or even 980ti vs Fury X?

I'm contemplating ditching my 295x2 for a Fury X. I saw the benchies, but curious about user experiences.


----------



## Clockster

Quote:


> Originally Posted by *kayan*
> 
> Just a question to all those who have gotten their X's: Has anyone come from [email protected] or a 295x2? If so, what are your impressions between the [email protected] vs 1 (or 2) Fury X? Or even 980ti vs Fury X?
> 
> I'm contemplating ditching my 295x2 for a Fury X. I saw the benchies, but curious about user experiences.


I'll be able to tell you on Monday next week.







(Used to run 290x Crossfire before my 980's,)


----------



## Ceadderman

Quote:


> Originally Posted by *Himo5*
> 
> How long are those water pipes on the Fury X? Are they long enough to reach the back fan vent from Slot 5 instead of Slot 2?


Tiny Tom says they're massively long. He had a bit to play with in his Giant size case. He stated that if you were to cram that in an mITX case you'd have to route and secure them to keep them out of the way. For get which case he has but it's *big*.









~Ceadder


----------



## Digitalwolf

I have mine installed now... for some reason FedEx said it missed their cut off on Weds so it didn't show until today. First impression was that the box it came in was huge. I put it in my MAtx build which uses a Thermaltake V21 and the box just well looks huge in comparison to my case. My card is an XFX because that's all I saw in stock before they completely vanished.

After installing the card and powering on, I have a very faint high pitched noise. That noise never changes regardless of what I'm doing, so I am thinking its probably the internal pump. To be clear this is more like a background noise but for me its a pitch that I can notice, its just not really bothersome at all. During gameplay and Heaven runs GPU-Z is showing my Fury's fan at 4800+ and noise wise it sounds the same as idle. The fan during idle only drops down to around 4000 rpm. Which for a comparison is the same as my EVGA SSC GTX 970 at 0 rpm. During benches and some games my 970 gets fairly loud.

So far I am liking the card. (aka in the entire couple hours I've had it installed.)

Oh as to someone asking about the length of the cooling tubes. They are pretty long and flexible except close to the card/rad where its more rigid. I didn't take the time to measure them but in my smallish case I have quite a bit of excess.


----------



## frunction

Picked one up yesterday at Microcenter.

For me it was a very simple decision, $1,700 for 980ti hybrid and a 1440p G-Sync monitor, or $1,100 for a Fury X and Freesync monitor. For that price difference I don't care about 5fps at release.

Looks very smooth on games I tried so far and suprisingly quieter than the beefy custom watercooled loop I had prevously.

My last cards were (3)780ti, (2)Titan, (2)680 before that some 6950s. So no feelings/fanboism involved, sync is just so much less expensive with AMD. Plus the monitor has multiple inputs.


----------



## DNMock

Quote:


> Originally Posted by *frunction*
> 
> Picked one up yesterday at Microcenter.
> 
> For me it was a very simple decision, $1,700 for 980ti hybrid and a 1440p G-Sync monitor, or $1,100 for a Fury X and Freesync monitor. For that price difference I don't care about 5fps at release.
> 
> Looks very smooth on games I tried so far and suprisingly quieter than the beefy custom watercooled loop I had prevously.
> 
> My last cards were (3)780ti, (2)Titan, (2)680 before that some 6950s. So no feelings/fanboism involved, sync is just so much less expensive with AMD. Plus the monitor has multiple inputs.


Can you increase the voltage on it? Still want to see what kind of O/C these guys can hit.

How well they O/C and how well the crossfire scales vs the SLI scaling of the 980ti / Titan X. I'm very curious to see how that goes.

Oh if anyone has a link to a tri-fire benchmark of the fury-X please share. Wanna see if it beats out dual 1500+ clocked T-X cards or not.


----------



## Kuivamaa

Quote:


> Originally Posted by *p4inkill3r*
> 
> Fury X is going to be limited to the reference design, much like nvidia's Titan. The non-X Fury will be the one you see with Sapphire's Tri-X/MSI's Twin Frozr, etc. when it is released next month.
> 
> I don't see any in stock anywhere; my order placed with Amazon the minute they were listed is as of this morning still unfulfilled. There are some people that have received theirs from Newegg and Tiger that were ordered when the initial stock was listed but since then, Newegg has yet to release any more.


Price not confirmed here either and I won't be placing an order before it is.I am afraid that at this pace I will have to wait till September ,because I don't want it delivered to my local post office when I am abroad on vacation. Oh well, not that my 290X isn't running nicely.


----------



## Ceadderman

Think am gonna wait for Fury 2x. Not that I don't want Fury, but TTLogan made a reasonable point about the shortness of Fury being in a FT system. 2x should be at least double the length and likely double the price.









~Ceadder


----------



## ozyo

so are we going to get x lightning ?


----------



## xer0h0ur

Quote:


> Originally Posted by *DNMock*
> 
> Can you increase the voltage on it? Still want to see what kind of O/C these guys can hit.
> 
> How well they O/C and how well the crossfire scales vs the SLI scaling of the 980ti / Titan X. I'm very curious to see how that goes.
> 
> Oh if anyone has a link to a tri-fire benchmark of the fury-X please share. Wanna see if it beats out dual 1500+ clocked T-X cards or not.


Were still waiting on an Afterburner update allowing voltage changes. Even the people overclocking felt as if increasing the power limit was counter-productive with the current Afterburner and I wouldn't be surprised if that isn't working right either.


----------



## Himo5

Thing to do is to get into as many competitions with FuryX prizes as you can. Have you tried reinstalling GPU-Z recently?


----------



## xer0h0ur

Quote:


> Originally Posted by *ozyo*
> 
> so are we going to get x lightning ?


Nope. AMD locked out AIBs from modifying the reference design of Fury X.


----------



## fewness

I guess I used up all my luck to get a FuryX...so I can't install the driver..



What should I do? I used the driver uninstaller from Guru3D to clean up Nvidia things. I installed all VC++ and DX packages from MS. I'm on Windows 8....please help:wheee:


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nope. AMD locked out AIBs from modifying the reference design of Fury X.


You serious? At this point I won't be getting a Fury X card to play with now. Seems like Pascal might be my next upgrade path.


----------



## Neon Lights

Quote:


> Originally Posted by *fewness*
> 
> I guess I used up all my luck to get a FuryX...so I can't install the driver..
> 
> 
> 
> What should I do? I used the driver uninstaller from Guru3D to clean up Nvidia things. I installed all VC++ and DX packages from MS. I'm on Windows 8....please help:wheee:


Try to use the AMD Cleanup Utility and plug your graphics card into a different PCIe Slot.


----------



## p4inkill3r

Quote:


> Originally Posted by *fewness*
> 
> I guess I used up all my luck to get a FuryX...so I can't install the driver..
> 
> 
> 
> What should I do? I used the driver uninstaller from Guru3D to clean up Nvidia things. I installed all VC++ and DX packages from MS. I'm on Windows 8....please help:wheee:


You ran DDU in safe mode?


----------



## xer0h0ur

Quote:


> Originally Posted by *Roaches*
> 
> You serious? At this point I won't be getting a Fury X card to play with now. Seems like Pascal might be my next upgrade path.


Yup. Pages back there is a question and answer session video where AMD guys say Fury X is locked and regular Fury is getting the custom AIB treatment. AMD pulled the same crap Nvidia did with Titan X by locking out AIBs from changing the cooler, pcb or backplate.


----------



## Digitalwolf

Quote:


> Originally Posted by *fewness*
> 
> I guess I used up all my luck to get a FuryX...so I can't install the driver..
> 
> 
> 
> What should I do? I used the driver uninstaller from Guru3D to clean up Nvidia things. I installed all VC++ and DX packages from MS. I'm on Windows 8....please help:wheee:


Are you installing from an included driver CD or did you download a driver?

To me this is what you'd see if you tried to install 15.5 or 15.6 as opposed to 15.15...

If you choose custom and only see two options (Catalyst Install Manager and one other thing.. its the wrong driver and will fail like that.)


----------



## fewness

Quote:


> Originally Posted by *Digitalwolf*
> 
> Are you installing from an included driver CD or did you download a driver?
> 
> To me this is what you'd see if you tried to install 15.5 or 15.6 as opposed to 15.15...
> 
> If you choose custom and only see two options (Catalyst Install Manager and one other thing.. its the wrong driver and will fail like that.)


Man you saved my day! I went to Guru3D, followed their link which says 15.16 but linked to 15.6....what the...

All good and up running now. thank you!


----------



## jase78

hey guys heres a good video that really breaks the fury x down with no bias


----------



## Roaches

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yup. Pages back there is a question and answer session video where AMD guys say X is locked and regulary Fury is getting the custom AIB treatment. AMD pulled the same crap Nvidia did with Titan X by locking out AIBs from changing the cooler, pcb or backplate.


Considering they've skimped even on the I/O interfaces, (no HDMI 2.0; leaving out the potential HTPC market) I almost feel they've nearly pulled a GPU Bulldozer here. HBM is really nice on paper; wanted to give one a try but oh well. Way to go to lose even more marketshare AMD









I'll wait for what Dual Fury X and non X card version holds, I'd love to see a Devil 13 successor, but chances are slim.


----------



## xer0h0ur

Quote:


> Originally Posted by *Roaches*
> 
> Considering they've skimped even on the I/O interfaces, (no HDMI 2.0; leaving out the potential HTPC market) I almost feel they've nearly pulled a GPU Bulldozer here. HBM is really nice on paper; wanted to give one a try but oh well. Way to go to lose even more marketshare AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll wait for what Dual Fury X and non X card version holds, I'd love to see a Devil 13 successor, but chances are slim.


That would be an entirely different animal though. I don't believe AMD would hold back AIBs from creating custom Fury X2's. They didn't with the 295X2 and this is no different.


----------



## taem

Quote:


> Originally Posted by *Boomstick727*
> 
> Liking the Fury X, back to AMD until Pascal arrives


What are those cables?

AgentSmith said earlier that Fury X might come with a DP to DVI adapter cable. But looking at shop descriptions they don't mention one being included.


----------



## xer0h0ur

Bottom cable is clearly full size DP to DVI. The top one I can't make out but I would suspect HDMI.


----------



## Ceadderman

Try removing all drivers and starting fresh.









As far as AMD not allowing vendors to modify, well I seriously doubt that. TTLogan suggested that vendors will likely modify Fury Pro to get the most out of the upcoming card. So I don't put much stock in the rumor mill swirling round Fury atm. He could be wrong and they might have been forbidden to mess round with the PCB, but AMD generally gives their blessings to the vendors. I've never seen AMD putting their foot down against such things.









~Ceadder


----------



## DFroN

Do we think there's a reasonable chance that aftermarket Fury's will beat the reference Fury X ala 980Ti/Titan or is not enough known at this point?

Tbh it would seem odd that AMD would allow non reference Fury's to beat the X so soon after the X's release.


----------



## Ha-Nocri

Can u see VRM temps on Fury X? How is the pipe connected to the metal beneath it? Anyone try to put TIM in-between?


----------



## blue1512

It's really hard for FuryPro to beat FuryX, given that the cooler on FuryX is extremely good. Hardocp is running FuryX 1140MHz stable at *37C* (100%fan via AB), anyone here tried it?


----------



## Ceadderman

Quote:


> Originally Posted by *DFroN*
> 
> Do we think there's a reasonable chance that aftermarket Fury's will beat the reference Fury X ala 980Ti/Titan or is not enough known at this point?
> 
> Tbh it would seem odd that AMD would allow non reference Fury's to beat the X so soon after the X's release.


Anything is Possible given that 980ti gives newer Titans fits.









Tbh we really won't know til we have something else from AMD to compare to.









~Ceadder


----------



## KnightWolf654

i think i figured out my problem. i put the card into another pc and it worked fine, not a single bsod and i ran 3dmark several time to put a load on it and its all good. looks like my motherboard may be the problem. hopefully a bios update will fix the problem. id rather not buy a new one as the card was expensive enough.


----------



## p4inkill3r

Quote:


> Sapphire Radeon R9 Fury X 4GB HBM HDMI / TRIPLE DP PCI-Express Graphics Card 21246-00-40G
> by Sapphire Technology
> $649.99
> *Usually ships in 1 to 3 weeks*


Updated this morning.


----------



## Sgt Bilko

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Sapphire Radeon R9 Fury X 4GB HBM HDMI / TRIPLE DP PCI-Express Graphics Card 21246-00-40G
> by Sapphire Technology
> $649.99
> *Usually ships in 1 to 3 weeks*
> 
> 
> 
> Updated this morning.
Click to expand...

Well....that kinda sucks


----------



## G227

Quote:


> Originally Posted by *DFroN*
> 
> Do we think there's a reasonable chance that aftermarket Fury's will beat the reference Fury X ala 980Ti/Titan or is not enough known at this point?
> 
> Tbh it would seem odd that AMD would allow non reference Fury's to beat the X so soon after the X's release.


Pending possible driver updates - the key is going to be sorting out the overclocking. The gap is just too wide right now even between the Fury X and the 980Ti/Titan when overclocked. At stock speeds, the cards are comperable at 4K (1440p, 1080p goes to the green team though). But when overclocked the difference is anywhere between 5-40%. And lets be honest here - if we are buying enthusiast grade GPU like Fury X, we are not going to be content with running it at stock.

JayZ did a good video review here comparing the overclocked cards, I recommend you guys watch it: 



 I mean even KINGPIN 980 trades blows with Fury X (and this card was running 1580 which you can get from every second good 980 not only kingpin).

So yeah, overclocking - voltage unlocks or at least some voltage boost. These cards need to get up to 1350-1400 to trade blows with 980Ti/Titan X and above 1400 to beat it. And there is a higher chance that Fury Pro will do that enabling more voltage regulation etc. Of course - thats assuming they are not gonna cut - or cut very little.
Quote:


> Originally Posted by *p4inkill3r*
> 
> Updated this morning.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well....that kinda sucks


Linus was just talking about it yesterday in the WAN show - saying that basically there were 0 stock at launch and maybe around 1000 is available soon after for US. Which is ridiculously low.


----------



## Minotaurtoo

I haven't read the whole thread... and I am planning on getting a fury x when I get back from vacation... I need to know if anyone has tried using 3 dp to vga adapters on one yet and if it worked... I have some monitors here that for some reason only seem to work right when on vga... go figure...

oh and subbed..


----------



## p4inkill3r

Quote:


> Hello **************,
> We have good news! We now have delivery date(s) for your item(s) listed below. We'll send a confirmation when your items ship. If you would like to view the status of your entire order or make any changes to it, please visit Your Orders on Amazon.com.
> 
> Delivery Estimate Details
> Order #*******************************
> Placed on Wednesday, June 24, 2015
> Your new estimated delivery date is: Monday, July 13, 2015 - Wednesday, July 15, 2015
> 
> Your shipping speed:
> One-Day Shipping


----------



## rv8000

Quote:


> Originally Posted by *p4inkill3r*


That's Unfortunate, and here I felt bad about having to wait till thursday for mine to arrive


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Try removing all drivers and starting fresh.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as AMD not allowing vendors to modify, well I seriously doubt that. TTLogan suggested that vendors will likely modify Fury Pro to get the most out of the upcoming card. So I don't put much stock in the rumor mill swirling round Fury atm. He could be wrong and they might have been forbidden to mess round with the PCB, but AMD generally gives their blessings to the vendors. I've never seen AMD putting their foot down against such things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Like I said already, AMD guys already confirmed that Fury X will not get AIB custom treatment. Its locked out just like Nvidia locked out AIBs from modifying the Titan X reference design. They are allowing AIBs to go nuts on the standard Fury though (Fiji Pro). No word on R9 Nano or Fury X2 although if I had to guess I would say they won't lock those either as I have already seen more than one cooler type on the R9 Nano and they didn't lock out AIBs from custom 295X2's so I figure Fury X2 won't be locked either.


----------



## DividebyZERO

Quote:


> Originally Posted by *G227*
> 
> Pending possible driver updates - the key is going to be sorting out the overclocking. The gap is just too wide right now even between the Fury X and the 980Ti/Titan when overclocked. At stock speeds, the cards are comperable at 4K (1440p, 1080p goes to the green team though). But when overclocked the difference is anywhere between 5-40%. And lets be honest here - if we are buying enthusiast grade GPU like Fury X, we are not going to be content with running it at stock.
> 
> JayZ did a good video review here comparing the overclocked cards, I recommend you guys watch it:
> 
> 
> 
> I mean even KINGPIN 980 trades blows with Fury X (and this card was running 1580 which you can get from every second good 980 not only kingpin).
> 
> So yeah, overclocking - voltage unlocks or at least some voltage boost. These cards need to get up to 1350-1400 to trade blows with 980Ti/Titan X and above 1400 to beat it. And there is a higher chance that Fury Pro will do that enabling more voltage regulation etc. Of course - thats assuming they are not gonna cut - or cut very little.
> 
> Linus was just talking about it yesterday in the WAN show - saying that basically there were 0 stock at launch and maybe around 1000 is available soon after for US. Which is ridiculously low.


Is Fury x voltage adjustable yet? Not saying results will be much different but your comparing oranges to apples like so many people are doing. Once voltage is adjustable the compare overclocks. Why compare stock vs overclocked now?


----------



## xer0h0ur

Well for what its worth, Maxwell still overclocks better than Fiji without adding voltage. Either way he's still right about needing to get voltage control over the card. We already know its capable of drawing a good chunk of extra power over its stock power draw so its got access to the wattage for sure. I can't imagine AMD calling this card an overclocker's dream over a 90-100MHz overclock. Something isn't adding up quite right.


----------



## Minotaurtoo

yeah so far the old tahiti cards proved to be better overclockers than fiji... I got 1200mgz out of mine with only 1.25v and considering that they were originally 800mhz cards and giga oc'd them to 900mhz base and then I pushed to 1200 base ...that seems like more of a dream since its a 50% over initial release of the 7950's

edit.... I feel a little stupid abandoning my cards for fiji, but it performs at stock near to what my two cards do with my custom OC and a lot cooler/quieter at that... not to mention power draw will be a lot less...
what do you think... I've been told to wait till the fury non x release, but I really don't want to go non x again... I really regretted that after I didn't get the 7970... of coarse I did mod it over to a 280, but it could have been a 280x lol


----------



## eucalyptus

Is it possible to use the "pump" as a watercooling block only, and plug in a real pump, like D5 in the loop?

Do anyone know if the Fury Pro will have less performance than Fury X? Thinking about to get a Fury Pro and just buy the EK waterblock.


----------



## Gdourado

Since the Fury X is way smaller than any other flagship card, how will it help to prevent the sagging of the card?
IS the card heavy?


----------



## rv8000

Quote:


> Originally Posted by *Gdourado*
> 
> Since the Fury X is way smaller than any other flagship card, how will it help to prevent the sagging of the card?
> IS the card heavy?


Less weight past the end of the pcie slot where the card has the least support from any direction.


----------



## Gdourado

Quote:


> Originally Posted by *rv8000*
> 
> Less weight past the end of the pcie slot where the card has the least support from any direction.


So in theory it will be more sag resistant than a longer card like a 980 ti G1 and the likes?

Cheers!


----------



## Forceman

Quote:


> Originally Posted by *eucalyptus*
> 
> Is it possible to use the "pump" as a watercooling block only, and plug in a real pump, like D5 in the loop?
> 
> Do anyone know if the Fury Pro will have less performance than Fury X? Thinking about to get a Fury Pro and just buy the EK waterblock.


The pump is built in to the block, so I don't think you'd want to try to use it in a custom loop as is. You'd also have to cut the tubing to connect it, which is less than ideal. I think either use it as is, or getting a full cover block are your only real options.

It seems pretty likely that the Fury will be a cut-down Fury X, but no idea how much. Going by previous cards, probably about 5% slower or so. Depends what they cut though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rv8000*
> 
> Less weight past the end of the pcie slot where the card has the least support from any direction.
> 
> 
> 
> So in theory it will be more sag resistant than a longer card like a 980 ti G1 and the likes?
> 
> Cheers!
Click to expand...

Yes, it will not sag as much as a longer GPU would


----------



## Gdourado

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yes, it will not sag as much as a longer GPU would


I ask because I really hate sagging graphics cards.
To solve that, I bought a HAF XB to have the board horizontal so the GPU wouldn't strain with the weight.

But for the fury x, I wouldn't think the HAF XB would be ideal as the radiator would have to be lauded on its side and would not sit above the GPU as AMD recommends.

Any thoughts on this?

Cheers!


----------



## eucalyptus

Quote:


> Originally Posted by *Forceman*
> 
> The pump is built in to the block, so I don't think you'd want to try to use it in a custom loop as is. You'd also have to cut the tubing to connect it, which is less than ideal. I think either use it as is, or getting a full cover block are your only real options.
> 
> It seems pretty likely that the Fury will be a cut-down Fury X, but no idea how much. Going by previous cards, probably about 5% slower or so. Depends what they cut though.


If they cut it down, which they will, I guess it will be by computer numbers and not in components?

Then I guess it won't be any trouble to overclock it with a fullcover block to reach the top.


----------



## rv8000

Quote:


> Originally Posted by *Gdourado*
> 
> I ask because I really hate sagging graphics cards.
> To solve that, I bought a HAF XB to have the board horizontal so the GPU wouldn't strain with the weight.
> 
> But for the fury x, I wouldn't think the HAF XB would be ideal as the radiator would have to be lauded on its side and would not sit above the GPU as AMD recommends.
> 
> Any thoughts on this?
> 
> Cheers!


While it's rumored that having the rad below the pump makes the card noisier and can change other aspects of the clc, I'm not sure I've scene a thorough or specific test done by anyone yet on the topic. Don't know all that much about WC or CLCs


----------



## xer0h0ur

Quote:


> Originally Posted by *Gdourado*
> 
> I ask because I really hate sagging graphics cards.
> To solve that, I bought a HAF XB to have the board horizontal so the GPU wouldn't strain with the weight.
> 
> But for the fury x, I wouldn't think the HAF XB would be ideal as the radiator would have to be lauded on its side and would not sit above the GPU as AMD recommends.
> 
> Any thoughts on this?
> 
> Cheers!


Note AMDMatt's rig with 4 Fury X's ignored this instruction. The reason they say this is because most people would complain or worry about the noise they hear otherwise from the air bubbles working their way through the closed loop. You would hear gurgling or pump noise momentarily on every startup. Its not a deal breaker.


----------



## Thoth420

Quote:


> Originally Posted by *Gdourado*
> 
> I ask because I really hate sagging graphics cards.
> To solve that, I bought a HAF XB to have the board horizontal so the GPU wouldn't strain with the weight.
> 
> But for the fury x, I wouldn't think the HAF XB would be ideal as the radiator would have to be lauded on its side and would not sit above the GPU as AMD recommends.
> 
> Any thoughts on this?
> 
> Cheers!


My plan was the same then saw the issue you mentioned. I opted for a Fractal R5 instead especially after hearing complaints of pump noises...I am mounting it to spec above the card in the rear exhaust of the case.

Sucks I really wanted to use the HAF XB ...

Ohhh also just kinda jumped in here page 160. Ordered my Fury X should arrive Monday and building Tuesday. I will join once I have it up and running and report back about noise(loud pumps and coil whine/resonance) since I am quite sensitive to it and have defeated it in the past.


----------



## semitope

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well for what its worth, Maxwell still overclocks better than Fiji *without adding voltage*. Either way he's still right about needing to get voltage control over the card. We already know its capable of drawing a good chunk of extra power over its stock power draw so its got access to the wattage for sure. I can't imagine AMD calling this card an overclocker's dream over a 90-100MHz overclock. Something isn't adding up quite right.


I heard the voltage does change automatically when its overclocked. Probably within a range


----------



## rv8000

Quote:


> Originally Posted by *semitope*
> 
> I heard the voltage does change automatically when its overclocked. Probably within a range


How are we sure of this, does any software support voltage readings yet? Or does anyone know where the measure points would be for vcore?


----------



## semitope

Quote:


> Originally Posted by *rv8000*
> 
> How are we sure of this, does any software support voltage readings yet? Or does anyone know where the measure points would be for vcore?


I was talking about maxwell raising voltage automatically, in response to the claim that maxwell doesn't change voltage to OC. But its probably a range nvidia or the partners set.


----------



## xer0h0ur

I didn't explicitly claim that. In fact I don't own any Maxwell cards to know if its vcore changes automatically upon overclocking. All I know is that currently Maxwell overclocks better and right now we don't have voltage control on Fiji yet.


----------



## provost

One inbound folks , excited to take her for a spin










Spoiler: Warning: Spoiler!



http://s1364.photobucket.com/user/provostelite/media/Provost Furyx order_zpslu9imphy.jpg.html

[\spoiler]


----------



## lagittaja

Quote:


> Originally Posted by *Forceman*
> 
> It seems pretty likely that the Fury will be a cut-down Fury X, but no idea how much. Going by previous cards, probably about 5% slower or so. Depends what they cut though.


Quote:


> Originally Posted by *eucalyptus*
> 
> If they cut it down, which they will, I guess it will be by computer numbers and not in components?
> 
> Then I guess it won't be any trouble to overclock it with a fullcover block to reach the top.


Well some rumors have said the Fury to be 56 CU vs Fury X 64 CU. That's a 12½% drop in stream processors.
Or SP/TMU/ROP core config:
4096:256:64
3584:224:64
Where as 290 was 40 CU and 290X was 44 CU. That's a little over 9%.
Or SP/TMU/ROP core config:
2816:176:64
2560:160:64

Fiji 64CU and 56CU would be inline with 7970/7950 and pretty close to 5870/5850 separation. Where as Hawaii is pretty close with 6970/6950.

I myself am gonna vote for a 56 CU Fury. 58-60CU might be a tad too close rendering Fury X mute with smaller chip being able to boost higher (read 980Ti/Titan X).
7950/70 diff was a decent 14% along with 5850/70 being 17% diff. Compared to 6950/70 difference back in the day was something like 7%? I don't think AMD want's to repeat that again.


----------



## rv8000

Quote:


> Originally Posted by *semitope*
> 
> I was talking about maxwell raising voltage automatically, in response to the claim that maxwell doesn't change voltage to OC. But its probably a range nvidia or the partners set.










, that's what I get for reading things out of context.

I really should have paid for express shipping, would be messing with my card as we speak if I did


----------



## Minotaurtoo

ok.. which one... I don't know if it matters what brand or not... any preferences?

http://www.tigerdirect.com/applications/SearchTools/search.asp?keywords=fury+x


----------



## xer0h0ur

Unless you're picky about what gets included in the bundle like cables or adapters then really it doesn't matter which one you get. There is no difference from a hardware perspective.


----------



## Minotaurtoo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unless you're picky about what gets included in the bundle like cables or adapters then really it doesn't matter which one you get. There is no difference from a hardware perspective.


all I needed to know.. thanks


----------



## Minotaurtoo

I will be having to use display port to vga adapters, wondering if any one can confirm that this is a viable solution... here is my current connections... 3 vga and one hdmi... I will have to buy new adapters... wish I could get away from vga, but can't afford the monitors I want just yet...and the ones I have are great, but they only seem to work well on vga for some reason.


----------



## xer0h0ur

Breh, what resolution are you even running?


----------



## Minotaurtoo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Breh, what resolution are you even running?


its an odd eyefinity setup... 4700 x 1080 total... it was a hybrid setup to fit my space... I couldn't go any bigger literally touching the walls of my desk area its composed of two monitor 12xx by 1080 and one 1920x 1080


----------



## xer0h0ur

That is certainly unique


----------



## Minotaurtoo

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is certainly unique


it was the only way I could get eyefinity on my small desk area lol... you do what you have to : )


----------



## Sgt Bilko

Well then.....
Quote:


> Your new estimated delivery date is: Friday, August 7, 2015 - Friday, August 21, 2015


Can't say I'm too pleased about that to be honest


----------



## xer0h0ur

Oh I can't blame you. You work with what you've got.


----------



## Minotaurtoo

I do have intentions of attempting to bios mod mine when I get it in... assuming I get it... looking at bilko's post I'm a little concerned... but just in case I'm luckier than him I don't want it arriving while I'm not at home.

edit.. just look at this lol.. http://www.amazon.com/ATI-Technologies-Inc-32MB-Rage/dp/B000056Q2W/ref=sr_1_4?s=pc&ie=UTF8&qid=1435443026&sr=1-4&keywords=fury+x

here is where I'm buying my card: http://www.tigerdirect.com/applications/SearchTools/search.asp?keywords=fury+x


----------



## Elmy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well then.....
> Can't say I'm too pleased about that to be honest


I have to put my patient hat on for my Fury X's.

Going to get 3 Asus MG279Q's 2560X1440p 144Hz IPS Freesync monitors next week to hold me over with something to play with until I get them.


----------



## Neon Lights

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I do have intentions of attempting to bios mod mine when I get it in... assuming I get it... looking at bilko's post I'm a little concerned... but just in case I'm luckier than him I don't want it arriving while I'm not at home.


BIOS for download: http://forums.guru3d.com/showthread.php?t=400302&page=3


----------



## xer0h0ur

You're sponsored by Club3D aren't you?


----------



## Ceadderman

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> That is certainly unique
> 
> 
> 
> it was the only way I could get eyefinity on my small desk area lol... you do what you have to : )
Click to expand...

Why not just get an LED flatscreen of size and be done with it? Seems like a lot of wasted real estate for 3 monitors of different sizes all running 1080p.









So which Fury X are you looking at?









Amazon search enfine is simply goofy. An ATi Fury card. An for $62. What a steal!







... j/k.









~Ceadder


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ceadderman*
> 
> Why not just get an LED flatscreen of size and be done with it? Seems like a lot of wasted real estate for 3 monitors of different sizes all running 1080p.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So which Fury X are you looking at?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazon search enfine is simply goofy. An ATi Fury card. An for $62. What a steal!
> 
> 
> 
> 
> 
> 
> 
> ... j/k.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


edit.. I read the post wrong first time lol... I only got 3 different monitors because one was free








As to which one I'm getting, well I don't know but I'm leaning toward the xfx, but I don't know...

I get the impression that right now the only difference is the stickers.


----------



## Ceadderman

As I have owned both brands, I would suggest either XFX or Sapphire. Leaning more toward Sapphire though.

Since I checked your search link I've forgotten the price of the XFX unit. So whichever is cheaper. If they're the same price go with the Sapphire brand as they're exclusively AMD and tend to build better GPUs because they're not likely to throttle the units down either intentionally or accidentally.

I first purchased XFX back with the 5770 Reference and that thing could over clock beautifully. Still own it and it's sitting in my Mother's system still putting away.

I have 2 Sapphire 6870s and the only reason I'm gonna ditch them is cause I want DX12. 6870s won't run that. So it's time for the upgrade. Have had them since their date of launch.









~Ceadder


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ceadderman*
> 
> As I have owned both brands, I would suggest either XFX or Sapphire. Leaning more toward Sapphire though.
> 
> Since I checked your search link I've forgotten the price of the XFX unit. So whichever is cheaper. If they're the same price go with the Sapphire brand as they're exclusively AMD and tend to build better GPUs because they're not likely to throttle the units down either intentionally or accidentally.
> 
> I first purchased XFX back with the 5770 Reference and that thing could over clock beautifully. Still own it and it's sitting in my Mother's system still putting away.
> 
> I have 2 Sapphire 6870s and the only reason I'm gonna ditch them is cause I want DX12. 6870s won't run that. So it's time for the upgrade. Have had them since their date of launch.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


might do that then.. I've had saphire cards before.. never had an issue with them... powercolor, xfx and saphire are all same price at tigerdirect.com... well at least for now they are.


----------



## Ceadderman

~Ceadder


----------



## kayan

Quote:


> Originally Posted by *Minotaurtoo*
> 
> edit.. I read the post wrong first time lol... I only got 3 different monitors because one was free
> 
> 
> 
> 
> 
> 
> 
> 
> As to which one I'm getting, well I don't know but I'm leaning toward the xfx, but I don't know...
> 
> I get the impression that right now the only difference is the stickers.


I agree with the guy below, get whichever is cheaper. Sapphire tends to use higher quality parts (such as Hynix vs Elpida memory in the 290x). I've used both and I like both. I've had one Sapphire that died on me, it was a 4000 series card. I love XFX personally, as they were the only AMD brand that offered Lifetime Transferrable Warranties. Sadly this is a thing of the past. I currently have never had an issue with XFX, and they are cool with slapping on aftermarket coolers, as long as you reinstall the stock cooler before warranty service.

Funny too, but my wife has an old 5830 that was gotten near launch, and I just plugged it into a PC I built for a friend to hold him over until he has money for a GPU and it still works.

I've also used Visiontek, and the one card I had from them was pretty good quality.

All 3 of those brands manufacture Radeons exclusively (in addition to Powercolor, which I've never personally used).


----------



## Ceadderman

I believe that XFX also mans nVidia cards too. Least they did a year or so back. Sapphire is strictly AMD as far as I know.









~Ceadder


----------



## kayan

Quote:


> Originally Posted by *Ceadderman*
> 
> I believe that XFX also mans nVidia cards too. Least they did a year or so back. Sapphire is strictly AMD as far as I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


XFX hasn't made NVidia cards since around the 5000 series for AMD. They switched about the same time BFG closed up shop.


----------



## Ceadderman

Shows how long it been since I looked at buying an nVidia card.







lol

+Rep for knowledge.









~Ceadder


----------



## G227

Quote:


> Originally Posted by *DividebyZERO*
> 
> Is Fury x voltage adjustable yet? Not saying results will be much different but your comparing oranges to apples like so many people are doing. Once voltage is adjustable the compare overclocks. Why compare stock vs overclocked now?


I'm not familiar withe whether JayZ maxes volts or not in his overclocking - probably yes, but he doesn't to my knowledge use custom BIOS. So the only advantage is voltage slider up as ALL cards - including Fury X were overclocked. As I have stated in my post - the overclocking will likely change once and if we get to regulate the voltage. It will go up for sure - well not for sure but hopefully. I was commenting that in general AMD cards ship clocked closer to their limit then team green (think 290/390X vs 980) so until I see what we get even with the higher volts, I will remain skeptical.

Now as pointed by someone else the 980Ti/Titan X overclocks better at stock volts than FX as is. People get those running at 1400 (with some margin - its sillicon lottery after all) at those stock voltts while with maxing voltage you go about 100-150 higher. For that you usually also use custom BIOS. Now that is still a lot more than FX is doing at stock volts. The only thing that could be causing the issue is the overclocking software but I don't think thats the case.

Thus my case is - we can only hope that we will be able to clock FX higher when the voltage slider is made available - if it will be and - second - that Fury Pro might do better there because it may be voltage unlocked providing it with better overclocks and potentially performance depending on how much of a cut it receives versus the Fury X








Quote:


> Originally Posted by *semitope*
> 
> I was talking about maxwell raising voltage automatically, in response to the claim that maxwell doesn't change voltage to OC. But its probably a range nvidia or the partners set.


It does change with different power states and temperatures. But on top of it you have with Titan X up to +112mV "extra" slider allowing you to go from 1.168V to 1.237V and with custom BIOS you can push it to 1.274V which is hardware limit of the card. 980Ti is a similiar story but only has +80mV so it might run higher votls on default - which would make sense since you can afford it - and can also go up to 1.274V. So the difference is that move from 1.168V to 1.237V or 1.274V respectively. This is what Fury X is now lacking and should hopefully come with software updates. But we will never know if and/or how much that extra is until then.


----------



## Casey Ryback

Quote:


> Originally Posted by *G227*
> 
> People get those running at 1400 (with some margin - its sillicon lottery after all) at those stock voltts while with maxing voltage you go about 100-150 higher. For that you usually also use custom BIOS. Now that is still a lot more than FX is doing at stock volts.


For example the reference cards, and the EVGA ACX cards get really hot, and if people get them running at 1400mhz then it's still under 15% that OC. (1200mhz boost cards)

The VRM's and core temps get quite hot on those models I'm not sure if abusing them is a good idea.

Lets say they get a gigabyte G1, msi gaming etc then these are heavily OC'd out of the box and you might get about a 10% OC on top of the estimated boost clock.

Same goes for the 970/980, sure the overclock ability is better than the fury X (locked) but if the fury X gets to 1150 then that's a 10% OC too, on stock voltage.

The fury X VRM's get hot too so even if voltage was unlocked I don't know whether it would even be worth it to hit 1200mhz.

Being clocked so high standard these cards aren't really gaining as much performance as people think.

It's not like a while ago when you could get a 900mhz 7950 to 1200mhz and see a large 25% OC.

edit - the reference cards are clocked even lower than the EVGA ACX so you might gain a larger % there.


----------



## G227

Quote:


> Originally Posted by *Casey Ryback*
> 
> For example the reference cards, and the EVGA ACX cards get really hot, and if people get them running at 1400mhz then it's still under 15% that OC. (1200mhz boost cards)
> 
> The VRM's and core temps get quite hot on those models I'm not sure if abusing them is a good idea.
> 
> Lets say they get a gigabyte G1, msi gaming etc then these are heavily OC'd out of the box and you might get about a 10% OC on top of the estimated boost clock.
> 
> Same goes for the 970/980, sure the overclock ability is better than the fury X (locked) but if the fury X gets to 1150 then that's a 10% OC too, on stock voltage.
> 
> The fury X VRM's get hot too so even if voltage was unlocked I don't know whether it would even be worth it to hit 1200mhz.
> 
> Being clocked so high standard these cards aren't really gaining as much performance as people think.
> 
> It's not like a while ago when you could get a 900mhz 7950 to 1200mhz and see a large 25% OC.
> 
> edit - the reference cards are clocked even lower than the EVGA ACX so you might gain a larger % there.


*Abusing VRM's is not a good idea - I wholeheartedly agree - you need a proper cooling that goes beyond whats provided at stock to do that if you wanna be safe*. These suckers and VRAM get so hot on NVIDIA cards - to the point that core temp is the least of your issue. And you have no idea how hot they actually run unless you open the card up and get thermal gun or put in a sensor. I suspect a good number of people will blow up their cards come 6 months from now and its gonna be those things - VRM and VRAM. I have a Titan X and if you leave a backplate on just as - it gets burning hot. Now I'm running a backplate with extra Fujipoly pads and 4 large and 24 small copper heatsinks and they all get really hot too. Its insane







- granted I run 1.261V and heavy overclock on the memory, but I see people running these volts and clock on memory with stock backplate or even without and that to me is a suicide - those temps must go well beyond 90C on both of those components. I see this really as major flaw with NVIDIA's desing and think that the cooling on Fury X is absolutely superb and miles ahead of it.

Where I will disagree with you is the performance boost you get from overclocking. *Having your card overclocked makes sizable difference even for 980Ti/TX*. (its really the same for preoverlocked G1 or normal stock card - what counts in the end is final clocks as compared to the other cards). When the Fury X benchmarks came out, I replicated them on my TItan X so I could see how they stack up. Picture bellow:





You see that while OC Fury X gains 8% from its overclock, Titan X gets over 21% @both 1440p & 4K and that is excluding the 5% that its faster by default. This is also due to the fact that in addition to the core, you can overclock memory which gives additional performance. I also have a pretty bad overclocker really and don't run the highest voltage BIOS because I don't have a full custom loop yet.

We can argue about whether that is worth it for the higher volts - that's subjective, but its there for the taking, whereas its not there yet for fury X. *I wanna see how far it can go when the extra 112mV or however much it will be able to get.*


----------



## blue1512

People seem to forget that the VRM chips on AMD cards are much better the crappy ones on nVidia's.


----------



## CrazyElf

Quote:


> Originally Posted by *G227*
> 
> *Abusing VRM's is not a good idea - I wholeheartedly agree - you need a proper cooling that goes beyond whats provided at stock to do that if you wanna be safe*. These suckers and VRAM get so hot on NVIDIA cards - to the point that core temp is the least of your issue. And you have no idea how hot they actually run unless you open the card up and get thermal gun or put in a sensor. I suspect a good number of people will blow up their cards come 6 months from now and its gonna be those things - VRM and VRAM. I have a Titan X and if you leave a backplate on just as - it gets burning hot. Now I'm running a backplate with extra Fujipoly pads and 4 large and 24 small copper heatsinks and they all get really hot too. Its insane
> 
> 
> 
> 
> 
> 
> 
> - granted I run 1.261V and heavy overclock on the memory, but I see people running these volts and clock on memory with stock backplate or even without and that to me is a suicide - those temps must go well beyond 90C on both of those components. I see this really as major flaw with NVIDIA's desing and think that the cooling on Fury X is absolutely superb and miles ahead of it.
> 
> Where I will disagree with you is the performance boost you get from overclocking. *Having your card overclocked makes sizable difference even for 980Ti/TX*. (its really the same for preoverlocked G1 or normal stock card - what counts in the end is final clocks as compared to the other cards). When the Fury X benchmarks came out, I replicated them on my TItan X so I could see how they stack up. Picture bellow:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> You see that while OC Fury X gains 8% from its overclock, Titan X gets over 21% @both 1440p & 4K and that is excluding the 5% that its faster by default. This is also due to the fact that in addition to the core, you can overclock memory which gives additional performance. I also have a pretty bad overclocker really and don't run the highest voltage BIOS because I don't have a full custom loop yet.
> 
> We can argue about whether that is worth it for the higher volts - that's subjective, but its there for the taking, whereas its not there yet for fury X. *I wanna see how far it can go when the extra 112mV or however much it will be able to get.*


This. I think that there is a risk that over time, higher VRM temperatures can be a problem. I would recommend for heavy overclocks getting a full waterblock that covers the VRMs or putting a heatsink on the rear (take off the backplate) that covers the VRM areas.

I'm worried that more than the voltage unlock, it's the VRM temperatures that will restrict OCing.

I think that a custom 980Ti might be better in this regard - tons of VRM overkill and ideally, no VRAM on the rear of the PCB. With the extra OC headroom, it will probably overtake a Titan X, save when 12GB is needed.

Something like this:


Spoiler: Warning: Spoiler!







Tons of overkill for the VRM (so it will stay cool even with the volts), ideally with good VRM cooling, and a cooler that covers all of the VRAM.
Quote:


> Originally Posted by *blue1512*
> 
> People seem to forget that the VRM chips on AMD cards are much better the crappy ones on nVidia's.


You are correct that these PCBs have 6x IR6811 / IR6894, but even then, for the longevity of the cards, it's best not to strain them.

Quote:


> Originally Posted by *Ceadderman*
> 
> As I have owned both brands, I would suggest either XFX or Sapphire. Leaning more toward Sapphire though.
> 
> Since I checked your search link I've forgotten the price of the XFX unit. So whichever is cheaper. If they're the same price go with the Sapphire brand as they're exclusively AMD and tend to build better GPUs because they're not likely to throttle the units down either intentionally or accidentally.
> 
> I first purchased XFX back with the 5770 Reference and that thing could over clock beautifully. Still own it and it's sitting in my Mother's system still putting away.
> 
> I have 2 Sapphire 6870s and the only reason I'm gonna ditch them is cause I want DX12. 6870s won't run that. So it's time for the upgrade. Have had them since their date of launch.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I'd recommend against Sapphire. They won't let you unscrew your GPUs to do something like reseat TIM without voiding warranty.

http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty/0_100

I'd choose a vendor that does allow modification. MSI does have the warranty stickers, but in practice, if you RMA, they will accept it anyways.

I generally go with MSI or Gigabyte for that reason. Some people say Visiontek is good for lifetime warranty, but I don't know how well they honor their warranties.

Quote:


> Originally Posted by *ozyo*
> 
> so are we going to get x lightning ?


Regrettably, we might never get it because AMD has not permitted their AIB partners to make custom PCBs.


----------



## cbarros82

im waiting on the fury and the nano first before i make my upgrade decision


----------



## Ceadderman

The big name companies will generally accept RMA so long as you re install the stock cooler whether or not you put TIM under there or not. This *includes* Sapphire. So I'm not sure how you came by the misinformation CrazyElf, but it's simply wrong.

Now what you *may* have heard/read/experienced was a removal of stock cooler and someome decided to chuck it in favor of a block or worse still broke something and tried to slip it past the RMA process.

I just recently (some time over the last few months) read a QnA with one of Sapphires big wigs an he stated that they fully expect that Power Users will remove their stock coolers. Computer Power User magazine. Doubtful *that guy* is going to put something like that in print without a lawyer whispering anti-ligatory sweet nothings in his ear. This is after all a lawsuit happy world we live in.









~Ceadder


----------



## bonami2

Quote:


> Originally Posted by *Ceadderman*
> 
> The big name companies will generally accept RMA so long as you re install the stock cooler whether or not you put TIM under there or not. This *includes* Sapphire. So I'm not sure how you came by the misinformation CrazyElf, but it's simply wrong.
> 
> Now what you *may* have heard/read/experienced was a removal of stock cooler and someome decided to chuck it in favor of a block or worse still broke something and tried to slip it past the RMA process.
> 
> I just recently (some time over the last few months) read a QnA with one of Sapphires big wigs an he stated that they fully expect that Power Users will remove their stock coolers. Computer Power User magazine. Doubtful *that guy* is going to put something like that in print without a lawyer whispering anti-ligatory sweet nothings in his ear. This is after all a lawsuit happy world we live in.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


official sapphire rep said himself that removing it void warantly and with the knowledge i have im gonna stay aways from that brand their are reason they are the cheapest one they cut cost.

found out why my pc reboot gpu cant handle [email protected] even with temp under infrared thermometer this thing is the most stupid gu i have ever seen


----------



## Ukkooh

Quote:


> Originally Posted by *Ceadderman*
> 
> The big name companies will generally accept RMA so long as you re install the stock cooler whether or not you put TIM under there or not. This *includes* Sapphire. So I'm not sure how you came by the misinformation CrazyElf, but it's simply wrong.
> 
> Now what you *may* have heard/read/experienced was a removal of stock cooler and someome decided to chuck it in favor of a block or worse still broke something and tried to slip it past the RMA process.
> 
> I just recently (some time over the last few months) read a QnA with one of Sapphires big wigs an he stated that they fully expect that Power Users will remove their stock coolers. Computer Power User magazine. Doubtful *that guy* is going to put something like that in print without a lawyer whispering anti-ligatory sweet nothings in his ear. This is after all a lawsuit happy world we live in.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


It seems to be region specific. In Europe only a select few brands allow you to change the cooler or replace the TIM.

Edit: Has someone managed to unlock the voltage on the furys yet? I might buy one if they OC to around 1200-1300 under water.


----------



## Ceadderman

See, now *that* makes sense.









~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Minotaurtoo*
> 
> edit.. I read the post wrong first time lol... I only got 3 different monitors because one was free
> 
> 
> 
> 
> 
> 
> 
> 
> As to which one I'm getting, well I don't know but I'm leaning toward the xfx, but I don't know...
> 
> I get the impression that right now the only difference is the stickers.
> 
> 
> 
> I agree with the guy below, get whichever is cheaper. Sapphire tends to use higher quality parts (such as Hynix vs Elpida memory in the 290x). I've used both and I like both. I've had one Sapphire that died on me, it was a 4000 series card. I love XFX personally, as they were the only AMD brand that offered Lifetime Transferrable Warranties. Sadly this is a thing of the past. I currently have never had an issue with XFX, and they are cool with slapping on aftermarket coolers, as long as you reinstall the stock cooler before warranty service.
> 
> Funny too, but my wife has an old 5830 that was gotten near launch, and I just plugged it into a PC I built for a friend to hold him over until he has money for a GPU and it still works.
> 
> I've also used Visiontek, and the one card I had from them was pretty good quality.
> 
> All 3 of those brands manufacture Radeons exclusively (in addition to Powercolor, which I've never personally used).
Click to expand...

I've only used XFX for this past gen and I'm very happy with them, half the reason i went with them was just to see if all the rumours were true and nope.....XFX lifted their game and have been better off since









btw, my current 290x uses Hynix memory and can't do more than 1375Mhz, my Elpida 290's did 1550Mhz.....

I've had 2 out of 3 Sapphire cards die on me now, a HD 6970 and a 290x, yet to have an XFX one go bad.

Powercolor is good afaik, VisionTek is another good brand, I haven't used HIS for quite some time, Diamond is good,


----------



## en9dmp

Bring it on....



Gonna take a whole to connect up tho, as need to un-plumb my 290x and faff with removing existing rad setup... I have a ridiculously small custom case so no other option


----------



## ZARuslan

Hi en9dmp! Sorry for my bad english.
Nice shopping! Could you please write later how much power consumes your system? That corsair psu should show power consumption with 1 sec sampling.
Thanks!


----------



## Alastair

Any news on unlocked voltages yet?


----------



## Slink3Slyde

Subbing to see what happens when somone puts one under a custom loop when Afterburner/whatever get updated with voltage unlock.

I'm waiting for the Fury non X in a month myself which just happens to be released on my birthday. I'm taking it as a message from the powers above that it's for me, not even my wife can argue with that logic.


----------



## tx12

MSI afterburner supports 3rd party database so any Fury owner should be able to scan his card and create AB config.
Big how-to is on the AB's forum:
http://forums.guru3d.com/showthread.php?t=399542


----------



## HiTechPixel

Since it has come to light that the Fury X has damn near perfect Crossfire scaling as evident by 3DMark scores, I'm now obviously a lot more interested in the card, despite the 4GB VRAM limit. So I'd like to ask any and all Fury X owners if you could benchmark your cards (Crossfire) at 4K with SSAA, at 1440P with AMD VSR or with an actual 5K screen (Dell UP2715K)?


----------



## sugarhell

Can anyone press the i button on the msi AB and send me a pic? Maybe i can unlock the voltages


----------



## sugarhell

So from further research the fury use an IR3567B controller the same as 290x.
http://www.techpowerup.com/reviews/AMD/R9_Fury_X/5.html
http://www.techpowerup.com/reviews/AMD/R9_290X/5.html

So:

The Afterburner command to give more voltages on 290x is this:

/wi6,30,8d,10 for 100mv.
30 is the i2c bus
8d is the i2c device

Also probably you need to add something on the config file of AB to detect the controller. Brb checking more.WIP


----------



## fewness

Quote:


> Originally Posted by *sugarhell*
> 
> Can anyone press the i button on the msi AB and send me a pic? Maybe i can unlock the voltages


These?









CPU
________________________________________________________________________________
Model : Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
Logical cores : 16
CPUID : GenuineIntel family 6, model 3F, stepping 2
Tjmax : 105°C

GPU1
________________________________________________________________________________
Display device : AMD Radeon (TM) R9 Series
BIOS : 015.048.000.063
GUID : VEN_1002&DEV_7300&SUBSYS_0B361002&REV_C8&BUS_1&DEV_0&FN_0
Registry key : \Registry\Machine\System\CurrentControlSet\Control\Video\{77B07053-8E48-444F-B546-0CD9BB0965F0}\0000

On-Screen Display server
________________________________________________________________________________
Path : C:\Program Files (x86)\RivaTuner Statistics Server
Version : v6.3.0
Active client(s) : framerate monitoring module
: On-Screen Display module
Active 3D process : not detected


----------



## sugarhell

Quote:


> Originally Posted by *fewness*
> 
> These?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CPU
> ________________________________________________________________________________
> Model : Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz
> Logical cores : 16
> CPUID : GenuineIntel family 6, model 3F, stepping 2
> Tjmax : 105°C
> 
> GPU1
> ________________________________________________________________________________
> Display device : AMD Radeon (TM) R9 Series
> BIOS : 015.048.000.063
> GUID : VEN_1002&DEV_7300&SUBSYS_0B361002&REV_C8&BUS_1&DEV_0&FN_0
> Registry key : \Registry\Machine\System\CurrentControlSet\Control\Video\{77B07053-8E48-444F-B546-0CD9BB0965F0}\0000
> 
> On-Screen Display server
> ________________________________________________________________________________
> Path : C:\Program Files (x86)\RivaTuner Statistics Server
> Version : v6.3.0
> Active client(s) : framerate monitoring module
> : On-Screen Display module
> Active 3D process : not detected


Thanks !

Can you give me an i2c dumb too?

Right click on the msi AB shortcut and add this on the target : /i2cd and then post it here

Like this:"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /i2cd

then start the msi AB again and it will create a file


----------



## CrazyElf

Quote:


> Originally Posted by *Ceadderman*
> 
> The big name companies will generally accept RMA so long as you re install the stock cooler whether or not you put TIM under there or not. This *includes* Sapphire. So I'm not sure how you came by the misinformation CrazyElf, but it's simply wrong.
> 
> Now what you *may* have heard/read/experienced was a removal of stock cooler and someome decided to chuck it in favor of a block or worse still broke something and tried to slip it past the RMA process.
> 
> I just recently (some time over the last few months) read a QnA with one of Sapphires big wigs an he stated that they fully expect that Power Users will remove their stock coolers. Computer Power User magazine. Doubtful *that guy* is going to put something like that in print without a lawyer whispering anti-ligatory sweet nothings in his ear. This is after all a lawsuit happy world we live in.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


It's not misinformation. Look at the link I just linked.

http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty/0_100

A hardware representative from Sapphire is saying that they have the right to void warranties to those that remove the stock cooler!

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys, I am seeing this asked a lot on various threads and thought it was time to make one official post to answer this. The question is, if a user buys a Sapphire card and then replaces the cooler that came with the card is the warranty still in effect? The simple answer to this is no, any modification of the card voids the warranty.


Until there's an official source from Sapphire saying otherwise, I'm going to continue to assume that they will deny people the warranty.

Anecdotally by the way, I have heard of people who have been denied warranties (at least on their 7970 and 7950s).

Quote:


> Originally Posted by *Ukkooh*
> 
> It seems to be region specific. In Europe only a select few brands allow you to change the cooler or replace the TIM.
> 
> Edit: Has someone managed to unlock the voltage on the furys yet? I might buy one if they OC to around 1200-1300 under water.


If that's the case, I'd recommend only buying the brands that do allow for removal of the cooler. I'd also recommend discouraging everyone around you from buying the brands that won't honor their warranty. They clearly do not want your business.


----------



## Blameless

Most AIBs will refuse RMA if they detect obvious tampering, which can certainly include removal of the stock cooler, if it's reassembled less than perfectly.

Of course, I could probably break all the components off the PCB, solder them back on backwards; then replace the HBM stacks with Chiclets and the thermal paste with mustard, and still sneak it past some RMA departments that would refuse a card for having a completely cosmetic scratch on the backplate...but that's another matter.


----------



## fewness

Quote:


> Originally Posted by *sugarhell*
> 
> Thanks !
> 
> Can you give me an i2c dumb too?
> 
> Right click on the msi AB shortcut and add this on the target : /i2cd and then post it here
> 
> Like this:"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /i2cd
> 
> then start the msi AB again and it will create a file












my computer went black after this....now the 1st BIOS on Fury X won't boot, there is no more led light on the 2 8-pin slots. "Radeon" light is still on...

Thanks god the card has 2 BIOSs...and the 2nd one boots fine...









How do I fix this?









..........

I saw 4 files: list, get, info and create.

list is empty

get:
AdapterIndex 0
DisplayIndex 4
Width 2560
Height 1440
ColorDepth 32
RefreshRate 60.00
XPos 0
YPos 0
Orientation 0
ModeFlag 0
ModeMask 0x00FF
ModeValue 0x0046

info:
AdapterIndex 0
DisplayIndex 4
TimingStandard 49
PossibleStandard 10
RefreshRate 0
Width 0
Height 0
TimingFlags 4104
HTotal 2720
HDisplay 2560
HSyncStart 2608
HSyncWidth 32
VTotal 1481
VDisplay 1440
VSyncStart 1443
VSyncWidth 5
PixelClock 24150
HOverscanRight 0
HOverscanLeft 0
VOverscanBottom 0
VOverscanTop 0

create:
AdapterIndex 0
DisplayIndex 4
TimingStandard 8
PossibleStandard 10
RefreshRate 30
Width 3840
Height 2160
TimingFlags 4104
HTotal 2720
HDisplay 2560
HSyncStart 2608
HSyncWidth 32
VTotal 1481
VDisplay 1440
VSyncStart 1443
VSyncWidth 5
PixelClock 24150
HOverscanRight 0
HOverscanLeft 0
VOverscanBottom 0
VOverscanTop 0


----------



## sugarhell

What? Really strange...

OK another way. Go to your program files -> msi afterburner -> profiles -> and send me both files here if you can


----------



## Nizzen

Quote:


> Originally Posted by *sugarhell*
> 
> What? Really strange...
> 
> OK another way. Go to your program files -> msi afterburner -> profiles -> and send me both files here if you can


Hoping for the right answers to get OC on Fury X in the first post in this thread









Great job trying to get it to work!

I get my Fury X next week


----------



## fewness

Quote:


> Originally Posted by *sugarhell*
> 
> What? Really strange...
> 
> OK another way. Go to your program files -> msi afterburner -> profiles -> and send me both files here if you can


MSIAfterburner


Spoiler: Warning: Spoiler!



[Settings]
Views=
LastUpdateCheck=558E2C54h
UpdateCheckingPeriod=3
LowLevelInterface=1
MMIOUserMode=1
HAL=1
Driver=1
Skin=MSICyborgRed.usf
StartWithWindows=0
StartMinimized=0
HwPollPeriod=1000
LockProfiles=0
ShowHints=1
ShowTooltips=1
LCDFont=font4x6.dat
RememberSettings=0
FirstRun=0
FirstUserDefineClick=1
FirstServerRun=0
CurrentGpu=0
Sync=1
Link=1
LinkThermal=1
ShowOSDTime=0
CaptureOSD=1
Profile1Hotkey=00000000h
Profile2Hotkey=00000000h
Profile3Hotkey=00000000h
Profile4Hotkey=00000000h
Profile5Hotkey=00000000h
OSDToggleHotkey=00000000h
OSDOnHotkey=00000000h
OSDOffHotkey=00000000h
OSDServerBlockHotkey=00000000h
ScreenCaptureHotkey=00000000h
VideoCaptureHotkey=00000000h
VideoPrerecordHotkey=00000000h
PTTHotkey=00000000h
PTT2Hotkey=00000000h
ScreenCaptureFormat=bmp
ScreenCaptureFolder=
ScreenCaptureQuality=100
VideoCaptureFolder=
VideoCaptureFormat=MJPG
VideoCaptureQuality=85
VideoCaptureFramerate=30
VideoCaptureFramesize=00000002h
VideoCaptureThreads=FFFFFFFFh
AudioCaptureFlags=00000003h
VideoCaptureFlagsEx=00000000h
AudioCaptureFlags2=00000000h
VideoCaptureContainer=avi
VideoPrerecordSizeLimit=256
VideoPrerecordTimeLimit=600
AutoPrerecord=0
WindowX=171
WindowY=171
Profile2D=-1
Profile3D=-1
SwAutoFanControl=0
SwAutoFanControlFlags=00000000h
SwAutoFanControlPeriod=5000
SwAutoFanControlCurve=0000010004000000000000000000F0410000204200004842000048420000A0420000A0420000B4420000C8420000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
RestoreAfterSuspendedMode=1
PauseMonitoring=0
ShowPerformanceProfilerStatus=0
AttachMonitoringWindow=1
MonitoringWindowOnTop=1
LogPath=C:\Program Files (x86)\MSI Afterburner\HardwareMonitoring.hml
EnableLog=0
RecreateLog=0
LogLimit=10
UnlockVoltageControl=1
UnlockVoltageMonitoring=1
OEM=-1
ForceConstantVoltage=0
SingleTrayIconMode=0
Fahrenheit=0
Time24=1
LCDGraph=0
Sources=+GPU temperature,+GPU usage,+Fan speed,+Fan tachometer,+Core clock,+Memory clock,+Memory usage,+Framerate,+Frametime,+CPU1 temperature,+CPU2 temperature,+CPU3 temperature,+CPU4 temperature,+CPU5 temperature,+CPU6 temperature,+CPU7 temperature,+CPU8 temperature,+CPU9 temperature,+CPU10 temperature,+CPU11 temperature,+CPU12 temperature,+CPU13 temperature,+CPU14 temperature,+CPU15 temperature,+CPU16 temperature,+CPU temperature,+CPU1 usage,+CPU2 usage,+CPU3 usage,+CPU4 usage,+CPU5 usage,+CPU6 usage,+CPU7 usage,+CPU8 usage,+CPU9 usage,+CPU10 usage,+CPU11 usage,+CPU12 usage,+CPU13 usage,+CPU14 usage,+CPU15 usage,+CPU16 usage,+CPU usage,+RAM usage,+Pagefile usage
MonitoringWindowX=1002
MonitoringWindowY=476
MonitoringWindowW=1113
MonitoringWindowH=1402
[ATIADLHAL]
UnofficialOverclockingMode=0
UnofficialOverclockingDrvReset=1
UnifiedActivityMonitoring=0
[Source GPU temperature]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source GPU usage]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Fan speed]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Fan tachometer]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=10000
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Core clock]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=1500
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Memory clock]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=5000
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Memory usage]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=3072
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Framerate]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=200
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Frametime]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100.000
MinLimit=0.000
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU1 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU2 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU3 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU4 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU5 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU6 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU7 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU8 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU9 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU10 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU11 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU12 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU13 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU14 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU15 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU16 temperature]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU temperature]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU1 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU2 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU3 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU4 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU5 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU6 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU7 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU8 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU9 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU10 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU11 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU12 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU13 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU14 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU15 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU16 usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source CPU usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=100
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source RAM usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=8192
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=
[Source Pagefile usage]
ShowInOSD=0
ShowInLCD=0
ShowInTray=0
EnableDataFiltering=0
MaxLimit=8192
MinLimit=0
Group=
Name=
TrayTextColor=FF0000h
TrayIconType=0
GraphColor=00FF00h
Formula=



Is this the file you're interested in? It has almost nothing....maybe because my card blacked out after taking the command line








VEN_1002&DEV_7300&SUBSYS_0B361002&REV_C8&BUS_1&DEV_0&FN_0


Spoiler: Warning: Spoiler!



[Startup]
Format=2
PowerLimit=
CoreClk=
MemClk=
FanMode=
FanSpeed=


----------



## sugarhell

Hmm the msi AB cant detect the i2c bus of Fury. Strange..

Thank you for your contribution i will see what i can do


----------



## blue1512

It seems that the HBMs on FuryX are only locked by driver. It DID be overclocked to 600 MHz.


----------



## pdasterly

Sapphire says warranty will be void but i had no issue with them, rma'd 290x three times, all had waterblocks on them. You have to reinstall original cooler. Athlon micro had quick turn around time too. Me personally I clean the gpu die and reapply thermal paste then get some blue loc-tite for the screws, makes it look all original.


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> It seems that the HBMs on FuryX are only locked by driver. It DID be overclocked to 600 MHz.


Did it have any performance impact?


----------



## p4inkill3r

Quote:


> Originally Posted by *Forceman*
> 
> Did it have any performance impact?


It definitely does: http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/fs/P/1038/500000?minScore=0&gpuName=AMD Radeon R9 Fury X

The core clock @ 1145mhz:
14345Fire Strike (1.1) by SovereignZuulJune 26 2015
8Intel Core i7-5930K (3,500 MHz)Details
Advanced Micro Devices Inc. AMD Radeon R9 Fury X (1,145 MHz)


----------



## Forceman

Quote:


> Originally Posted by *p4inkill3r*
> 
> It definitely does: http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/fs/P/1038/500000?minScore=0&gpuName=AMD Radeon R9 Fury X
> 
> The core clock @ 1145mhz:
> 14345Fire Strike (1.1) by SovereignZuulJune 26 2015
> 8Intel Core i7-5930K (3,500 MHz)Details
> Advanced Micro Devices Inc. AMD Radeon R9 Fury X (1,145 MHz)


Wait, but what about all those people saying we didn't need memory overclocking because HBM was already so fast it didn't need to be overclocked?

That being said, I find it difficult to believe that the memory overclock alone accounted for that kind of increase. That score wasn't validated, and doesn't show on Futuremark's site. Tessellation may have been off for all we know.


----------



## blue1512

Source:
http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results

I believe the tessellation was not touch coz they are doing review. Believe it or not 600MHz is 20%OC on Memory, and it DOES have impact on result. To be honest shader/bandwidth rate of Fury is not much more than 290x so I'm not surpise with the impact of Mem OC here.
So far FuryX has been showing a lot of OC potential


----------



## G227

Quote:


> Originally Posted by *CrazyElf*
> 
> This. I think that there is a risk that over time, higher VRM temperatures can be a problem. I would recommend for heavy overclocks getting a full waterblock that covers the VRMs or putting a heatsink on the rear (take off the backplate) that covers the VRM areas.
> 
> I'm worried that more than the voltage unlock, it's the VRM temperatures that will restrict OCing.
> 
> I think that a custom 980Ti might be better in this regard - tons of VRM overkill and ideally, no VRAM on the rear of the PCB. With the extra OC headroom, it will probably overtake a Titan X, save when 12GB is needed.


Agreed







. I would add though that I like what AMD has done with VRM cooling here - the copper heatsink that is actively cooled is a great idea and I would wager that it does much better job than 980Ti/Titan X without full water block. On my Titan X I use the hybrid AIO and just installed a "Manhattan" of copper heatsinks (







) on the backplate and they all get incredibly hot - especially those above the VRMs. Frankly I think that the stock small heatsink that cools the VRAM/VRMs is just plain insufficient and the lack of monitoring is something that NVIDIA needs to work on.

We will probably have to see 980Ti KINGPIN or the new HOF LN2 edition to really take on the Titan X. The G1 Gaming clocked at 1529MHz still fell short of Titan X clocked at 1400. So seeing people get up to 1500-1550 on their titans, you will need ~1700 to push over it and that will be a tough cookie even for 1.4V cards.


----------



## Ceadderman

Quote:


> Originally Posted by *pdasterly*
> 
> Sapphire says warranty will be void but i had no issue with them, rma'd 290x three times, all had waterblocks on them. You have to reinstall original cooler. Athlon micro had quick turn around time too. Me personally I clean the gpu die and reapply thermal paste then get some blue loc-tite for the screws, makes it look all original.


Same thing that I was saying. They really don't care so long as they get the whole unit back. Even if they took the time to separate the cooler from the card, how are they going to know anything has been done to it. They break their own warranty an it's not like the TIM is going to give them that knowledge. All manufacturers have that writer in their warranty structure. Now whether they use the writer or not is another story entirely. I suspect that it's there purely to address multiple warranty submissions for people who consistently abuse their RMA system. Such as warrantying purely to replace a card in order to resale it so the end user can sell it as an unopened unit to recoup funds for whatever purposes.

I wonder how many times they've had perfectly good cards returned as defective.









~Ceadder


----------



## pdasterly

I had reference cards, they replaced one with tri-x which performed well and tried to replace the other with a vapor-x but i had to decline because the waterblock was different. All worked out well.

Wouldn't hesitate to buy sapphire again


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> Source:
> http://uk.hardware.info/reviews/6158/19/amd-radeon-r9-fury-x-review-amds-new-flag-ship-graphics-card-overclocking-results
> 
> I believe the tessellation was not touch coz they are doing review. Believe it or not 600MHz is 20%OC on Memory, and it DOES have impact on result. To be honest shader/bandwidth rate of Fury is not much more than 290x so I'm not surpise with the impact of Mem OC here.
> So far FuryX has been showing a lot of OC potential


So they have a version of CCC that allows memory overclocking? Because that's what the review says they are using.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> So they have a version of CCC that allows memory overclocking? Because that's what the review says they are using.


i have 15.15 on my 290s and CCC has vram oc'ing.


----------



## xer0h0ur

I remember them saying it was a bug allowing them to modify the mem clock slider on every other restart.


----------



## Forceman

Quote:


> Originally Posted by *rdr09*
> 
> i have 15.15 on my 290s and CCC has vram oc'ing.


We are talking about Fury, of course you can overclock the memory on a 290, but CCC was supposed to not allow Fury memory overclocks.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> We are talking about Fury, of course you can overclock the memory on a 290, but CCC was supposed to not allow Fury memory overclocks.


my bad. i assume their driver will have same features. it would be nice to be able to oc hbm. at least a little. but, core is king, so that has to oc.


----------



## blue1512

Anyone here tried GPUTweak with Asus FuryX? I still remember that it is the first OC software to deal with r9 290x at released and I'm not surprise if it's the first to unlock FuryX


----------



## xer0h0ur

Can anyone with FuryX and SoM confirm if the game is stuttering @ 4K?


----------



## Gdourado

Hello,
I Saw Jays2cents review on the fury x and he says the pump is quite noisy and there is also significant coil whine...
With that in mind...
Which would be quieter?
The pump and single 120mm fan of the fury x?
Or the triple fan Windforce cooler on the gigabyte 980TI G1?

Cheers!


----------



## en9dmp

Just got it up and running in crossfire but not running GTA V smooth in 4k yet... Also, where the hell is this new target frame rate control? There's nothing in my catalyst control centre for it...


----------



## xer0h0ur

You're using the 15.15?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> I Saw Jays2cents review on the fury x and he says the pump is quite noisy and there is also significant coil whine...
> With that in mind...
> Which would be quieter?
> The pump and single 120mm fan of the fury x?
> Or the triple fan Windforce cooler on the gigabyte 980TI G1?
> 
> Cheers!


Air cooler would be more silent @idle, Fury @load


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Just got it up and running in crossfire but not running GTA V smooth in 4k yet... Also, where the hell is this new target frame rate control? There's nothing in my catalyst control centre for it...


What Catalyst version?


----------



## lagittaja

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> I Saw Jays2cents review on the fury x and he says the pump is quite noisy and there is also significant coil whine...
> With that in mind...
> Which would be quieter?
> The pump and single 120mm fan of the fury x?
> Or the triple fan Windforce cooler on the gigabyte 980TI G1?
> 
> Cheers!


Regarding sound pressure levels?
Like Ha-Nocri said
G1 Gaming would be quieter @ idle.
Fury X would be quieter @ load.

You can check that from TPU's review. 980 Ti G1 Gaming doesn't spin it's fans in idle, hence 0dBA while load is 40dBA.
Fury X pump and fan spin constantly, hence 31dBA idle / 32dBA load. This is from 1M away from the card.
(E: And the "high" idle SPL is because the pump spins at a constant speed no matter the load. Now I can't see why AMD couldn't have implemented a lower speed for the pump when core below certain temperature)

But it's not the SPL but the characteristics of the sound!

The high pitched whine of the pump sounds irritating based on the videos I've seen. I wouldn't want that in my system.
I'd take a 980Ti/Titan X reference anyday over the whine of that pump.
The swoosh of the blower is heck of a lot more pleasant to listen to than a high pitched whine. Even though it's SPL is higher than the pumps.

AMD said that the issue was fixed for retail. Yet there have been popping up more and more videos showcasing the noise that the pump makes. Clearly retail cards. And as far as I know, Jayz card is retail. It came from VisionTek and was packaged for retail.
Now how big of a issue is it? Is it just the first batch? Or is it a "feature" that some cards will have no matter the batch? I guess time will tell.


----------



## Tivan

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> I Saw Jays2cents review on the fury x and he says the pump is quite noisy and there is also significant coil whine...
> With that in mind...
> Which would be quieter?
> The pump and single 120mm fan of the fury x?
> Or the triple fan Windforce cooler on the gigabyte 980TI G1?
> 
> Cheers!


I would not worry about it and RMA if it's perceived noisy at idle. It's probably a simple bios fix for pump speed, if you even get one that still has the issue, considering they're all sold out and AMD supposedly took steps to fix the issue.

Maybe getting one from a vendor that expects to restock em soon is a good idea. Could be fine either way, what do I know!


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> What Catalyst version?


Using the latest 15.15 from the AMD site:




It's weird there's nothing in there... should I try uninstalling and reinstalling the driver suite?


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Using the latest 15.15 from the AMD site:
> 
> 
> 
> 
> It's weird there's nothing in there... should I try uninstalling and reinstalling the driver suite?


Has anyone anywhere said that they have already used FRTC? It probably is not included in Catalyst yet. Any other program can do that already, though.


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> Has anyone anywhere said that they have already used FRTC? It probably is not included in Catalyst yet. Any other program can do that already, though.


I saw it in the reviews, some of them showed data of the power savings when activated. I'm pretty sure it's supposed to be in there. Can anyone else confirm if they have it in their CCC?


----------



## bonami2

this seem to be laggy
Quote:


> Originally Posted by *pdasterly*
> 
> Sapphire says warranty will be void but i had no issue with them, rma'd 290x three times, all had waterblocks on them. You have to reinstall original cooler. Athlon micro had quick turn around time too. Me personally I clean the gpu die and reapply thermal paste then get some blue loc-tite for the screws, makes it look all original.


3 time

Oh another bad story to my already too long list

Was it a bios issue too? And all they says is we have tested the gpu and it working fine.............

Msi at least im like i think i killed my mobo... It good we are going to rma it


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> I saw it in the reviews, some of them showed data of the power savings when activated. I'm pretty sure it's supposed to be in there. Can anyone else confirm if they have it in their CCC?


Yeah, I found one: http://www.hardwareluxx.com/index.php/reviews/hardware/vgacards/35798-reviewed-amd-r9-fury-x-4gb.html?start=20. It should be under "Performance".

It does not say what driver they used, though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Neon Lights*
> 
> Has anyone anywhere said that they have already used FRTC? It probably is not included in Catalyst yet. Any other program can do that already, though.


FRTC is working on my 390, I'll check my ccc later and see where it was located. I believe it's under pperformance?


----------



## xer0h0ur

For some reason tabs go missing in the CCC. I haven't seen the overdrive tab in the CCC since three drivers ago and no amount of DDUing and/or driver re-installs seems to get it back. Coincidentally ever since then my tri-fire has gone back to crashing some games at startup. Now I am getting kernel crashing at random too but were so close to Windows 10 that I don't even feel like re-installing Windows 7 right now. I have a feeling one of my multitude of virus/rootkit/malware/spyware scanners deleted something windows related.


----------



## Digitalwolf

Quote:


> Originally Posted by *en9dmp*
> 
> I saw it in the reviews, some of them showed data of the power savings when activated. I'm pretty sure it's supposed to be in there. Can anyone else confirm if they have it in their CCC?


It might be due to crossfire...

I have a single Fury and under performance my two tabs are... Overdrive and Frame Rate. When I click on Frame Rate there is a box to check and it says Maximum Frame Rate. Then there is a slider with what appears to be a minimum value of 55.

As opposed to the screen shot that under the same tab simply shows Overdrive and Crossfire. So my guess is that is why its not showing. I have no idea if its supposed to be supported with Xfire or is just a bug.


----------



## xer0h0ur

LOL that actually sounds entirely possible given how crap AMD's support is on new features. Were still waiting on expanded VSR support, Freesync Crossfire support and if this is correct now Crossfire MFR support. I really wish they would just stick to finishing a feature before trying to release some other half baked crap.


----------



## en9dmp

Quote:


> Originally Posted by *Digitalwolf*
> 
> It might be due to crossfire...
> 
> I have a single Fury and under performance my two tabs are... Overdrive and Frame Rate. When I click on Frame Rate there is a box to check and it says Maximum Frame Rate. Then there is a slider with what appears to be a minimum value of 55.
> 
> As opposed to the screen shot that under the same tab simply shows Overdrive and Crossfire. So my guess is that is why its not showing. I have no idea if its supposed to be supported with Xfire or is just a bug.


Good spot. I noticed that I have Crossfire showing twice, once under performance and once under 3d settings, so that also seems kind of strange. It's almost like one of them is meant to be for FRTC but is showing the wrong tab... I can't imagine they wouldn't plan on supporting FRTC with crossfire setups...

If that's the case though they would surely have spotted it during testing. Is anyone else running these in crossfire and getting the same thing?

Also, just to confirm some findings I also have no OC headroom on these cards. Seems to run OK on 5% OC, but at 6% firestrike crashed. Far Cry 4 is butter smooth at 4k ultra settings which is great, GTA V still need to back off some settings to hit 60fps. Still get the feeling they could really improve performance with the drivers, I hope that's what they're spending all their time on right now.


----------



## xer0h0ur

The CCC has always double listed the Crossfire tab for me.


----------



## Bludge

Hurrah my first Sapphire card shipped, due on Wednesday, now to source a second


----------



## pdasterly

Quote:


> Originally Posted by *bonami2*
> 
> this seem to be laggy
> 3 time
> 
> Oh another bad story to my already too long list
> 
> Was it a bios issue too? And all they says is we have tested the gpu and it working fine.............
> 
> Msi at least im like i think i killed my mobo... It good we are going to rma it


no questions asked


----------



## blue1512

Hot tip: "How to overlock HBM on FuryX"
So after hardware.info's finding, people are overclocking HBM on FuryX. Here is the trick
Quote:


> Hilarious bug in drivers - every other few reboots you get memory clock slider enabled in catalyst control center.
> When such reboot happens, OC the memory and never turn the system off ... let it sleep at night and make the uptime last for months.


Hilarious, but it works.
http://www.techpowerup.com/gpuz/details.php?id=9pdzh


----------



## Neon Lights

Quote:


> Originally Posted by *blue1512*
> 
> Hot tip: "How to overlock HBM on FuryX"
> So after hardware.info's finding, people are overclocking HBM on FuryX. Here is the trick
> Hilarious, but it works.
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


Is that your screenshot?


----------



## blue1512

Quote:


> Originally Posted by *Neon Lights*
> 
> Is that your screenshot?


Nope, it's a *GPUz validation link* by the way.
http://www.techpowerup.com/gpuz/details.php?id=9pdzh
The level of skeptical when there is a good news for AMD is incredible


----------



## rdr09

Quote:


> Originally Posted by *xer0h0ur*
> 
> For some reason tabs go missing in the CCC. I haven't seen the overdrive tab in the CCC since three drivers ago and no amount of DDUing and/or driver re-installs seems to get it back. Coincidentally ever since then my tri-fire has gone back to crashing some games at startup. Now I am getting kernel crashing at random too but were so close to Windows 10 that I don't even feel like re-installing Windows 7 right now. I have a feeling one of my multitude of virus/rootkit/malware/spyware scanners deleted something windows related.


Always have Overdrive under Performance Tab in CCC. I never use it, though. Always in every driver i use. Currently at 15.15 for my crossfire 290s.


----------



## Ceadderman

But do you have the same with Fury X?

R9 290 isn't the ToC here unless it's comparative.









~Ceadder


----------



## rdr09

Quote:


> Originally Posted by *Ceadderman*
> 
> But do you have the same with Fury X?
> 
> R9 290 isn't the ToC here unless it's comparative.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


lol. i am so used to seeing xer0h0ur in hawaii threads. my bad.

i have plans on getting the Fury, though. not sure why 'cause i'm leaving the US soon.


----------



## fewness

Quote:


> Originally Posted by *blue1512*
> 
> Hilarious, but it works.
> http://www.techpowerup.com/gpuz/details.php?id=9pdzh


I want to see a screen cap with this mysterious HBM slider shown up in ccc as the proof...before I go crazy to reboot my computer 1000 times....


----------



## Minotaurtoo

Quote:


> Originally Posted by *fewness*
> 
> I want to see a screen cap with this mysterious HBM slider shown up in ccc as the proof...before I go crazy to reboot my computer 1000 times....


I'd just use gpuz to snatch the bios out, and then try to edit it to get the figures I want... and use ati winflash to load the modded bios into the card.... if it doesn't work, you have a backup via the dual bios switch, if it does congrats you have the mem oc'd


----------



## bonami2

this seem to be laggy
Quote:


> Originally Posted by *pdasterly*
> 
> no questions asked


That not what i was saying. They can take anything depend on the guy the other side.

In the first place you needed an rma for what? 3 time

I did rma 2 time the same model of gpu expecting to have something that worked and ended with the same buggy bios.

But anyways im running it until it blow up or i upgrade


----------



## pdasterly

two cards, on booted to blackscreen, the other performance was way to low and one had blue vertical lines on the display.
Athlon micro hooked me up


----------



## bonami2

Quote:


> Originally Posted by *pdasterly*
> 
> two cards, on booted to blackscreen, the other performance was way to low and one had blue vertical lines on the display.
> Athlon micro hooked me up


Well that exactly the problem ive heard all arround me with sapphire.

Seem their quality control is pretty low.. Sure their product work when they work but it seem they have a lot of problem From killing vrm from to high voltage to bad bios to rattling fan and the list go on.

At least when mine work well im happy. Seem i have fixed most of the problem using first bios without hwinfo64 and im using manual fan speed with the buggy trixx software.

I just miss my old asus gts250 this thing was just plug put on [email protected] for a month running at like 80c and it still going strong and never crashed







Fan is going out but it still going


----------



## pdasterly

maybe so but i blame myself for at least one due to user error, first time using water, a few spills here and there


----------



## bonami2

Quote:


> Originally Posted by *pdasterly*
> 
> maybe so but i blame myself for at least one due to user error, first time using water, a few spills here and there


Oh yea they did not like the swimming part ahah









Anyways like i said when they work they work ( if vrm are under good temp ) Everything is fine


----------



## Blackops_2

Quote:


> Originally Posted by *Alastair*
> 
> Any news on unlocked voltages yet?


Wondering the same it's nearly been a week and we don't have any serious overclocking results. Which ultimately determines this cards position and value, especially after the inevitable price drop that i suspect will come.


----------



## Ceadderman

I'm gonna put waterwings on mine. They should be right happy for that.









~Ceadder


----------



## ozyo

Quote:


> Originally Posted by *fewness*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my computer went black after this....now the 1st BIOS on Fury X won't boot, there is no more led light on the 2 8-pin slots. "Radeon" light is still on...
> 
> Thanks god the card has 2 BIOSs...and the 2nd one boots fine...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How do I fix this?


backup working bios
switch to broken bios without restarting/powering off pc
flash it
restart


----------



## Silent Scone

Quote:


> Originally Posted by *Blackops_2*
> 
> Wondering the same it's nearly been a week and we don't have any serious overclocking results. Which ultimately determines this cards position and value, especially after the inevitable price drop that i suspect will come.


Overvolt control should be coming in the coming weeks. Apparently it's taking longer than usual due to the complexity of the chip


----------



## Sgt Bilko

Quote:


> Originally Posted by *Silent Scone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blackops_2*
> 
> Wondering the same it's nearly been a week and we don't have any serious overclocking results. Which ultimately determines this cards position and value, especially after the inevitable price drop that i suspect will come.
> 
> 
> 
> Overvolt control should be coming in the coming weeks. Apparently it's taking longer than usual due to the complexity of the chip
Click to expand...

That doesn't surprise me, there are alot of new things about this GPU, kinda glad mine going to take so long in getting here........sort of


----------



## blue1512

Have anyone here try the "restarting trick" yet?
http://www.overclock.net/t/1562593/hardware-info-hbm-on-furyx-can-be-overclocked-after-all
A number from you guys could silent a lot of green trolls on that thread.


----------



## Clockster

Well my 1st card just arrived, need to go collect it.
Just not sure when the 2nd one will be available. These cards seem to be selling really well. As soon as they're in stock they are gone.


----------



## Ganf

Searched 500 posts in this thread and couldn't find it.

What material is the backplate made out of?....


----------



## en9dmp

Quote:


> Originally Posted by *Ganf*
> 
> Searched 500 posts in this thread and couldn't find it.
> 
> What material is the backplate made out of?....


It's hard to tell, it has a matte finish and is slightly rubbery to the touch. It clearly doesn't have any thermally conductive properties... An EK block and backplate would do wonders for this card


----------



## CrazyElf

Quote:


> Originally Posted by *Ganf*
> 
> Searched 500 posts in this thread and couldn't find it.
> 
> What material is the backplate made out of?....


I'm not 100% sure I believe that it is made of aluminum with some sort of soft-touch material coating it, which is the same material that covers the front heatsink. For sure the front is made of aluminum though. The back I think is also aluminum coating with the soft-touch material.

They seem to be using some sort of nickel mirror gloss.

I think that the front is removable. It might be interesting to see if a custom modded 3D printed shroud could be able to replace the stock, although it'd be purely for the looks.

Quote:


> Originally Posted by *G227*
> 
> Agreed
> 
> 
> 
> 
> 
> 
> 
> . I would add though that I like what AMD has done with VRM cooling here - the copper heatsink that is actively cooled is a great idea and I would wager that it does much better job than 980Ti/Titan X without full water block. On my Titan X I use the hybrid AIO and just installed a "Manhattan" of copper heatsinks (
> 
> 
> 
> 
> 
> 
> 
> ) on the backplate and they all get incredibly hot - especially those above the VRMs. Frankly I think that the stock small heatsink that cools the VRAM/VRMs is just plain insufficient and the lack of monitoring is something that NVIDIA needs to work on.
> 
> We will probably have to see 980Ti KINGPIN or the new HOF LN2 edition to really take on the Titan X. The G1 Gaming clocked at 1529MHz still fell short of Titan X clocked at 1400. So seeing people get up to 1500-1550 on their titans, you will need ~1700 to push over it and that will be a tough cookie even for 1.4V cards.


One of the reasons why I like the idea of custom PCBs is because they tend to have overkill VRMs.

Stock PCB:



Example of custom PCB:



Those extra VRMs should help overclocking headroom.


----------



## Neon Lights

Alright, I received my graphic cards today at around 1pm. I already installed them. Any questions?

To everyone who has one (or multiple ones): I am getting stuttering in desktop mode, whenever I move something around it stutters a bit, the mouse cursor too. Does that happen to you too? It happens with and without Crossfire being enabled. I have the drivers installed that came on the DVD (which are the 15.15 beta drivers). This was not the case with my other graphic card (260X, prior to that 7970).
I also can not find FRTC in Catalyst.


----------



## Ganf

Quote:


> Originally Posted by *CrazyElf*
> 
> I'm not 100% sure I believe that it is made of aluminum with some sort of soft-touch material coating it, which is the same material that covers the front heatsink. For sure the front is made of aluminum though. The back I think is also aluminum coating with the soft-touch material.
> 
> They seem to be using some sort of nickel mirror gloss.
> 
> I think that the front is removable. It might be interesting to see if a custom modded 3D printed shroud could be able to replace the stock, although it'd be purely for the looks.
> One of the reasons why I like the idea of custom PCBs is because they tend to have overkill VRMs.
> 
> Stock PCB:
> 
> Example of custom PCB:
> 
> Those extra VRMs should help overclocking headroom.


As an owner of that second PCB I'll attest that it makes overclocking easier and keeps temps down, but doesn't really give you any more headroom unless you're going for LN2 runs.


----------



## rt123

Quote:


> Originally Posted by *Ganf*
> 
> As an owner of that second PCB I'll attest that it makes overclocking easier and keeps temps down, but doesn't really give you any more headroom unless you're going for LN2 runs.


Its does & is supposed to give you more headroom, but its crippled by the BlackScreen bug.


----------



## pdasterly

Quote:


> Originally Posted by *rt123*
> 
> Its does & is supposed to give you more headroom, but its crippled by the BlackScreen bug.


I agree with that, with my 290x the vrm temps was the only thing holding card back until i went full water block. Modded cards with nzxt bracket, gelid heatsinks, heatkiller backplate, fans and ultra extreme fujipoly, basically the cream of the crop and vrm temps was always an issue once you get into a heavy overclock on top of a weak card.

http://www.3dmark.com/fs/3346716


----------



## Gumbi

Quote:


> Originally Posted by *Neon Lights*
> 
> Alright, I received my graphic cards today at around 1pm. I already installed them. Any questions?
> 
> To everyone who has one (or multiple ones): I am getting stuttering in desktop mode, whenever I move something around it stutters a bit, the mouse cursor too. Does that happen to you too? It happens with and without Crossfire being enabled. I have the drivers installed that came on the DVD (which are the 15.15 beta drivers). This was not the case with my other graphic card (260X, prior to that 7970).
> I also can not find FRTC in Catalyst.


There might be newer drivers available on AMD's site, check it out and dl/install them. Might help.


----------



## Final8ty

Quote:


> Originally Posted by *Ganf*
> 
> Searched 500 posts in this thread and couldn't find it.
> 
> What material is the backplate made out of?....


Aluminum.


----------



## Ganf

Quote:


> Originally Posted by *Final8ty*
> 
> Aluminum.


Aluminum with a rubberized coating?

If so a quick dip in some acetone will make it a real backplate again.









After that you'd just need some thermal pads on the back of the VRM's and thermal issues are a thing of the past.


----------



## bonami2

Quote:


> Originally Posted by *Ganf*
> 
> Aluminum with a rubberized coating?
> 
> If so a quick dip in some acetone will make it a real backplate again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After that you'd just need some thermal pads on the back of the VRM's and thermal issues are a thing of the past.


Aint sure thermal pad will change anything when your pumping 300w + from a weak 8 phase setup and or 6 phase on the current fury x

7970 had matrix 20 phase and r9 290x had lighthing with like 14 phase to help


----------



## DFroN

Couple of Fury X's in stock in the UK:

XFX @ Scan http://www.scan.co.uk/products/4gb-xfx-radeon-r9-fury-x-28nm-pcie-30-(x16)-1000mhz-hbm-1050mhz-gpu-4096-streams-aio-water-cooled-3x £546.62

Asus @ OCUK https://www.overclockers.co.uk/showproduct.php?prodid=GX-375-AS&groupid=701&catid=56&subcat=3068 £649.99


----------



## blue1512

Quote:


> Originally Posted by *bonami2*
> 
> Aint sure thermal pad will change anything when your pumping 300w + from a weak 8 phase setup and or 6 phase on the current fury x
> 
> 7970 had matrix 20 phase and r9 290x had lighthing with like 14 phase to help


The number of phase doesn't dictate the temp. It just help the voltage feeding to the core more stable. And compared the phases on FuryX, those crappy ones on TX looks like a joke on a 999$ card.


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> Aint sure thermal pad will change anything when your pumping 300w + from a weak 8 phase setup and or 6 phase on the current fury x
> 
> 7970 had matrix 20 phase and r9 290x had lighthing with like 14 phase to help


Help? No.

It's pure overkill. VRM temps on my 290x Lightning are 10c lower than core temps at all times. It serves no purpose unless you're doing extreme benchmarking. Thermal pads putting the hottest part of the PCB in contact with a 6" aluminum plate will help plenty.


----------



## bonami2

Quote:


> Originally Posted by *Ganf*
> 
> Help? No.
> 
> It's pure overkill. VRM temps on my 290x Lightning are 10c lower than core temps at all times. It serves no purpose unless you're doing extreme benchmarking. Thermal pads putting the hottest part of the PCB in contact with a 6" aluminum plate will help plenty.


well says what you want but more phase = more surface area in contact = lower temp.

pad arent magical people report at best 20c drop that just nothing

anyways lower vrm = lower capacitor temp for folding and or mining overclocked to death it required


----------



## DividebyZERO

Quote:


> Originally Posted by *bonami2*
> 
> well says what you want but more phase = more surface area in contact = lower temp.
> 
> pad arent magical people report at best 20c drop that just nothing
> 
> anyways lower vrm = lower capacitor temp for folding and or mining overclocked to death it required


----------



## NBrock

If any of you guys get one and are into [email protected] please post PPD information.


----------



## bonami2

Quote:


> Originally Posted by *DividebyZERO*


Quote:


> Originally Posted by *bonami2*
> 
> well says what you want but more phase = more surface area in contact = lower temp.
> 
> pad arent magical people report at best 20c drop that just nothing


did i miss something a bit hard to read and quote stuff on my nexus 5 ahah


----------



## bonami2

Quote:


> Originally Posted by *NBrock*
> 
> If any of you guys get one and are into [email protected] please post PPD information.


waiting for after august to buy







maybe 8gb version


----------



## Neon Lights

I also had the memory bug.



In the "Furry and Tessy" Test (1920x1080, 4xMSAA) in MSI Kombustor 2.5.0 600MHz memory clock (and standard core clock) gives me 57FPS instead of 49FPS.


----------



## bonami2

Quote:


> Originally Posted by *Neon Lights*
> 
> I also had the memory bug.
> 
> 
> 
> In the "Furry and Tessy" Test (1920x1080, 4xMSAA) in MSI Kombustor 2.5.0 600MHz memory clock (and standard core clock) gives me 57FPS instead of 49FPS.


impressive and with core clock you could do even better


----------



## Neon Lights

Quote:


> Originally Posted by *bonami2*
> 
> impressive and with core clock you could do even better


Nope, not in that test. 105MHz more only gave me 1FPS more. This test almost only scales with memory performance.


----------



## Clockster

https://community.amd.com/docs/DOC-1281
Quote:


> The AMD Radeon™ R9 Fury X graphics card industrial design was created with the goal of embodying a professional, elegant and simple design. Using multiple pieces of aluminum die cast finished in black nickel and a soft touch black, the full metal construction makes the graphics card feels as good as it looks. We are extremely pleased with the outcome of the design but also understand there are always different design ideas out there.
> 
> During the process of creating the industrial design for the AMD Radeon™ R9 Fury X graphics card we encountered a variety of unique perspectives within AMD on how it should look. These differentiating opinions made us think, what if we could enable our customers to implement their own creativity on our design? To do this we incorporated a removable front plate on the AMD Radeon™ R9 Fury X graphics card to allow for customer creativity. Below you will find a link to download the 3D model for the face plate to help get you started on designing your own 3D printed or CNC front plate.
> 
> Please ensure to take all necessary precautions prior to removing the front plate from the graphics card; these include but are not limited to:
> 
> Do not remove the front plate while the graphics card is installed inside a system
> Do not remove the front plate while the graphics card is powered or operational
> Ensure your workspace is clear of debris and appropriate electrostatic discharge (ESD) protection is taken
> The front plate can be removed by removing the four hex screws from the front of the graphics card as illustrated below
> Do not remove any other screws or modify any other components on the graphics card
> Use a proper hex key or screwdriver to remove the screws from the front plate to avoid damaging the screws
> When reinstalling the screws do not to overtighten the screws


Pretty cool if you're into modding your rig.


----------



## bonami2

Quote:


> Originally Posted by *Neon Lights*
> 
> Nope, not in that test. 105MHz more only gave me 1FPS more. This test almost only scales with memory performance.


oh yea i did think the increase was extreme ahah

well


----------



## Elmy

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're sponsored by Club3D aren't you?


Yes


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> did i miss something a bit hard to read and quote stuff on my nexus 5 ahah


Yeah, you say pads aren't magical and then point out they drop temps by 20c putting the VRM temps on par with your average aftermarket 290x.

You can't compare a reference card to something like a Lightning, it's apples and oranges. There is a reason the 290x Lightnings were $700 on launch.


----------



## Forceman

Quote:


> Originally Posted by *Neon Lights*
> 
> I also had the memory bug.
> 
> 
> 
> In the "Furry and Tessy" Test (1920x1080, 4xMSAA) in MSI Kombustor 2.5.0 600MHz memory clock (and standard core clock) gives me 57FPS instead of 49FPS.


Did you get a chance to run some side-by-side comparisons of any games or benchmarks? Even Firestrike, Heaven, or Valley stock/overclocked HBM results would be useful.


----------



## p4inkill3r

Quote:


> Originally Posted by *Forceman*
> 
> Did you get a chance to run some side-by-side comparisons of any games or benchmarks? Even Firestrike, Heaven, or Valley stock/overclocked HBM results would be useful.


Yes, please!


----------



## Agent Smith1984

Any Fury owners have a Heaven run at these settings?


----------



## bonami2

Quote:


> Originally Posted by *Ganf*
> 
> Yeah, you say pads aren't magical and then point out they drop temps by 20c putting the VRM temps on par with your average aftermarket 290x.
> 
> You can't compare a reference card to something like a Lightning, it's apples and oranges. There is a reason the 290x Lightnings were $700 on launch.


90c - 20 = 70 ad a bit of voltage im back at 100c


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> 90c - 20 = 70 ad a bit of voltage im back at 100c


Sounds like an AMD card to me.









My 6870 and 7970 Sapphire VRM's were around 95c OC'd also.


----------



## bonami2

Quote:


> Originally Posted by *Ganf*
> 
> Sounds like an AMD card to me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 6870 and 7970 Sapphire VRM's were around 95c OC'd also.


exactly 7950 im using an infrared thermometer


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> exactly 7950 im using an infrared thermometer


That's right, you're the guy with the 7950 that was imported from a carpet merchant in Istanbul that's turned yellow from folding.

Your card is not normal, and should probably be caged and muzzled.


----------



## rt123

Quote:


> Originally Posted by *Ganf*
> 
> That's right, you're the guy with the *7950 that was imported from a carpet merchant in Istanbul that's turned yellow from folding*.
> 
> Your card is not normal, and should probably be caged and muzzled.












That's sounds like some Urban Legend.


----------



## bonami2

Quote:


> Originally Posted by *Ganf*
> 
> That's right, you're the guy with the 7950 that was imported from a carpet merchant in Istanbul that's turned yellow from folding.
> 
> Your card is not normal, and should probably be caged and muzzled.


lol laugh as much you want the gts 250 changed color she has over 10k hour of gaming + about 1 month of folding. those old friend where the kind of no life that burned 2 gpu years from playing wow back in the days nd they explicitly said that the sapphire gpu are crap. gonna trust them and my own knowledge

ps i may add the gts250 is bent from it own weight to the point that we think the solder are going to fail. need toask my cousin for picture


----------



## bonami2

i have 6000 hour since 2011 on raptr. that gpu had 3 owner and i used it for like 4000 hour


----------



## StereoPixel

There is Fury X + Cooler Master Elite 110 Case owner?


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> i have 6000 hour since 2011 on raptr. that gpu had 3 owner and i used it for like 4000 hour


Dem 7 month winters.


----------



## bonami2

hum i remember
Quote:


> Originally Posted by *Ganf*
> 
> Dem 7 month winters.


Ya those winter are long so gaming time









Well the other days my friend was like hey man i never cleaned the gts250 yet... Look at the gpu im like dafuq it like full of dust... Take my infrared thermometer look at the temp in it game the damn thing is at like 120celsius on the vrm......

Try furmark hit 140c on vrm and stop looking at that world war 2 gpu.

Get the compresor out and clean it. Test again 70c under load









That the model https://www.asus.com/Graphics_Cards/ENGTS250_DKDI1GD3WW/ I even played bf3 on that thing


----------



## CM Felinni

Quote:


> Originally Posted by *StereoPixel*
> 
> There is Fury X + Cooler Master Elite 110 Case owner?


I'd like to see this also







#beastly


----------



## Kane2207

Quote:


> Originally Posted by *Neon Lights*
> 
> I also had the memory bug.
> 
> 
> 
> In the "Furry and Tessy" Test (1920x1080, 4xMSAA) in MSI Kombustor 2.5.0 600MHz memory clock (and standard core clock) gives me 57FPS instead of 49FPS.


Yay, a conclusive test! Thanks

+rep


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> I also had the memory bug.
> 
> 
> 
> In the "Furry and Tessy" Test (1920x1080, 4xMSAA) in MSI Kombustor 2.5.0 600MHz memory clock (and standard core clock) gives me 57FPS instead of 49FPS.


Dude, that's a huge fps increase, much more than overclocking core gives...

Can you post a screenshot of CCC with the memory slider active? How many restarts were required to unlock it? Can you post any gaming or 3D Mark scores with the memory @ 600?


----------



## hyp36rmax

Quote:


> Originally Posted by *en9dmp*
> 
> Dude, that's a huge fps increase, much more than overclocking core gives...
> 
> Can you post a screenshot of CCC with the memory slider active? How many restarts were required to unlock it? *Can you post any gaming or 3D Mark scores with the memory @ 600?*


This is what I want to see as more members have achieved the OC without showing the CCC screen.


----------



## Agent Smith1984

Quote:


> Originally Posted by *en9dmp*
> 
> Dude, that's a huge fps increase, much more than overclocking core gives...
> 
> Can you post a screenshot of CCC with the memory slider active? How many restarts were required to unlock it? Can you post any gaming or 3D Mark scores with the memory @ 600?


That is quite a jump, but you gotta figure.... you overclock HBM 100MHz on these cards and that's a gigantic OC (20%) for the VRAM
Hopefully we can see some 20% core overclocks once somebody gets those voltage sliders moving (not holding my breath for more than 12-15% on these cards though.)

Mainly interested in results that may translate to the possibilities for Fury vanilla...

I'll be at a corssroads by the time Fury launches to either add another 390 (whopping performance boost, but more power and higher temps to follow) or maybe moving this 390 to the wife's project box and going with a single Fury.


----------



## Gdourado

Any big issues by assembling the Fury X radiator like this:









With the tubbing on the side and the radiator on the same level as the card.


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Dude, that's a huge fps increase, much more than overclocking core gives...
> 
> Can you post a screenshot of CCC with the memory slider active? How many restarts were required to unlock it? Can you post any gaming or 3D Mark scores with the memory @ 600?




I had two restarts with only one card active and both times the memory slider was in Catalyst. I only own 3DMark 11, not Fire Strike. What games?
Try booting with one card yourself, I bet you are also going to see the memory slider in Catalyst.


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> 
> 
> I had two restarts with only one card active and both times the memory slider was in Catalyst. I only own 3DMark 11, not Fire Strike. What games?
> Try booting with one card yourself, I bet you are also going to see the memory slider in Catalyst.


Nice, kinda defeats the point a little for me if this can't be done in crossfire as obviously the addition of a card has more impact than overclocking the memory... Seems like they're punishing people for buying 2 cards! The fact the the memory "bug" and frtc are disabled when 2 cards are running...

Out of interest would be interesting to see some tests for GTA V as it's a memory intensive game with stock vs mem oc... I think you can get the basic fire strike (not advanced or ultra) for free on steam. I could try it but working late tonight and won't be able to until tomorrow...


----------



## DNMock

Can someone help me out a bit here?

The difference between a 980ti and a Fury X is larger at lower resolutions because the Nvidia cards are not bandwidth bottlenecked at those resolutions. The Fury X closes the gap at 4k due to the 980ti/Titan X facing bandwidth bottlenecks at 4k.

Is this correct?

If so I don't understand how overclocking the memory on the Fury X would give it a boost at lower res like 1080p and 1440p. If 396 GB/S is more than enough, I don't see how pushing a card that already has 500+ GB/S worth of bandwidth even higher would have any notable change at all. Now at 4k, or 8k or whatever, I could definitely see this having a positive impact, but not at lower resolutions.


----------



## lagittaja

Yea, just get the 3DMark Basic Edition which on Steam is just listed as "demo". You can run the normal Firestrike.


----------



## hyp36rmax

Quote:


> Originally Posted by *Gdourado*
> 
> Any big issues by assembling the Fury X radiator like this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the tubbing on the side and the radiator on the same level as the card.


OMG makes me want *two* R9 FURY X2's when they are released! LOL


----------



## taem

Quote:


> Originally Posted by *CrazyElf*
> 
> Those extra VRMs should help overclocking headroom.


Has there ever been a study that looks into whether custom pcbs with beefed up vrms oc better than reference cards? Maybe the high end cards like Classifieds and Lightnings. But as far as I can tell, just sort of aggregating in my mind the stuff I've read over the years, it sure doesn't seem like the non-high end custom pcbs oc any better than reference.


----------



## en9dmp

Quote:


> Originally Posted by *DFroN*
> 
> Couple of Fury X's in stock in the UK:
> 
> XFX @ Scan http://www.scan.co.uk/products/4gb-xfx-radeon-r9-fury-x-28nm-pcie-30-(x16)-1000mhz-hbm-1050mhz-gpu-4096-streams-aio-water-cooled-3x £546.62
> 
> Asus @ OCUK https://www.overclockers.co.uk/showproduct.php?prodid=GX-375-AS&groupid=701&catid=56&subcat=3068 £649.99


Big +1 for Scan from me. I got lucky and bought both mine for £519 on the 24th... By the time I checked out they were all gone! Even with the low stock they're not trying to screw people like overclockers... Interestingly overclockers are also flogging a few brands of 980 Ti's for £509 this week only. Big incentive there for anyone on the fence... They in cahoots with nVidia or something?


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Nice, kinda defeats the point a little for me if this can't be done in crossfire as obviously the addition of a card has more impact than overclocking the memory... Seems like they're punishing people for buying 2 cards! The fact the the memory "bug" and frtc are disabled when 2 cards are running...
> 
> Out of interest would be interesting to see some tests for GTA V as it's a memory intensive game with stock vs mem oc... I think you can get the basic fire strike (not advanced or ultra) for free on steam. I could try it but working late tonight and won't be able to until tomorrow...


Oops, I was wrong:



Apparently I just did not see it the last time I had checked if the memory slider was in Catalyst when I had Crossfire enabled. So yes, it works with Crossfire too.

*Edit:* While it works with Crossfire enabled, the memory slider is only present for the first card. Since we do not have SFR yet (except for in one game I think), the slower card always limits the faster one, so practically it has no use using this memory overclocking in Crossfire.


----------



## en9dmp

Quote:


> Originally Posted by *hyp36rmax*
> 
> OMG makes me want *two* R9 FURY X2's when they are released! LOL


Lol, so much nicer than my ghetto htpc setup for now... 

Need to build a bigger case and get those EK blocks when they come out...


----------



## DFroN

Quote:


> Originally Posted by *en9dmp*
> 
> Big +1 for Scan from me. I got lucky and bought both mine for £519 on the 24th... By the time I checked out they were all gone! Even with the low stock they're not trying to screw people like overclockers... Interestingly overclockers are also flogging a few brands of 980 Ti's for £509 this week only. Big incentive there for anyone on the fence... They in cahoots with nVidia or something?


I saw the Scan Fury's for £520 but by the time I had read reviews they had sold out







OCUK had really inflated 980Ti prices on release too. My local shop has Fury X's listed for £503 but I haven't seen them have any stock.


----------



## en9dmp

Quote:


> Originally Posted by *Gdourado*
> 
> Any big issues by assembling the Fury X radiator like this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the tubbing on the side and the radiator on the same level as the card.


Nice braided power cables... Are they the corsair aftermarket ones? I really need to get me those if they are. The stock ones are ridiculously stiff. Not much good in a tiny tiny case...


----------



## Agent Smith1984

Isn't funny how AMD is in a kind of "damned if you, damned if you don't" scenario right now?

They put out a card 2 years ago that beat out NVIDIA's best for less money, but the gripe was that it was too hot and used too much power...

Now they put out a card that is slightly slower but is a big leap forward in temp/power usage, and everyone says it's slower than NVIDIA....

If they'd of put this card out with 1200/600 clocks, it would be beating the 980ti, and everyone would be calling it a power hog all over again, and saying how even though the water cooling is nice, it still runs warm (because of the voltage likely needed to get it to those clocks with stability).

This card isn't for me (not the X anyways), but I am really looking forward to seeing the progression of this series as the amount of people with cards in hand increase.


----------



## Slink3Slyde

Quote:


> Originally Posted by *DNMock*
> 
> Can someone help me out a bit here?
> 
> The difference between a 980ti and a Fury X is larger at lower resolutions because the Nvidia cards are not bandwidth bottlenecked at those resolutions. The Fury X closes the gap at 4k due to the 980ti/Titan X facing bandwidth bottlenecks at 4k.
> 
> Is this correct?
> 
> If so I don't understand how overclocking the memory on the Fury X would give it a boost at lower res like 1080p and 1440p. If 396 GB/S is more than enough, I don't see how pushing a card that already has 500+ GB/S worth of bandwidth even higher would have any notable change at all. Now at 4k, or 8k or whatever, I could definitely see this having a positive impact, but not at lower resolutions.


Someone mentioned somewhere here that AMD doesnt Do so well at the lower resolutions because of the bad Dx11 CPU overhead on the drivers.

Dont know how true that is, I always thought it was memory bandwidth same as you.


----------



## flopper

Quote:


> Originally Posted by *Slink3Slyde*
> 
> Someone mentioned somewhere here that AMD doesnt Do so well at the lower resolutions because of the bad Dx11 CPU overhead on the drivers.
> 
> Dont know how true that is, I always thought it was memory bandwidth same as you.


its a bit more complex than that.
they made design choices to go for 4k resolutions more than 1080p.
I expect it to shine there once drivers get matured.


----------



## Minotaurtoo

well I placed my order for my fury x.... expected mid next month... now I have to find my boxes for my old cards so I can pack them up and sell them on ebay or here if anyone wants them... I'll be selling them and a few other items to make up the $$... can't really afford not to lol... oh well.. this will be the end of the line upgrade for my current pc... since its a dead end socket now that amd is moving on from am3 socket...bout time lol... anyway, I don't think my cpu could feed much more than the fury x anyway... even now with the twins I see it hitting 70% cpu usage at over 5ghz... but maybe this rig will last a few years to give me time to save for my next one... sounds like I gave it a death sentence don't it.


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> Oops, I was wrong:
> 
> 
> 
> Apparently I just did not see it the last time I had checked if the memory slider was in Catalyst when I had Crossfire enabled. So yes, it works with Crossfire too.


Nice... I just restarted 5 times and the memory slider hasn't appeared.

Can't be a$$ed to keep trying. Did you apply the overclock in single card mode, then power off and seat the second card?


----------



## en9dmp

Quote:


> Originally Posted by *Minotaurtoo*
> 
> well I placed my order for my fury x.... expected mid next month... now I have to find my boxes for my old cards so I can pack them up and sell them on ebay or here if anyone wants them... I'll be selling them and a few other items to make up the $$... can't really afford not to lol... oh well.. this will be the end of the line upgrade for my current pc... since its a dead end socket now that amd is moving on from am3 socket...bout time lol... anyway, I don't think my cpu could feed much more than the fury x anyway... even now with the twins I see it hitting 70% cpu usage at over 5ghz... but maybe this rig will last a few years to give me time to save for my next one... sounds like I gave it a death sentence don't it.




Got a lovely 290x with red backplate for sale of anyone wants it!


----------



## josephimports

Spoiler: Warning: Spoiler!







Sapphire Fury X

Ill post OC and temp results later. Waiting for the egg to re-stock to order #2.


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Nice... I just restarted 5 times and the memory slider hasn't appeared.
> 
> Can't be a$$ed to keep trying. Did you apply the overclock in single card mode, then power off and seat the second card?


Yes, it did not work.
I can overclock the first card's memory, but if I switch to the second one in Catalyst there is no memory slider (see my edit).

Have you tried to enable Overdrive before rebooting? That may be "helping".


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> Yes, it did not work.
> I can overclock the first card's memory, but if I switch to the second one in Catalyst there is no memory slider (see my edit).
> 
> Have you tried to enable Overdrive before rebooting? That may be "helping".


yeah I've had overdrive enabled constantly so I can make full use of my epic 4% core OC









Think I'll wait until afterburner supports voltage and memory unlock for now... Might as well stick to far cry 4 for the time being. Runs like a dream


----------



## StereoPixel

Fury X @ 1150 and Fury X @1174 MHz
http://forums.guru3d.com/showpost.php?p=5109233&postcount=64


----------



## Ganf

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127890&ignorebbr=1&cm_re=PPSSPYMKCFJSSG-_-14-127-890-_-Product&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=

In stock. It's at a $30 premium ($679.99) but it's in stock.


----------



## p4inkill3r

I'm waiting, but someone better jump on it!


----------



## ban25

Quote:


> Originally Posted by *Ganf*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127890&ignorebbr=1&cm_re=PPSSPYMKCFJSSG-_-14-127-890-_-Product&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=
> 
> In stock. It's at a $30 premium ($679.99) but it's in stock.


Looks like it's already sold out.


----------



## Cool Mike

No longer in stock.


----------



## Minotaurtoo

well that was quick


----------



## Cool Mike

Does anyone know if the Sapphire version actually has the Sapphire/Fury silk screening on the body of the card as shown in many pictures?


----------



## Ganf

I give up. I found plenty of places that listed 3584 cores but nothing with a good source. Fury core count is still up for speculation but I'll eat my boots if it's 4096.


----------



## RaduZ

Quote:


> Originally Posted by *Ganf*
> 
> I give up. I found plenty of places that listed 3584 cores but nothing with a good source. Fury core count is still up for speculation but I'll eat my boots if it's 4096.


Edit: Nope don't mind me


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ganf*
> 
> I give up. I found plenty of places that listed 3584 cores but nothing with a good source. Fury core count is still up for speculation but I'll eat my boots if it's 4096.


do you want fries with that?







just joking... I'll be there with you eating my boots if they keep it the same.


----------



## PontiacGTX

btw Did someone tested which Feature level of DIrectx 12 has Fury X since is a sligly better improved architecture there might be something new


----------



## hyp36rmax

Quote:


> Originally Posted by *en9dmp*
> 
> Lol, so much nicer than my ghetto htpc setup for now...
> 
> Need to build a bigger case and get those EK blocks when they come out...


Welcome to the club!


----------



## hyp36rmax

Quote:


> Originally Posted by *Neon Lights*
> 
> Oops, I was wrong:
> 
> 
> 
> Apparently I just did not see it the last time I had checked if the memory slider was in Catalyst when I had Crossfire enabled. So yes, it works with Crossfire too.
> 
> *Edit:* While it works with Crossfire enabled, the memory slider is only present for the first card. Since we do not have SFR yet (except for in one game I think), the slower card always limits the faster one, so practically it has no use using this memory overclocking in Crossfire.


Welcome to the club!


----------



## lagittaja

Quote:


> Originally Posted by *Ganf*
> 
> I give up. I found plenty of places that listed 3584 cores but nothing with a good source. *Fury core count is still up for speculation but I'll eat my boots if it's 4096.*


Me too. No way. Who would buy Fury X then?

Personally the 56 CU Fury or 3584 cores seems logical (#1601). Anything more and the gap could be too small. Anything less and the gap could be too big. Especially considering their price points. Wasn't it stated that Fury MSRP is 549$ vs the 649$ of Fury X?


----------



## m0n4rch

Quote:


> Originally Posted by *DNMock*
> 
> Can someone help me out a bit here?
> 
> The difference between a 980ti and a Fury X is larger at lower resolutions because the Nvidia cards are not bandwidth bottlenecked at those resolutions. The Fury X closes the gap at 4k due to the 980ti/Titan X facing bandwidth bottlenecks at 4k.
> 
> Is this correct?
> 
> If so I don't understand how overclocking the memory on the Fury X would give it a boost at lower res like 1080p and 1440p. If 396 GB/S is more than enough, I don't see how pushing a card that already has 500+ GB/S worth of bandwidth even higher would have any notable change at all. Now at 4k, or 8k or whatever, I could definitely see this having a positive impact, but not at lower resolutions.


Quote:


> Originally Posted by *Slink3Slyde*
> 
> Someone mentioned somewhere here that AMD doesnt Do so well at the lower resolutions because of the bad Dx11 CPU overhead on the drivers.
> 
> Dont know how true that is, I always thought it was memory bandwidth same as you.


I did the API overhead test with 290X and 970 in the same system.





And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


----------



## Blackops_2

Quote:


> Originally Posted by *m0n4rch*
> 
> I did the API overhead test with 290X and 970 in the same system.
> 
> 
> 
> 
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


Well considering they're moving to DX12 in the future idk that you'll have to worry about it. I agree on the issue though.


----------



## m0n4rch

Quote:


> Originally Posted by *Blackops_2*
> 
> Well considering they're moving to DX12 in the future idk that you'll have to worry about it. I agree on the issue though.


Most games we play now are DX11, and there'll still be a lot of DX11 games in the future.


----------



## DNMock

Someone needs to just gut it out and bust out the multi-meter and pencil-mod to put that extra voltage on them to see what kind of overclock they can swing lol


----------



## rv8000

Interesting find in the EK thread for the block for Fury X, if GPUZ is reading correctly, vcore under load is sitting @ 1.131v which is considerably low compared to Hawaii. I feel a bit more optimistic about unlocked voltage and overclocking as long as voltages such as 1.2-1.3 don't have adverse affects on the memory controller/hbm.

http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block/60#post_24106463


----------



## Minotaurtoo

Quote:


> Originally Posted by *m0n4rch*
> 
> I did the API overhead test with 290X and 970 in the same system.
> 
> 
> 
> 
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


AMD mantle seems to even the issue out.. .oh wait.. it seems to run plum away with it... I don't really know what mantles about... but the numbers there are huge.


----------



## josephimports

Quote:


> Originally Posted by *rv8000*
> 
> Interesting find in the EK thread for the block for Fury X, if GPUZ is reading correctly, vcore under load is sitting @ 1.131v which is considerably low compared to Hawaii. I feel a bit more optimistic about unlocked voltage and overclocking as long as voltages such as 1.2-1.3 don't have adverse affects on the memory controller/hbm.
> 
> http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block/60#post_24106463





Spoiler: Warning: Spoiler!







Mine is loading @ 1.181v and max of 1.22v. No pump noise, minor coil whine, and artifacts above 1130.


----------



## Ultracarpet

Quote:


> Originally Posted by *lagittaja*
> 
> *Me too. No way. Who would buy Fury X then?*
> 
> Personally the 56 CU Fury or 3584 cores seems logical (#1601). Anything more and the gap could be too small. Anything less and the gap could be too big. Especially considering their price points. Wasn't it stated that Fury MSRP is 549$ vs the 649$ of Fury X?


The same people who bought FX 9590's... Lol.


----------



## Slink3Slyde

Quote:


> Originally Posted by *m0n4rch*
> 
> I did the API overhead test with 290X and 970 in the same system.
> 
> 
> 
> 
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


Its a well known issue, thanks for posting though







. But it doesnt explain why overclocking the HBM would cause increased performance at lower resolutions then I dont think.

Unless its something to do with HBM being a bit less effective the lower the res. That would explain all the built for 4k marketing I suppose.


----------



## Ganf

Meh, he just needs to fix his card.

http://www.3dmark.com/aot/15644

Disregarding the fact that the API test is borked and has been since day one. Right now half the time it tells me that my card doesn't support DX11.

People need better complaints.


----------



## Ceadderman

Quote:


> Originally Posted by *m0n4rch*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DNMock*
> 
> Can someone help me out a bit here?
> 
> The difference between a 980ti and a Fury X is larger at lower resolutions because the Nvidia cards are not bandwidth bottlenecked at those resolutions. The Fury X closes the gap at 4k due to the 980ti/Titan X facing bandwidth bottlenecks at 4k.
> 
> Is this correct?
> 
> If so I don't understand how overclocking the memory on the Fury X would give it a boost at lower res like 1080p and 1440p. If 396 GB/S is more than enough, I don't see how pushing a card that already has 500+ GB/S worth of bandwidth even higher would have any notable change at all. Now at 4k, or 8k or whatever, I could definitely see this having a positive impact, but not at lower resolutions.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slink3Slyde*
> 
> Someone mentioned somewhere here that AMD doesnt Do so well at the lower resolutions because of the bad Dx11 CPU overhead on the drivers.
> 
> Dont know how true that is, I always thought it was memory bandwidth same as you.
> 
> Click to expand...
> 
> I did the API overhead test with 290X and 970 in the same system.
> 
> 
> 
> 
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.
Click to expand...

I don't seem to recall an issue with either my 5770 or my 6870 and DX11.









~Ceadder


----------



## blue1512

Quote:


> Originally Posted by *m0n4rch*
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


I think it would be opposite. Even with that Dx11 handicap they traded blow with the green. Given that Dx12 is knocking and going to remove that disadvantage, imagine how awesome the red will become in near future.


----------



## Blackops_2

Quote:


> Originally Posted by *DNMock*
> 
> Someone needs to just gut it out and bust out the multi-meter and pencil-mod to put that extra voltage on them to see what kind of overclock they can swing lol


Hear hear!


----------



## fewness

I think I have restarted over 50 times now...since I saw the bug yesterday...never worked even once....









Is there a special version/language/whatever of CCC for this bug?


----------



## fewness

Quote:


> Originally Posted by *ozyo*
> 
> backup working bios
> switch to broken bios without restarting/powering off pc
> flash it
> restart


Thank you! What program should I use for this? I'm quite familiar with Nvidia bios flash but not so much with AMD. I'm assuming the program should work in windows, right?...based on the process you described.


----------



## Mtom

Haven't seen it posted, it seems if you tick "unofficial overclocking mode - enabled" in MSI afterburner, it lets you set the memory clock.


----------



## Mtom

Quote:


> Originally Posted by *m0n4rch*
> 
> I did the API overhead test with 290X and 970 in the same system.
> 
> 
> 
> 
> 
> And I'm not buying another AMD GPU until they fix their DX11 driver. I'm sorry for intruding this thread despite the fact that I don't own a Fury, but the issue with DX11 is very real.


Well you don't have to wait too much, as they completely reworking the drivers right now. 15.200 is the first leak of it, 15.300 will be the latest, and on W10 they are already gaining hugely.


----------



## ECPowers

Well after some fiddling with MSI Afterburner I did manage to get the memory overclocked


----------



## HoZy

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Any Fury owners have a Heaven run at these settings?


Curious as well.

Would love some results,
Cheers.


----------



## ECPowers

Quote:


> Originally Posted by *HoZy*
> 
> Curious as well.
> 
> Would love some results,
> Cheers.


Here you go, Fury overclocked to 1090/540.



Fury overclocked to 1092/570



System:

Intel Core I7 4790k @ 4.7GHz
Asus Maximus Impact VII
Noctua NH-D15
Kingston Fury Red 2x8GB 1866MHz
MSI Radeon R9 Fury X (15.200.1040 modded)
Samsung XP941 128GB
Samsung 850EVO 1TB
Cooltek W1 (Jonsbo W1)
XFX Pro1000W Limited Black Edition
Windows 10 10130 Preview


----------



## HoZy

Interesting results, Fury X is not as impressive as I first thought.

Wasn't sure how to overlay my gpuz & results without photoshop so clocks are visible in top right.

This is with a single *290x* @ 1200/1500 (Disabled my 2nd one)



Cheers
Mat

PC:

i7 4790k @ 4.7ghz
MSI Z97 Gaming 7
G.Skill Trident 2400mhz
XFX 290x with STILT bios & Strapped memory timings @ 1200/1500
Samsung pro 850 128gb
Custom watercooling loop
Coolermaster Silent pro 850w PSU
Windows 8.1


----------



## ECPowers

Maybe we need someone who can test it with the 15.15 release driver? As i'm using the modded Catalyst 15.2 Windows 10 preview driver. Seeing your max FPS i'm guessing there's still much improvement left for Fury X driver-wise.


----------



## Clockster

Quote:


> Originally Posted by *HoZy*
> 
> Interesting results, Fury X is not as impressive as I first thought.
> 
> Wasn't sure how to overlay my gpuz & results without photoshop so clocks are visible in top right.
> 
> This is with a single *290x* @ 1200/1500 (Disabled my 2nd one)
> 
> 
> 
> Cheers
> Mat


Yeah because synthetic benchmarks is the be all and end all lol.

Fury X is meant for 4K gaming.


----------



## Orthello

Quote:


> Originally Posted by *HoZy*
> 
> Interesting results, Fury X is not as impressive as I first thought.
> 
> Wasn't sure how to overlay my gpuz & results without photoshop so clocks are visible in top right.
> 
> This is with a single *290x* @ 1200/1500 (Disabled my 2nd one)
> 
> 
> 
> Cheers
> Mat


Fury is meant to be a lot faster in tessellation than 290x so that is interesting - this bench is heavy in tessellation. I would have thought more gain to Fury there in Heaven 4.0. Still thats 25% at 1080p considering the 10% clock lead you have there and your 15% behind still if other factors could be considered same / similar.

Its also 1080p so i think as you step it up in res you will see Fury X pull away .. be interesting to see same benches at 4k .

Interesting point others have made about max fps .. definately some driver catchup required there.


----------



## Thoth420

Yay! Too bad it's going in a fresh build and I await other hardware...


----------



## blue1512

Quote:


> Originally Posted by *ECPowers*
> 
> Maybe we need someone who can test it with the 15.15 release driver? As i'm using the modded Catalyst 15.2 Windows 10 preview driver. Seeing your max FPS i'm guessing there's still much improvement left for Fury X driver-wise.


The Win10 driver is not for FuryX I'm afraid. FuryX Heaven 1080p is around 78fps, just google it


----------



## Minotaurtoo

I'm sure it will be soon enough.. same stuff happens with a lot of new releases... it takes time for the software to catch up.. I mean amd's 8 core cpus were absolutely useless in gaming until recently and now the tech is so old and behind it's not funny... even when they were new, they didn't keep up.. at least these keep up... (btw I'm not picking on amd's 8 cores... I actually own one... love it, but its just not on par with intel core for core and clock for clock)


----------



## Gumbi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I'm sure it will be soon enough.. same stuff happens with a lot of new releases... it takes time for the software to catch up.. I mean amd's 8 core cpus were absolutely useless in gaming until recently and now the tech is so old and behind it's not funny... even when they were new, they didn't keep up.. at least these keep up... (btw I'm not picking on amd's 8 cores... I actually own one... love it, but its just not on par with intel core for core and clock for clock)


AMD's Cpus have pretty much never been good for gaming, only in very rare cases are they any good.


----------



## Blackops_2

Quote:


> Originally Posted by *Gumbi*
> 
> AMD's Cpus have pretty much never been good for gaming, only in very rare cases are they any good.


You mean bulldozer and if they're clocked high enough they're fine they just have to be at 4.5+ or the game has to be heavily threaded.

A64 in its time frame was great for gaming.


----------



## josephimports

Quote:


> Originally Posted by *ECPowers*
> 
> Well after some fiddling with MSI Afterburner I did manage to get the memory overclocked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Thanks.







Memory slider is now unlocked up to 650MHz.


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *josephimports*
> 
> [/SPOILER]
> 
> Thanks.
> 
> 
> 
> 
> 
> 
> 
> Memory slider is now unlocked up to 650MHz.
> 
> 
> Spoiler: Warning: Spoiler!


any coil whine and annoying sound from the pump?

ediT: BTW, my 2 290s at stock get 4500 at stock in graphics with that bench.


----------



## Gumbi

Quote:


> Originally Posted by *Blackops_2*
> 
> You mean bulldozer and if they're clocked high enough they're fine they just have to be at 4.5+ or the game has to be heavily threaded.
> 
> A64 in its time frame was great for gaming.


For sure, I'm just talking about the more recent chips. Had a Phenom II myself and it served me well. Just their current offerings are power whores and get spanked all day clock for clock by Intel.


----------



## josephimports

Quote:


> Originally Posted by *rdr09*
> 
> any coil whine and annoying sound from the pump?
> 
> ediT: BTW, my 2 290s at stock get 4500 at stock in graphics with that bench.


See my previous post here.


----------



## rdr09

Quote:


> Originally Posted by *josephimports*
> 
> See my previous post here.


Thanks.


----------



## ECPowers

Quote:


> Originally Posted by *rdr09*
> 
> any coil whine and annoying sound from the pump?
> 
> ediT: BTW, my 2 290s at stock get 4500 at stock in graphics with that bench.


My MSI Fury X got both the annoying pump noise and light coil whine, altough the pump noise actually seems to go away once the card is under load.


----------



## Noviets

How much performance do you think is lacking from not having its own optimised driver?

I was waiting to see what the performance numbers were to decide between the 980ti or Fury X (same price here about $1050 AUD)

Willing to wait if the drivers are going to make a huge difference, but will they?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Noviets*
> 
> How much performance do you think is lacking from not having its own optimised driver?
> 
> I was waiting to see what the performance numbers were to decide between the 980ti or Fury X (same price here about $1050 AUD)
> 
> Willing to wait if the drivers are going to make a huge difference, but will they?


Hopefully i'll have mine in a couple of weeks time so i'll let you know, we've got pretty similar rigs so that'll give you a realistic idea as well









(i ordered mine from Amazon, worked out to be over $100 cheaper than PCCG)


----------



## Casey Ryback

Quote:


> Originally Posted by *Noviets*
> 
> 1.How much performance do you think is lacking from not having its own optimised driver?
> 
> 2. I was waiting to see what the performance numbers were to decide between the 980ti or Fury X (same price here about $1050 AUD)
> 
> 3. Willing to wait if the drivers are going to make a huge difference, but will they?


1. Exactly 12.65%









2. The performance numbers are out.

3. Let me check my crystal ball...................

Seriously all your questions are only going to be answered with speculation and guessing. Basing them on your purchase decision would be ridiculously naive.

I see you have a 144hz 1080p display so the 980ti is by far the purchase for you at this point.


----------



## Digitalwolf

Quote:


> Originally Posted by *HoZy*
> 
> Interesting results, Fury X is not as impressive as I first thought.
> 
> Wasn't sure how to overlay my gpuz & results without photoshop so clocks are visible in top right.
> 
> This is with a single *290x* @ 1200/1500 (Disabled my 2nd one)
> 
> 
> 
> Cheers
> Mat
> 
> PC:
> 
> i7 4790k @ 4.7ghz
> MSI Z97 Gaming 7
> G.Skill Trident 2400mhz
> XFX 290x with STILT bios & Strapped memory timings @ 1200/1500
> Samsung pro 850 128gb
> Custom watercooling loop
> Coolermaster Silent pro 850w PSU
> Windows 8.1


I run heaven with those settings on all my current cards. 290x, 970 and Fury X.. when I look at that screen shot all I see is that Max FPS and wonder... what is up with that.


----------



## blue1512

Quote:


> Originally Posted by *Digitalwolf*
> 
> I run heaven with those settings on all my current cards. 290x, 970 and Fury X.. when I look at that screen shot all I see is that Max FPS and wonder... what is up with that.


Google "Stilt bios" and you will get the idea. Every GPU miners know about his optimized bios for Tahiti and Hawaii


----------



## Blackops_2

Quote:


> Originally Posted by *Gumbi*
> 
> For sure, I'm just talking about the more recent chips. Had a Phenom II myself and it served me well. Just their current offerings are power whores and get spanked all day clock for clock by Intel.


Agreed, i probably should've inferred you were talking about the most recent.


----------



## ozyo

Quote:


> Originally Posted by *fewness*
> 
> Thank you! What program should I use for this? I'm quite familiar with Nvidia bios flash but not so much with AMD. I'm assuming the program should work in windows, right?...based on the process you described.


dos/win or even bios if u trying to fix motherboard bios
and how to flash fury ? I'm not sure


----------



## Slink3Slyde

Quote:


> Originally Posted by *Ganf*
> 
> Meh, he just needs to fix his card.
> 
> http://www.3dmark.com/aot/15644
> 
> Disregarding the fact that the API test is borked and has been since day one. Right now half the time it tells me that my card doesn't support DX11.
> 
> People need better complaints.


Dont believe I had one. I was actually fishing for someone to confirm the reason that GCN cards seem to do better at higher resolutions in general, I always thought it was the memory. But the 280x catches up with the 780 more at 1440 and 4k for example, so that would suggest overhead to me as both have 384 bit 3 gb? Never really considered before, always thought it was memory bandwidth.

Sorry for OT in the owners club, I'm subscribed to so many Fury threads I forgot which this was


----------



## Agent Smith1984

Quote:


> Originally Posted by *ECPowers*
> 
> Maybe we need someone who can test it with the 15.15 release driver? As i'm using the modded Catalyst 15.2 Windows 10 preview driver. Seeing your max FPS i'm guessing there's still much improvement left for Fury X driver-wise.


Thanks for posting....

I am getting around 1550 points in Heaven myself, with avg of 61.8.

Don't put too much stock in the min/max on Heaven.

It has a few spots that dip sometimes, and a few places where it shoots up, though I have no seen any FPS as high as what HoZy posted.

Mine tops at around 128.

So far all the math on everything benchmark I have seen at 1080P points to the Fury X being about 10-15% faster than Hawaii (depending on clock speeds) at 1080P

It's at 4k where the gap widens to about 20%....

Still a bit more than I can chew right now at 50% more $$$ than my 390, but I still think there is a bigger picture for the Fiji as new drivers and Win10 draw near.


----------



## blue1512

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks for posting....
> 
> I am getting around 1550 points in Heaven myself, with avg of 61.8.
> 
> Don't put too much stock in the min/max on Heaven.
> 
> It has a few spots that dip sometimes, and a few places where it shoots up, though I have no seen any FPS as high as what HoZy posted.
> 
> Mine tops at around 128.
> 
> So far all the math on everything benchmark I have seen at 1080P points to the Fury X being about 10-15% faster than Hawaii (depending on clock speeds) at 1080P
> 
> It's at 4k where the gap widens to about 20%....
> 
> Still a bit more than I can chew right now at 50% more $$$ than my 390, but I still think there is a bigger picture for the Fiji as new drivers and Win10 draw near.


1. HoZy used a modded BIOS, hence that strange behaviour.
2. FuryX avg in Heaven should be around 78fps with its *launched driver, 15.15*


----------



## bonami2

Any idea guy if the 4gb hbm it lacking in some game? I know the bandwith and all reduce the need to split data but game are coded like crap so im affraid i need to wait for 8gb gpu

5760x1080 here i need at least 2x the power of my 7950 and if i could 4x with crossfire sli option in some game


----------



## Agent Smith1984

Quote:


> Originally Posted by *blue1512*
> 
> 1. HoZy used a modded BIOS, hence that strange behaviour.
> 2. FuryX avg in Heaven should be around 78fps with its *launched driver, 15.15*


The BIOS mod on the max makes sense, but it doesn't seem to help the overall score, as I am getting a little higher.

The Fury X does get around 77-78 FPS on 15.15 in Heaven, but not with the custom settings I had requested.
If I use the standard 1080P setting used by most bench sites, I get around 68FPS, so the gap is still about the same at 1080P.

I'm surprised the 295x2's aren't flyijng off the shelf right now.

Besides the power, it's a cheaper, single card water cooled solution.
Again, I am really curious to see what's in store.

Good to see Fury X able to get the HBM overclocked. Looking forward to seeing if Fiji has some OC headroom.


----------



## HoZy

The Fury X definitely pulls away for 4k gaming, Been playing all evening with my 290x's and bios tweaks,

Best I can push from 1x Card on *Firestrike Ultra* is 2928 & on crossfire 5207.

This is with a 4790k @ 4.6ghz right this second, My "temporary" 850w'psu HATES life if I try to run everything at desired clocks.

Unsure how much it'd change if it was back at 5ghz as over 4.8 I don't notice significant differences.

Cheers
Mat


----------



## sugarhell

Can anyone try this?

http://forums.guru3d.com/showpost.php?p=5111179&postcount=11


----------



## josephimports

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Any Fury owners have a Heaven run at these settings?
> 
> 
> Spoiler: Warning: Spoiler!


Stock Sapphire Fury X. Heaven custom settings.


Spoiler: Warning: Spoiler!







Fury OC 1100/550


Spoiler: Warning: Spoiler!


----------



## blue1512

Quote:


> Originally Posted by *josephimports*
> 
> Stock Sapphire Fury X. Heaven custom settings.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Fury OC 1100/550
> 
> 
> Spoiler: Warning: Spoiler!


The improved score mostly come from OC'ing the core it seems. From the rumours of 625MHz HBM before launch, the mem should be OC to 600-650 MHz zone would see the benefit.


----------



## xer0h0ur

http://wccftech.com/amd-radeon-r9-fury-front-plate-custom-3d-model/

AMD apparently wants to encourage customization of your Fury X. This makes perfect sense along with locking out AIBs from modifying your reference design....perfect sense.....NOT


----------



## pdasterly

Quote:


> Originally Posted by *en9dmp*
> 
> 
> 
> Got a lovely 290x with red backplate for sale of anyone wants it!


Did you buy that on ebay earlier this year?


----------



## Naennon

Quote:


> Originally Posted by *ECPowers*
> 
> Fury overclocked to 1092/570
> 
> System:
> 
> Intel Core I7 4790k @ 4.7GHz
> Asus Maximus Impact VII
> Noctua NH-D15
> Kingston Fury Red 2x8GB 1866MHz
> MSI Radeon R9 Fury X (15.200.1040 modded)
> Samsung XP941 128GB
> Samsung 850EVO 1TB
> Cooltek W1 (Jonsbo W1)
> XFX Pro1000W Limited Black Edition
> Windows 10 10130 Preview


overclockers dream? like no tomorrow? ***..


----------



## Agent Smith1984

Quote:


> Originally Posted by *blue1512*
> 
> The improved score mostly come from OC'ing the core it seems. From the rumours of 625MHz HBM before launch, the mem should be OC to 600-650 MHz zone would see the benefit.


Looks like Fury X puts the "power to the ground" when you overclock.

Nice to see!


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like Fury X puts the "power to the ground" when you overclock.
> 
> Nice to see!


Uh your daddy gamer build did you have any test done to see the vrm temp under those voltage im sure they are throttling like mad ? Just asking because well im affraid of the lifespan


----------



## xer0h0ur

Quote:


> Originally Posted by *Naennon*
> 
> overclockers dream? like no tomorrow? ***..


We don't have control of the voltage yet?


----------



## Agent Smith1984

Quote:


> Originally Posted by *bonami2*
> 
> Uh your daddy gamer build did you have any test done to see the vrm temp under those voltage im sure they are throttling like mad ? Just asking because well im affraid of the lifespan


About 85C core/88c VRM for the top card, and 77C core/81c VRM for the bottom.

Cards ran great and never throttled.... even ran them at 1100/1500 for a bit with 50mv+ and still never throttled, though the top card would get up to around 88C/92C VRM

Those 2x 290's I paid about $200 a piece for were slapping graphics around like pimped hoes!!

Only went 390 for the aesthetics and additional VRAM.

Adding a second in a few weeks.


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> About 85C core/88c VRM for the top card, and 77C core/81c VRM for the bottom.
> 
> Cards ran great and never throttled.... even ran them at 1100/1500 for a bit with 50mv+ and still never throttled, though the top card would get up to around 88C/92C VRM
> 
> Those 2x 290's I paid about $200 a piece for were slapping graphics around like pimped hoes!!
> 
> Only went 390 for the aesthetics and additional VRAM.
> 
> Adding a second in a few weeks.


Oh im talking about cpu









Great temp im at 90c too on my 7950 at 60% fan speed


----------



## Agent Smith1984

Quote:


> Originally Posted by *bonami2*
> 
> Oh im talking about cpu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great temp im at 90c too on my 7950 at 60% fan speed


Na, no throttling on CPU....

240mm AIO water cooling, with 80mm VRM and 120mm socket fan on the back..... lol


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, no throttling on CPU....
> 
> 240mm AIO water cooling, with 80mm VRM and 120mm socket fan on the back..... lol


Ahh that explain it. Without fan those thing would melt


----------



## snow cakes

cool, you can download fury drivers already


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> http://wccftech.com/amd-radeon-r9-fury-front-plate-custom-3d-model/
> 
> AMD apparently wants to encourage customization of your Fury X. This makes perfect sense along with locking out AIBs from modifying your reference design....perfect sense.....NOT


I saw that and was thinking the same... also great avatar by the way









In other news...really hoping I don't have one with a loud pump. I plan on following the specifications for installation to the letter and if it still makes sound that annoys me in a fractal r5 then it's def going in for RMA. Guess I will be running a crap card from best buy til that happens and then returning it....I love abusing Best Buy and their return policy.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I saw that and was thinking the same... also great avatar by the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other news...really hoping I don't have one with a loud pump. I plan on following the specifications for installation to the letter and if it still makes sound that annoys me in a fractal r5 then it's def going in for RMA. Guess I will be running a crap card from best buy til that happens and then returning it....I love abusing Best Buy and their return policy.


For what its worth, Cooler Master claims that they have resolved the issue and the pump noise is only affecting the first batch of cards. They claim new cards will not have this anymore.

"AMD's Antal Tungler has confirmed that the problem exists in early production units. However a fix (for the pump whine) has been applied by Cooler Master USA and it is hoped that the problem has been resolved for future R9 Fury X units."

http://wccftech.com/amd-radeon-fury-x-reportedly-suffering-buzzing-coil-whine/

As for the avatar, I am a huge GoT fan. I have a feeling that sneaky SOB is going to wind up being one of the most powerful men in the end. He certainly has the long game in full effect right now.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> For what its worth, Cooler Master claims that they have resolved the issue and the pump noise is only affecting the first batch of cards. They claim new cards will not have this anymore.
> 
> "AMD's Antal Tungler has confirmed that the problem exists in early production units. However a fix (for the pump whine) has been applied by Cooler Master USA and it is hoped that the problem has been resolved for future R9 Fury X units."
> 
> http://wccftech.com/amd-radeon-fury-x-reportedly-suffering-buzzing-coil-whine/
> 
> As for the avatar, I am a huge GoT fan. I have a feeling that sneaky SOB is going to wind up being one of the most powerful men in the end. He certainly has the long game in full effect right now.


I got my card on US release so it is first batch AFAIK









Plan on building this weekend at latest and don't have stuff to test her out now even though she is sitting here.


Edit: Also in regard to your comment on GoT....I made the same remark to my girlfrien(just got her into the show and she is hooked) last night. He is one sly fox and appears to have a major "long con" endgame....opposed to just the typical desire for consolidated control like most of the more...."intelligent" characters in the series. I have not read the books figure I should point that out.


----------



## Yorkston

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202155

Sapphire card in stock at the egg. Have at it gents.


----------



## xer0h0ur

For people experiencing coil whine I suggest leaving your rig on over a long period of time in the menu for a game that can get you insane high fps when you uncap it. I just used Counter Strike Global Offensive, opened console, FPS_MAX 0 and the framerate was in the thousands. Left it overnight for a couple of nights and boom coil whine on my 295X2 was gone.


----------



## Clockster

So I just finished my 1st card install. 2nd card coming in the next few weeks.

http://www.techpowerup.com/gpuz/details.php?id=dwpq


----------



## hyp36rmax

Quote:


> Originally Posted by *Thoth420*
> 
> I got my card on US release so it is first batch AFAIK
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plan on building this weekend at latest and don't have stuff to test her out now even though she is sitting here.
> 
> 
> Edit: Also in regard to your comment on GoT....I made the same remark to my girlfrien(just got her into the show and she is hooked) last night. He is one sly fox and appears to have a major "long con" endgame....opposed to just the typical desire for consolidated control like most of the more...."intelligent" characters in the series. I have not read the books figure I should point that out.


Welcome to the club!









Quote:


> Originally Posted by *Yorkston*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202155
> 
> Sapphire card in stock at the egg. Have at it gents.


So tempted to pull that trigger. If only i wasn't upgrading my suspension on my car. Custom Ohilins or FURY X!!! hahaha

Quote:


> Originally Posted by *xer0h0ur*
> 
> For people experiencing coil whine I suggest leaving your rig on over a long period of time in the menu for a game that can get you insane high fps when you uncap it. I just used Counter Strike Global Offensive, opened console, FPS_MAX 0 and the framerate was in the thousands. Left it overnight for a couple of nights and boom coil whine on my 295X2 was gone.


This is basically the way to minimize the coil whine and has worked with many! Great suggestion!


----------



## hyp36rmax

Quote:


> Originally Posted by *Clockster*
> 
> So I just finished my 1st card install. 2nd card coming in the next few weeks.
> 
> http://www.techpowerup.com/gpuz/details.php?id=dwpq


Welcome to the club!


----------



## hyp36rmax

*It's Live! Click Here for registration or this short cut!







*



Spoiler: Warning: Spoiler! Members Registration



Loading...



Bring it! Welcome to the club!


----------



## en9dmp

Quote:


> Originally Posted by *pdasterly*
> 
> Did you buy that on ebay earlier this year?


Nah, bought it late last year from ebuyer in the UK and put the block and plate on...


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> For people experiencing coil whine I suggest leaving your rig on over a long period of time in the menu for a game that can get you insane high fps when you uncap it. I just used Counter Strike Global Offensive, opened console, FPS_MAX 0 and the framerate was in the thousands. Left it overnight for a couple of nights and boom coil whine on my 295X2 was gone.


Great advice it will weed out whether it your PSU or dirty power from the wall being the root of the cause alot of the time. I used Hitman Abso Start menu with v sync off refresh maxed for the display and in game (has no fps cap) and motion constantly going ...not that burn is an issue for most people. This produces insane coil whine.

If it doesn't subside then you probably need a better PSU and/or an AVR. I use a Cyberpower 1500VA USPS that has an AVR because my area suffers terrible brownouts and my house wiring is a bit old and badly implemented.


----------



## xer0h0ur

Its actually sad that coil whine has become so commonplace in these expensive enthusiast class video cards. I don't understand why they wouldn't just go with better chokes to avoid the bad PR coil whine draws.


----------



## Minotaurtoo

crap crap crap crap... now amazon is showing that they don't even know if the saphire fury x is going to be back in stock and mine hasn't shipped yet that I ordered.... crap crap crap crap....









http://www.amazon.com/gp/product/B01012TLSS?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00


----------



## p4inkill3r

My status hasn't changed:

Arriving Mon, Jul 13 - Wed, Jul 15
Not yet shipped
Track package

Sapphire Radeon R9 Fury X 4GB HBM HDMI / TRIPLE DP PCI-Express Graphics Card 21246-00-40G
Sold by: Amazon.com LLC
$649.99


----------



## ozyo

any update on pro release date ?


----------



## bonami2

Quote:


> Originally Posted by *Thoth420*
> 
> Great advice it will weed out whether it your PSU or dirty power from the wall being the root of the cause alot of the time. I used Hitman Abso Start menu with v sync off refresh maxed for the display and in game (has no fps cap) and motion constantly going ...not that burn is an issue for most people. This produces insane coil whine.
> 
> If it doesn't subside then you probably need a better PSU and/or an AVR. I use a Cyberpower 1500VA USPS that has an AVR because my area suffers terrible brownouts and my house wiring is a bit old and badly implemented.


Well i dont think it has anything to do with psu and or wall power my 7950 changed 3 time home from different city from new building to old and they still whine if im close enough

I just think it about high current into undersized part. Never heard of coil whine from lighthing and or matrix user well maybe they have too they just arent lot of buyer ahah


----------



## xer0h0ur

Quote:


> Originally Posted by *ozyo*
> 
> any update on pro release date ?


Far as I know it hasn't changed. Still July 14th.

The R9 Nano is the one we have absolutely no date for.


----------



## Gdourado

Quote:


> Originally Posted by *xer0h0ur*
> 
> Far as I know it hasn't changed. Still July 14th.
> 
> The R9 Nano is the one we have absolutely no date for.


Any specs released?
How far from the S will it be?


----------



## xer0h0ur

I can't honestly say with any certainty if AMD has confirmed the specs of the card but its commonly believed to be a cut down Fury X with 3584 SPs at about 1GHz likely still packing 64 ROPs but sporting a lower TMU count. We don't have any specs really on the R9 Nano though other than its being touted as having fantastic performance (Beats 390X) at low power consumption in a pipsqueak package with only a puny fan needed to cool it.


----------



## Minotaurtoo

Quote:


> Originally Posted by *p4inkill3r*
> 
> My status hasn't changed:
> 
> Arriving Mon, Jul 13 - Wed, Jul 15
> Not yet shipped
> Track package
> 
> Sapphire Radeon R9 Fury X 4GB HBM HDMI / TRIPLE DP PCI-Express Graphics Card 21246-00-40G
> Sold by: Amazon.com LLC
> $649.99


yeah, my status is the same... its just scary that the page for the card says they don't know when they will get any more in.... I mean holy crap since mine hasn't shipped does that mean I won't be getting it for months?


----------



## rv8000

Quote:


> Originally Posted by *Minotaurtoo*
> 
> yeah, my status is the same... its just scary that the page for the card says they don't know when they will get any more in.... I mean holy crap since mine hasn't shipped does that mean I won't be getting it for months?


This is why I would never order a GPU on release day from Amazon, all sorts of shenanigans going on there.


----------



## DividebyZERO

is it me or is newegg.com not working? i've tried on comcast and verizon. the main page loads but i cannot go anywhere else

Edit NM, its now working and everything is sold out fury wise


----------



## Yorkston

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121969

Asus one is in stock as of writing


----------



## bonami2

What it a coolermaster pump?

Damn why not asetek they are the king of cpu pump

http://www.gamersnexus.net/industry/1809-asetek-vs-coolit-liquid-cooling-market-shrinks Lawsuit against all of those offbrand coolermaster Swiftech etc


----------



## ban25

Quote:


> Originally Posted by *Yorkston*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121969
> 
> Asus one is in stock as of writing


Bought, thanks!


----------



## Thoth420

Quote:


> Originally Posted by *bonami2*
> 
> Well i dont think it has anything to do with psu and or wall power my 7950 changed 3 time home from different city from new building to old and they still whine if im close enough
> 
> I just think it about high current into undersized part. Never heard of coil whine from lighthing and or matrix user well maybe they have too they just arent lot of buyer ahah


I will give you there are times when there is no solution or no easily perceivable one. I did however solve mine with past cards using an AVR and high quality PSUs.


----------



## nubleh

Quote:


> Originally Posted by *rv8000*
> 
> This is why I would never order a GPU on release day from Amazon, all sorts of shenanigans going on there.


I don't think that's a big issue with amazon though, your credit card isn't charged until the item is shipped, so you're free to cancel it if another model shows up in stock.

That's what i did: i had ordered Sapphire's but i managed to snag Visiontek's model when it came in stock. I cancelled Sapphire's right after.

Visiontek's Fury X has been shipped and will arrive in a couple days.


----------



## ozyo

sapphire in stock
http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-21246-00-40G/dp/B01012TLSS/ref=sr_1_1?s=electronics&ie=UTF8&qid=1435731445&sr=1-1&keywords=Sapphire+Radeon+R9+Fury+X+4GB+HBM+HDMI+%2F+TRIPLE+DP+PCI-Express+Graphics+Card+21246-00-40G
no internal shipping


----------



## ban25

Quote:


> Originally Posted by *ozyo*
> 
> sapphire in stock
> http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-21246-00-40G/dp/B01012TLSS/ref=sr_1_1?s=electronics&ie=UTF8&qid=1435731445&sr=1-1&keywords=Sapphire+Radeon+R9+Fury+X+4GB+HBM+HDMI+%2F+TRIPLE+DP+PCI-Express+Graphics+Card+21246-00-40G
> no internal shipping


"Shipped and Sold by RackGo"? Hmm...


----------



## en9dmp

Can anyone find any info about the EK blocks availability? I did some searches but can find anything recent... Ek's page touts availability as end of June, yet it's now July and can't see any updates


----------



## xer0h0ur

Quote:


> Originally Posted by *en9dmp*
> 
> Can anyone find any info about the EK blocks availability? I did some searches but can find anything recent... Ek's page touts availability as end of June, yet it's now July and can't see any updates


http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


----------



## blue1512

Quote:


> Originally Posted by *bonami2*
> 
> What it a coolermaster pump?
> 
> Damn why not asetek they are the king of cpu pump
> 
> http://www.gamersnexus.net/industry/1809-asetek-vs-coolit-liquid-cooling-market-shrinks Lawsuit against all of those offbrand coolermaster Swiftech etc


Only CM's square block can fit Fiji and HBM I afraid


----------



## en9dmp

Quote:


> Originally Posted by *xer0h0ur*
> 
> http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block


nice one thanks, don't know how I missed that... It is a think of beauty and looks like it does a significantly better job of temps... I saw infra red shots of the stock cooler in the tomshardware review and the temps by the PCI slot were approaching 90°C! Not that surprising really when the whole thing is encased in a rubber brick...


----------



## blue1512

It's not rubber by the way


----------



## snow cakes

so the x2 FIji is being released sometime in September?


----------



## HoZy

Quote:


> Originally Posted by *snow cakes*
> 
> so the x2 FIji is being released sometime in September?


If that's the case, Guess my b'day just got a lot more expensive.


----------



## xer0h0ur

Quote:


> Originally Posted by *snow cakes*
> 
> so the x2 FIji is being released sometime in September?


There is no stated month or date for that matter for either the R9 Nano or the Fury X2. Matter of fact the only thing they even mention Fury X2 in is that Project Quantum thing that I am not too sure if they are going to go forward with or not.


----------



## royfrosty

Hi guys,

Anyone managed to pull slightly above 100mhz core clock?


----------



## xer0h0ur

We are still limited by no control over the voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> We are still limited by no control over the voltage.


Isn't the biggest worry.... will there ever be voltage control??









I would guess yes... but no one is certain as to whether or not it's hard locked yet, correct?

If the core clocked the way AMD touted it would before launch.... I'm not so sure anyone would care too much about the voltage...

If it's anything like Hawaii, even with voltage unlocked, the scaling of core frequency/voltage required will drastically fall off after 1150 anyways.

I'd really like to see Fury X with some 1200-1250 core clocks. Im guessing, even with a gimped driver, at those clocks, it could be slapping 980ti around, even in lower resolutions.


----------



## xer0h0ur

Yup, I have heard no direct confirmation of whether voltage is locked on Fury X or not. If AMD locked it after calling this card an overclocker's dream then they really doubled down on ******ed after locking out AIBs from modifying the reference design.


----------



## royfrosty

Well i hope they come out something that allows us to increase the voltages. If they locked it, it will be a shame on them.

We don't seems to be limited by temps. Most of the failed oc for what i have tested was just drivers being crashed.

Hope that they do come out something to allow us to do more oc!


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yup, I have heard no direct confirmation of whether voltage is locked on Fury X or not. If AMD locked it after calling this card an overclocker's dream then they really doubled down on ******ed after locking out AIBs from modifying the reference design.


Yeah, I am still lost as to why they touted this as "an overclocker's dream"

Maxwell is an overclocker's dream.... locked voltage, 250+ MHz overclocks readily...

I really want this Fury X to be successful, but am counting on it's criticisms more than anything, so that the next series is more refined.
Locking voltage is a HUGE no-no for AMD....

NVIDIA gets by with it because their chips clock just fine on their own.

If Fiji's voltage is truly locked (and I am very skeptical of this), there is no overclocking dreams to be had at all with this series.
Not to mention, the old, "yeah, but when overclocked it beats, or competes with....." conversation can't even be had.

Has anyone seen reviews on Fury X running in the ~1150/550 range against the 980ti??
Very curious to see how it impacts the gaming benchmarks, because so far, mild overclocks are greatly impacting synthetics.


----------



## Clockster

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I am still lost as to why they touted this as "an overclocker's dream"
> 
> Maxwell is an overclocker's dream.... locked voltage, 250+ MHz overclocks readily...
> 
> I really want this Fury X to be successful, but am counting on it's criticisms more than anything, so that the next series is more refined.
> Locking voltage is a HUGE no-no for AMD....
> 
> NVIDIA gets by with it because their chips clock just fine on their own.
> 
> If Fiji's voltage is truly locked (and I am very skeptical of this), there is no overclocking dreams to be had at all with this series.
> Not to mention, the old, "yeah, but when overclocked it beats, or competes with....." conversation can't even be had.
> 
> Has anyone seen reviews on Fury X running in the ~1150/550 range against the 980ti??
> Very curious to see how it impacts the gaming benchmarks, because so far, mild overclocks are greatly impacting synthetics.


The funny thing is I saw legit benchmarks for the Fury X 2 months or so before launch and it was clocked higher than the launch clocks.
Not quite sure why AMD dropped to lower clocks though. The card scales incredibly well with overclocking so I really do hope the voltage gets unlocked.


----------



## hyp36rmax

Quote:


> Originally Posted by *Clockster*
> 
> The funny thing is I saw legit benchmarks for the Fury X 2 months or so before launch and it was clocked higher than the launch clocks.
> Not quite sure why AMD dropped to lower clocks though. The card scales incredibly well with overclocking so I really do hope the voltage gets unlocked.


How's crossfire performance?


----------



## looniam

i am going to leave this here about voltage control . .

http://forums.guru3d.com/showthread.php?t=400333

cheers!


----------



## hyp36rmax

Quote:


> Originally Posted by *looniam*
> 
> i am going to leave this here about voltage control . .
> 
> http://forums.guru3d.com/showthread.php?t=400333
> 
> cheers!


I appreciate the effort...

*Here you go guys. Unwinders thoughts on the issue.*

Quote:


> It is not a question of "strange decision by AMD" at all. It is a question of very limited AMD ADL API. NVIDIA cards simply have unified GPIO/VID based voltage control functions inside NVAPI, so it is very easy and fast to support voltage control on new cards (within driver allowed voltage control range of course) or even provide voltage control for future cards without even seeing them.
> 
> It doesn't apply to AMD. To support voltage control on new cards developers first need to implement low-level I2C aceess support for each new GPI family (which can be troublesome for new GPU architecture), then provide support for each new voltage controller model. That's not the task that can be done without hardware.


Quote:


> There is nothing you can do to speed up development of low-level I2C driver, sorry. However, *I've got hint from one source that AMD could leave driver-level I2C access unlocked for Fiji, which means that even current AB 4.1.1 can easily access VRM if it is really the case.*
> 
> So you could verify it by scanning I2C bus with the following command line switches and posting the results here:
> 
> MSIAfterburner.exe /i2cd


----------



## Agent Smith1984

Quote:


> Originally Posted by *Clockster*
> 
> The funny thing is I saw legit benchmarks for the Fury X 2 months or so before launch and it was clocked higher than the launch clocks.
> Not quite sure why AMD dropped to lower clocks though. The card scales incredibly well with overclocking so I really do hope the voltage gets unlocked.


You gotta wonder if those leaked benchies were at 1100 or whatever, and AMD reduced to 1050 just to leave some overclocking headroom









I mean, one thing is for certain... overclocking GPU's is very common place now.

There are still people on OCN who won't touch their CPU's clock speed because of all the stability issues than can go along with it, but almost everybody will give OC'ing a go on their video cards, even if it's just a tad bit with stock voltage.


----------



## Clockster

Quote:


> Originally Posted by *hyp36rmax*
> 
> How's crossfire performance?


Still waiting for the 2nd card, most likely only getting it by the 14th








Quote:


> Originally Posted by *Agent Smith1984*
> 
> You gotta wonder if those leaked benchies were at 1100 or whatever, and AMD reduced to 1050 just to leave some overclocking headroom
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean, one thing is for certain... overclocking GPU's is very common place now.
> 
> There are still people on OCN who won't touch their CPU's clock speed because of all the stability issues than can go along with it, but almost everybody will give OC'ing a go on their video cards, even if it's just a tad bit with stock voltage.


I saw legit benchmarks from a friend of mine that works with/for AMD and the cards were clocked higher.

I will say though, haven't changed any settings on any of my games from when my Titan X was in the rig , and running the Fury I haven't had any hick ups or slow downs, which is great.


----------



## xer0h0ur

People keep claiming that Fury X has terrible frametime issues and stuttering/hitching in games like SoM. Can anyone confirm or refute?


----------



## hyp36rmax

*+OP Added Personalizing your AMD Radeon™ R9 Fury X graphics card*

*Source:* Link



Spoiler: Personalizing your AMD Radeon R9 Fury X graphics card





The AMD Radeon™ R9 Fury X graphics card industrial design was created with the goal of embodying a professional, elegant and simple design. Using multiple pieces of aluminum die cast finished in black nickel and a soft touch black, the full metal construction makes the graphics card feels as good as it looks. We are extremely pleased with the outcome of the design but also understand there are always different design ideas out there.

During the process of creating the industrial design for the AMD Radeon™ R9 Fury X graphics card we encountered a variety of unique perspectives within AMD on how it should look. These differentiating opinions made us think, what if we could enable our customers to implement their own creativity on our design? To do this we incorporated a removable front plate on the AMD Radeon™ R9 Fury X graphics card to allow for customer creativity. Below you will find a link to download the 3D model for the face plate to help get you started on designing your own 3D printed or CNC front plate.

Please ensure to take all necessary precautions prior to removing the front plate from the graphics card; these include but are not limited to:


Do not remove the front plate while the graphics card is installed inside a system
Do not remove the front plate while the graphics card is powered or operational
Ensure your workspace is clear of debris and appropriate electrostatic discharge (ESD) protection is taken
The front plate can be removed by removing the four hex screws from the front of the graphics card as illustrated below
Do not remove any other screws or modify any other components on the graphics card

Use a proper hex key or screwdriver to remove the screws from the front plate to avoid damaging the screws
When reinstalling the screws do not to over tighten the screws




Be sure to share with us your creations and we'll highlight some of our favorites.



*Download the front plate 3D model HERE*

IMPORTANT: AMD's product warranty does not cover damage to your graphics card or system caused in whole or in part by removing, modifying or reinstalling the AMD Radeon Fury X faceplate, which activities you agree to carry out at your own risk. AMD will not provide replacement faceplates for any faceplates lost or damaged, nor will AMD be liable for any damages to the graphics card or your system caused during the removal, modification or reinstallation of the faceplate.

RadeonR9FuryXfrontplate.zip 295k .zip file


*Source:* Link


----------



## hyp36rmax

Quote:


> Originally Posted by *Clockster*
> 
> *Still waiting for the 2nd card, most likely only getting it by the 14th
> 
> 
> 
> 
> 
> 
> 
> *


I look forward to your results. Beats my wait time of this fall hahaha.


----------



## bonami2

Hey just saying

A 7950 7970 had all sort of issue and low clock at release.

Some time later they become pure sickness with high overclock


----------



## Agent Smith1984

Quote:


> Originally Posted by *Clockster*
> 
> Still waiting for the 2nd card, most likely only getting it by the 14th
> 
> 
> 
> 
> 
> 
> 
> 
> I saw legit benchmarks from a friend of mine that works with/for AMD and the cards were clocked higher.
> 
> I will say though, haven't changed any settings on any of my games from when my Titan X was in the rig , and running the Fury I haven't had any hick ups or slow downs, which is great.


That's good to hear.

I think people need to remember that this card is in a class of performance where it will meet the same demands as the NVIDIA cards, even though they are faster.

If the speed limit was 100MPH, and it took a minimum of 100HP to reach that speed, and Fury X has 105HP, and Titan X has 120HP, doesn't matter which of the two you have, if you are only need to reach 100MPH

I know, I broke the cardnal rule and use a car analogy again...


----------



## xer0h0ur

I love AMD and hate Nvidia but I won't make any excuses. As of now the Titan X and 980 Ti are still better than Fury X. It is what it is. The advantage you have with an AMD card right now is that the GCN architecture has been around for a while and is still sticking around so support for your card is likely to keep going for quite a while.


----------



## hyp36rmax

*R9 Fury X 4K Crossfire results from Eteknix. Wow Nice! Now if they only compared them to the GTX 980Ti and TitanX SLI*













*Source:* Link


----------



## Agent Smith1984

`Very nice!

Damn, is that 390X making a good showing or what??









Fury X looks great too.

Looks like AMD is targeting 4K pretty hard.


----------



## xer0h0ur

Well there was no point to testing crossfire in Arkham Knight since crossfire is disabled in that game.


----------



## blue1512

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well there was no point to testing crossfire in Arkham Knight since crossfire is disabled in that game.


They needed a bench to gimp the final score of it, didn't they? "Controlled" media


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well there was no point to testing crossfire in Arkham Knight since crossfire is disabled in that game.


Correct unless they created a Catalyst profile with Crossfire AFR which does work albeit unoptimized and flickering textures hahaha.


----------



## Ha-Nocri

Did someone finally test BF4 with Mantle on AMD cards?



Can't find that info. NV is usually faster under DX11 in BF4


----------



## xer0h0ur

Straight crashes the game for me when forcing AFR Friendly or Optimize 1x1 in AK or Dying Light. Although I believe that is an issue affecting me in several games which I suspect has more to do with a Win 7 installation that is crapping out on me.


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> Straight crashes the game for me when forcing AFR Friendly or Optimize 1x1 in AK or Dying Light. Although I believe that is an issue affecting me in several games which I suspect has more to do with a Win 7 installation that is crapping out on me.


It could be an isolated issue. Batman Arkham Knight will work most of time and has crashed in several occasions on load on my R9 290X VAPOR-X Crossfire in Windows 8.1 with the latest 15.6 Drivers and Dying Light works in crossfire without having to create a custom profile with 100% load for me as well. Maybe things will get better once Windows 10 drops at the end of the month?


----------



## xer0h0ur

The thing is that tri-fire would work fine in Dying Light for me without having to force anything. Then I switched to a newer driver, Techland patched the game and suddenly I can't even load the game anymore without an instant BSOD. I am forced to make an app profile to disable crossfire so the game can even load now and I refuse to play the game with the framerate I get @ 4K on a single card. I have been avoiding a reinstall of Win 7 like the plague. Just waiting on Windows 10.


----------



## tallshortguy

Any word when air cooled fury embargo is? Release date?


----------



## Agent Smith1984

Quote:


> Originally Posted by *tallshortguy*
> 
> Any word when air cooled fury embargo is? Release date?


July 14th release date...

Availability after everyone and their brothers buy them all out... who knows?


----------



## bonami2

Im affraid of 4gb hbm so i think im passing on the gpu gen. 5760x1080 need ram


----------



## bonami2

Hey btw the r9 390x is 390gbs of bandwith so the 512-614gbs is aint that magical at the end except the big power comsumption drop


----------



## G227

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Has anyone seen reviews on Fury X running in the ~1150/550 range against the 980ti??
> Very curious to see how it impacts the gaming benchmarks, because so far, mild overclocks are greatly impacting synthetics.


I did it against Titan X which is a tad bit faster then 980Ti. I have compared my results to one of the owners of Fury X as I don't have any yet. He did just core OC - so I followed that too. This is 4K Ultra (max) Shadow of Mordor (I have bolded the core vs core overclock - i.e. apples to apples):

Fury X @1050/500 - stock BIOS, stock volts: 48.95fps = 100.0% / 0.0% OC
*Fury X @1140/500 - stock BIOS, stock volts: 54.02fps = 108.3% / 8.3%OC*
Titan X @1177/3505 - stock BIOS, stock volts: 50.65fps = 103.5% / 0.0% OC (this is the stock setting no OC - just GPU boost 2.0)
*Titan X @1427/3505 - stock BIOS, stock volts: 56.45fps = 115.3% / 11.5% OC (just core overclock)*
Titan X @1422/3969 - stock BIOS, stock volts: 60.19fps = 123.0% / 18.8% OC (core + memory overclock)

Now I can also adjust the voltage slider and/or flash custom BIOS on the Titan to get even more performance (1550/4000), but this is for the sake of fairness - i.e. NO voltage was adjusted in any way on the Titan.

Also keep in mind this is 4K - the difference get progressively larger at 1440/1080p


----------



## xer0h0ur

Quote:


> Originally Posted by *bonami2*
> 
> Hey btw the r9 390x is 390gbs of bandwith so the 512-614gbs is aint that magical at the end except the big power comsumption drop


Except people are overclocking HBM 20% and we really still don't know how well AMD is going to optimize for HBM. vRAM usage is not the same on GDDR5 versus HBM so far.


----------



## Gdourado

Any news on the noise and coil whine issues?
AMD claims with was a pre production issue and that it has been fixed.
Is this accurate?
How's the noise on the newly bought cards?

Cheers!


----------



## bonami2

Quote:


> Originally Posted by *G227*
> 
> I did it against Titan X which is a tad bit faster then 980Ti. I have compared my results to one of the owners of Fury X as I don't have any yet. He did just core OC - so I followed that too. This is 4K Ultra (max) Shadow of Mordor (I have bolded the core vs core overclock - i.e. apples to apples):
> 
> Fury X @1050/500 - stock BIOS, stock volts: 48.95fps = 100.0% / 0.0% OC
> *Fury X @1140/500 - stock BIOS, stock volts: 54.02fps = 108.3% / 8.3%OC*
> Titan X @1177/3505 - stock BIOS, stock volts: 50.65fps = 103.5% / 0.0% OC (this is the stock setting no OC - just GPU boost 2.0)
> *Titan X @1427/3505 - stock BIOS, stock volts: 56.45fps = 115.3% / 11.5% OC (just core overclock)*
> Titan X @1422/3969 - stock BIOS, stock volts: 60.19fps = 123.0% / 18.8% OC (core + memory overclock)
> 
> Now I can also adjust the voltage slider and/or flash custom BIOS on the Titan to get even more performance (1550/4000), but this is for the sake of fairness - i.e. NO voltage was adjusted in any way on the Titan.
> 
> Also keep in mind this is 4K - the difference get progressively larger at 1440/1080p


I heard of a mod on older nvidia gpu that remove all the Tdp throttling so you may look to do that too to increase performance i have no idea if titan is throttling ?

http://overclocking.guide/increase-the-nvidia-power-limit-all-cards/

Quote:


> Originally Posted by *xer0h0ur*
> 
> Except people are overclocking HBM 20% and we really still don't know how well AMD is going to optimize for HBM. vRAM usage is not the same on GDDR5 versus HBM so far.


Yea that what i heard but well it still 4gb I dont know how it work but if you have 8gb of data the engine need to put on vram it aint going better on 4gb.

Maybe it will go in ram and the important stuff in vram

Or direct x 12 will maybe allow to share vram in crossfire config and that why amd just release 4gb one.

I have no idea but well a 980ti is just 2gb more so not even better. 390x 8gb and fury are the only choice i have but the 390x is not enough powerfull i want a 100% increase ahah and the watercooler would be nice for [email protected] without noise


----------



## HellBoundgr

Quote:


> Originally Posted by *Gdourado*
> 
> Any news on the noise and coil whine issues?
> AMD claims with was a pre production issue and that it has been fixed.
> Is this accurate?
> How's the noise on the newly bought cards?
> 
> Cheers!


Hi

I read on other forums that coolermaster logo is now etched into the plastic instead of being a sticker like the first samples was. And that they have no problem now with card after they RMA the one with noise.


----------



## hyp36rmax

Quote:


> Originally Posted by *HellBoundgr*
> 
> Hi
> 
> I read on other forums that coolermaster logo is now etched into the plastic instead of being a sticker like the first samples was. And that they have no problem now with card after they RMA the one with noise.


Good to hear! Can you provide links to those forum post and pictures?


----------



## HellBoundgr

pictures at the end







http://forums.anandtech.com/showthread.php?t=2437161&page=2


----------



## G227

Quote:


> Originally Posted by *bonami2*
> 
> I heard of a mod on older nvidia gpu that remove all the Tdp throttling so you may look to do that too to increase performance i have no idea if titan is throttling ?


Thanks for the hint! But no need for that with this Titan. At this settings - at stock volts, the highest the card goes is 104% TDP and you have slider for 110%. Now this is dependent on your particular lottery, but generally if you don't add voltage, you're good.

If you do add voltage, you can just flash modded BIOS with 350W base and 425W max TDP (some BIOSes go up to 450W - the normal ones, not the crazy ones







). This gets rid of any TDP related throttling or issues. The highest I have seen my card was 366W @1520/4000 at maxed out voltage that is hardware allowed at 1.274V









Naturally you can also just use Maxwell BIOS tweaker and tweak your own BIOS to do that to your liking if you know how


----------



## magicc8ball

Quote:


> Originally Posted by *HellBoundgr*
> 
> pictures at the end
> 
> 
> 
> 
> 
> 
> 
> http://forums.anandtech.com/showthread.php?t=2437161&page=2


Thanks! This looks better than a sticker


----------



## hyp36rmax

Quote:


> Originally Posted by *HellBoundgr*
> 
> pictures at the end
> 
> 
> 
> 
> 
> 
> 
> http://forums.anandtech.com/showthread.php?t=2437161&page=2


Thank you for your effort!

*Here is a picture of the revised pump, can anyone else confirm this is the same or they have the Cooler Master Sticker on it?*



*Source:* Link

*Instruction on how to remove top cover*: Link


----------



## HellBoundgr

Quote:


> Originally Posted by *hyp36rmax*
> 
> Thank you for your effort!
> 
> *Here is a picture of the revised pump, can anyone else confirm this is the same or they have the Cooler Master Sticker on it?*
> 
> 
> 
> 
> *Source:* Link
> 
> *Instruction on how to remove top cover*: Link


No problem







I gonna pick up 2x XFX on friday, I realy hope I get this new..


----------



## lagittaja

Quote:


> Originally Posted by *xer0h0ur*
> 
> People keep claiming that Fury X has terrible frametime issues and stuttering/hitching in games like SoM. Can anyone confirm or refute?


Does this answer your question?
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,31.html
16ms or below with 150% super sampling. Guru3D did FCAT for 9 different games. I can't see anything wrong in those results, pretty damn nice if you ask me.


----------



## bonami2

Quote:


> Originally Posted by *G227*
> 
> Thanks for the hint! But no need for that with this Titan. At this settings - at stock volts, the highest the card goes is 104% TDP and you have slider for 110%. Now this is dependent on your particular lottery, but generally if you don't add voltage, you're good.
> 
> If you do add voltage, you can just flash modded BIOS with 350W base and 425W max TDP (some BIOSes go up to 450W - the normal ones, not the crazy ones
> 
> 
> 
> 
> 
> 
> 
> ). This gets rid of any TDP related throttling or issues. The highest I have seen my card was 366W @1520/4000 at maxed out voltage that is hardware allowed at 1.274V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Naturally you can also just use Maxwell BIOS tweaker and tweak your own BIOS to do that to your liking if you know how


Oh yea well ahah

Question like that i hate my sapphire gpu for so many problem but it may be amd problem too.

Any idea if Example Doing [email protected] pausing it gaming Starting it Pausing it. May bug out the gpu? I often find myself stuck at 850mhz Maybe because of flash and sometime at 925mhz maybe for the same reason. And well I do suspect a bios issue. Both of the bios have multiple other problem the second just end up at 100% usage doing nothing.

Im looking at the fury x for my 5760x1080 setup ahah and may go crossfire later. ( plan to purchase in august


----------



## xer0h0ur

Just go into Afterburner and under the unofficial overclocking mode select disabled without powerplay support. That will not allow your clocks to downclock at all. Basically it forces your 3D clocks non-stop. You can of course just re-enable it for whenever you're not folding or gaming.


----------



## xer0h0ur

Quote:


> Originally Posted by *lagittaja*
> 
> Does this answer your question?
> http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,31.html
> 16ms or below with 150% super sampling. Guru3D did FCAT for 9 different games. I can't see anything wrong in those results, pretty damn nice if you ask me.


If I wanted to read more reviews I would have taken their word for it instead of asking owners of the card on a forum. In this day in age where Nvidia pays websites to shill for them, I don't trust reviews anymore.

This being the reason I am asking:


----------



## kayan

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I wanted to read more reviews I would have taken their word for it instead of asking owners of the card on a forum. In this day in age where Nvidia pays websites to shill for them, I don't trust reviews anymore.
> 
> This being the reason I am asking:


I just watched that, and then his Witcher 3 video, what a scam. He didn't even test the different cards with the same settings. How is that helpful at all?


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just go into Afterburner and under the unofficial overclocking mode select disabled without powerplay support. That will not allow your clocks to downclock at all. Basically it forces your 3D clocks non-stop. You can of course just re-enable it for whenever you're not folding or gaming.


Uh well if it do that im going to rep 1000+ you

But that gpu is ******ed voltage only work in trixx and trixx crash on latest version so the other days with custom fan i ended with 0% and 80c on gpu ahah

Brb gonna try that


----------



## Gregster

Thought I would test the Fury X against the Titan X in my system. Both using the same settings and same CPU and the results are not great for the Fury X in truth. Mantle was even slower than DX11, which a few people have attested to.


----------



## xer0h0ur

Quote:


> Originally Posted by *kayan*
> 
> I just watched that, and then his Witcher 3 video, what a scam. He didn't even test the different cards with the same settings. How is that helpful at all?


Yeah its hard to tell who is legit and who is intentionally trying to make something fail anymore.


----------



## bonami2

Well it working DDDDDDDDD

Only thing i have trixx with the setting set + voltage

And msi with the same setting

So gonna try to reboot and pray it wont blow up

If the fury x can do that im sold









Just need SOME [email protected] PPD RESULT Anyone stop LOOKING AT THAT GREAT GPU AT IDLE AND MAKE IT WORK HARD


----------



## hyp36rmax

Quote:


> Originally Posted by *Gregster*
> 
> Thought I would test the Fury X against the Titan X in my system. Both using the same settings and same CPU and the results are not great for the Fury X in truth. Mantle was even slower than DX11, which a few people have attested to.


Thanks for the comparison. What resolution is this? Based on your test the Titan X is doing very well for maximum FPS in your tested resolution compared to the FURY X. It can be argued "Drivers" are the culprit. With a game like BF4 which by now should be very optimized by both AMD and Nvidia. What about 4K results? (if possible VSR and DSR). Over all playability it looks as if you would be fine with either one as they both deliver performance over 60FPS, more so with the Titan X for those wanting to push 120hz - 144hz screens.


----------



## rdr09

Quote:


> Originally Posted by *Gregster*
> 
> Thought I would test the Fury X against the Titan X in my system. Both using the same settings and same CPU and the results are not great for the Fury X in truth. Mantle was even slower than DX11, which a few people have attested to.
> 
> 
> Spoiler: Warning: Spoiler!


Does the T X using that new tech called color compression? "cause the color on the T X seems dull. And the detail looks less. Not sure. Might be 'cause with the F X the person is closer to the curb.


----------



## bonami2

Quote:


> Originally Posted by *rdr09*
> 
> Does the T X using that new tech called color compression? "cause the color on the T X seems dull. And the detail looks less. Not sure. Might be 'cause with the F X the person is closer to the curb.


blur vs no blur maybe


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> Does the T X using that new tech called color compression? "cause the color on the T X seems dull. And the detail looks less. Not sure. Might be 'cause with the F X the person is closer to the curb.


Quote:


> Originally Posted by *bonami2*
> 
> blur vs no blur maybe


Can we really tell from a Youtube video? All that compression.


----------



## rdr09

Quote:


> Originally Posted by *bonami2*
> 
> blur vs no blur maybe


Quote:


> Originally Posted by *hyp36rmax*
> 
> Can we really tell from a Youtube video? All that compression.


Maybe a different monitor. But, I've read AMD's color looks more vibrant.

Checkout the curb. see the area where the bricks are?


----------



## bonami2

Quote:


> Originally Posted by *hyp36rmax*
> 
> Can we really tell from a Youtube video? All that compression.


+1

The compression seem to go worse with time my old video look better than my new one i had a core 2 duo back in the old video and recorded at 720p... Now at 1080p i get so ugly video that i decided to not upload them ahah


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> Maybe a different monitor. But, I've read AMD's color looks more vibrant.
> 
> Checkout the curb. see the area where the bricks are?


I noticed that also. Thanks for pointing that out.

Quote:


> Originally Posted by *bonami2*
> 
> +1
> 
> The compression seem to go worse with time my old video look better than my new one i had a core 2 duo back in the old video and recorded at 720p... Now at 1080p i get so ugly video that i decided to not upload them ahah


Lol. I know exactly what you're talking about.


----------



## xer0h0ur

Yeah why is there more detail on the Fury X than the Titan X testing? Just look at the ground on the frame before even playing the video.


----------



## Gregster

Both runs were at 1080P, both were the exact same settings and both captured with Avermedia LGP Lite (only does upto 1080P. As for the colour differences, I wouldn't read too much into that personally, as I haven't set the Nvidia colours back up.

Edit:

This was just a real quick test to see what the difference was between my Fury X and my Titan X. The next video's I do will be set correctly.


----------



## rdr09

Quote:


> Originally Posted by *Gregster*
> 
> Both runs were at 1080P, both were the exact same settings and both captured with Avermedia LGP Lite (only does upto 1080P. As for the colour differences, I wouldn't read too much into that personally, as I haven't set the Nvidia colours back up.


My HD 7950 looks better than that T X in detail. Is it the feature with nVidia to increase fps?


----------



## Nunzi

I have to say the FURY X looks better !


----------



## rv8000

So card finally came in today. The pump noise is pretty bad, card tops out at 1150/525, temps hit 65c after 15 mins of w3 @ 1100/525 +50% power (though I have it set up as single pull atm and haven't adjusted the fan speed yet). Ran a few Firestrike tests but I don't know how much more of this pump noise I can take tonight







. Installing Metro LL and a few other games for some more testing atm. Anyone have any specifics they want me to run (with the exclusion of GTA and BF4







)?


----------



## Gdourado

Quote:


> Originally Posted by *rv8000*
> 
> So card finally came in today. The pump noise is pretty bad, card tops out at 1150/525, temps hit 65c after 15 mins of w3 @ 1100/525 +50% power (though I have it set up as single pull atm and haven't adjusted the fan speed yet). Ran a few Firestrike tests but I don't know how much more of this pump noise I can take tonight
> 
> 
> 
> 
> 
> 
> 
> . Installing Metro LL and a few other games for some more testing atm. Anyone have any specifics they want me to run (with the exclusion of GTA and BF4
> 
> 
> 
> 
> 
> 
> 
> )?


Is the pump the new version with the engraving? Or the older with the sticker?


----------



## rv8000

Quote:


> Originally Posted by *Gdourado*
> 
> Is the pump the new version with the engraving? Or the older with the sticker?


I'll check in about 10 mins.


----------



## josephimports

GTAV 4K benchmark results. GTAV graphic settings set at default. OC settings 1100/550 50%PL.

Stock Fury X

Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 15.795262, 61.236046, 37.305099
Pass 1, 19.187397, 60.238831, 30.399248
Pass 2, 19.752962, 59.992737, 30.320618
Pass 3, 20.606808, 86.938797, 44.523190
Pass 4, 18.584539, 103.520615, 51.154911


Spoiler: Warning: Spoiler!



Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 15.795262, 61.236046, 37.305099
Pass 1, 19.187397, 60.238831, 30.399248
Pass 2, 19.752962, 59.992737, 30.320618
Pass 3, 20.606808, 86.938797, 44.523190
Pass 4, 18.584539, 103.520615, 51.154911

Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 16.330252, 63.310123, 26.805986
Pass 1, 16.600588, 52.117542, 32.895550
Pass 2, 16.668684, 50.625320, 32.980858
Pass 3, 11.502344, 48.527653, 22.460205
Pass 4, 9.659912, 53.808167, 19.548466

=== SYSTEM ===
Windows 8.1 Pro 64-bit (6.2, Build 9200)
DX Feature Level: 11.0
Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz (8 CPUs), ~4.0GHz
8192MB RAM
AMD Radeon (TM) R9 Series, 4268MB, Driver Version 15.150.1004.0
Graphics Card Vendor Id 0x1002 with Device ID 0x7300

=== SETTINGS ===
Display: 3840x2160 (FullScreen) @ 60Hz VSync ON
Tessellation: 2
LodScale: 1.000000
PedLodBias: 0.200000
VehicleLodBias: 0.000000
ShadowQuality: 2
ReflectionQuality: 2
ReflectionMSAA: 0
SSAO: 2
AnisotropicFiltering: 16
MSAA: 0
MSAAFragments: 0
MSAAQuality: 0
SamplingMode: 0
TextureQuality: 2
ParticleQuality: 1
WaterQuality: 1
GrassQuality: 0
ShaderQuality: 1
Shadow_SoftShadows: 1
UltraShadows_Enabled: false
Shadow_ParticleShadows: true
Shadow_Distance: 1.000000
Shadow_LongShadows: false
Shadow_SplitZStart: 0.930000
Shadow_SplitZEnd: 0.890000
Shadow_aircraftExpWeight: 0.990000
Shadow_DisableScreenSizeCheck: false
Reflection_MipBlur: true
FXAA_Enabled: true
TXAA_Enabled: false
Lighting_FogVolumes: true
Shader_SSA: true
DX_Version: 2
CityDensity: 1.000000
PedVarietyMultiplier: 1.000000
VehicleVarietyMultiplier: 1.000000
PostFX: 2
DoF: true
HdStreamingInFlight: false
MaxLodScale: 0.000000
MotionBlurStrength: 0.000000



Fury X OC

Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 16.276979, 63.496597, 48.411743
Pass 1, 20.072182, 61.181374, 43.558453
Pass 2, 20.044783, 60.819393, 31.836372
Pass 3, 20.080952, 95.497925, 50.024746
Pass 4, 18.807522, 139.071747, 53.400936


Spoiler: Warning: Spoiler!



Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 16.276979, 63.496597, 48.411743
Pass 1, 20.072182, 61.181374, 43.558453
Pass 2, 20.044783, 60.819393, 31.836372
Pass 3, 20.080952, 95.497925, 50.024746
Pass 4, 18.807522, 139.071747, 53.400936

Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 15.748876, 61.436459, 20.656145
Pass 1, 16.344845, 49.820194, 22.957657
Pass 2, 16.442123, 49.888294, 31.410614
Pass 3, 10.471432, 49.798435, 19.990107
Pass 4, 7.190533, 53.170216, 18.726263

=== SYSTEM ===
Windows 8.1 Pro 64-bit (6.2, Build 9200)
DX Feature Level: 11.0
Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz (8 CPUs), ~4.0GHz
8192MB RAM
AMD Radeon (TM) R9 Series, 4268MB, Driver Version 15.150.1004.0
Graphics Card Vendor Id 0x1002 with Device ID 0x7300

=== SETTINGS ===
Display: 3840x2160 (FullScreen) @ 60Hz VSync ON
Tessellation: 2
LodScale: 1.000000
PedLodBias: 0.200000
VehicleLodBias: 0.000000
ShadowQuality: 2
ReflectionQuality: 2
ReflectionMSAA: 0
SSAO: 2
AnisotropicFiltering: 16
MSAA: 0
MSAAFragments: 0
MSAAQuality: 0
SamplingMode: 0
TextureQuality: 2
ParticleQuality: 1
WaterQuality: 1
GrassQuality: 0
ShaderQuality: 1
Shadow_SoftShadows: 1
UltraShadows_Enabled: false
Shadow_ParticleShadows: true
Shadow_Distance: 1.000000
Shadow_LongShadows: false
Shadow_SplitZStart: 0.930000
Shadow_SplitZEnd: 0.890000
Shadow_aircraftExpWeight: 0.990000
Shadow_DisableScreenSizeCheck: false
Reflection_MipBlur: true
FXAA_Enabled: true
TXAA_Enabled: false
Lighting_FogVolumes: true
Shader_SSA: true
DX_Version: 2
CityDensity: 1.000000
PedVarietyMultiplier: 1.000000
VehicleVarietyMultiplier: 1.000000
PostFX: 2
DoF: true
HdStreamingInFlight: false
MaxLodScale: 0.000000
MotionBlurStrength: 0.000000


----------



## rv8000

Quote:


> Originally Posted by *Gdourado*
> 
> Is the pump the new version with the engraving? Or the older with the sticker?


It is apparently one of the early samples with the sticker.


----------



## hyp36rmax

Quote:


> Originally Posted by *rv8000*
> 
> It is apparently one of the early samples with the sticker.


Thanks for verifying there;s an actual difference.


----------



## xer0h0ur

I wouldn't particularly expect all of the new pump units without the noise issue to have the same logo unless they specifically said all new units will have that plastic cover with the larger logo.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> I wouldn't particularly expect all of the new pump units without the noise issue to have the same logo unless they specifically said all new units will have that plastic cover with the larger logo.


It is clearly indicative of a revision to the pump/aio unit though.


----------



## xer0h0ur

Corporations don't throw away perfectly good stock of parts. They would still use every cover with the small sticker anyways.


----------



## gamervivek

Quote:


> Originally Posted by *rdr09*
> 
> My HD 7950 looks better than that T X in detail. Is it the feature with nVidia to increase fps?


AMD have better colors out of the box which seem more vivid.

In that video, besides the color differences, it could be different effect of blur, more distance of Titan X from the right side buildings and different LoD setting by default by amd/nvidia.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> Corporations don't throw away perfectly good stock of parts. They would still use every cover with the small sticker anyways.


If you ordered a card release day and have a pump with a sticker my guess is it is most likely from the initial batch, and will thus whine if the revision is not in place. We don't know when they altered the cover design, we don't know how many they've recalled or have received from replacement orders, and we really don't know how many cards were in the initial batch. The trend is sticker -> whine, engraving ->fine, I doubt from this point on we will see many cards with the sticker if the initial batch was small anyways.


----------



## royfrosty

Hi guys, not sure if any of you have seen my cards that were in hwz.

I managed to get the faceplate removed, and mine in SG is a etched CM logo on it. Did some testing and found that my card does not have any pump whine, but rather a small gpu coil whine instead.

But overall it isn't that audible from a distance.

Not sure if it helps, but here are some of my pics of the pump and also the video of how it sounds like at 30cm and 1m.


Spoiler: Warning: Spoiler!







@less than 30cm





@ 1m





EDIT: Note that i removed the Zalman AIO cooler to let it run awhile as it is noisier than my Fury X lol.


----------



## hyp36rmax

Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys, not sure if any of you have seen my cards that were in hwz.
> 
> I managed to get the faceplate removed, and mine in SG is a etched CM logo on it. Did some testing and found that my card does not have any pump whine, but rather a small gpu coil whine instead.
> 
> But overall it isn't that audible from a distance.
> 
> Not sure if it helps, but here are some of my pics of the pump and also the video of how it sounds like at 30cm and 1m.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @less than 30cm
> 
> 
> 
> 
> 
> @ 1m
> 
> 
> 
> 
> 
> EDIT: Note that i removed the Zalman AIO cooler to let it run awhile as it is noisier than my Fury X lol.


Thank you for this!

As far as the logo is concerned the ones with the sticker is using our old Cooler Master logo with the Teal and the rounded "R's" as the "revised pump" with the etched Cooler Master logo is using the latest straight "R's" for our new "Make it yours" marketing campaign. I know it's subtle. I honestly can't verify which pump is being used or any revisions as AMD has those specifications since this is an ODM part which is another division of the company. But I suspect it's the Seidon pump retrofitted.

How can I say such blasphemy? I'm also:



Spoiler: Warning: Spoiler!



CM Felinni


----------



## Sgt Bilko

Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys, not sure if any of you have seen my cards that were in hwz.
> 
> I managed to get the faceplate removed, and mine in SG is a etched CM logo on it. Did some testing and found that my card does not have any pump whine, but rather a small gpu coil whine instead.
> 
> But overall it isn't that audible from a distance.
> 
> Not sure if it helps, but here are some of my pics of the pump and also the video of how it sounds like at 30cm and 1m.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @less than 30cm
> 
> 
> 
> 
> 
> @ 1m
> 
> 
> 
> 
> 
> EDIT: Note that i removed the Zalman AIO cooler to let it run awhile as it is noisier than my Fury X lol.


Thanks for doing this, makes me very hopeful for mine when it geta here


----------



## Ceadderman

Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys, not sure if any of you have seen my cards that were in hwz.
> 
> I managed to get the faceplate removed, and mine in SG is a etched CM logo on it. Did some testing and found that my card does not have any pump whine, but rather a small gpu coil whine instead.
> 
> But overall it isn't that audible from a distance.
> 
> Not sure if it helps, but here are some of my pics of the pump and also the video of how it sounds like at 30cm and 1m.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @less than 30cm
> 
> 
> 
> 
> 
> @ 1m
> 
> 
> 
> 
> 
> EDIT: Note that i removed the Zalman AIO cooler to let it run awhile as it is noisier than my Fury X lol.


I am reasonably convinced that your coil whine isn't coil whine at all. Look how close that copper tube is next to the 8pin. Betcha dollars to donuts that if you apply some liquid electrical tape on that section of bare copper, your whine diminishes or disappears completely.

It *could be* but what I know of electricity, it favors the path of least resistance and looking at your pic there is only one other place this applies to and that's the Mosfets above the copper. Might give that section a coat as well.









~Ceadder


----------



## blue1512

Quote:


> Originally Posted by *josephimports*
> 
> GTAV 4K benchmark results. GTAV graphic settings set at default. OC settings 1100/550 50%PL.
> 
> Stock Fury X
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 15.795262, 61.236046, 37.305099
> Pass 1, 19.187397, 60.238831, 30.399248
> Pass 2, 19.752962, 59.992737, 30.320618
> Pass 3, 20.606808, 86.938797, 44.523190
> Pass 4, 18.584539, 103.520615, 51.154911
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 15.795262, 61.236046, 37.305099
> Pass 1, 19.187397, 60.238831, 30.399248
> Pass 2, 19.752962, 59.992737, 30.320618
> Pass 3, 20.606808, 86.938797, 44.523190
> Pass 4, 18.584539, 103.520615, 51.154911
> 
> Time in milliseconds(ms). (Lower is better). Min, Max, Avg
> Pass 0, 16.330252, 63.310123, 26.805986
> Pass 1, 16.600588, 52.117542, 32.895550
> Pass 2, 16.668684, 50.625320, 32.980858
> Pass 3, 11.502344, 48.527653, 22.460205
> Pass 4, 9.659912, 53.808167, 19.548466
> 
> === SYSTEM ===
> Windows 8.1 Pro 64-bit (6.2, Build 9200)
> DX Feature Level: 11.0
> Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz (8 CPUs), ~4.0GHz
> 8192MB RAM
> AMD Radeon (TM) R9 Series, 4268MB, Driver Version 15.150.1004.0
> Graphics Card Vendor Id 0x1002 with Device ID 0x7300
> 
> === SETTINGS ===
> Display: 3840x2160 (FullScreen) @ 60Hz VSync ON
> Tessellation: 2
> LodScale: 1.000000
> PedLodBias: 0.200000
> VehicleLodBias: 0.000000
> ShadowQuality: 2
> ReflectionQuality: 2
> ReflectionMSAA: 0
> SSAO: 2
> AnisotropicFiltering: 16
> MSAA: 0
> MSAAFragments: 0
> MSAAQuality: 0
> SamplingMode: 0
> TextureQuality: 2
> ParticleQuality: 1
> WaterQuality: 1
> GrassQuality: 0
> ShaderQuality: 1
> Shadow_SoftShadows: 1
> UltraShadows_Enabled: false
> Shadow_ParticleShadows: true
> Shadow_Distance: 1.000000
> Shadow_LongShadows: false
> Shadow_SplitZStart: 0.930000
> Shadow_SplitZEnd: 0.890000
> Shadow_aircraftExpWeight: 0.990000
> Shadow_DisableScreenSizeCheck: false
> Reflection_MipBlur: true
> FXAA_Enabled: true
> TXAA_Enabled: false
> Lighting_FogVolumes: true
> Shader_SSA: true
> DX_Version: 2
> CityDensity: 1.000000
> PedVarietyMultiplier: 1.000000
> VehicleVarietyMultiplier: 1.000000
> PostFX: 2
> DoF: true
> HdStreamingInFlight: false
> MaxLodScale: 0.000000
> MotionBlurStrength: 0.000000
> 
> 
> 
> Fury X OC
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 16.276979, 63.496597, 48.411743
> Pass 1, 20.072182, 61.181374, 43.558453
> Pass 2, 20.044783, 60.819393, 31.836372
> Pass 3, 20.080952, 95.497925, 50.024746
> Pass 4, 18.807522, 139.071747, 53.400936
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 16.276979, 63.496597, 48.411743
> Pass 1, 20.072182, 61.181374, 43.558453
> Pass 2, 20.044783, 60.819393, 31.836372
> Pass 3, 20.080952, 95.497925, 50.024746
> Pass 4, 18.807522, 139.071747, 53.400936
> 
> Time in milliseconds(ms). (Lower is better). Min, Max, Avg
> Pass 0, 15.748876, 61.436459, 20.656145
> Pass 1, 16.344845, 49.820194, 22.957657
> Pass 2, 16.442123, 49.888294, 31.410614
> Pass 3, 10.471432, 49.798435, 19.990107
> Pass 4, 7.190533, 53.170216, 18.726263
> 
> === SYSTEM ===
> Windows 8.1 Pro 64-bit (6.2, Build 9200)
> DX Feature Level: 11.0
> Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz (8 CPUs), ~4.0GHz
> 8192MB RAM
> AMD Radeon (TM) R9 Series, 4268MB, Driver Version 15.150.1004.0
> Graphics Card Vendor Id 0x1002 with Device ID 0x7300
> 
> === SETTINGS ===
> Display: 3840x2160 (FullScreen) @ 60Hz VSync ON
> Tessellation: 2
> LodScale: 1.000000
> PedLodBias: 0.200000
> VehicleLodBias: 0.000000
> ShadowQuality: 2
> ReflectionQuality: 2
> ReflectionMSAA: 0
> SSAO: 2
> AnisotropicFiltering: 16
> MSAA: 0
> MSAAFragments: 0
> MSAAQuality: 0
> SamplingMode: 0
> TextureQuality: 2
> ParticleQuality: 1
> WaterQuality: 1
> GrassQuality: 0
> ShaderQuality: 1
> Shadow_SoftShadows: 1
> UltraShadows_Enabled: false
> Shadow_ParticleShadows: true
> Shadow_Distance: 1.000000
> Shadow_LongShadows: false
> Shadow_SplitZStart: 0.930000
> Shadow_SplitZEnd: 0.890000
> Shadow_aircraftExpWeight: 0.990000
> Shadow_DisableScreenSizeCheck: false
> Reflection_MipBlur: true
> FXAA_Enabled: true
> TXAA_Enabled: false
> Lighting_FogVolumes: true
> Shader_SSA: true
> DX_Version: 2
> CityDensity: 1.000000
> PedVarietyMultiplier: 1.000000
> VehicleVarietyMultiplier: 1.000000
> PostFX: 2
> DoF: true
> HdStreamingInFlight: false
> MaxLodScale: 0.000000
> MotionBlurStrength: 0.000000


Very impressive OC performance without voltage control


----------



## hyp36rmax

Quote:


> Originally Posted by *Ceadderman*
> 
> I am reasonably convinced that your coil whine isn't coil whine at all. Look how close that copper tube is next to the 8pin. Betcha dollars to donuts that if you apply some liquid electrical tape on that section of bare copper, your whine diminishes or disappears completely.
> 
> It *could be* but what I know of electricity, it favors the path of least resistance and looking at your pic there is only one other place this applies to and that's the Mosfets above the copper. Might give that section a coat as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Some one try this!


----------



## Ceadderman

I would definitely do it had I had one on hand. I still have to wait to get one.

Maybe Sapphire should send me one for "good knowledge"?









I think that Cooler Master mucked this up. That tube should've been set the other direction to keep that uninsulated copper away from the main power source. So in theory what we have here is interference sent through the tube and colliding with the GPU and HBM frequencies. Gee wonder where that whine is coming from?









~Ceadder


----------



## fewness

2nd one arrived today.










Spoiler: Warning: Spoiler!


----------



## Bludge

Nice, my second one is at the shop, and I'm wondering what the X2 car will look like....tri-fire FTW!

Or probably by this time tomorrow I'll have gone and got it


----------



## dir_d

Quote:


> Originally Posted by *fewness*
> 
> 2nd one arrived today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


So pretty, I'm jelly and I don't even play games anymore.


----------



## Thoth420

While I love this card the sound issues concern me. The whole point of my build was for it be quiet single GPU and powerful without having to do a custom loop.
I got my order in day 1 US release and have a strong feeling it will make noise. When I build I am going to give it a try and hope for the best and if my fears are realized I have a backup 390 on the way as I am not in the mood to wait for a revision. I can RMA but since they are so limited if someone does want it and thinks they can solve the problem with some tinkering maybe one of you guys want it.

I am not trying to pawn off something crappy on anyone if the card makes whine be it coil, pump etc. I will disclose it 100%. I will update when I get my build up and running (most likely this weekend).
Just know they are sold out everywhere otherwise I would have ordered another.

I will take a small shipping loss to send it in for refund but this is a community and sometimes one man's junk another man can turn into gold with more knowledge...also I am too much of a noob to try and do some DIY fix.


----------



## Silent Scone

My lounge build, i5 4690K @ 4.4 1.15v, 16GB TG 1600C11 at 2133C10 1.6v

74GB Intel M.2, Samsung 850 Evo 250GB



Not had a chance to play with the Fury yet, and yes I can hear the pump. The biggest issue IMO is how AMD have tied their customers into this problem by not letting AIBP release their own varients


----------



## Clockster

Well it seems I got lucky with my card. I'm able to push 1153 core + 600 mem with 50% power and stock volts.
Only frustrating thing is I have no time to play with it, the little I've done I noticed that with the overclock the min fps jumps up a lot in heaven. Increased by 10 min fps which is great as far as I'm concerned.

@Silent Scone I love that build lol

Mine is so vast and the card looks a little lost in there lol


----------



## Gumbi

Quote:


> Originally Posted by *Clockster*
> 
> Well it seems I got lucky with my card. I'm able to push 1153 core + 600 mem with 50% power and stock volts.
> Only frustrating thing is I have no time to play with it, the little I've done I noticed that with the overclock the min fps jumps up a lot in heaven. Increased by 10 min fps which is great as far as I'm concerned.


How are temps? Core/VRM? That's a decent overclock tbh.


----------



## Clockster

Quote:


> Originally Posted by *Gumbi*
> 
> How are temps? Core/VRM? That's a decent overclock tbh.


I didn't actually check, but I'll run it again a little later today/tonight and post some screens of temps ect.


----------



## royfrosty

Quote:


> Originally Posted by *Clockster*
> 
> Well it seems I got lucky with my card. I'm able to push 1153 core + 600 mem with 50% power and stock volts.
> Only frustrating thing is I have no time to play with it, the little I've done I noticed that with the overclock the min fps jumps up a lot in heaven. Increased by 10 min fps which is great as far as I'm concerned.
> 
> @Silent Scone I love that build lol
> 
> Mine is so vast and the card looks a little lost in there lol


Sorry, how did you managed to overclock the vrams for fury X?


----------



## ECPowers

Quote:


> Originally Posted by *royfrosty*
> 
> Sorry, how did you managed to overclock the vrams for fury X?


Install MSI Afterburner and tick "Extend official overclock limits". This will unlock memory overclocking.

New heaven score:


----------



## Clockster

Quote:


> Originally Posted by *royfrosty*
> 
> Sorry, how did you managed to overclock the vrams for fury X?


Via the overdrive function in ccc.


----------



## blue1512

And a number of restarting


----------



## Gregster

I realised late that the Titan X was only recording at 12mbs instead of the 30mbs I used for the Fury X doh!!!

Anyways, I did it again.


----------



## Gregster

And I recorded PCars


----------



## rdr09

Quote:


> Originally Posted by *Gregster*
> 
> I realised late that the Titan X was only recording at 12mbs instead of the 30mbs I used for the Fury X doh!!!
> 
> Anyways, I did it again.


same . . .



must be view distance. see how the white border line fades with the T X?


----------



## Ganf

People say that AMD makes their colors brighter, but it's the Titan X that looks saturated to me.


----------



## DFroN

Quote:


> Originally Posted by *Gregster*
> 
> And I recorded PCars


The FPS difference is shocking







PCars is one of the games I'm playing the most of at the minute and AMD's performance in it is really off-putting. I'm flip flopping daily between getting a Fury X or not because my favourite monitor is FreeSync.


----------



## rdr09

Quote:


> Originally Posted by *Ganf*
> 
> People say that AMD makes their colors brighter, but it's the Titan X that looks saturated to me.


cards are for 4K and should be used for 4K. But, Greg is just testing i understand . . .



Quote:


> Originally Posted by *DFroN*
> 
> The FPS difference is shocking
> 
> 
> 
> 
> 
> 
> 
> PCars is one of the games I'm playing the most of at the minute and AMD's performance in it is really off-putting. I'm flip flopping daily between getting a Fury X or not because my favourite monitor is FreeSync.


if you'll play nVdia optimized games . . . you should get an nVidia car.


----------



## royfrosty

Alright, today is rather an interesting day.

I have managed to unlocked HBM oc. Although the slider could pull more than 550mhz, it went unstable at 580mhz. But nevertheless, at just 50mhz difference, the bandwidth went from 512gb/s to a whooping 563.2gb/s.

I have bumps the core clock to a minimal 1110mhz core clocked.

Took some benchmarks for Firestrike and SoM. Results were rather interesting.

Firestrike 1.1
Stock - 12854
OC - 13604



Firestrike Extreme 1.1
Stock - 7048
OC - 7459



Firestrike Ultra
Stock - 3896
OC - 4121



SoM all maxed out settings.

Stock
Min: 56.42
Max: 135.91
Avg: 84.94

OC
Min: 60.04
Max: 154.69
Avg: 86.13



If my math is correct, this is about 5% increment in performance!


----------



## Gregster

I got fed up of reviews showing benchmarks being all over the shop, so decided to do my own and record the footage to actually show what is what. I have no idea if anything is going on with BF4 (possibly) and all I am doing is recording what I get. I made a massive booboo on the first recording by only having the capture card set for 12mbs instead of 30mbs, so at least now it is showing exactly how I see it.

More to come and I will be doing a full review of the Fury X.


----------



## Alastair

Quote:


> Originally Posted by *Gregster*
> 
> I got fed up of reviews showing benchmarks being all over the shop, so decided to do my own and record the footage to actually show what is what. I have no idea if anything is going on with BF4 (possibly) and all I am doing is recording what I get. I made a massive booboo on the first recording by only having the capture card set for 12mbs instead of 30mbs, so at least now it is showing exactly how I see it.
> 
> More to come and I will be doing a full review of the Fury X.


We appreciate your efforts, I have also been put off by all these reviews all over the place. The jury for me however is out untile we get some solid overclocking numbers when voltage sliders get unlocked for us. (Maybe some Memory OC'ing as well. The results for that are looking amazing.)


----------



## Gregster

Quote:


> Originally Posted by *Alastair*
> 
> We appreciate your efforts, I have also been put off by all these reviews all over the place. The jury for me however is out untile we get some solid overclocking numbers when voltage sliders get unlocked for us. (Maybe some Memory OC'ing as well. The results for that are looking amazing.)


Agreed and I have overclocked my Fury X to 1128Mhz and anything over was a no sadly. The cards are staying with me till 16/14nm though, so any testing will also be revisited when voltage is sorted.

For reference, my TX runs at 1428Mhz 24/7 on stock volts, so will be interesting to see how max oc for both compare as well.


----------



## kayan

Quote:


> Originally Posted by *rv8000*
> 
> So card finally came in today. The pump noise is pretty bad, card tops out at 1150/525, temps hit 65c after 15 mins of w3 @ 1100/525 +50% power (though I have it set up as single pull atm and haven't adjusted the fan speed yet). Ran a few Firestrike tests but I don't know how much more of this pump noise I can take tonight
> 
> 
> 
> 
> 
> 
> 
> . Installing Metro LL and a few other games for some more testing atm. Anyone have any specifics they want me to run (with the exclusion of GTA and BF4
> 
> 
> 
> 
> 
> 
> 
> )?


What resolution do you play at? If it's 1440p, 1440p ultrawide, or 4k I'd be interested in FPS numbers for Witcher 3 & Metro LL specifically.


----------



## Clockster

All I can say is Fury x's numbers at anything below 4k isn't impressive where as at 4K I can match a 980Ti. Very interesting.


----------



## Minotaurtoo

I don't run 4k monitors...but anyone out there with 3x 1080p eyefinity got any numbers from in game fps benchmarks... particularly interested in bioshock infinite, The Crew, Dirt Rally and ETS2 or ATS


----------



## rv8000

Quote:


> Originally Posted by *kayan*
> 
> What resolution do you play at? If it's 1440p, 1440p ultrawide, or 4k I'd be interested in FPS numbers for Witcher 3 & Metro LL specifically.


1440P. Off the top of my head Metro ll was 78avg 169 max, and cant remember min. Max settings, tess at normal, no blur, no ssaa. W3 was sitting around 55fps avg at ultra with bround characters at high, no blur, sharpening low, no CA, no vignetting; all of this in the first town and the wooded area beyond it.


----------



## blue1512

So far we have 2 lucky guys with 600MHz HBM stable, but none of them showed any bench yet


----------



## Ha-Nocri

Quote:


> Originally Posted by *blue1512*
> 
> So far we have 2 lucky guys with 600MHz HBM stable, but none of them showed any bench yet


Doesn't help much, 1-3 frames, as expected


----------



## kayan

Quote:


> Originally Posted by *rv8000*
> 
> 1440P. Off the top of my head Metro ll was 78avg 169 max, and cant remember min. Max settings, tess at normal, no blur, no ssaa. W3 was sitting around 55fps avg at ultra with bround characters at high, no blur, sharpening low, no CA, no vignetting; all of this in the first town and the wooded area beyond it.


Any chance you can turn everything all the way up on Witcher 3, except hairworks and AA and check again?


----------



## blue1512

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Doesn't help much, 1-3 frames, as expected


How do you know that,? For sure you are not one of those two. Again could anyone bench with 600MHz HBM?


----------



## p4inkill3r

I canceled my order with Amazon this morning; too many kinks- not in stock, new pump revision, etc.

I'll wait until they're in stock and then get one.


----------



## royfrosty

I can't get pass 580mhz to be stable.

My 3Dmark went crashing at 580mhz.

So i just fallback at 550mhz.


----------



## bonami2

Coolermaster pump so much for the reliability they are know to blow up doing nothing.....
Quote:


> Originally Posted by *Ceadderman*
> 
> I would definitely do it had I had one on hand. I still have to wait to get one.
> 
> Maybe Sapphire should send me one for "good knowledge"?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think that Cooler Master mucked this up. That tube should've been set the other direction to keep that uninsulated copper away from the main power source. So in theory what we have here is interference sent through the tube and colliding with the GPU and HBM frequencies. Gee wonder where that whine is coming from?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Heatpipe dont vibrate why do heatpipe with water would vibrate water is an insulator of noise i think.

Now im waiting to see for voltage

Im sure amd is laughing or something and it set at the lowest voltage


----------



## flopper

Quote:


> Originally Posted by *blue1512*
> 
> How do you know that,? For sure you are not one of those two. Again could anyone bench with 600MHz HBM?


UPDATE : We've confirmed with Robert Hallock, technical PR lead at AMD, that while the GPU-Z tool is reporting an increase in memory frequency in reality the frequency did not change. As HBM's frequency in the Radeon R9 Fury X is determined in hardware and cannot be changed through software

Read more: http://wccftech.com/amd-radeon-r9-fury-memory-oveclocked-20/#ixzz3ekTW0zkM


----------



## sugarhell

Quote:


> Originally Posted by *flopper*
> 
> UPDATE : We've confirmed with Robert Hallock, technical PR lead at AMD, that while the GPU-Z tool is reporting an increase in memory frequency in reality the frequency did not change. As HBM's frequency in the Radeon R9 Fury X is determined in hardware and cannot be changed through software
> 
> Read more: http://wccftech.com/amd-radeon-r9-fury-memory-oveclocked-20/#ixzz3ekTW0zkM


PR Damage control nothing else


----------



## Clockster

Quote:


> Originally Posted by *sugarhell*
> 
> PR Damage control nothing else


Yip

Mine def increased and so did my scores, especially min and max fps.


----------



## blue1512

Quote:


> Originally Posted by *Clockster*
> 
> Yip
> 
> Mine def increased and so did my scores, especially min and max fps.


How did your 600 MHz HBM run? I believe HBM at 600 MHz has a linear boost in performance (20%), unlike lower clock. It seems that 500MHz-599MHz use only the 500MHz timing.


----------



## Gdourado

There is much talk about Nvidia optimized games and gameworks...
But how about AMD optimized games?
What are the big AAA games recently that are AMD optimized?

Cheers!


----------



## Agent Smith1984

Man, this thing is getting messy...

You have AMD saying HBM will not OC on software level when it obviously does through a "glitch"

You have locked voltage, and noone has answered the call from a 3rd party software standpoint yet.

You have poor 1080/1440 performance being blamed on drivers, and no driver has been released to correct it.

Then there is this jazz about pump and coil noise....

All on top of not being in stock anywhere anyways....










I hope those who actually own the cards are having good experiences.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gdourado*
> 
> There is much talk about Nvidia optimized games and gameworks...
> But how about AMD optimized games?
> What are the big AAA games recently that are AMD optimized?
> 
> Cheers!


All with Mantle I guess. But for some unknown reasons ([cough]fanboys[/cough]) reviewers usually do not use it


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gdourado*
> 
> There is much talk about Nvidia optimized games and gameworks...
> But how about AMD optimized games?
> What are the big AAA games recently that are AMD optimized?
> 
> Cheers!


Shadow of Mordor, and Far Cry4 seem very AMD happy


----------



## Neon Lights

Quote:


> Originally Posted by *blue1512*
> 
> So far we have 2 lucky guys with 600MHz HBM stable, but none of them showed any bench yet


What are you saying? I gave Benchmark results.


----------



## bonami2

Well this gpu look like


----------



## Agent Smith1984

Quote:


> Originally Posted by *Neon Lights*
> 
> What are you saying? I gave Benchmark results.


Can you share again please.

Looked through your post and just saw the Kombuster improvement.

I'd really like to see some Firestrike or Heaven at 500 then 600 if you don't mind.

Thanks much for sharing so far!!!


----------



## Ceadderman

Quote:


> Originally Posted by *bonami2*
> 
> Coolermaster pump so much for the reliability they are know to blow up doing nothing.....
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> I would definitely do it had I had one on hand. I still have to wait to get one.
> 
> Maybe Sapphire should send me one for "good knowledge"?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think that Cooler Master mucked this up. That tube should've been set the other direction to keep that uninsulated copper away from the main power source. So in theory what we have here is interference sent through the tube and colliding with the GPU and HBM frequencies. Gee wonder where that whine is coming from?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heatpipe dont vibrate why do heatpipe with water would vibrate water is an insulator of noise i think.
> 
> Now im waiting to see for voltage
> 
> Im sure amd is laughing or something and it set at the lowest voltage
Click to expand...

What? I don't understand what you're saying. I think you're leaving some wording out.









~Ceadder


----------



## Gdourado

Quote:


> Originally Posted by *Ha-Nocri*
> 
> All with Mantle I guess. But for some unknown reasons ([cough]fanboys[/cough]) reviewers usually do not use it


That's right...
I don't really hear about AMD optimized games...
Are they really the minority? Or just not talked about?


----------



## HellBoundgr

Im so ready for 4k =) Got my crossfire from XFX today and new monitor/psu. But sadly I got the cards with sticker, but I cant hear any coil noise so im happy with it anyway. But yeah, need a new tower or need to do something.. Have the NZXT 4400 now,I have taken the radiators at the front. So now Its no room for the harddrives. So right now I have taken a sata cabel/power to the outside so I can connect my steam hdd so I can play hehe.. I could take one radiator at the top, but I realy want to have some systemfans to. Time to try out some games, never tried crossfire before and have only used ati/amd since 1997 =p

Cheers.

Pictures:


----------



## bonami2

Quote:


> Originally Posted by *Ceadderman*
> 
> What? I don't understand what you're saying. I think you're leaving some wording out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


i said i think it impossible that the copper pipe conduct noise.

and that amd may have a surprise for us we dont know how the voltage option are going to be and if they are high be amd or if they are low for comsumption


----------



## hyp36rmax

Quote:


> Originally Posted by *HellBoundgr*
> 
> Im so ready for 4k =) Got my crossfire from XFX today and new monitor/psu. But sadly I got the cards with sticker, but I cant hear any coil noise so im happy with it anyway. But yeah, need a new tower or need to do something.. Have the NZXT 4400 now,I have taken the radiators at the front. So now Its no room for the harddrives. So right now I have taken a sata cabel/power to the outside so I can connect my steam hdd so I can play hehe.. I could take one radiator at the top, but I realy want to have some systemfans to. Time to try out some games, never tried crossfire before and have only used ati/amd since 1997 =p
> 
> Cheers.
> 
> Pictures:


Welcome to the Club!


----------



## Ceadderman

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> What? I don't understand what you're saying. I think you're leaving some wording out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i said i think it impossible that the copper pipe conduct noise.
> 
> and that amd may have a surprise for us we dont know how the voltage option are going to be and if they are high be amd or if they are low for comsumption
Click to expand...

How's that? Electricity is noise. How does an uninsulated tube not cunduct electricity and therefore noise. I'm not seeing it.









~Ceadder


----------



## bonami2

Quote:


> Originally Posted by *Ceadderman*
> 
> How's that? Electricity is noise. How does an uninsulated tube not cunduct electricity and therefore noise. I'm not seeing it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


im not that good with electricity. but water may act as a sound dampener i think it the word im french

well pump from corsair and coolermaster are know to rattle sometime even my fish pump do make vibration sometime


----------



## Ceadderman

Vibration maybe, water is a conductive property as well so I'm reasonably sure that wouldn't dampen the noise.









~Ceadder


----------



## Ganf

Quote:


> Originally Posted by *bonami2*
> 
> im not that good with electricity. but water may act as a sound dampener i think it the word im french
> 
> well pump from corsair and coolermaster are know to rattle sometime even my fish pump do make vibration sometime


It's not likely to be the pipe making the noise. That piece of copper is firmly seated, but any EM interference between the coils and the pipe that could cause vibrations is just as likely to cause it in the coils as the pipe. Whichever object is easier for the interference to move physically will be the object that gets moved, and thus create noise.


----------



## Agent Smith1984

Quote:


> Originally Posted by *HellBoundgr*
> 
> Im so ready for 4k =) Got my crossfire from XFX today and new monitor/psu. But sadly I got the cards with sticker, but I cant hear any coil noise so im happy with it anyway. But yeah, need a new tower or need to do something.. Have the NZXT 4400 now,I have taken the radiators at the front. So now Its no room for the harddrives. So right now I have taken a sata cabel/power to the outside so I can connect my steam hdd so I can play hehe.. I could take one radiator at the top, but I realy want to have some systemfans to. Time to try out some games, never tried crossfire before and have only used ati/amd since 1997 =p
> 
> Cheers.
> 
> Pictures:


Very nice!!!

Does that 4k TV have a DP input??

Seems like 2 Fury X cards is overkill for 30 FPS 4k gaming.

Not sure if you know or not, but you will be capped without HDMI 2.0.


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Very nice!!!
> 
> Does that 4k TV have a DP input??
> 
> Seems like 2 Fury X cards is overkill for 30 FPS 4k gaming.
> 
> Not sure if you know or not, but you will be capped without HDMI 2.0.


That's a Philips BDM4065UC PC monitor with DP1.2


----------



## Agent Smith1984

Quote:


> Originally Posted by *hyp36rmax*
> 
> That's a Philips BDM4065UC PC monitor with DP1.2


OMG!!

I saw 40" and assumed TV.....

Wow, this guy just dumped some cash on his fun machine didn't he? lol

Nice go!!


----------



## Ganf

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OMG!!
> 
> I saw 40" and assumed TV.....
> 
> Wow, this guy just dumped some cash on his fun machine didn't he? lol
> 
> Nice go!!


Wait how has this monitor not been on my radar?

4k/60hz 40" VA panel monitor for $900?

Sweet Jeebus....

http://www.amazon.com/Philips-BDM4065UC-Resolution-Speakers-DisplayPort/dp/B00SCX78JS


----------



## Thoth420

Ermagherd that's a 4k *monitor*?









No matter how hard I try....will always be a peasant.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Ganf*
> 
> Wait how has this monitor not been on my radar?
> 
> 4k/60hz 40" VA panel monitor for $900?
> 
> Sweet Jeebus....
> 
> http://www.amazon.com/Philips-BDM4065UC-Resolution-Speakers-DisplayPort/dp/B00SCX78JS












But i'm still wanting those ultra-wides..


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ganf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> OMG!!
> 
> I saw 40" and assumed TV.....
> 
> Wow, this guy just dumped some cash on his fun machine didn't he? lol
> 
> Nice go!!
> 
> 
> 
> Wait how has this monitor not been on my radar?
> 
> 4k/60hz 40" VA panel monitor for $900?
> 
> Sweet Jeebus....
> 
> http://www.amazon.com/Philips-BDM4065UC-Resolution-Speakers-DisplayPort/dp/B00SCX78JS
Click to expand...

That's been on sale here for a while.....the $1100 price tag keeps throwing me off though and i keep looking forward to the LG IPS 4k Freesync monitor launching end of the month......smaller yes but cheaper and also IPS, Freesync is just a bonus


----------



## Agent Smith1984

I am curious as to how bad 30FPS would on 4k, cause I got a 4k 120hz TV I am thinking about picking up....

If anything I could play at 1440p 60 for now right? And then figure out an adapter later?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's been on sale here for a while.....the $1100 price tag keeps throwing me off though and i keep looking forward to the LG IPS 4k Freesync monitor launching end of the month......smaller yes but cheaper and also IPS, Freesync is just a bonus


4k freeSync... up to 60Hz? Can it go any higher @4k? What connector do you need for that?


----------



## Thoth420

I would love to be able to afford hardware to drive 4k but am also addicted to 120hz+ so I guess I have to be satisfied with my 27 inch 1440 reso for now to get that.


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OMG!!
> 
> I saw 40" and assumed TV.....
> 
> Wow, this guy just dumped some cash on his fun machine didn't he? lol
> 
> Nice go!!


LOL! Yea I saw the prices for it also. Not bad! Been wanting to go larger from my ASUS PB287Q 28" 4K to a 40".


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That's been on sale here for a while.....the $1100 price tag keeps throwing me off though and i keep looking forward to the LG IPS 4k Freesync monitor launching end of the month......smaller yes but cheaper and also IPS, Freesync is just a bonus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4k freeSync... up to 60Hz? Can it go any higher @4k? What connector do you need for that?
Click to expand...

Final specs for it haven't been released for it yet but hopefully we'll see it soon









http://www.lg.com/au/it-monitors/lg-27MU67

27" is fine by me tbh, I also want to know is if it can still do 1440p freesync and at what refresh rate for that......


----------



## HellBoundgr

The scaling is perfect on this screen. I did go from a PB278Q 1440p, the color was great on that screen..I just need to find some calibration settings for this 4k and I think its on same level







Only tried witcher so far, looks so good! =) to bad about that flicker in CF







But I see some settings I can try out around the net maybe.


----------



## Thoth420

Yeah 40 inches for 4k is awesome PPI imo. Hey maybe in a couple years I could run 4k...for you guys with deeper wallets now... I am happy the bigger 4k monitors are rolling out.


----------



## KnightWolf654

Quote:


> Originally Posted by *bonami2*
> 
> Im affraid of 4gb hbm so i think im passing on the gpu gen. 5760x1080 need ram


that's what i am using and this card has handled everything i can throw at it. bf4, max payne 3, GTA5 all sitting at 60+ FPS avg with everything turned up.


----------



## bonami2

Quote:


> Originally Posted by *KnightWolf654*
> 
> that's what i am using and this card has handled everything i can throw at it. bf4, max payne 3, GTA5 all sitting at 60+ FPS avg with everything turned up.


Well if you says that it back on my wishlist.

Just realised that sli 970 can perform and even 7990 can perform so yea vram is aint that important with 3+ gb Anyways i have 4x4 2400mhz cl10 to backup at worst scenario case and with direct x 12 we may have multi gpu model Scaleability + Vram sharing


----------



## xer0h0ur

That is a grossly misquoted feature of DX12 that has been around since day 1 of Mantle yet no developer has attempted to climb that mountain. Neither Mantle nor DX12 automatically stack vRAM. They have to specifically use split frame rendering then figure out a clever way of coding it so that widely ranging amounts of vRAM can stack together. Props to the first developer that takes this on and implements it correctly but if I had to predict it, I would bet heavy on nearly no one bothering to do it as its going to complicate any game's development quite a bit.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is a grossly misquoted feature of DX12 that has been around since day 1 of Mantle yet no developer has attempted to climb that mountain. Neither Mantle nor DX12 automatically stack vRAM. They have to specifically use split frame rendering then figure out a clever way of coding it so that widely ranging amounts of vRAM can stack together. Props to the first developer that takes this on and implements it correctly but if I had to predict it, I would bet heavy on nearly no one bothering to do it as its going to complicate any game's development quite a bit.


Samung already made hmp Homogeneous multis processing for the Galaxy s6. So they should start to Accelerate Developpment If samsung can make 2 cpu from different model work in a os.

Why gpu could not ahah.

Sure im aint trusting to much this to happen but i can still dream


----------



## p4inkill3r

So, I canceled my order with Amazon this morning.
I just got an auto-notify from Newegg that the Sapphire Fury X was in stock with Shoprunner shipping; bought it and didn't have to pay any tax on it for a savings of $50








Quote:


> Sales Order Date: 7/2/2015 2:01:47 PM
> Shipping Method: ShopRunner 2Day
> 
> 1 x ($649.99) Sapphire Radeon R9 FURY X 4GB HBM PCI-E HDMI / TRI $649.99
> Subtotal: $649.99
> Tax: $0.00
> Shipping and Handling: $0.00
> Total Amount: $649.99


----------



## rv8000

Quote:


> Originally Posted by *kayan*
> 
> Any chance you can turn everything all the way up on Witcher 3, except hairworks and AA and check again?


Card is already packaged up to be replaced for the pump noise sorry







. All of the other settings I disabled have no bearing on fps (maybe 1 or 2 difference), Hairworks was also disabled but I did have AA on, not entirely sure how It would relate to fps difference. Without a benchmark it is also really hard to compare unless you can run a similar path as the person doing the testing.


----------



## kayan

Quote:


> Originally Posted by *rv8000*
> 
> Card is already packaged up to be replaced for the pump noise sorry
> 
> 
> 
> 
> 
> 
> 
> . All of the other settings I disabled have no bearing on fps (maybe 1 or 2 difference), Hairworks was also disabled but I did have AA on, not entirely sure how It would relate to fps difference. Without a benchmark it is also really hard to compare unless you can run a similar path as the person doing the testing.


Did you RMA to manufacturer or return for refund?


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is a grossly misquoted feature of DX12 that has been around since day 1 of Mantle yet no developer has attempted to climb that mountain. Neither Mantle nor DX12 automatically stack vRAM. They have to specifically use split frame rendering then figure out a clever way of coding it so that widely ranging amounts of vRAM can stack together. Props to the first developer that takes this on and implements it correctly but if I had to predict it, I would bet heavy on nearly no one bothering to do it as its going to complicate any game's development quite a bit.


Well I learned something...and that's why I love these boards. +Rep


----------



## xer0h0ur

Hell, if developers just started using split frame rendering without stacking the vRAM it would still be an improvement as SFR is used sparingly and is quite smooth and efficient for multi-gpu rendering anyways.


----------



## the9quad

Quote:


> Originally Posted by *KnightWolf654*
> 
> that's what i am using and this card has handled everything i can throw at it. bf4, max payne 3, GTA5 all sitting at 60+ FPS avg with everything turned up.


I had no idea one furyx @ 5760x1080 was faster than 3 290x's @1440p. I cant get a SOLID 60 fps with *everything* turned up in GTA V at 1440P, in fact reviews have the furyx _AVERAGING_ 72 fps at 1440p, and that is not with "everything" turned up. How did you get yours to get so fast pushing more pixels?


----------



## rv8000

Quote:


> Originally Posted by *kayan*
> 
> Did you RMA to manufacturer or return for refund?


The egg, they had some more sapphires in stock so I just bought another one and changed the RMA to a refund instead of replacement. If it's at the new jersey warehouse I should have the card tomorrow, if it's at the cali warehouse monday. I will test for you once I have the new card If no one else does in the meantime.


----------



## Thoth420

I'm trying to change my refund from Tiger Direct now to an replacement...I also don't mind what brand since it's reference just want a revision model with no pump sound. Their phone customer support is terrible...at least today...and this is why I tend to use the egg exclusively.


----------



## bonami2

Quote:


> Originally Posted by *the9quad*
> 
> I had no idea one furyx @ 5760x1080 was faster than 3 290x's @1440p. I cant get a SOLID 60 fps with *everything* turned up in GTA V at 1440P, in fact reviews have the furyx _AVERAGING_ 72 fps at 1440p, and that is not with "everything" turned up. How did you get yours to get so fast pushing more pixels?


fury x is almost worth 2 290x

100% on first

50-95% scaling on second

15-60% on third gpu

That mostly the scaling i seen in review the third gpu do almost nothing and the 4 do like 15% max

But yea maybe not maxed out at 60fps but im sure it close.

Im pushing 25-40fps on high with my 7950 at that res so 2x = 60fps +


----------



## the9quad

Quote:


> Originally Posted by *bonami2*
> 
> fury x is almost worth 2 290x
> 
> 100% on first
> 
> 50-95% scaling on second
> 
> 15-60% on third gpu
> 
> That mostly the scaling i seen in review the third gpu do almost nothing and the 4 do like 15% max
> 
> But yea maybe not maxed out at 60fps but im sure it close.
> 
> Im pushing 25-40fps on high with my 7950 at that res so 2x = 60fps +


Fury is consistently behind the 295x2. 295x2 isn't much different than 2 290x's. I am just curious how he maintains everything maxed out in gta v and 60 fps in the countryside at that res.



here is a quote about gtav at 1440p and the furyx:

"Well, this isn't a great start for the AMD Fury X. I_t is barely outperforming the Radeon R9 290X_ at 2560x1440 with our image quality settings shown above. Both the GTX 980 Ti and the Titan X are able to hit 60 FPS while the Fury X is only able to average 45 FPS or so. During gameplay as well in the benchmark there were several instances of stutter and freezes on AMD's newest flagship"


----------



## Agent Smith1984

I would think the $600 always in stock 295x2 with its 5632 shaders and aio cooling package would be flying off the shelf right now.....


----------



## bonami2

well some of my game dont support sli crossfire.

Anyways 1440p is 2.7m while 5760x1080 is 4.2m pixel. there a bit more work done here and we see the fury scale better at high res so maybe 295x perform less


----------



## littlestereo

Droppin off these Fury X Firestrike Ultra 4k benches and a quick synopsis of my experience with the card since last Friday (6/26 overnight shipped from Newegg, placed order on 6/24 at 2:25pm EST right after the auto-notify came in) .

http://www.3dmark.com/compare/fs/5276383/fs/5256512

*Benchmarks/ OC*: The card is an XFX Fury X OC'd with MSI Afterburner (Driver = 15.15.1004-150619a-185674E) and one of the lucky few without any coil whine. I was able to overclock the memory to ~650 max but that created heavy artifacting. It runs stable with a +80-100MHz memory overclock and core pushed +85MHz. On the scores the 4225 was with a 2500k pushed to 4.8GHz and the Fury X at 1135 MHz core with 600 MHz memory. The 3816 was with CPU at 4.4 GHZ and Fury X stock clocks.

*Gaming:* Playing Tomb Raider, Assassins Creed: Black Flag, GRID Autosport, Grey Goo, Lego Pirate's of the Carribean at 45-60FPS with no dips below 35 at 4K (VSR). To get these frame rates my settings are AA off, shadows set high (no soft shadows, HBAO set to - or low) everything else Max/Ultra. Coming from an EVGA ACX 2.1 SC GTX970 the biggest difference to me is the change in temps and noise. The card feels like even at 100% it's not being pushed very hard based on the sound and heat output. Certainly feels like there's a ton of headroom for overvolting I just hope the chip can handle it and the artifacting at higher clocks can be corrected with some more voltage.

*Misc:* Another thing I've noticed is the color quality when paired with an IPS monitor is incredible. I've been playing around with the settings in CCC as well as monitor calibrations and scenery has higher dynamic range which subtly makes for a more natural look than any of my previous cards. The stock calibrations feel a bight too bright for TN panels and maybe a bit oversaturated but can be easily turned down and a bit of gamma correction.


----------



## Ganf

Quote:


> Originally Posted by *the9quad*
> 
> Fury is consistently behind the 295x2. 295x2 isn't much different than 2 290x's. I am just curious how he maintains everything maxed out in gta v and 60 fps in the countryside at that res.
> 
> 
> 
> here is a quote about gtav at 1440p and the furyx:
> 
> "Well, this isn't a great start for the AMD Fury X. I_t is barely outperforming the Radeon R9 290X_ at 2560x1440 with our image quality settings shown above. Both the GTX 980 Ti and the Titan X are able to hit 60 FPS while the Fury X is only able to average 45 FPS or so. During gameplay as well in the benchmark there were several instances of stutter and freezes on AMD's newest flagship"


Screwed up review. Pretty much everyone else showed they could get a solid 60fps in the game.

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-4.html

http://www.techspot.com/review/1024-and-radeon-r9-fury-x/page7.html

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,16.html

http://www.bit-tech.net/hardware/graphics/2015/06/24/amd-radeon-r9-fury-x-review/7

And there's more examples, but that isn't to say they didn't screw up their reviews in other places.

I'm so sick of reading reviews that're inconsistent or just flat out publishing obviously failed test results and saying "Nerp, Don't know what happened there, probably drivers or something. Good luck!"


----------



## the9quad

Quote:


> Originally Posted by *Ganf*
> 
> Screwed up review. Pretty much everyone else showed they could get a solid 60fps in the game.
> 
> http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-4.html
> 
> http://www.techspot.com/review/1024-and-radeon-r9-fury-x/page7.html
> 
> http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,16.html
> 
> http://www.bit-tech.net/hardware/graphics/2015/06/24/amd-radeon-r9-fury-x-review/7
> 
> And there's more examples, but that isn't to say they didn't screw up their reviews in other places.
> 
> I'm so sick of reading reviews that're inconsistent or just flat out publishing obviously failed test results and saying "Nerp, Don't know what happened there, probably drivers or something. Good luck!"


not sure what you are saying, that is what i said. It *averages* around 60 fps at 1440p, and that isnt even maxed


----------



## Ganf

Quote:


> Originally Posted by *the9quad*
> 
> not sure what you are saying, that is what i said. It *averages* around 60 fps at 1440p, and that isnt even maxed


Actually those reviews show it averaging 70 at stock.


----------



## kayan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I would think the $600 always in stock 295x2 with its 5632 shaders and aio cooling package would be flying off the shelf right now.....


Quote:


> Originally Posted by *bonami2*
> 
> well some of my game dont support sli crossfire.
> 
> Anyways 1440p is 2.7m while 5760x1080 is 4.2m pixel. there a bit more work done here and we see the fury scale better at high res so maybe 295x perform less


I am in the same boat. I currently use a 295x2 and it's great, if there are xfire profiles, however the big game that I want to play right now the xfire profiles are meh, Witcher 3.

I think I'll hold out for Windows 10 drivers, and if they still don't fix the texture flicker in W3 I'm getting a Fury or Fury X.


----------



## Agent Smith1984

Quote:


> Originally Posted by *kayan*
> 
> I am in the same boat. I currently use a 295x2 and it's great, if there are xfire profiles, however the big game that I want to play right now the xfire profiles are meh, Witcher 3.
> 
> I think I'll hold out for Windows 10 drivers, and if they still don't fix the texture flicker in W3 I'm getting a Fury or Fury X.


295X to fury?

Profiles?

Patience.... Please... 5632 shaders vs 4096 shaders.... Think about it!

If you aren't getting 2 furies don't bother?

Sorry, I'm ranting


----------



## Bludge

Anyone have ideas for wht amd control centre won't launch? June 20 drivers installed, after a driver clean, all working fine, just no ccc launching

Sent from my LG-D802T using Tapatalk


----------



## Casey Ryback

Quote:


> Originally Posted by *Bludge*
> 
> Anyone have ideas for wht amd control centre won't launch? June 20 drivers installed, after a driver clean, all working fine, just no ccc launching
> 
> Sent from my LG-D802T using Tapatalk


So it's not there at all?

(Clean drivers and reinstall - make sure you're restarting the pc between cleaning etc)

or you click the symbol and it won't open?

(Not sure how to fix - probably above method lol)


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 295X to fury?
> 
> Profiles?
> 
> Patience.... Please... 5632 shaders vs 4096 shaders.... Think about it!
> 
> If you aren't getting 2 furies don't bother?
> 
> Sorry, I'm ranting


Yea but that thing is from middle age you know cant compare a old Tatra 8x8 36liter motor with 400hp with a brand new 6liter diesel


----------



## kayan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 295X to fury?
> 
> Profiles?
> 
> Patience.... Please... 5632 shaders vs 4096 shaders.... Think about it!
> 
> If you aren't getting 2 furies don't bother?
> 
> Sorry, I'm ranting


Quote:


> Originally Posted by *bonami2*
> 
> Yea but that thing is from middle age you know cant compare a old Tatra 8x8 36liter motor with 400hp with a brand new 6liter diesel


Don't get me wrong, I love my 295x2, for the games where crossfire is used, but other games, namely Witcher where it uses either one of the 295x2 GPU cores (basically a 290x) or both and has very very bad texture flickering, it not really playable. The fury kinda makes more sense. Don't have to wait for crossfire profiles to take advantage of the better performance.

So, like I said. I am waiting, but the game has been out for over a month already and it still doesn't work. I'm sure they won't release more drivers for anything besides W10. Hence my previous statement of waiting for that before doing anything. Shader cores aren't everything you know...


----------



## Gregster

I felt it only fair that I give the Fury X a review.




I waffle a bit but I feel it is fair


----------



## Agent Smith1984

Quote:


> Originally Posted by *bonami2*
> 
> Yea but that thing is from middle age you know cant compare a old Tatra 8x8 36liter motor with 400hp with a brand new 6liter diesel


That's hilarious! Lol

But seriously.. We all know the architecture is not improved on a per shader basis... It's even rocking the same 64 rop.... 128 rop, 5632 shaders, 1024 bit bus at 1250+ (with oc)...

Just saying... Want a single card solution that kills everything for $600?

295X2 baby!

But if cf pinches your nerves, obviously furyx is a good alternative....


----------



## Casey Ryback

Quote:


> Originally Posted by *Gregster*
> 
> I felt it only fair that I give the Fury X a review.
> 
> I waffle a bit but I feel it is fair












Hitting the highest benchmark scores don't really matter if it can play ultra at 1440p smoothly it's a solid card. (Agree though slightly lower price would make it a hit)

No coil whine or loud pump noise with your card?


----------



## Minotaurtoo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's hilarious! Lol
> 
> But seriously.. We all know the architecture is not improved on a per shader basis... It's even rocking the same 64 rop.... 128 rop, 5632 shaders, 1024 bit bus at 1250+ (with oc)...
> 
> Just saying... Want a single card solution that kills everything for $600?
> 
> 295X2 baby!
> 
> But if cf pinches your nerves, obviously furyx is a good alternative....


cf is pinching my nerves... bad I've got 3 games that won't even use it... one that just craps itself... and two that have crossfire stutter or weird graphics issues like misplaced grass floating in the air... most do ok.. but just happens that one that has stutter does ok on one card, but only at about 40fps... well.. fury x is just a tad bit stronger them my two cards together.. 390x is a tad weaker so I don't want to back up... fury x it is.


----------



## bonami2

Quote:


> Originally Posted by *Minotaurtoo*
> 
> cf is pinching my nerves... bad I've got 3 games that won't even use it... one that just craps itself... and two that have crossfire stutter or weird graphics issues like misplaced grass floating in the air... most do ok.. but just happens that one that has stutter does ok on one card, but only at about 40fps... well.. fury x is just a tad bit stronger them my two cards together.. 390x is a tad weaker so I don't want to back up... fury x it is.


beam ng at 30fps fury x i will drift like mad g27 powa at like 60fps

Game do not support crossfire suposetly and maybe sli some user said. So i have no idea but it a cpu bound game like arma 3 so most of the people are like it aint working cpu is like melting at 100% on one core

Anyone want to try crossfire on that game for me on high res


----------



## Bludge

Quote:


> Originally Posted by *Casey Ryback*
> 
> So it's not there at all?
> 
> (Clean drivers and reinstall - make sure you're restarting the pc between cleaning etc)
> 
> or you click the symbol and it won't open?
> 
> (Not sure how to fix - probably above method lol)


Its all installed into directories etc, but doesnt load on startup and just gives me the windows "thinking about it" circle for a few seconds when doubleclicked. install over the top did nothing, I'll strip it all out and try again.


----------



## Yorkston

Well I got mine from Newegg today, Sapphire model. Sticker version, bad pump whine, and a poor clocker to boot. This one is gonna get RMA'd.


----------



## rv8000

Quote:


> Originally Posted by *Yorkston*
> 
> Well I got mine from Newegg today, Sapphire model. Sticker version, bad pump whine, and a poor clocker to boot. This one is gonna get RMA'd.


When did you order it?


----------



## Yorkston

Quote:


> Originally Posted by *rv8000*
> 
> When did you order it?


Tuesday


----------



## rv8000

Quote:


> Originally Posted by *Yorkston*
> 
> Tuesday


Hmmm, I truly hope the one I ordered today doesn't have the issue. The other one im returning was ordered on release day, sticker on the pump and bad whine.


----------



## Ganf

Quote:


> Originally Posted by *Yorkston*
> 
> Well I got mine from Newegg today, Sapphire model. Sticker version, bad pump whine, and a poor clocker to boot. This one is gonna get RMA'd.


Sapphire probably bought the Lion's share of the first batch. It might be worth advising people to avoid them and grab one of the other brands for a while.


----------



## rv8000

In any of the investigative articles, did reviewers ever mention if the "fix" or "revised pump" was actually a physical change to parts aside from the aio unit cover, and not something simple like a bios update to adjust the pump speeds?


----------



## p4inkill3r

Quote:


> Originally Posted by *Gregster*
> 
> I felt it only fair that I give the Fury X a review.
> 
> 
> 
> 
> I waffle a bit but I feel it is fair


Nice review.


----------



## xer0h0ur

Nope. I also doubt AMD will tell anyone what it was for the sake of not throwing a partner under the bus.


----------



## blue1512

Quote:


> Originally Posted by *littlestereo*
> 
> Droppin off these Fury X Firestrike Ultra 4k benches and a quick synopsis of my experience with the card since last Friday (6/26 overnight shipped from Newegg, placed order on 6/24 at 2:25pm EST right after the auto-notify came in) .
> 
> http://www.3dmark.com/compare/fs/5276383/fs/5256512
> 
> *Benchmarks/ OC*: The card is an XFX Fury X OC'd with MSI Afterburner (Driver = 15.15.1004-150619a-185674E) and one of the lucky few without any coil whine. I was able to overclock the memory to ~650 max but that created heavy artifacting. It runs stable with a +80-100MHz memory overclock and core pushed +85MHz. On the scores the 4225 was with a 2500k pushed to 4.8GHz and the Fury X at 1135 MHz core with 600 MHz memory. The 3816 was with CPU at 4.4 GHZ and Fury X stock clocks.
> 
> *Gaming:* Playing Tomb Raider, Assassins Creed: Black Flag, GRID Autosport, Grey Goo, Lego Pirate's of the Carribean at 45-60FPS with no dips below 35 at 4K (VSR). To get these frame rates my settings are AA off, shadows set high (no soft shadows, HBAO set to - or low) everything else Max/Ultra. Coming from an EVGA ACX 2.1 SC GTX970 the biggest difference to me is the change in temps and noise. The card feels like even at 100% it's not being pushed very hard based on the sound and heat output. Certainly feels like there's a ton of headroom for overvolting I just hope the chip can handle it and the artifacting at higher clocks can be corrected with some more voltage.
> 
> *Misc:* Another thing I've noticed is the color quality when paired with an IPS monitor is incredible. I've been playing around with the settings in CCC as well as monitor calibrations and scenery has higher dynamic range which subtly makes for a more natural look than any of my previous cards. The stock calibrations feel a bight too bright for TN panels and maybe a bit oversaturated but can be easily turned down and a bit of gamma correction.


Another lucky guy with 600Mhz








On other note, image rendered by NVidia is too pale imho. AMD is the only choice for an IPS screen, at least with me.


----------



## Bludge

Quote:


> Originally Posted by *Ganf*
> 
> Sapphire probably bought the Lion's share of the first batch. It might be worth advising people to avoid them and grab one of the other brands for a while.


Aren't Sapphire the main OEM builder for AMD ref. cards anyway?


----------



## magicc8ball

Quote:


> Originally Posted by *Bludge*
> 
> Aren't Sapphire the main OEM builder for AMD ref. cards anyway?


I believe it is Sapphire.


----------



## Silent Scone

lol - yes Sapphire are OEM for AMD, have been for a fair few years.

Here's mine at 1110/500. I don't believe it's too bad respectively for a 4690K.

http://www.3dmark.com/3dm/7597042


----------



## en9dmp

Who here is running with a sapphire card? I can't seem to run with any overclock at all on either core or memory... Even a 2% core OC will crash in fire strike or any game


----------



## looncraz

Quote:


> Originally Posted by *littlestereo*
> 
> http://www.3dmark.com/compare/fs/5276383/fs/5256512


So, why did you use CPU clocks for the later benchmarks? Have to keep things even for an accurate comparison.


----------



## looncraz

Can someone post up the Fury X's BIOS? I've been digging around in my 290's BIOS quite a bit and would be quite interested in what the Fury X has held within.


----------



## tx12

BIOS already posted several times:

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/1360#post_24088811
http://forums.guru3d.com/showpost.php?p=5108888&postcount=51

BIOS tools:
http://www.overclock.net/t/1562998/amd-fiji-atiflash-tool


----------



## KnightWolf654

Quote:


> Originally Posted by *the9quad*
> 
> I had no idea one furyx @ 5760x1080 was faster than 3 290x's @1440p. I cant get a SOLID 60 fps with *everything* turned up in GTA V at 1440P, in fact reviews have the furyx _AVERAGING_ 72 fps at 1440p, and that is not with "everything" turned up. How did you get yours to get so fast pushing more pixels?


you got me on the GTA5 thing, i must have originally tested it with a few of the settings turned down. benchmark revealed an avg of 45fps with everything turned up.


----------



## Sgt Bilko

Looks like Asus designed the new Strix cooler for the Fury judging by the heatsink/die contact on the 980Ti version:


----------



## royfrosty

All right lets start of with Gaming Benchmarks.

Note that i have bumped the core clock to 1110mhz and 570mhz Mem Clock.

Results are somewhat rather surprising.

http://s995.photobucket.com/user/royfrosty/media/Far Cry 4_zps1juc9oaf.png.html

http://s995.photobucket.com/user/royfrosty/media/Hitman_zpsyeeq2ckk.png.html

http://s995.photobucket.com/user/royfrosty/media/Thief_zpsqcbp2j4z.png.html

http://s995.photobucket.com/user/royfrosty/media/SoM_zpsjq4rw3is.png.html

http://s995.photobucket.com/user/royfrosty/media/Tomb Raider_zpsvvrexr9i.png.html

http://s995.photobucket.com/user/royfrosty/media/Unigine_zpssaidgclo.png.html

http://s995.photobucket.com/user/royfrosty/media/Firestrike_zps0refmlym.png.html

http://s995.photobucket.com/user/royfrosty/media/Firestrike Extreme_zpswskgeiqw.png.html

http://s995.photobucket.com/user/royfrosty/media/Firestrike Ultra_zps7hgio71m.png.html

http://s995.photobucket.com/user/royfrosty/media/Overlook_zps4wchpyke.png.html

Credit to TomsHardware, Guru3D, Anandtech and OC3D scores.

Here is my test bench specs

MSI Z97S SLI Plus + i7-4770k
Avexir Raiden Series 2x4gb 2133mhz Dual Channel Kit
Crucial m550 512gb SSD
Sapphire r9 Fury X
Superflower Leadex 1000w 80+ Platinum Full Modular


----------



## Newbie2009

Not bad at all


----------



## p4inkill3r

Quote:


> Originally Posted by *royfrosty*
> 
> All right lets start of with Gaming Benchmarks.
> 
> Note that i have bumped the core clock to 1110mhz and 570mhz Mem Clock.
> 
> Results are somewhat rather surprising.
> 
> Credit to TomsHardware, Guru3D, Anandtech and OC3D scores.
> 
> Here is my test bench specs
> 
> MSI Z97S SLI Plus + i7-4770k
> Avexir Raiden Series 2x4gb 2133mhz Dual Channel Kit
> Crucial m550 512gb SSD
> Sapphire r9 Fury X
> Superflower Leadex 1000w 80+ Platinum Full Modular


Those are some excellent results!


----------



## Neon Lights

Quote:


> Originally Posted by *blue1512*
> 
> Another lucky guy with 600Mhz


To clarify: As long as one enables the Unofficial Overclocking Mode in MSI Afterburner, it should be possible to overclock the memory.


----------



## Osjur

Aah got my card as well. Oh boy how excited I am when I finally get the PLP-Eyefinity support which I've been waiting for forever. And it works!


----------



## blue1512

Quote:


> Originally Posted by *Neon Lights*
> 
> To clarify: As long as one enables the Unofficial Overclocking Mode in MSI Afterburner, it should be possible to overclock the memory.


Hey, it's old already. The 'luck' lies in 600MHz, coz not many card are able to reach that clock.


----------



## josephimports

Second Sapphire from Newegg just arrived. This one was ordered Tuesday.



Spoiler: Warning: Spoiler!







Fortunately, my launch day Sapphire exhibits no pump noise.


----------



## hyp36rmax

*Source: *Link

Finally some direct crossfire comparisons with a Titan X and GTX 980Ti SLI! AS AMD's focus on 4k is apparent you can really see the R9 FURY X shine in 4K Crossfire going neck and neck with Nvidia's offering.

















I'd also like to see some more Nvidia focused titles such as Project Cars and the Witcher 3.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hyp36rmax*
> 
> *Source: *Link
> 
> Finally some direct crossfire comparisons with a Titan X and GTX 980Ti SLI! AS AMD's focus on 4k is apparent you can really see the R9 FURY X shine in 4K Crossfire going neck and neck with Nvidia's offering.


CFX is a beast once it works.


----------



## Ganf

Lol at the minimum frames on Metro: LL. It takes at least $1300 worth of GPU to get out of the single digits.


----------



## blue1512

Quote:


> Originally Posted by *Ganf*
> 
> Lol at the minimum frames on Metro: LL. It takes at least $1300 worth of GPU to get out of the single digits.


And $2000 worth of GPU for anyone in the green ecosystem. They still have to deal with the IQ by the way


----------



## Agent Smith1984

Good Lord, pretty obvious what amd did with the fury x now...
They built it for 4k crossfire!

Seems like gpu's are directly targeted at different resolution markets now...

390 for 1080, 390x for 1440, and fury for 4k

970 for 1080, 980 for 1440, 980ti for 4k

I think nvidia and amd are both doing great jobs right now.

Especially after seeing those benches.

TPU & TT are both without a doubt my favorite review sites, and they never come across as amd or nvidia haters.


----------



## provost

Quote:


> Originally Posted by *Gregster*
> 
> I realised late that the Titan X was only recording at 12mbs instead of the 30mbs I used for the Fury X doh!!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyways, I did it again.


I am still liking the "vibrancy" of Furyx better


----------



## Gdourado

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good Lord, pretty obvious what amd did with the fury x now...
> They built it for 4k crossfire!
> 
> Seems like gpu's are directly targeted at different resolution markets now...
> 
> 390 for 1080, 390x for 1440, and fury for 4k
> 
> 970 for 1080, 980 for 1440, 980ti for 4k
> 
> I think nvidia and amd are both doing great jobs right now.
> 
> Especially after seeing those benches.
> 
> TPU & TT are both without a doubt my favorite review sites, and they never come across as amd or nvidia haters.


Is the fury overkill for 1440p?
Should I save 300 and just get a 390X from msi?


----------



## Blackops_2

Quote:


> Originally Posted by *Gdourado*
> 
> Is the fury overkill for 1440p?
> Should I save 300 and just get a 390X from msi?


There is no such thing as overkill for a constantly changing market. Get what you can afford. There will always be more demanding games.


----------



## Thoth420

Quote:


> Originally Posted by *Gdourado*
> 
> Is the fury overkill for 1440p?
> Should I save 300 and just get a 390X from msi?


1440 60hz or 1440 120hz+?

Either way I say it isn't overkill. Def not if you plan playing on a high refresh 1440.
Also how long before you plan on an upgrade?


----------



## littlestereo

Quote:


> Originally Posted by *looncraz*
> 
> So, why did you use CPU clocks for the later benchmarks? Have to keep things even for an accurate comparison.


Most Firestrike Ultra Fury X benches I've seen are from 5860x systems or 4790k's so to get a comparable score with minimal CPU limitation I maxed them both out. The comparison was meant to show how the Fury X responds to overclocking from CPU + GPU combined vs. more stable clocks on both. I'll post benches with a constant CPU clock while OC'ing the Fury X to different levels later today for the strictly GPU-GPU OC'd comparison.


----------



## BIGTom

XFX Radeon Fury X delivered this morning. Upgraded from a 7970Ghz. Installation was a cinch. Excellent performance on 3440x1440 ultra wide monitor. I am amazed at how cool and quiet the card is running.

Thanks to everyone for sharing their experiences, I couldn't be more pleased with the Fury X


----------



## josephimports

Fury X CFX 3DMark Ultra 4K











Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## wiak

Quote:


> Originally Posted by *Gregster*
> 
> Thought I would test the Fury X against the Titan X in my system. Both using the same settings and same CPU and the results are not great for the Fury X in truth. Mantle was even slower than DX11, which a few people have attested to.


PSA: Mantle has been deprecated. use Vulkan/DX12 instead. AMD has not optimized it on the Fury X. Mantle has always been a test API
Quote:


> Originally Posted by *HellBoundgr*
> 
> Im so ready for 4k =) Got my crossfire from XFX today and new monitor/psu. But sadly I got the cards with sticker, but I cant hear any coil noise so im happy with it anyway. But yeah, need a new tower or need to do something.. Have the NZXT 4400 now,I have taken the radiators at the front. So now Its no room for the harddrives. So right now I have taken a sata cabel/power to the outside so I can connect my steam hdd so I can play hehe.. I could take one radiator at the top, but I realy want to have some systemfans to. Time to try out some games, never tried crossfire before and have only used ati/amd since 1997 =p
> 
> Cheers.
> 
> Pictures:


its FuryFire mind you

Soundtrack for Fury


----------



## kayan

Quote:


> Originally Posted by *BIGTom*
> 
> XFX Radeon Fury X delivered this morning. Upgraded from a 7970Ghz. Installation was a cinch. Excellent performance on 3440x1440 ultra wide monitor. I am amazed at how cool and quiet the card is running.
> 
> Thanks to everyone for sharing their experiences, I couldn't be more pleased with the Fury X


Nice. Do you have Witcher 3? If so, can you max everything but AA and Hairworks and tell me the FPS @ your native Resolution? Also same but including AA. If so, I'd be super appreciative!


----------



## StereoPixel

Fury X Mantle Numbers



http://www.golem.de/news/grafikkarte-auch-fury-x-rechnet-mit-der-mantle-schnittstelle-flotter-1507-115005.html


----------



## wiak

Quote:


> Originally Posted by *littlestereo*
> 
> Droppin off these Fury X Firestrike Ultra 4k benches and a quick synopsis of my experience with the card since last Friday (6/26 overnight shipped from Newegg, placed order on 6/24 at 2:25pm EST right after the auto-notify came in) .
> 
> http://www.3dmark.com/compare/fs/5276383/fs/5256512
> 
> *Benchmarks/ OC*: The card is an XFX Fury X OC'd with MSI Afterburner (Driver = 15.15.1004-150619a-185674E) and one of the lucky few without any coil whine. I was able to overclock the memory to ~650 max but that created heavy artifacting. It runs stable with a +80-100MHz memory overclock and core pushed +85MHz. On the scores the 4225 was with a 2500k pushed to 4.8GHz and the Fury X at 1135 MHz core with 600 MHz memory. The 3816 was with CPU at 4.4 GHZ and Fury X stock clocks.
> 
> *Gaming:* Playing Tomb Raider, Assassins Creed: Black Flag, GRID Autosport, Grey Goo, Lego Pirate's of the Carribean at 45-60FPS with no dips below 35 at 4K (VSR). To get these frame rates my settings are AA off, shadows set high (no soft shadows, HBAO set to - or low) everything else Max/Ultra. Coming from an EVGA ACX 2.1 SC GTX970 the biggest difference to me is the change in temps and noise. The card feels like even at 100% it's not being pushed very hard based on the sound and heat output. Certainly feels like there's a ton of headroom for overvolting I just hope the chip can handle it and the artifacting at higher clocks can be corrected with some more voltage.
> 
> *Misc:* Another thing I've noticed is the color quality when paired with an IPS monitor is incredible. I've been playing around with the settings in CCC as well as monitor calibrations and scenery has higher dynamic range which subtly makes for a more natural look than any of my previous cards. The stock calibrations feel a bight too bright for TN panels and maybe a bit oversaturated but can be easily turned down and a bit of gamma correction.


eyefinity display controllers doing their work eh? and pretty nice review there mate


----------



## Agent Smith1984

Microsoft wrote AMD a big check and said "kill mantle and give us the tech" lol

They said, "okay cool, we can build an hbm card now"


----------



## littlestereo

*Firestrike Ultra 4k - Fury X Overclocking*

These are my results after running Firestrike Ultra a couple dozen times with different clocks (CPU = i2500k stock clock). First figure is core overclock (MHz), second is Memory overclock (MHz). The best performance of my card appears to be +80 core, +100 mem. For max stability I'd recommend something around *+75, +75*.

My total system power draw at load from my UPS on stock clocks is around 450 watts (including monitors). After OC'ing the core to 1125 power draw was up to 470 watts. Pushing the memory to 600 MHz with the core at 1125 MHz draw was around 475-480watts. With memory OC'd to 600 and core at stock there was no noticeable difference in power draw (~450) from stock mem clocks.

*stock core, stock mem = 3734*

stock core, +50 mem = 3768
stock core, +75 mem = 3805
stock core, +100 mem = 3804

+50 core, stock mem = 3868
+70 core, stock mem = 3934
+85 core, stock mem = 3964

+50 core, +50 mem = 3908
+75 core, +75 mem = 4013
+80 core, +65 mem = 3983
+85 core, +80 mem = 4035
+75core , +100 mem = 4018
*+80 core, +100 mem = 4130*
+85 core, +100 mem = 4073

http://www.3dmark.com/compare/fs/5292892/fs/5293806/fs/5293842/fs/5293880/fs/5292962/fs/5293658/fs/5293012/fs/5293313/fs/5293427/fs/5293612/fs/5293539/fs/5293390/fs/5293567


----------



## neurotix

Do you guys have voltage control yet?

What kind of core clocks are you getting?


----------



## littlestereo

Quote:


> Originally Posted by *neurotix*
> 
> Do you guys have voltage control yet?
> 
> What kind of core clocks are you getting?


Max stable core I've gotten without artifacting is around 1130MHz.

Voltage hasn't been unlocked yet, word on the street is Sapphire TriXX will have it unlocked before the Air Fury launch in the next week or two


----------



## Gregster

Right chaps, the best quality I could get. Recorded at 1080P with 60fps using Raptr for AMD and ShadowPlay for Nvidia.

Again, nothing has been touched from either (not even the colours on the AMD side of things.

DDU - Instaled AMD Drivers, recorded with Raptr
DDU - installed Nvidia drivers, recorded with ShadowPlay
Nothing changed anywhere - Recorded with 30mbs and 1080P on both and set for 60 fps recording.

I didn't even alter the colour settings this time for AMD and left everything default. thoughts?


----------



## Ceadderman

That Fury X...









Can't even tell much of a difference if at all in FPS even though the counter shows that Titan is running a 30fps advantage over Fury.

Color saturation look more vivid with Fury X. Not so washed out in the light with Fury X. I could see that Titan looked reasonable but there was minor color fade occasionally. There was none with Fury X.

AMD really outdid themselves with this one imho.









~Ceadder


----------



## BackwoodsNC

Quote:


> Originally Posted by *littlestereo*
> 
> Max stable core I've gotten without artifacting is around 1130MHz.
> 
> Voltage hasn't been unlocked yet, word on the street is Sapphire TriXX will have it unlocked before the Air Fury launch in the next week or two


Where did you here that?


----------



## Gregster

Quote:


> Originally Posted by *Ceadderman*
> 
> That Fury X...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't even tell much of a difference if at all in FPS even though the counter shows that Titan is running a 30fps advantage over Fury.
> 
> Color saturation look more vivid with Fury X. Not so washed out in the light with Fury X. I could see that Titan looked reasonable but there was minor color fade occasionally. There was none with Fury X.
> 
> AMD really outdid themselves with this one imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I also did a run with changing the NVCP to "Use my preference emphasising performance" over the default "let the 3D application decide".


----------



## Ceadderman

^I think I just soiled myself.









Looked just as good, and the Fury X even passed Titan for fps in a good part of that run?

I bet team Green HQ is frantically running about trying to get HBM off their drawing boards and onto their next card.


















~Ceadder


----------



## Ganf

Quote:


> Originally Posted by *Gregster*
> 
> Right chaps, the best quality I could get. Recorded at 1080P with 60fps using Raptr for AMD and *ShadowPlay* for Nvidia.


I've seen a lot of long time YTers say they stopped using shadowplay because it's compression does not mesh well with YT compression algorithms and makes a lot of things look like a hot mess.

No official source, just random comments by people in their videos that would take me a week to relocate. TB made mention of it once, that's the only one I can name drop.


----------



## Gregster

Quote:


> Originally Posted by *Ceadderman*
> 
> ^I think I just soiled myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looked just as good, and the Fury X even passed Titan for fps in a good part of that run?
> 
> I bet team Green HQ is frantically running about trying to get HBM off their drawing boards and onto their next card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Ermmm, that video was both Nvidia runs









One is with the default profile that is set when you just install the drivers and the other is when you change the 3D settings to "prefer max quality"









Quote:


> Originally Posted by *Ganf*
> 
> I've seen a lot of long time YTers say they stopped using shadowplay because it's compression does not mesh well with YT compression algorithms and makes a lot of things look like a hot mess.
> 
> No official source, just random comments by people in their videos that would take me a week to relocate. TB made mention of it once, that's the only one I can name drop.


The early ShadowPlay was pretty poor and I have been using it pretty much from the off but settings and quality have improved a lot now and you can record 2160P 60fps if you wanted. I hope GVR gets some good options as well, as I would love to record 1440P but sadly only 1080P is an option


----------



## Ganf

Quote:


> Originally Posted by *Gregster*
> 
> Ermmm, that video was both Nvidia runs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One is with the default profile that is set when you just install the drivers and the other is when you change the 3D settings to "prefer max quality"
> 
> 
> 
> 
> 
> 
> 
> 
> The early ShadowPlay was pretty poor and I have been using it pretty much from the off but settings and quality have improved a lot now and you can record 2160P 60fps if you wanted. I hope GVR gets some good options as well, as I would love to record 1440P but sadly only 1080P is an option


Thanks for updating me.


----------



## Ceadderman

Haha that will certainly teach me to assume that it's straight like for like comparisons. I'm sure you can see how that confused me.









~Ceadder


----------



## GorillaSceptre

Quote:


> Originally Posted by *Gregster*
> 
> Ermmm, that video was both Nvidia runs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One is with the default profile that is set when you just install the drivers and the other is when you change the 3D settings to "prefer max quality"



















Nice work, i gave you a sub







That guns a bit big though


----------



## Gregster

Quote:


> Originally Posted by *GorillaSceptre*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice work, i gave you a sub
> 
> 
> 
> 
> 
> 
> 
> That guns a bit big though


Thanks very much. Many more vids to come









I also expect performance to get closer now I will be recording at 1440P for both and I also expect drivers will show some nice gains for the Fury X


----------



## bonami2

People need to think the titan x is out for some time. = 980ti driver are already perfect

fury x is out now with new design they need to build a new driver just for that architecture now im sure in 6 month it will perform better


----------



## Gregster

Quote:


> Originally Posted by *bonami2*
> 
> People need to think the titan x is out for some time. = 980ti driver are already perfect
> 
> fury x is out now with new design they need to build a new driver just for that architecture now im sure in 6 month it will perform better


I agree and performance will get better. Hopefully they will have it nailed within a couple of months and another plus for the Fury is Windows 10, which I will be switching to on the 28th and performance is up in a lot of games on that OS for AMD, so good things to come.


----------



## Ceadderman

Quote:


> Originally Posted by *Gregster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bonami2*
> 
> People need to think the titan x is out for some time. = 980ti driver are already perfect
> 
> fury x is out now with new design they need to build a new driver just for that architecture now im sure in 6 month it will perform better
> 
> 
> 
> I agree and performance will get better. Hopefully they will have it nailed within a couple of months and another plus for the Fury is Windows 10, which I will be switching to on the 28th and performance is up in a lot of games on that OS for AMD, so good things to come.
Click to expand...

Agreed.









~Ceadder


----------



## provost

Quote:


> Originally Posted by *royfrosty*
> 
> All right lets start of with Gaming Benchmarks.
> 
> Note that i have bumped the core clock to 1110mhz and 570mhz Mem Clock.
> 
> Results are somewhat rather surprising.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/royfrosty/media/Far Cry 4_zps1juc9oaf.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Hitman_zpsyeeq2ckk.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Thief_zpsqcbp2j4z.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/SoM_zpsjq4rw3is.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Tomb Raider_zpsvvrexr9i.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Unigine_zpssaidgclo.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Firestrike_zps0refmlym.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Firestrike Extreme_zpswskgeiqw.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Firestrike Ultra_zps7hgio71m.png.html
> 
> http://s995.photobucket.com/user/royfrosty/media/Overlook_zps4wchpyke.png.html
> 
> Credit to TomsHardware, Guru3D, Anandtech and OC3D scores.
> 
> Here is my test bench specs
> 
> MSI Z97S SLI Plus + i7-4770k
> Avexir Raiden Series 2x4gb 2133mhz Dual Channel Kit
> Crucial m550 512gb SSD
> Sapphire r9 Fury X
> Superflower Leadex 1000w 80+ Platinum Full Modular


Nice OC results, and thanks for sharing.

but, with all due respect to the newbie members, do we have any data from the old timer bench markers here, or have they not yet received their cards?


----------



## Mad Pistol

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Right chaps, the best quality I could get. Recorded at 1080P with 60fps using Raptr for AMD and ShadowPlay for Nvidia.
> 
> Again, nothing has been touched from either (not even the colours on the AMD side of things.
> 
> DDU - Instaled AMD Drivers, recorded with Raptr
> DDU - installed Nvidia drivers, recorded with ShadowPlay
> Nothing changed anywhere - Recorded with 30mbs and 1080P on both and set for 60 fps recording.
> 
> I didn't even alter the colour settings this time for AMD and left everything default. thoughts?


I'm in the process of uploading a video of me doing the same thing with the same settings on my GTX 780 and shadowplay.

Standby.

EDIT: Done




Ultra settings, 1080P, 60FPS. I checked nvidia control panel, and the setting for texture filtering is "quality." I turned on the on-screen nvidia FPS counter, but it didn't show up in the video.







It was between 65-100 FPS for this test.

Comparison Screenshot:



I hate to burst your bubble guys, but I'm pretty sure this is an issue with Youtube compression of videos recorded in Shadowplay. The in-game quality is far superior to that seen in my video.

I can post a screen shot of any of those areas to prove my point. It's shadowplay, not the Titan X.


----------



## xer0h0ur

http://wccftech.com/unlock-memory-overclocking-amd-r9-fury/

"Moving over to over-volting on the Fury X. The current lack of voltage control doesn't mean that the voltage on Fury X cards is locked. This is simply the case because no 3rd party software has been updated to recognize the voltage regulator of the card so far, which is necessary to enable voltage control on Fury X cards. Fortunately we've heard that an update for Sapphire's Trixx overclocking software is in the works. The update aims to address this exact issue by unlocking voltage control on Fiji based cards, including the Fury X and Fury."

"The overclocking guru aka Unwinder who's responsible for the RivaTuner GPU overclocking software that forms the backbone of almost all 3rd party overclocking tools did not receive a Fury X card at launch. Which is why we've yet to see an update come out for MSI Afterburner as of yet that enables voltage adjustments on AMD Radeon R9 Fury X cards."


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> http://wccftech.com/unlock-memory-overclocking-amd-r9-fury/
> 
> "Moving over to over-volting on the Fury X. The current lack of voltage control doesn't mean that the voltage on Fury X cards is locked. This is simply the case because no 3rd party software has been updated to recognize the voltage regulator of the card so far, which is necessary to enable voltage control on Fury X cards. Fortunately we've heard that an update for Sapphire's Trixx overclocking software is in the works. The update aims to address this exact issue by unlocking voltage control on Fiji based cards, including the Fury X and Fury."
> 
> "The overclocking guru aka Unwinder who's responsible for the RivaTuner GPU overclocking software that forms the backbone of almost all 3rd party overclocking tools did not receive a Fury X card at launch. Which is why we've yet to see an update come out for MSI Afterburner as of yet that enables voltage adjustments on AMD Radeon R9 Fury X cards."


Trixx update you are dreaming.

This software is the worst piece of crap i have ever seen. Im using the 3 or 4 oldest version from the current one because all of the other are crashing when i press setting.. And they always did in about 4 windows 8 8.1 install

Gonna stick with msi afterburner







Asus + evga are good too of what i remember









Btw i heard that enabling the advanced overclocking in Msi afterburner Allow Hbm overclock

From a french review.

Now need to have update for voltage. my 7950 is only supported in trixx because sapphire used a unknow Voltage controller or idk


----------



## josephimports

3DMark Ultra CFX

Stock
Graphics score 7472

OC 1125/570*
Graphics score 8030



Spoiler: Warning: Spoiler!






*Memory OC on GPU1 only


----------



## Osjur

Quote:


> Originally Posted by *Mad Pistol*
> 
> I can post a screen shot of any of those areas to prove my point. It's shadowplay, not the Titan X.


Please do because this shows that there is a difference. These screenshots were taken from the game:


----------



## bonami2

Quote:


> Originally Posted by *Osjur*
> 
> Please do because this shows that there is a difference. These screenshots were taken from the game:


All i can says is the amd image is clear and sharp on my 1080p 23" ips

While the titan is blurry so maybe he just running faster or something


----------



## Mad Pistol

Quote:


> Originally Posted by *Osjur*
> 
> Please do because this shows that there is a difference. These screenshots were taken from the game:




From my 780.

Admittedly, my 780 does seem to show quality closer to the Fury X than the Titan X, IMO.

Maybe it is time to investigate this further.


----------



## Blameless

Quote:


> Originally Posted by *Mad Pistol*
> 
> I hate to burst your bubble guys, but I'm pretty sure this is an issue with Youtube compression of videos recorded in Shadowplay. The in-game quality is far superior to that seen in my video.


Irrelevant, at least with regards to the filtering issue.

Your texture filtering is fine, and this is very clearly so in your video. It would be clear in a 320x240, 256-color, animated .gif.
Quote:


> Originally Posted by *bonami2*
> 
> All i can says is the amd image is clear and sharp on my 1080p 23" ips
> 
> While the titan is blurry so maybe he just running faster or something


The video clearly shows significant differences in texture filtering. Has nothing to do with blur, or running speed, or encode quality.

Either there are settings that are different between the two cards or the drivers aren't applying the correct level of filtering.
Quote:


> Originally Posted by *Mad Pistol*
> 
> 
> 
> From my 780.


That's what it should look like, and what I'd expect from any GPU, 980Ti/Titan X included, unless someone was forcing AF disabled in the drivers or the drivers are limiting AF.


----------



## Ceadderman

@bonami... EVGA is strictly nVidia manufacturer. They have nothing to do with AMD. So I get using ASUS and MSi as reliable software sources, but EVGA?









~Ceadder


----------



## Tivan

Quote:


> Originally Posted by *Mad Pistol*
> 
> 
> 
> From my 780.
> 
> Admittedly, my 780 does seem to show quality closer to the Fury X than the Titan X, IMO.
> 
> Maybe it is time to investigate this further.


Maybe the lack of this kind of 'optimization' is what's putting the 700 series behind in some tests vs 900 series. (though there's also some actual feature advantage of course)

Of course just my wild guess on this topic!

But yeah doing some 700 vs 900 testing might be interesting in select games

edit:
Quote:


> Originally Posted by *Osjur*
> 
> Please do because this shows that there is a difference. These screenshots were taken from the game:


With the High Quality setting, the TX looks a lot like the 700 series/FuryX level quality, hmm. (maybe even more defined in the distance vs the 780 shot)


----------



## Mad Pistol

Quote:


> Originally Posted by *Tivan*
> 
> Maybe the lack of this kind of 'optimization' is what's putting the 700 series behind in some tests vs 900 series. (though there's also some actual feature advantage of course)
> 
> Of course just my wild guess on this topic!
> 
> But yeah doing some 700 vs 900 testing might be interesting in select games
> 
> edit:
> With the High Quality setting, the TX looks a lot like the 700 series/FuryX level quality, hmm. (maybe even more defined in the distance vs the 780 shot)


My thoughts exactly. It begs the question of what the 780 would look like at "high quality" in the Texture Filtering mode.

Off-topic, I'm creating a new thread for this over in the Graphics Cards - General section of the forum.

http://www.overclock.net/t/1563386/texture-filtering-quality-thread-amd-gcn-vs-nvidia-maxwell-and-kepler/0_30

Lets see if we can figure out what's really going on.


----------



## looncraz

Quote:


> Originally Posted by *Osjur*
> 
> Please do because this shows that there is a difference. These screenshots were taken from the game:




R9 290, [email protected] (VSR), no AA, Ultra everything else, no adjustments in CCC.. Usually at or above 60fps, but dips down to the 40s in some very heavy scenes. AA is totally unneeded on my 24" screen using VSR









I have a new monitor coming, still 1080p, still 24", XL2411Z. Will be running 120hz so my VSR res limit drops to something like [email protected], so I may need a little AA to keep it looking nice.

To me, at least, it seems that the AMD settings are tuned for quality quite a bit more than the drivers do for the Titan X.

For more screenshots:

http://files.looncraz.net/bf4/

I have monster internet bandwidth (incl 50mbps upload), so I didn't bother making these smaller... that, and compression causes artifacts


----------



## Noufel

i tried my best to take screen shots at the same place, this is on ultra settings no AF applied from the controle panel and it's running on 980G1 sli on stock.
i'll let you people judge thumb.gif
but i've noticed that in the last one the container in the background is better rendered in the fury screen shot .


----------



## Balsagna

No wonder Titan's and the TI outdue the Fury... Nvidia cuts the quality down for them lol...

I'm just kidding


----------



## PlugSeven

Quote:


> Originally Posted by *Osjur*


Quote:


> Originally Posted by *Balsagna*
> 
> No wonder Titan's and the TI outdue the Fury... Nvidia cuts the quality down for them lol...
> 
> I'm just kidding


A 9% hit in performance when rendering an image closer in quality to what the Fury does, you might not be kidding as much as you think you're.


----------



## Gregster

A lot closer this time for the Fury X and keeping up well with the factory overclocked Titan X.


----------



## Gregster

GTA V compared between the 2. Guys, should I put these here or do you think I should put them somewhere else? If so, where please?


----------



## provost

That's good work Gregster. How about starting a new thread similar to your other one so that it can be its own separate topic?
just a thought.....

on a separate note, now I am really looking forward to receiving my Furyx. Looks like fun to play around with until the 14/16 nm node shrink with hbm2 and real jump in performance.
What a long run for this 28nm.. lol


----------



## Gregster

Quote:


> Originally Posted by *provost*
> 
> That's good work Gregster. How about starting a new thread similar to your other one so that it can be its own separate topic?
> just a thought.....
> 
> on a separate note, now I am really looking forward to receiving my Furyx. Looks like fun to play around with until the 14/16 nm node shrink with hbm2 and real jump in performance.
> What a long run for this 28nm.. lol


Cheers Provost









I agree with putting these into their own thread, as there will be lots of them.

I also agree that 28nm needs to do one. Great performance from AMD and Nvidia this time but it is now time to move on. HBM 2 as you say will be great and bring it on I say.


----------



## crislevin

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> 
> 
> GTA V compared between the 2. Guys, should I put these here or do you think I should put them somewhere else? If so, where please?


Loved the videos, I didn't realize Titan X always running that hot?


----------



## bonami2

Quote:


> Originally Posted by *Ceadderman*
> 
> @bonami... EVGA is strictly nVidia manufacturer. They have nothing to do with AMD. So I get using ASUS and MSi as reliable software sources, but EVGA?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yea i know but i was saying that their software is made correctly while sapphire is aint working


----------



## Mega Man

Quote:


> Originally Posted by *Mad Pistol*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tivan*
> 
> Maybe the lack of this kind of 'optimization' is what's putting the 700 series behind in some tests vs 900 series. (though there's also some actual feature advantage of course)
> 
> Of course just my wild guess on this topic!
> 
> But yeah doing some 700 vs 900 testing might be interesting in select games
> 
> edit:
> With the High Quality setting, the TX looks a lot like the 700 series/FuryX level quality, hmm. (maybe even more defined in the distance vs the 780 shot)
> 
> 
> 
> My thoughts exactly. It begs the question of what the 780 would look like at "high quality" in the Texture Filtering mode.
> 
> Off-topic, I'm creating a new thread for this over in the Graphics Cards - General section of the forum.
> 
> http://www.overclock.net/t/1563386/texture-filtering-quality-thread-amd-gcn-vs-nvidia-maxwell-and-kepler/0_30
> 
> Lets see if we can figure out what's really going on.
Click to expand...

you all act like nvoidia hasnt been doign that for years, news flash they have. i wanna get my customers to upgrade ......


----------



## Silent Scone

Firestrike 1110/500 & 1110/550


----------



## looncraz

Quote:


> Originally Posted by *Mega Man*
> 
> you all act like nvoidia hasnt been doign that for years, news flash they have. i wanna get my customers to upgrade ......


Yup, this was discovered years ago, nVidia has always used lower quality default settings than AMD, except during a little period with 10.10 drivers, IIRC, where the default AF settings were screwed up. nVidia pounced on it.

There are many forums devoted to the differences.

Including this one with a poll:

http://www.overclock.net/t/1462291/amd-vs-nvidia-image-quality

The difference is rather small, usually, but you will certainly notice it if you're paying close attention - or are on a very large screen.


----------



## Neon Lights

There appears to be a memory bug present in GTA V when using 2 Fury Xs in Crossfire.


----------



## Silent Scone

That VRAM total never works regardless of card.


----------



## provost

Quote:


> Originally Posted by *Silent Scone*
> 
> Firestrike 1110/500 & 1110/550


Ok. This is a member I do recognize from his OG Titan days... lol
Tks for sharing this.
So, Scone, this score looks a bit (much?) different (lower) than what some other scores we saw earlier in this thread. Different benchmarks, different clocks or Tess tweaks?
ps. I think I am thinking of 20k score somewhere, if not in this thread


----------



## PriestOfSin

Thinking of picking a fury x up in a few weeks. Is xfx a good manufacturer?


----------



## Ceadderman

Indeed. Of course my last experience with them is on 5770 Radeon HD. But it OC'ed really well for Gamimg. I really love mine. Then I went to Sapphire 6870 Radeon HD and that one you cannot clock for whatever reason. At least I wasn't able to do it before I drained the loop and shelved my system for a bit. Now I'm molding the system so who knows what surprises are in store for me.









~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *PriestOfSin*
> 
> Thinking of picking a fury x up in a few weeks. Is xfx a good manufacturer?


Yeah they are good, haven't had a bad experiance with them so far and i'll be going with XFX for my Fury X as well


----------



## Thoth420

Mine is XFX haven't gotten to try it yet. I chose them becaue their customer service is good in case I need an RMA.


----------



## Casey Ryback

Quote:


> Originally Posted by *PriestOfSin*
> 
> Thinking of picking a fury x up in a few weeks. Is xfx a good manufacturer?


Not the greatest, badly designed R9 290/290X's resulted in extremely hot vrm temps, ie 100C+ stock.

But it won't matter in this instance as XFX don't make the fury X they just box it up and sell it.


----------



## Ceadderman

~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PriestOfSin*
> 
> Thinking of picking a fury x up in a few weeks. Is xfx a good manufacturer?
> 
> 
> 
> Not the greatest, badly designed R9 290/290X's resulted in extremely hot vrm temps, ie 100C+ stock.
> 
> But it won't matter in this instance as XFX don't make the fury X they just box it up and sell it.
Click to expand...

There were a couple that hit high vrm temps at stock but my 290's and my 290x are fine.

They fixed that for the 300 series anyway:


Spoiler: Warning: Spoiler!







Anyways as you said, doesn't matter anyways seeing as it's ref design but the customer service is good now compared to what it used to be


----------



## Casey Ryback

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There were a couple that hit high vrm temps at stock but my 290's and my 290x are fine.


Ah ok maybe they revised the design, not sure.

Anyway looks like they didn't forget heatsinks on the 300 series!


----------



## FLaguy954

Any 290 owners in here? What do you all plan on upgrading to (if you are even upgrading): the Fury, Fury X, or Fury Nano?

I'm thinking of the Fury myself depending on how it performs vs a Fury X. This will also influence my perception of the $550 price tag (which I still think is a little too high).


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> There were a couple that hit high vrm temps at stock but my 290's and my 290x are fine.
> 
> 
> 
> Ah ok maybe they revised the design, not sure.
> 
> Anyway looks like they didn't forget heatsinks on the 300 series!
Click to expand...

That they certainly didn't, Have the reason i went XFX with my cards was because of all the horror stories I'd heard about them over the years and i can honestly say while the 290/x didn't have great vrm cooling compared to the others they made up for it with the 300 series and as before, my experience with their customer service has been top notch, replies take a little while longer than you'd like but they are always accommodating









Just watching the effort they've put in between the 7900 series and 300 series has bumped them up a few notches on my AIB list for AMD


----------



## Casey Ryback

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just watching the effort they've put in between the 7900 series and 300 series has bumped them up a few notches on my AIB list for AMD


Yeah I think they had some not so great 7000 series cards too.

It's often some bad products that can then make a company really focus on that weakness, and then suddenly they have the best coolers the following series.

That 300 series PCB looks pretty good. It'll be interesting to see who does the best fury cooler on the upcoming air cooled version.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Just watching the effort they've put in between the 7900 series and 300 series has bumped them up a few notches on my AIB list for AMD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I think they had some not so great 7000 series cards too.
> 
> It's often some bad products that can then make a company really focus on that weakness, and then suddenly they have the best coolers the following series.
Click to expand...

Exactly, seeing as how the 390x DD keeps up with the 390x Tri-X makes me very hopeful for a beastly Fury DD


----------



## hyp36rmax

Quote:


> Originally Posted by *FLaguy954*
> 
> Any 290 owners in here? What do you all plan on upgrading to (if you are even upgrading): the Fury, Fury X, or Fury Nano?
> 
> I'm thinking of the Fury myself depending on how it performs vs a Fury X. This will also influence my perception of the $550 price tag (which I still think is a little too high).


I've got crossfire r9 290x 8gb cards in my current setup and looking forward to a couple Fury X2's. Sure enough I do have Fury X on the radar.


----------



## flopper

Quote:


> Originally Posted by *FLaguy954*
> 
> Any 290 owners in here? What do you all plan on upgrading to (if you are even upgrading): the Fury, Fury X, or Fury Nano?
> 
> I'm thinking of the Fury myself depending on how it performs vs a Fury X. This will also influence my perception of the $550 price tag (which I still think is a little too high).


Fury makes most sense to upgrade to for me.
Normally I dont upgrade in the entusiast end but I rather buy the fury as the difference with fps will be small and null in actual practice vs the fury x.
Nano goes away as its limited in power for OC.


----------



## looncraz

Quote:


> Originally Posted by *FLaguy954*
> 
> Any 290 owners in here? What do you all plan on upgrading to (if you are even upgrading): the Fury, Fury X, or Fury Nano?
> 
> I'm thinking of the Fury myself depending on how it performs vs a Fury X. This will also influence my perception of the $550 price tag (which I still think is a little too high).


I own an R9 290 and absolutely love it! For the games I run (BF4, Civ V, Hitman Absolution, Far Cry 3, MS Flight Sim X) it is capable of maximum quality with maximum VSR of 3200x1800, without AA (AA makes things look worse for me in most games, I'd rather increase IQ settings)... all at 60hz or better (usually use VSync).

However, I have a 144hz 1080p monitor coming, running at 120hz in-game will drop my VSR limit to 2048x1536 - a pretty big drop I'm sure to notice! At the same time, though, I will gain frames to help fill in the refresh cycles, so it will be a matter of compromises for my current games. The only upcoming game I'm looking at getting is the new Hitman game coming this winter, so it I probably won't be on the market for a higher end card until after I see how that runs. Then again, that darn Nano is sooo cute...


----------



## Silent Scone

Quote:


> Originally Posted by *provost*
> 
> Ok. This is a member I do recognize from his OG Titan days... lol
> Tks for sharing this.
> So, Scone, this score looks a bit (much?) different (lower) than what some other scores we saw earlier in this thread. Different benchmarks, different clocks or Tess tweaks?
> ps. I think I am thinking of 20k score somewhere, if not in this thread


20k graphics was with tess tweaks. The overal is lower than most because it's 1080p firestrike and this box is a 4690k clocked at 4.4. The graphics score of over 17k is pretty ball park









Was just showing the difference with an extra 50mhz on the HBM.


----------



## royfrosty

Im currently an owner of 2 MSI r9 290 Gaming edition + current Fury X.

The 2 r9 290 is currently cooled in my rig with Aquacomputer waterblocks.

What made my mind to swap over to a single Fury X is because of the power consumption and i was actually looking for a single card that allows me to game on my 1440p monitor. It was scaling very badly for the 2 r9 290 for most games.

So far i have no regrets changing it to a single Fury X. But however after seeing Tweaktown's scaling for 4k, i am still contemplating to change to a 4k monitor and dual Fury X and both waterblock it into my current Project GeneXis.


----------



## xer0h0ur

295X2 + 290X here. Have no reason to upgrade really. I am just waiting for Arctic Islands mid to late next year for 8GB HBM and the node shrink to 14/16 nanometer. I am taking a pass on Fury while I wait for the dust to settle on Win10/DX12 and AMD sorting out their drivers. If they maintain this fractured driver structure with more attention being paid to adding features to the 3XX/Fury series then I may end up getting a Fury X2.


----------



## yawa

I'm on the fence. Watercooled 290X owner here. I guess I'm waiting to see what specific drivers do for this thing before I pull the trigger. Otherwise I don't mind skipping a generation. HBM and overclocking potential (with voltage bumps) are also what I'm waiting to see.


----------



## xer0h0ur

Not to mention that by the time Pascal and Arctic Islands rolls around AMD will have had a year+ worth of time optimizing for HBM whether it be the manufacturing process or driver optimization.


----------



## blue1512

Quote:


> Originally Posted by *royfrosty*
> 
> Im currently an owner of 2 MSI r9 290 Gaming edition + current Fury X.
> 
> The 2 r9 290 is currently cooled in my rig with Aquacomputer waterblocks.
> 
> What made my mind to swap over to a single Fury X is because of the power consumption and i was actually looking for a single card that allows me to game on my 1440p monitor. It was scaling very badly for the 2 r9 290 for most games.
> 
> So far i have no regrets changing it to a single Fury X. But however after seeing Tweaktown's scaling for 4k, i am still contemplating to change to a 4k monitor and dual Fury X and both waterblock it into my current Project GeneXis.


Waiting for your project, hope that you will share your work in OCN








http://www.overclock.net/f/154/case-mod-work-logs


----------



## FLaguy954

Thanks for the responses everyone.

Who's going to slap a H100i or water block on an air-cooled Fury?

I'm seriously considering the former a couple of months from now. Should be more economical vs getting the Fury X.


----------



## pdasterly

Quote:


> Originally Posted by *FLaguy954*
> 
> Thanks for the responses everyone.
> 
> Who's going to slap a H100i or water block on an air-cooled Fury?
> 
> I'm seriously considering the former a couple of months from now. Should be more economical vs getting the Fury X.


waiting to see price of dual gpu fury x


----------



## Mega Man

Quote:


> Originally Posted by *PriestOfSin*
> 
> Thinking of picking a fury x up in a few weeks. Is xfx a good manufacturer?


they can be depending on what you want

for ref cards they rock, once they start gimping cards ( as all manufactures do ) they can be bad, voltage locked, removal of bios switch ect

but ref they let you take off coolers in USA and it doesnt void your warranty IE you can watercool and keep your warranty

when i get my fury ( s ) all 4 will either be xfx or msi


----------



## the9quad

Quote:


> Originally Posted by *Mega Man*
> 
> they can be depending on what you want
> 
> for ref cards they rock, once they start gimping cards ( as all manufactures do ) they can be bad, voltage locked, removal of bios switch ect
> 
> but ref they let you take off coolers in USA and it doesnt void your warranty IE you can watercool and keep your warranty
> 
> when i get my fury ( s ) all 4 will either be xfx or msi


You gonna get the nano's?


----------



## Nizzen

AMD does not work with 144hz on Asus Swift. Only 120hz. I have Fury X, and the same problem was with 290 and 290x too. 750ti/980/titan X works with 144hz with the same DP cable.

Is it only an Asus Swift bug, or can't AMD do 144hz?







`


----------



## Mega Man

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> they can be depending on what you want
> 
> for ref cards they rock, once they start gimping cards ( as all manufactures do ) they can be bad, voltage locked, removal of bios switch ect
> 
> but ref they let you take off coolers in USA and it doesnt void your warranty IE you can watercool and keep your warranty
> 
> when i get my fury ( s ) all 4 will either be xfx or msi
> 
> 
> 
> You gonna get the nano's?
Click to expand...

eventually i now have a tx10 i need 2 new systems to fill in

right at this time my baby is bigest concern ( due 8/8/2015 )

after that later this year i will get another fully watercooled quadfire system


----------



## littlestereo

What kind of PSU would you need to run 2 Fury X's under water with maximum overvolting (once it's unlocked) + CPU OC'd, case fans, etc? I'm seeing less than 500 watt draw (at the UPS) under max load with one fury X currently, should/ is Fiji as power hungry as the 290x with overclocking from an architectural standpoint?


----------



## Gumbi

Quote:


> Originally Posted by *littlestereo*
> 
> What kind of PSU would you need to run 2 Fury X's under water with maximum overvolting (once it's unlocked) + CPU OC'd, case fans, etc? I'm seeing less than 500 watt draw (at the UPS) under max load with one fury X currently, should/ is Fiji as power hungry as the 290x with overclocking from an architectural standpoint?


Who knows regarding over volting at this stage. Hawaii would draw a lot more power when over volted, so if Fiji is the same, I'd recommend a 1000w PSU minimum. Preferably a 1200w one if you plan on pushing the chips to the max.


----------



## Digitalwolf

Quote:


> Originally Posted by *Nizzen*
> 
> AMD does not work with 144hz on Asus Swift. Only 120hz. I have Fury X, and the same problem was with 290 and 290x too. 750ti/980/titan X works with 144hz with the same DP cable.
> 
> Is it only an Asus Swift bug, or can't AMD do 144hz?
> 
> 
> 
> 
> 
> 
> 
> `


Well I have the Asus MG279... and my Fury X's are doing 144hz with no issue. I have no personal experience with the Swift so can't comment on issues with that monitor.


----------



## xer0h0ur

Quote:


> Originally Posted by *Nizzen*
> 
> AMD does not work with 144hz on Asus Swift. Only 120hz. I have Fury X, and the same problem was with 290 and 290x too. 750ti/980/titan X works with 144hz with the same DP cable.
> 
> Is it only an Asus Swift bug, or can't AMD do 144hz?
> 
> 
> 
> 
> 
> 
> 
> `


Considering there are 144Hz Freesync monitors, I doubt there is a 120Hz limitation.


----------



## Maul

so.. i was attempting to get 1 of my older dell monitors to work with the dvi->hdmi adapter and after rebooting my primary asus monitor didn't boot up in dp anymore.. the monitor is working using the adapter i was originally trying for the secondary monitor but no longer works with displayport. any ideas on what happened because i can't figure it out..


----------



## xer0h0ur

You want to try again this time making sense? Fury X doesn't have a DVI port...


----------



## Maul

Quote:


> Originally Posted by *xer0h0ur*
> 
> You want to try again this time making sense? Fury X doesn't have a DVI port...


if that was directed at me my displayports stopped working for no reason.. all i did was try to use the adapter that came with the fury on a secondary monitor.. and i'm not sure where u were thinking i said dvi port but "dvi->hdmi adapter" means just that lol


----------



## Nizzen

Fury X @ 1120mhz
5930k @ 4600mhz
3200mhz ddr4

Firestrike Extreme
7708 with AMD Radeon R9 Fury X(1x) and Intel Core i7-5930K
Graphics Score 8080
Physics Score 17983
Combined Score 3501

http://www.3dmark.com/fs/5314119

Not as good as my main computer with 5960x and 2x Tx, but good enough for BF4 and 1440p


----------



## xer0h0ur

Yeah except you said DVI to HDMI and then mentioned Displayport. Does not compute. Are you double converting?


----------



## Maul

no i'm using a straight up displayport cable from my primary monitor, but the DP ports don't work anymore ever since i tried to get the secondary monitor to work using the adapter


----------



## xer0h0ur

Whoa...that is fubar. I haven't seen all DP ports just suddenly stop working. Okay so just to clarify you have how many monitors total and how are each of them connected?


----------



## Maul

Quote:


> Originally Posted by *xer0h0ur*
> 
> Whoa...that is fubar. I haven't seen all DP ports just suddenly stop working. Okay so just to clarify you have how many monitors total and how are each of them connected?


i have 1 asus vg248qe as my main monitor and 1 old dell monitor as my secondary. at the moment i have to use the dvi to hdmi adapter to use the asus monitor, but when i try to plug it into displayport it won't even show the post on boot. it feels as though my displayports were deactivated somehow idk


----------



## xer0h0ur

So the ports do work eventually? You're just not seeing the POST?


----------



## Maul

no i still cant get the ports to work using my asus monitor, i have to use the adapter just to get into windows. when i try to use displayport it just tells me "displayport NO SIGNAL", but displayport was working until i tried to get my secondary monitor to work using the adapter it came with.


----------



## Digitalwolf

Quote:


> Originally Posted by *Nizzen*
> 
> Fury X @ 1120mhz
> 5930k @ 4600mhz
> 3200mhz ddr4
> 
> Firestrike Extreme
> 7708 with AMD Radeon R9 Fury X(1x) and Intel Core i7-5930K
> Graphics Score 8080
> Physics Score 17983
> Combined Score 3501
> 
> http://www.3dmark.com/fs/5314119
> 
> Not as good as my main computer with 5960x and 2x Tx, but good enough for BF4 and 1440p


Just to give my not really relative comparison to my Crossfire setup.

My setup is 2x Fury X @ stock.

4790k @ 4400
2666 ddr3

Firestrike Extreme: 12593

Graphics Score 15248
Physics Score 12322
Combined Score 5542

http://www.3dmark.com/3dm/7637884

I guess it might give you some relevant idea of Xfire performance.


----------



## Bludge

Quote:


> Originally Posted by *FLaguy954*
> 
> Any 290 owners in here? What do you all plan on upgrading to (if you are even upgrading): the Fury, Fury X, or Fury Nano?
> 
> I'm thinking of the Fury myself depending on how it performs vs a Fury X. This will also influence my perception of the $550 price tag (which I still think is a little too high).


I went from crossfire 290X 8gb Vapor-X to Fury X. Waiting on the second.


----------



## p4inkill3r

My Fury X will be delivered tomorrow afternoon.


----------



## Casey Ryback

Quote:


> Originally Posted by *Bludge*
> 
> I went from crossfire 290X 8gb Vapor-X to Fury X. Waiting on the second.


Weren't 290X 8GB's enough for 4K? damn ultra HD...........
Quote:


> Originally Posted by *p4inkill3r*
> 
> My Fury X will be delivered tomorrow afternoon.


----------



## the9quad

Quote:


> Originally Posted by *Mega Man*
> 
> eventually i now have a tx10 i need 2 new systems to fill in
> 
> right at this time my baby is bigest concern ( due 8/8/2015 )
> 
> after that later this year i will get another fully watercooled quadfire system


grats dude!


----------



## Mega Man

thanks !


----------



## bonami2

I was at canada computer i had a fury x in my face a sapphire one..................................................

How well we cant open it it sealed............... Oh ok but anyways i dont have the money. Oh ok. Bye......

And im like and everyone is like waiting to get their order and stuff and i had one in front of me with no budget
















Anyways im not fan of sapphire so i will go with asus and or msi.. ( even if it the same thing


----------



## brettjv

Quote:


> Originally Posted by *littlestereo*
> 
> Droppin off these Fury X Firestrike Ultra 4k benches and a quick synopsis of my experience with the card since last Friday (6/26 overnight shipped from Newegg, placed order on 6/24 at 2:25pm EST right after the auto-notify came in) .
> 
> snip


Nice rundown, thanks.

SOOOO ... I'm not going to go thru the whole thread but since I've found SOMEone who's Fury X can do a 100MHz OC on the memory ... in the event that the matter has not already been settled earlier ... can you do some benchmarks and post results with JUST a 100MHz memory OC (nothing on the core, and exactly 600MHz memory)? Just like 3 or 4 tests, maybe FireStrike, Unigine Valley, and a couple more recent gaming benchmarks?

You see, someone mentioned last week on Reddit (and someone said something similar way earlier in this thread) that they were seeing a big perf increase with a 100MHz OC on teh memory ... but others people have done tests with smaller memory OC's and seen basically NADA in terms of perf gain. Which makes sense given the staggering bandwidth on teh cards ...

But like I say ... there's rumors of +100MHz boost on vram doing some magic 'stuff', so I'm hoping you could test it and post results? Again, assuming the matter was not totally settled already on this thread (or elsewhere)?

TIA!


----------



## blue1512

Quote:


> Originally Posted by *brettjv*
> 
> Nice rundown, thanks.
> 
> SOOOO ... I'm not going to go thru the whole thread but since I've found SOMEone who's Fury X can do a 100MHz OC on the memory ... in the event that the matter has not already been settled earlier ... can you do some benchmarks and post results with JUST a 100MHz memory OC (nothing on the core, and exactly 600MHz memory)? Just like 3 or 4 tests, maybe FireStrike, Unigine Valley, and a couple more recent gaming benchmarks?
> 
> You see, someone mentioned last week on Reddit (and someone said something similar way earlier in this thread) that they were seeing a big perf increase with a 100MHz OC on teh memory ... but others people have done tests with smaller memory OC's and seen basically NADA in terms of perf gain. Which makes sense given the staggering bandwidth on teh cards ...
> 
> But like I say ... there's rumors of +100MHz boost on vram doing some magic 'stuff', so I'm hoping you could test it and post results? Again, assuming the matter was not totally settled already on this thread (or elsewhere)?
> 
> TIA!


That was based on the rumor of a memory timing profile in FuryX BIOS for that clock, as early leak stated that AMD had planed to clock HBM at 600-625MHz. Just a rumor by the way.


----------



## Sgt Bilko

http://www.tweaktown.com/tweakipedia/94/amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160/index.html

Some Triple 4k Crossfire Benches.

Looks like these cards perform best in Crossfire + High res


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://www.tweaktown.com/tweakipedia/94/amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160/index.html
> 
> Some Triple 4k Crossfire Benches.
> 
> Looks like these cards perform best in Crossfire + High res


It's neat and all, but is anyone really interested in playing games at the medium preset on a bunch of 4K monitors? I'd rather see them spend their time testing memory overclocks, or Win 10 performance.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> http://www.tweaktown.com/tweakipedia/94/amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160/index.html
> 
> Some Triple 4k Crossfire Benches.
> 
> Looks like these cards perform best in Crossfire + High res
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's neat and all, but is anyone really interested in playing games at the medium preset on a bunch of 4K monitors? I'd rather see them spend their time testing memory overclocks, or Win 10 performance.
Click to expand...

I don't think Tweaktown does GPU overclocking at all let alone HBM and Win 10 performance would best be judged when it actually releases (not tech preview).

I think its pretty cool though considering one of the big things people were jumping up and down about was the 4GB of Vram and was it even capable of pushing 4k let alone 4k eyefinity


----------



## Noufel

So if true that 100mhz+ on the memory of the fury bring that 10-20 % perf increase AMD hurted them selfs with being to much conservative on the furyX spec on the launch .


----------



## blue1512

Quote:


> Originally Posted by *Noufel*
> 
> So if true that 100mhz+ on the memory of the fury bring that 10-20 % perf increase AMD hurted them selfs with being to much conservative on the furyX spec on the launch .


It's a rumors. And as far as I know not many cards stable at 600Mhz. AMD had to settle HBM at 500Mhz after all


----------



## Bludge

Quote:


> Originally Posted by *Casey Ryback*
> 
> Weren't 290X 8GB's enough for 4K? damn ultra HD...........


Well, 1 fury is pretty close to 290X crossfire, so 2 Fury is better







Plus with the Vapor-X cards, they have a 2.5 slot format, couldn't fit soundcard or esata card in my machine.

Plan was to wait for a Fury X2 (or whatever) but I tend to get impatient. Be even better if I move to WC blocks for the Furies, this X9 case is awesome.


----------



## Gdourado

Those who have been getting their furys, how's the pump noise and coil wine on these latest a units?


----------



## Silent Scone

Quote:


> Originally Posted by *Gdourado*
> 
> Those who have been getting their furys, how's the pump noise and coil wine on these latest a units?


No coil whine. Pump is audible but not enough to annoy me. In a relatively small enclosure too. Noise is subjective.


----------



## Ceadderman

@Mega Man...

Just went through the same thing with my Son. You're gonna be sleep deprived as hades but it's well worth it.









Congratulations bro!









~Ceadder


----------



## escksu

Btw guys, whats the vcore of Fury X?


----------



## escksu

Quote:


> Originally Posted by *Silent Scone*
> 
> No coil whine. Pump is audible but not enough to annoy me. In a relatively small enclosure too. Noise is subjective.


You are lucky, mine got coil whine. Send back for RMA already.... Have to "endure" 2 months w/o fury x.........


----------



## Silent Scone

Quote:


> Originally Posted by *escksu*
> 
> You are lucky, mine got coil whine. Send back for RMA already.... Have to "endure" 2 months w/o fury x.........


Irony there is coil whine has a lot of luck involved. It's not just associated with card, it's a natural phenomenon which resonate and occur at frequencies which can change in various depending on other components, namely the PSU , cabling and sometimes MB.


----------



## escksu

Quote:


> Originally Posted by *Silent Scone*
> 
> Irony there is coil whine has a lot of luck involved. It's not just associated with card, it's a natural phenomenon which resonate and occur at frequencies which can change in various depending on other components, namely the PSU , cabling and sometimes MB.


I don't know. Cause my previous card which is Sapphire R9 290 got not noise. Its very quiet. Thats why I don't understand why the fury X is so noisy.


----------



## Gdourado

Any feedback on the powercolor variant?
I am thinking of buying a fury x and powercolor is the only one in stock.


----------



## Newbie2009

Any voltage unlock yet?


----------



## Jflisk

Place holder good morning


----------



## Zanpakuto

in stock


----------



## thefathef

@Gdourado
mine was PoweColor it has awful buzzing noise from pump
RMA today


----------



## Casey Ryback

Quote:


> Originally Posted by *thefathef*
> 
> @Gdourado
> mine was PoweColor it has awful buzzing noise from pump
> RMA today


That's unfortunate, but just so you know powercolor had nothing to do with it.

The various brands have not touched the PCB or cooler on the fury X.


----------



## thefathef

I now, but it is frustrating, i want Fury but from which brand to choose?
do revision pump is on the market?
if Vendors tell us which batch is affected will be much easier!


----------



## royfrosty

Does anyone felt strange about the performance of the Fury X at lower res? Or is it just me?

I'm feeling that the Fury X is so darn powerful at 4k res in Xfire. I could not believe at the scaling and also the performance of MIN fps across most games be it in 4k or in triple 4k eyefinity setup.

But something isn't just right? 1440p is just lagging behind both the 980ti and Titan X in most gaming titles.

Is it AMD themselves that came out a Beta drivers just to push 4k gaming GPU? And could not be bothered to optimize 1440p gaming? Or is it just how the GPU is made for 4k only?

I am pretty puzzled by TT's benchmarks, and it set me going crazy about it for the past few days. But i myself wanna be sure if i'm heading the right way.

Pretty much because of the budget that i have is limited. And i do not wish to make a wrong purchase.

What i have in mind is, to have a 4k 32inch display, a dual Fury X (waterblock). Or should i just be contented with the current Samsung S32D850T 1440p monitor for now, and stick with single fury X and wait for further drivers being matured for lower res?

Ahhhhh help, im confused and in a big dilemma.


----------



## Ha-Nocri

Might be CPU overhead. Anyone has the article about win10 dx11 overhead using AMD GPU's?


----------



## ZealotKi11er

Quote:


> Originally Posted by *royfrosty*
> 
> Does anyone felt strange about the performance of the Fury X at lower res? Or is it just me?
> 
> I'm feeling that the Fury X is so darn powerful at 4k res in Xfire. I could not believe at the scaling and also the performance of MIN fps across most games be it in 4k or in triple 4k eyefinity setup.
> 
> But something isn't just right? 1440p is just lagging behind both the 980ti and Titan X in most gaming titles.
> 
> Is it AMD themselves that came out a Beta drivers just to push 4k gaming GPU? And could not be bothered to optimize 1440p gaming? Or is it just how the GPU is made for 4k only?
> 
> I am pretty puzzled by TT's benchmarks, and it set me going crazy about it for the past few days. But i myself wanna be sure if i'm heading the right way.
> 
> Pretty much because of the budget that i have is limited. And i do not wish to make a wrong purchase.
> 
> What i have in mind is, to have a 4k 32inch display, a dual Fury X (waterblock). Or should i just be contented with the current Samsung S32D850T 1440p monitor for now, and stick with single fury X and wait for further drivers being matured for lower res?
> 
> Ahhhhh help, im confused and in a big dilemma.


DX11 CPU Overhead.


----------



## Forceman

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Might be CPU overhead. Anyone has the article about win10 dx11 overhead using AMD GPU's?


I tried a couple of games with Win 10 and the new "overhead reduction" (the ones that give the draw call gains) drivers and didn't see any real improvement at 1440p. 290X though, so not directly comparable, but I don't think that Win 10 is going to be any kind of savior for AMD. DX12 maybe, but that's a while still.


----------



## swiftypoison

Question: I have a Corsair 450D case. I cant mount the rad in the back because my Dark Rock Pro 3 cooler is too bag. Can I mount it in the front? The reason i ask is because I heard some say that the rad cant me mounted in the front.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> I tried a couple of games with Win 10 and the new "overhead reduction" (the ones that give the draw call gains) drivers and didn't see any real improvement at 1440p. 290X though, so not directly comparable, but I don't think that Win 10 is going to be any kind of savior for AMD. DX12 maybe, but that's a while still.


I dont think 1 x 290X is going to see much gains @ 1440p. Maybe 1080p. Fury X does hit that 1440p mark.


----------



## ban25

Quote:


> Originally Posted by *Zanpakuto*
> 
> in stock


Thanks for the heads up, I think I'll go for Crossfire


----------



## p4inkill3r

Delivered.


----------



## magicc8ball

Quote:


> Originally Posted by *p4inkill3r*
> 
> Delivered.


Congrats!!!









Let is know if it is the sticker version or the etched pump cover.


----------



## thefathef

@p4inkill3r
Good luck!
And pls share about pump


----------



## p4inkill3r

I will inspect it tonight.


----------



## Gdourado

Can anyone please tell me the thickness of the Fury X radiator with fan? I need to know if it would fit with my HE01.

Also, Is it me, or is the Fury X impossible to install as intake due to the placement of the pipes and the reservoir on the radiator?


----------



## Cool Mike

Yes can be mounted in front. I tried it and mounted just fine.


----------



## looncraz

Quote:


> Originally Posted by *royfrosty*
> 
> Does anyone felt strange about the performance of the Fury X at lower res? Or is it just me?
> 
> I'm feeling that the Fury X is so darn powerful at 4k res in Xfire. I could not believe at the scaling and also the performance of MIN fps across most games be it in 4k or in triple 4k eyefinity setup.
> 
> But something isn't just right? 1440p is just lagging behind both the 980ti and Titan X in most gaming titles.
> 
> Is it AMD themselves that came out a Beta drivers just to push 4k gaming GPU? And could not be bothered to optimize 1440p gaming? Or is it just how the GPU is made for 4k only?
> 
> I am pretty puzzled by TT's benchmarks, and it set me going crazy about it for the past few days. But i myself wanna be sure if i'm heading the right way.
> 
> Pretty much because of the budget that i have is limited. And i do not wish to make a wrong purchase.
> 
> What i have in mind is, to have a 4k 32inch display, a dual Fury X (waterblock). Or should i just be contented with the current Samsung S32D850T 1440p monitor for now, and stick with single fury X and wait for further drivers being matured for lower res?
> 
> Ahhhhh help, im confused and in a big dilemma.


In the real world, stock vs stock, you'll never notice a difference between a Fury X and a Titan X aside from temporary driver maturity issues... Well, except for AMD's higher default LOD settings, which effects some games positively (higher detail in textures at a distance, most noticeable in BF4 it seems), same negatively (causing shimmer), but always results in higher computation (and lower FPS vs nVidia).

Also, if you you like to use VSR or DSR on a lower resolution monitor (like I do), then the 4k results are what matters. I run 3200x1800 VSR on a 1080p monitor, so the Fury X would give me a significant boost. However, I have no games which my R9 290 can't play at over 60fps at that resolution with everything maxed (except AA, which I don't care about at 3200x1800 on a 23.6" screen). The R9 290 will be limiting me to [email protected], though, with my new 144hz 1080p monitor (due in any minute now














). It is also 24", since I use multiple monitors and don't like much of a size mismatch. Going from the IPS back to TN will hopefully not be much of an issue. The IPS will remain as my secondary monitor, and my much older Acer TN monitor will be sold (already have a buyer







).


----------



## looncraz

Quote:


> Originally Posted by *thefathef*
> 
> I now, but it is frustrating, i want Fury but from which brand to choose?
> do revision pump is on the market?
> if Vendors tell us which batch is affected will be much easier!


The only thing that matters with Fury X brands, right now, is how the good the company is at honoring its warranties. PowerColor should be just fine, the cards are all the same.

However, if by "Fury" you meant the upcoming air-cooled Fury cards, then that remains to be seen, but Sapphire and Gigabyte have been my go-to for quite a while with wonderful cards (but never any need to use their support...).


----------



## Clockster

I'm not supposed to say anything but I'll be getting an Asus R9 Fury Strix in 9 days









But ssshhhh no telling lol


----------



## the9quad

Quote:


> Originally Posted by *looncraz*
> 
> In the real world, stock vs stock, you'll never notice a difference between a Fury X and a Titan X aside from temporary driver maturity issues... Well, except for AMD's higher default LOD settings, which effects some games positively (higher detail in textures at a distance, most noticeable in BF4 it seems), same negatively (causing shimmer), but always results in higher computation (and lower FPS vs nVidia).
> 
> Also, if you you like to use VSR or DSR on a lower resolution monitor (like I do), then the 4k results are what matters. I run 3200x1800 VSR on a 1080p monitor, so the Fury X would give me a significant boost. However, I have no games which my R9 290 can't play at over 60fps at that resolution with everything maxed (except AA, which I don't care about at 3200x1800 on a 23.6" screen). The R9 290 will be limiting me to [email protected], though, with my new 144hz 1080p monitor (due in any minute now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ). It is also 24", since I use multiple monitors and don't like much of a size mismatch. Going from the IPS back to TN will hopefully not be much of an issue. The IPS will remain as my secondary monitor, and my much older Acer TN monitor will be sold (already have a buyer
> 
> 
> 
> 
> 
> 
> 
> ).


What games are you playing where your 290 plays all games at 60 fps with everything maxed at 3200X1800? Quake?


----------



## Agent Smith1984

Quote:


> Originally Posted by *the9quad*
> 
> What games are you playing where your 290 plays all games at 60 fps with everything maxed at 3200X1800? Quake?


----------



## hyp36rmax

Quote:


> Originally Posted by *Clockster*
> 
> I'm not supposed to say anything but I'll be getting an Asus R9 Fury Strix in 9 days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But ssshhhh no telling lol


LOL! I see what you did there....


----------



## Gdourado

Quote:


> Originally Posted by *Cool Mike*
> 
> Yes can be mounted in front. I tried it and mounted just fine.


Did you flip the fan to the other side?
If so, how about that lower reservoir extension? Can it be inverted also?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gdourado*
> 
> Did you flip the fan to the other side?
> If so, how about that lower reservoir extension? Can it be inverted also?


Pull = Push so it's fine because the fan in this RAD is one of the best.


----------



## ozyo

Quote:


> Originally Posted by *p4inkill3r*
> 
> Delivered.


----------



## ozyo

Quote:


> Originally Posted by *Clockster*
> 
> I'm not supposed to say anything but I'll be getting an Asus R9 Fury Strix in 9 days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But ssshhhh no telling lol


with x or without


----------



## p4inkill3r

Quote:


> Originally Posted by *ozyo*


Yeah, I canceled the one I had on order with Amazon. I placed the order the hour of release and was looking at delivery between July 13-15.
The very next day, Newegg had it in stock with Shoprunner shipping and I pounced.


----------



## Ceadderman

Quote:


> Originally Posted by *Gdourado*
> 
> Can anyone please tell me the thickness of the Fury X radiator with fan? I need to know if it would fit with my HE01.
> 
> Also, Is it me, or is the Fury X impossible to install as intake due to the placement of the pipes and the reservoir on the radiator?


Reasonably sure that you can swap out the fan. So flipping it would make it Intake instead of Exhaust or vice versa.









~Ceadder


----------



## Thoth420

Quote:


> Originally Posted by *looncraz*
> 
> The only thing that matters with Fury X brands, right now, is how the good the company is at honoring its warranties. PowerColor should be just fine, the cards are all the same.
> 
> However, if by "Fury" you meant the upcoming air-cooled Fury cards, then that remains to be seen, but Sapphire and Gigabyte have been my go-to for quite a while with wonderful cards (but never any need to use their support...).


I haven't gotten around to building the hardware in my sig yet. I registered my XFX Fury X(ordered and shipped on release day) today and called them to ask if I end up with a model with a pump issue and/or coil whine that they would be willing to swap it and they said it would be no problem. I didn't ask for a time-frame on a replacement for a theoretical issue in my case. I was curious but always had great customer service with them in the past so I expected nothing less. I don't see why any company would deny an RMA since AMD stated they would cover the losses from the bad batch that got out.

The questions on my mind are more:
How large was this bad batch?
When can RMAs be expected to be fulfilled?


----------



## Ceadderman

Why don't you just return it for vendor swap? RMAing it will likely net you a refurb.

~Ceadder


----------



## littlestereo

Quote:


> Originally Posted by *brettjv*
> 
> Nice rundown, thanks.
> 
> SOOOO ... I'm not going to go thru the whole thread but since I've found SOMEone who's Fury X can do a 100MHz OC on the memory ... in the event that the matter has not already been settled earlier ... can you do some benchmarks and post results with JUST a 100MHz memory OC (nothing on the core, and exactly 600MHz memory)? Just like 3 or 4 tests, maybe FireStrike, Unigine Valley, and a couple more recent gaming benchmarks?
> 
> You see, someone mentioned last week on Reddit (and someone said something similar way earlier in this thread) that they were seeing a big perf increase with a 100MHz OC on teh memory ... but others people have done tests with smaller memory OC's and seen basically NADA in terms of perf gain. Which makes sense given the staggering bandwidth on teh cards ...
> 
> But like I say ... there's rumors of +100MHz boost on vram doing some magic 'stuff', so I'm hoping you could test it and post results? Again, assuming the matter was not totally settled already on this thread (or elsewhere)?
> 
> TIA!


Pushing VRAM +100MHZ yielded less than a 2% performance boost for me in Firestrike Ultra

stock core, stock mem = 3734
stock core, +100 mem = 3804

http://www.3dmark.com/compare/fs/5293880/fs/5292892


----------



## rv8000

Miffed with newegg atm, card has been sitting in packaging since thursday and I paid for 3 day shipping this time around







.


----------



## BIGTom

Has anyone seen the VSR option in CCC with their Fury X? It seems to be missing for me.


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Why don't you just return it for vendor swap? RMAing it will likely net you a refurb.
> 
> ~Ceadder


There is no such thing. These guys don't touch the cards other than packaging them.


----------



## littlestereo

Quote:


> Originally Posted by *BIGTom*
> 
> Has anyone seen the VSR option in CCC with their Fury X? It seems to be missing for me.


CCC->LCD Display Panel tab -> on the right side it's a checkbox, I can screenshot it for you when I'm off work in a couple hours


----------



## looncraz

Quote:


> Originally Posted by *the9quad*
> 
> What games are you playing where your 290 plays all games at 60 fps with everything maxed at 3200X1800? Quake?


Battlefield 4, Civ V, Far Cry 3, Hitman Absolution (well, that hits 52fps average), everything maxed, but no AA.

I don't own a game I can't play maxed out at 3200x1800.

I recently modified my GPU BIOS from its stock OC clocks of 1040/1250 to 900/1250 so that my idle clock profiles are more effective. Those clocks cause BF4 to drop into the high 40s at times, but I am usually right around 60. With 1040/1250 I very rarely hit below 50fps, and with 1150/1500 I effectively never do.

GPU also maxes at 72C in BF4 with everything maxed (except AA).

Today, though, I upgraded to a 144hz monitor, so I am running [email protected] and I get over 120fps reliably, all ultra, but still no AA.

I7 2600k @ 4.5GHz, 16GB DDR3-2133 RAM, 5x SSDs (no RAID in use), 50mbps symmetric internet (fiber optic), and all defaults in CCC.

Turning AA on can bring things to a crawl, but I'm not sensitive to aliasing in most games, and have long turned it off just from my dislike of it. VSR makes it completely unneeded, in fact, IMHO. (Well, on my 24" monitors...).


----------



## looncraz

Quote:


> Originally Posted by *Ceadderman*
> 
> Why don't you just return it for vendor swap? RMAing it will likely net you a refurb.
> 
> ~Ceadder


I'm not sure that would be a concern for a card that is only a couple weeks old, max.

No matter what, you're getting a nearly new, or brand new, card.


----------



## royfrosty

Quote:


> Originally Posted by *BIGTom*
> 
> Has anyone seen the VSR option in CCC with their Fury X? It seems to be missing for me.


Here you go...


----------



## DividebyZERO

Quote:


> Originally Posted by *royfrosty*
> 
> Here you go...


Are you getting 4k VSR? or 3k?


----------



## Ganf

Quote:


> Originally Posted by *BIGTom*
> 
> Has anyone seen the VSR option in CCC with their Fury X? It seems to be missing for me.


Are you on Windows 10? VSR seems to be disabled in the Windows 10 drivers for the time being.


----------



## Gdourado

Anyone installed the fury x radiator as exhaust together with a large CPU aircooler like a nocturnal d15 or the likes?
Does it all fit? Room for the tubes to pass and the rad?
Can you post a pic please?

Thanks


----------



## royfrosty

Quote:


> Originally Posted by *DividebyZERO*
> 
> Are you getting 4k VSR? or 3k?


4k vsr.


----------



## BIGTom

Quote:


> Originally Posted by *Ganf*
> 
> Are you on Windows 10? VSR seems to be disabled in the Windows 10 drivers for the time being.


Quote:


> Originally Posted by *royfrosty*
> 
> Here you go...


Quote:


> Originally Posted by *littlestereo*
> 
> CCC->LCD Display Panel tab -> on the right side it's a checkbox, I can screenshot it for you when I'm off work in a couple hours


Thanks everyone. The VSR option is not visible for me in Windows 8.1 CCC . I will try another round of DDU and 15.15 driver installation to see if this corrects the issue.


----------



## BIGTom

I am starting to think VSR is not available for native 3440x1440 screens (21:9 aspect ratio)


----------



## Sgt Bilko

Quote:


> Originally Posted by *BIGTom*
> 
> I am starting to think VSR is not available for native 3440x1440 screens.


Umm, No its not....sorry

Apologies for the screencap


----------



## Minotaurtoo

well.. .due to odd costs, cut hours and mostly tired of waiting.. I've canceled my amazon order for the fury x.... maybe later when supplies are better and hours pick back up at work I'll try again...


----------



## p4inkill3r

Ok, got my Fury X installed.

A few impressions, first:


The build quality is first rate; you can tell that it is a premium product just by the weight and solidity.
The pump, fan, and radiator combination works beautifully in tandem. I had an initial gurgle on first power up as the pump primed itself and nothing since.
I'm going to have to redo my entire wiring setup
I need a new hex key set








I don't have a hex or screwdriver that will fit the screws holding the cover on so I cannot determine which revision I have; I will rectify this tomorrow with a trip to Harbor Freight.









Here are just a couple quick pics and benchmarks before I eat dinner.

First, Mordor on my 290 Tri-X @ 1125mhz


Fury @ stock


----------



## Sgt Bilko

Just looking at the Graphics score thete and its pummeling my 290x....

http://www.3dmark.com/fs/5068571

1200/1375 iirc.

Can you do a run of Firestrike Extreme by chance?

Thats for the results, gives me a clearer picture of what to expect


----------



## DividebyZERO

Quote:


> Originally Posted by *Minotaurtoo*
> 
> well.. .due to odd costs, cut hours and mostly tired of waiting.. I've canceled my amazon order for the fury x.... maybe later when supplies are better and hours pick back up at work I'll try again...


Don't be sad, it may end up better waiting. Newegg had some in stock this morning and i could have ordered but i didn't. They want to milk people right because they are hard to get. I've decided to wait as well, still have air cooled fury coming. Possibly maybe dual fiji in august? I may end up waiting until next year myself. If they manage to unlock voltage and overclock and get more performance from drivers or whatever then i will consider it. Right now its either overpriced, or under-performing and one of those need fixing whichever it may be,

390x/390s are pulling 14k or so in firestrike, so perhaps 4k if gains a better lead.


----------



## p4inkill3r

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just looking at the Graphics score thete and its pummeling my 290x....
> 
> http://www.3dmark.com/fs/5068571
> 
> 1200/1375 iirc.
> 
> Can you do a run of Firestrike Extreme by chance?
> 
> Thats for the results, gives me a clearer picture of what to expect




@1125mhz


----------



## Sgt Bilko

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Just looking at the Graphics score thete and its pummeling my 290x....
> 
> http://www.3dmark.com/fs/5068571
> 
> 1200/1375 iirc.
> 
> Can you do a run of Firestrike Extreme by chance?
> 
> Thats for the results, gives me a clearer picture of what to expect
Click to expand...

Yep....thats a stomping









http://www.3dmark.com/fs/5078201

1287/1375

Thanks for that +rep


----------



## p4inkill3r

45C during Firestrike Extreme...

This card has legs once we get the voltage unlocked.


----------



## Jflisk

Picked up the Powercolor this morning. So I have 3x R9 290X for sale. Lets see how well the Furys perform. Also Have to change my Loop to just my CPU.


----------



## Minotaurtoo

Quote:


> Originally Posted by *p4inkill3r*
> 
> Ok, got my Fury X installed.
> 
> A few impressions, first:
> 
> 
> The build quality is first rate; you can tell that it is a premium product just by the weight and solidity.
> The pump, fan, and radiator combination works beautifully in tandem. I had an initial gurgle on first power up as the pump primed itself and nothing since.
> I'm going to have to redo my entire wiring setup
> I need a new hex key set
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have a hex or screwdriver that will fit the screws holding the cover on so I cannot determine which revision I have; I will rectify this tomorrow with a trip to Harbor Freight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are just a couple quick pics and benchmarks before I eat dinner.
> 
> First, Mordor on my 290 Tri-X @ 1125mhz
> 
> 
> Fury @ stock


This is another reason I canceled my order... look at my graphics score compared to yours... kinda a side grade for me. http://www.3dmark.com/3dm/7376026?
Quote:


> Originally Posted by *DividebyZERO*
> 
> Don't be sad, it may end up better waiting. Newegg had some in stock this morning and i could have ordered but i didn't. They want to milk people right because they are hard to get. I've decided to wait as well, still have air cooled fury coming. Possibly maybe dual fiji in august? I may end up waiting until next year myself. If they manage to unlock voltage and overclock and get more performance from drivers or whatever then i will consider it. Right now its either overpriced, or under-performing and one of those need fixing whichever it may be,


yeah.. I also as stated above... started seeing what I was seeing with the 290's and 390's... just not enough performance increase over what I have to warrant the price.. I might would get $300 out of my current cards since they have never been driven very hard... (max volts ever seen was 1.25) and never used for mining...
390x/390s are pulling 14k or so in firestrike, so perhaps 4k if gains a better lead.


----------



## provost

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep....thats a stomping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/5078201
> 
> 1287/1375
> 
> Thanks for that +rep


Cool. Looking forward to receiving my Fury !


----------



## ZealotKi11er

Quote:


> Originally Posted by *Minotaurtoo*
> 
> This is another reason I canceled my order... look at my graphics score compared to yours... kinda a side grade for me. http://www.3dmark.com/3dm/7376026?
> yeah.. I also as stated above... started seeing what I was seeing with the 290's and 390's... just not enough performance increase over what I have to warrant the price.. I might would get $300 out of my current cards since they have never been driven very hard... (max volts ever seen was 1.25) and never used for mining...
> 390x/390s are pulling 14k or so in firestrike, so perhaps 4k if gains a better lead.


The think is in 3DMark the cards will scale 100% so it's not a real comparison. Fury X will destroy 2 x 280s out of the water. I have not tried Windows 8.1 with my MAX OC 290X but i am sure i can hit 15K GPU score with 1300/1500 OC. If you look at it that way then Fury X does not look that good.


----------



## Minotaurtoo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The think is in 3DMark the cards will scale 100% so it's not a real comparison. Fury X will destroy 2 x 280s out of the water. I have not tried Windows 8.1 with my MAX OC 290X but i am sure i can hit 15K GPU score with 1300/1500 OC. If you look at it that way then Fury X does not look that good.


Oh don't get me wrong... I want to get the fury... but as I stated a few posts back... with some unexpected costs and cut in hours at work I can't really justify the expense of what ultimately is a side grade for me... I've been watching the benches and the games I have that I've seen benched I have no issues with... I do have a few games that crossfire just refuses to work... and at some point in time I will be getting a more powerful single gpu card whether it be fury x or whats next IDK atm... don't get your panties in a wad or anything I wasn't attacking the fury x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Oh don't get me wrong... I want to get the fury... but as I stated a few posts back... with some unexpected costs and cut in hours at work I can't really justify the expense of what ultimately is a side grade for me... I've been watching the benches and the games I have that I've seen benched I have no issues with... I do have a few games that crossfire just refuses to work... and at some point in time I will be getting a more powerful single gpu card whether it be fury x or whats next IDK atm... don't get your panties in a wad or anything I wasn't attacking the fury x.


Just saying because i upgrade from 2 x HD 7970s to 290X and was a good upgrade. Single GPU is so much reliable these days.


----------



## DividebyZERO

Well the 390/x hit 13.5k+ firestrike stock and with mild OC's hit 14k+. I guess if your 1080p its something to think about for FuryX since it struggles right now at 1080p(compared to all topend cards). I'm still waiting for 5k benchmarks because 4GB vram and all. I am hoping by the time i revisit the idea of getting Fury its full throttled. For now its a waiting game and it may be worth it


----------



## Minotaurtoo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just saying because i upgrade from 2 x HD 7970s to 290X and was a good upgrade. Single GPU is so much reliable these days.


yes it is... I really hate having to wait... but the ac went out on my wifes car... she's a heart patient... so heat really gets to her bad... and it is a long way to the doctors and town for that matter... had to fix her air quickly... so that set me back... I'm still trying to sell off some stuff to raise the money... maybe in a month or so I'll have it... hopefully by then stock will be easier to find and maybe even a price drop


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Why don't you just return it for vendor swap? RMAing it will likely net you a refurb.
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is no such thing. These guys don't touch the cards other than packaging them.
Click to expand...

This is the *second* time I have read this. If this were indeed the case and AMD manufactured *every* card then what's in it for them to allow anybody to package them for resale under another brand name?

Frankly until I see







of this, it's not true afaiac. I've never seen any hint of this with past iterations of AMD cards an I doubt this is truly the case at this time. I could be wrong but the first time someone stated this I asked for proof and got none.









~Ceadder


----------



## xer0h0ur

Don't call me out unless you want to look like a fool.

http://wccftech.com/amd-fiji-gpu-powering-fury-manufactured-samsung/

It is not my responsibility to track down and source everything for you. Do your own legwork.


----------



## Casey Ryback

Quote:


> Originally Posted by *Ceadderman*
> 
> Frankly until I see
> 
> 
> 
> 
> 
> 
> 
> of this, it's not true afaiac.


The reference designs are never modified by the AIB partners.

In the past they have put their name on the blower cooler or something but other than that it's just packaging that is different.

This goes for Nvidia reference and AMD reference cards.

Once the AIB's get involved you will see quality improve or drop from the reference design.


----------



## Ceadderman

Wasn't *calling anyone out* was just pointing out that you shouldn't state something without supportive information.

So yeah, you kinda do have to "do the legwork" as you so eloquently put it.

If I turn in a paper to my Profs that state facts that I don't cite for they'd be right to grade them with big fat F minuses.

I wasn't even picking a fight with you. So please tone it down a bit okay?









Thank you for the link. But all it shows is Hynix is mounting the HBM to the GPU die and shipping the completed dies back to AMD. Don't see how your link makes me look like a fool. But everyone sees things how they see em I guess.









~Ceadder


----------



## rv8000

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The think is in 3DMark the cards will scale 100% so it's not a real comparison. Fury X will destroy 2 x 280s out of the water. I have not tried Windows 8.1 with my MAX OC 290X but i am sure i can hit 15K GPU score with 1300/1500 OC. If you look at it that way then Fury X does not look that good.


That goes for a 980 max oc vs ref 980ti, the end result being in either case that the 290x/980 cannot beat their successors even being pushed to their limits. How does that make a card look bad?


----------



## wooshna

Anyone here know of a or where they can buy a r9 fury x? i've checked amazon, newegg, tigerdirect, ncixus, bestbuy, extremegear, ascendtech, mwave, pricewatch, cutting edge gamer.

Was there only so few that were manufactured that no one other than the ebay people who are selling them for $900 has them?


----------



## ban25

Here it is:



There will be a second one joining it on Thursday.

I thought I'd post some compute results, here's PrimeGrid's PPS Sieve...

*PPS Sieve:*

Code:



Code:


Sieve started: 104200113000000000 <= p < 104200122000000000
Thread 0 starting
Detected 1024 multiprocessors (5120 SPUs) on device 0.
Device 0 is a 'Advanced Micro Devices, Inc.' 'Fiji'.
GCN device detected; use -m1 --vecsize=4 to undo effect

Thread 0 completed
Sieve complete: 104200113000000000 <= p < 104200122000000000
count=229677660,sum=0x205fbfed049bcfe4
Elapsed time: 516.53 sec. (1.03 init + 515.50 sieve) at 17459066 p/sec.
Processor time: 753.30 sec. (1.03 init + 752.27 sieve) at 11964112 p/sec.
Average processor utilization: 1.00 (init), 1.46 (sieve)
22:37:19 (2152): called boinc_finish

Compared to my R9 290, this is only about 2% faster (516s vs 525s). However, the Fury X is running quite a bit cooler.
The card reached 67C in a room at 29C ambient. My R9 290 typically sits at 80C under this load. Here's the hwinfo sensors during the primegrid run:


----------



## Ceadderman

Quote:


> Originally Posted by *wooshna*
> 
> Anyone here know of a or where they can buy a r9 fury x? i've checked amazon, newegg, tigerdirect, ncixus, bestbuy, extremegear, ascendtech, mwave, pricewatch, cutting edge gamer.
> 
> Was there only so few that were manufactured that no one other than the ebay people who are selling them for $900 has them?


Seems like the egg is expecting two vendors cards at the moment. "*Coming soon*" with no listed prices, so I would go there and sign up for email notification to try to score the one you want and ignore eBay altogether unless you like paying more or simply can no longer wait for a fair price?









~Ceadder


----------



## Mega Man

Quote:


> Originally Posted by *rv8000*
> 
> Miffed with newegg atm, card has been sitting in packaging since thursday and I paid for 3 day shipping this time around
> 
> 
> 
> 
> 
> 
> 
> .


yea another of my orders are sitting there as well i bought friday and one got out but the other isnt ( all from same order,- still in packaging >.> kinda weird and my nas is waiting for these parts


----------



## Ceadderman

Why Fury X xFire for a NAS box?









Don't get me wrong, you should do whatever you like but I think a data backup sporting top end GPU make overkill look pensive.









~Ceadder


----------



## looncraz

Quote:


> Originally Posted by *Ceadderman*
> 
> Wasn't *calling anyone out* was just pointing out that you shouldn't state something without supportive information.
> 
> So yeah, you kinda do have to "do the legwork" as you so eloquently put it.
> 
> If I turn in a paper to my Profs that state facts that I don't cite for they'd be right to grade them with big fat F minuses.
> 
> I wasn't even picking a fight with you. So please tone it down a bit okay?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you for the link. But all it shows is Hynix is mounting the HBM to the GPU die and shipping the completed dies back to AMD. Don't see how your link makes me look like a fool. But everyone sees things how they see em I guess.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Usually when someone asks a question and the same answer is given by multiple people the person asking the question accepts, or does their own legwork, rather than making other people do the work for them.

At least that's what I do.


----------



## Ceadderman

Two people qualifies as multiple? I always considered it "a couple".









~Ceadder


----------



## Gdourado

Quote:


> Originally Posted by *p4inkill3r*
> 
> Ok, got my Fury X installed.
> 
> A few impressions, first:
> 
> 
> The build quality is first rate; you can tell that it is a premium product just by the weight and solidity.
> The pump, fan, and radiator combination works beautifully in tandem. I had an initial gurgle on first power up as the pump primed itself and nothing since.
> I'm going to have to redo my entire wiring setup
> I need a new hex key set
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have a hex or screwdriver that will fit the screws holding the cover on so I cannot determine which revision I have; I will rectify this tomorrow with a trip to Harbor Freight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are just a couple quick pics and benchmarks before I eat dinner.
> 
> First, Mordor on my 290 Tri-X @ 1125mhz
> 
> 
> Fury @ stock


Is that an Air 540?
What is your CPU cooler?
Can you post more pics on the setup please?

Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *p4inkill3r*
> 
> Ok, got my Fury X installed.
> 
> A few impressions, first:
> 
> 
> The build quality is first rate; you can tell that it is a premium product just by the weight and solidity.
> The pump, fan, and radiator combination works beautifully in tandem. I had an initial gurgle on first power up as the pump primed itself and nothing since.
> I'm going to have to redo my entire wiring setup
> I need a new hex key set
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have a hex or screwdriver that will fit the screws holding the cover on so I cannot determine which revision I have; I will rectify this tomorrow with a trip to Harbor Freight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are just a couple quick pics and benchmarks before I eat dinner.
> 
> First, Mordor on my 290 Tri-X @ 1125mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Fury @ stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that an Air 540?
> What is your CPU cooler?
> Can you post more pics on the setup please?
> 
> Thanks
Click to expand...

Assuming thats his sig rig it is indeed a Air 540 and the AIO is a Corsair H100i


----------



## looncraz

Quote:


> Originally Posted by *Ceadderman*
> 
> Two people qualifies as multiple? I always considered it "a couple".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I thought it was more than that (from earlier in the thread), either way, the point remains.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ban25*
> 
> Here it is:
> 
> 
> 
> There will be a second one joining it on Thursday.
> 
> I thought I'd post some compute results, here's PrimeGrid's PPS Sieve...
> 
> *PPS Sieve:*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Sieve started: 104200113000000000 <= p < 104200122000000000
> Thread 0 starting
> Detected 1024 multiprocessors (5120 SPUs) on device 0.
> Device 0 is a 'Advanced Micro Devices, Inc.' 'Fiji'.
> GCN device detected; use -m1 --vecsize=4 to undo effect
> 
> Thread 0 completed
> Sieve complete: 104200113000000000 <= p < 104200122000000000
> count=229677660,sum=0x205fbfed049bcfe4
> Elapsed time: 516.53 sec. (1.03 init + 515.50 sieve) at 17459066 p/sec.
> Processor time: 753.30 sec. (1.03 init + 752.27 sieve) at 11964112 p/sec.
> Average processor utilization: 1.00 (init), 1.46 (sieve)
> 22:37:19 (2152): called boinc_finish
> 
> Compared to my R9 290, this is only about 2% faster (516s vs 525s). However, the Fury X is running quite a bit cooler.
> The card reached 67C in a room at 29C ambient. My R9 290 typically sits at 80C under this load. Here's the hwinfo sensors during the primegrid run:


So you are getting 67C with 29C Ambient. If you take a typical 20-24C the card would be 58-62C. How are some people getting low 40s lol. I say this because my Not OCed 290X + 290 hits 50-55C during gaming load with plenty of RAD space.


----------



## blue1512

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So you are getting 67C with 29C Ambient. If you take a typical 20-24C the card would be 58-62C. How are some people getting low 40s lol. I say this because my Not OCed 290X + 290 hits 50-55C during gaming load with plenty of RAD space.


FuryX BIOS has two pump control profiles, one top at 60C and one at 67C. The one with 67C has less aggressive fan profile, hence higher temp in general. BUT, since it lets the card reach 67C it will have higher thermal limit, mean higher voltage and overclock.


----------



## p4inkill3r

Quote:


> Originally Posted by *Gdourado*
> 
> Is that an Air 540?
> What is your CPU cooler?
> Can you post more pics on the setup please?
> 
> Thanks


I will tonight if you like, but yes, it is an Air 540 and the h100's radiator is mounted up top.


----------



## Jflisk

Hopefully EK drops a block for these. Just going to do a Loop hybrid for now - Also hopefully Newegg is processing orders on these I have one paid for and it is in packaging.


----------



## rt123

Quote:


> Originally Posted by *Jflisk*
> 
> Hopefully EK drops a block for these. Just going to do a Loop hybrid for now - Also hopefully Newegg is processing orders on these I have one paid for and it is in packaging.


http://www.overclock.net/t/1563820/ek-releases-amd-radeon-r9-fury-x-full-cover-water-block#post_24138145


----------



## eurostyle360

Are there any preorders for the regular Fury anywhere?


----------



## Jflisk

Quote:


> Originally Posted by *eurostyle360*
> 
> Are there any preorders for the regular Fury anywhere?


Amazon is doing preorder at least they were. Best bet is -

http://www.nowinstock.net/


----------



## xer0h0ur

LOL so if the production of Fury X has been AMD to SK Hynix to AMD to AIB then you conclude that AIBs are the ones handling the repairs of RMAs? Believe whatever you want to believe but it seems obvious to me if AIBs have no hand in its production then they aren't qualified to work on RMAed cards either.


----------



## ban25

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So you are getting 67C with 29C Ambient. If you take a typical 20-24C the card would be 58-62C. How are some people getting low 40s lol. I say this because my Not OCed 290X + 290 hits 50-55C during gaming load with plenty of RAD space.


I haven't run a gaming test, apart from a quick Heaven benchmark (during which I neglected to check temps), but any game is going to be significantly less demanding than sieving, especially during loading screens and menus. Also, the GPU fan was only running at 1500rpm, so it seems the BIOS was attempting to hold at 67C, as the poster below suggests.

Tonight I'll try setting the fan to 100% and report back the temps I see.
Quote:


> Originally Posted by *blue1512*
> 
> FuryX BIOS has two pump control profiles, one top at 60C and one at 67C. The one with 67C has less aggressive fan profile, hence higher temp in general. BUT, since it lets the card reach 67C it will have higher thermal limit, mean higher voltage and overclock.


Interesting...is there a way to toggle between the two profiles?

Cheers!


----------



## blue1512

Quote:


> Originally Posted by *ban25*
> 
> I haven't run a gaming test, apart from a quick Heaven benchmark (during which I neglected to check temps), but any game is going to be significantly less demanding than sieving, especially during loading screens and menus. Also, the GPU fan was only running at 1500rpm, so it seems the BIOS was attempting to hold at 67C, as the poster below suggests.
> 
> Tonight I'll try setting the fan to 100% and report back the temps I see.
> Interesting...is there a way to toggle between the two profiles?
> 
> Cheers!


No, the AIB will choose a profile to run with their BIOS. Sapphire is confirmed 60C, and Asus is confirmed 67C. I'm not sure about other brand though.


----------



## akumaburn

Them clocks though...


http://www.3dmark.com/3dm11/10006756


----------



## xer0h0ur

I don't give any weight to non-standard 3dmark testing unless you're trying to compare various of the same user's tests relative to each other.


----------



## akumaburn

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't give any weight to non-standard 3dmark testing unless you're trying to compare various of the same user's tests relative to each other.


I'm remarking the clocks.. not so much the results.. Still even then the results beat two 980s in SLI.. no small feat no matter what tessellation setting was used.


----------



## ban25

Quote:


> Originally Posted by *blue1512*
> 
> No, the AIB will choose a profile to run with their BIOS. Sapphire is confirmed 60C, and Asus is confirmed 67C. I'm not sure about other brand though.


Interesting. I have an ASUS, but my second card on the way is a Sapphire. I had no qualms about ordering different AIBs because I assumed the cards were exactly the same. Hopefully there's no problem with this arrangement and worst case, one fan runs faster than the other.


----------



## xer0h0ur

Quote:


> Originally Posted by *akumaburn*
> 
> I'm remarking the clocks.. not so much the results.. Still even then the results beat two 980s in SLI.. no small feat no matter what tessellation setting was used.


Well actually that is the entire point which makes that win a hollow victory. Maxwell handles tessellation better than Fury X does. So that is a cheap way to create a win.


----------



## akumaburn

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well actually that is the entire point which makes that win a hollow victory. Maxwell handles tessellation better than Fury X does. So that is a cheap way to create a win.


No it doesn't.. even with tessellation off.. beating two 980s in SLI is not a hollow victory by any measure.

And again you seem to be ignoring that my main point is the core and memory clocks he's using.


----------



## Agent Smith1984

Anyone heard much of this talk about NVIDIA cheating IQ within their drivers?

Just curious to see if there is much development on that yet.

Back in the day, GPU reviews ALWAYS compared image quality, but I notice these days you don't see that much anymore.


----------



## xer0h0ur

I don't care what you say. Maxwell does in fact handle tessellation better than Fiji does. Sure you have a point about the clocks, I can't take anything away from that. That is the first time I even see someone using anything higher than 600MHz on HBM. Regardless that has nothing to do with my point. Hollow victory is hollow.


----------



## Blackops_2

It's weird to me that memory frequency is having such an impact on scores. There shouldn't be any restriction of bandwidth considering the bus width.


----------



## obababoy

Quote:


> Originally Posted by *Ceadderman*
> 
> Two people qualifies as multiple? I always considered it "a couple".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Why would AMD NOT manufacture these cards? The aftermarket brands are just resellers but they are necessary to carry the product by spreading marketing resources. AMD is saving on advertising by having these venders do it. When aftermarket companies make new PCB's, they need to completely change them or pay more money to AMD to share their patents. They also have to pay to use the proprietary parts that make up the card as a whole.

To add, I almost want to dismiss your question after you were unwilling to accept the answer you got from multiple people. Yes multiple, and no I am not going to find everyone elses posts that agree that AMD gave the other brands access to advertise, market, and put a sticker on a fan. There is no more to it.


----------



## Ceadderman

Oh geez. I have read two comments that stated that AMD builds them. Are there more comments of the sort? Likely, but it's obvious I missed them.









I am not intentionally being stubborn here. So maybe we can dial this back a notch.

I only suggested that I wanted proof that this is the case and thus far the only thing resembling proof at all is lacking the straight skinny about whom is doing the manufacturing. Hynix (Korea) gets the die, add the the HBM, test them for fault and ship them back to AMD. That's not proof. That's a singular piece of the manufacturing process.

I'm not being a jerk about this. I'm not Trolling and I take offense when I'm treated as though I am.

LoneWolf15 wrote a guide some time back on OCN (5 Apr 2010) about "How to: Tell a Reference..."

And in it he states that for 1st Generation cards both nVidia and ATi (at the time) have to design the card and then give it to the manufacturer to use if they wish to stick to that design. XFX actually cut one of the 5770 xFire links completely out of their Reference model. I have two of them and I know this to be the case.

This tells me that NO, neither chip manufacturer manufactures 1st Gen cards. The term "Reference" implies that in and of itself. As in "for Reference".

So find me proof that AMD "manufactures" bulk runs of Fury X. Cause everything I know or have seen states no such thing is taking place. They do build Reference Cards but those are shipped to the vendors who are better tooled for mass production of parts.









If you have a link stating this exactly in detail then fine that's what they do. If not, then why pick a fight when you can leave it to someone else who does have the information. Shouting me down is not proof and it's bad form.









~Ceadder


----------



## rv8000

Quote:


> Originally Posted by *Blackops_2*
> 
> It's weird to me that memory frequency is having such an impact on scores. There shouldn't be any restriction of bandwidth considering the bus width.


It isn't, the performance increase is coming from the modified tessellation settings.


----------



## Blackops_2

Quote:


> Originally Posted by *rv8000*
> 
> It isn't, the performance increase is coming from the modified tessellation settings.


Oh.. i missed that. Was on my phone. Any news on voltage control?


----------



## rv8000

Quote:


> Originally Posted by *Blackops_2*
> 
> Oh.. i missed that. Was on my phone. Any news on voltage control?


Latest news was that Unwinder was waiting on a Fury X card to use personally to update AB/Trix/RivaTuner to be able to control the voltage for Fury; It will be unlocked, the real question is when.

And for anyone else that thinks overclocking the HBM is going to get you some magical result, 10 mins of testing easily shows it WILL NOT.

290x Lightning @ 1080/1250, default CCC tess settings -> Gpu score of 12347
290x Lightning @ 1200/1500, tess modified to 2x-> Gpu score of 14740

Results in an increase of 19.4% in GPU score

Fury X @ 1050/500, default CCC tess settings -> Gpu score of 16237
Fury X @ 1145/600, tess modified to 2x -> Gpu score of 19321

Results in an increase of 19% in GPU score

If you want benchmark links I can link them, *but overclocking the HBM on Fury/Fury X is not going to result in massive performance increases PERIOD*.


----------



## Blackops_2

Quote:


> Originally Posted by *rv8000*
> 
> Latest news was that Unwinder was waiting on a Fury X card to use personally to update AB/Trix/RivaTuner to be able to control the voltage for Fury; It will be unlocked, the real question is when.
> 
> And for anyone else that thinks overclocking the HBM is going to get you some magical result, 10 mins of testing easily shows it WILL NOT.
> 
> 290x Lightning @ 1080/1250, default CCC tess settings -> Gpu score of 12347
> 290x Lightning @ 1200/1500, tess modified to 2x-> Gpu score of 14740
> 
> Results in an increase of 19.4% in GPU score
> 
> Fury X @ 1050/500, default CCC tess settings -> Gpu score of 16237
> Fury X @ 1145/600, tess modified to 2x -> Gpu score of 19321
> 
> Results in an increase of 19% in GPU score
> 
> If you want benchmark links I can link them, *but overclocking the HBM on Fury/Fury X is not going to result in massive performance increases PERIOD*.


As it shouldn't i didn't expect it to. I do however take great interest in seeing what Fiji can do with some voltage. I've yet to see one do 1200 on stock volts.


----------



## rv8000

Quote:


> Originally Posted by *Blackops_2*
> 
> As it shouldn't i didn't expect it to. I do however take great interest in seeing what Fiji can do with some voltage. I've yet to see one do 1200 on stock volts.


At this point I'd expect very similar scaling to hawaii with voltage. I'd expect 1200-1300 for 24/7 with a custom loop.


----------



## p4inkill3r

Quote:


> Originally Posted by *Blackops_2*
> 
> As it shouldn't i didn't expect it to. I do however take great interest in seeing what Fiji can do with some voltage. I've yet to see one do 1200 on stock volts.


I was sitting at 58C last night running Firestrike Extreme @ 1125mhz. 1200 should be easily attainable before temps become an issue.


----------



## ozyo

ASUS STRIX-R9 FURY
http://www.computeruniverse.net/en/products/90610661/asus-strix-r9fury-dc3-4g-gaming.asp

Chipset AMD Radeon R9 FURY X
Cooling Type water cooling
Cooler Width Dual-Slot
Low Profile no

interesting


----------



## Gumbi

Quote:


> Originally Posted by *p4inkill3r*
> 
> I was sitting at 58C last night running Firestrike Extreme @ 1125mhz. 1200 should be easily attainable before temps become an issue.


How hot are the VRMs?


----------



## Gdourado

Finally pulled the trigger.
Ordered a Saphire Fury X today.
Also took the chance and ordered some other parts to do some changes on my build.
A Corsair 450D case to properly mount the Fury X, a Corsair H110i GT to cool the CPU and a lamptron fan controller to keep the noise in check.
Also got a nice promo and ordered a pair of Edifier studio monitors.
Hope it all comes together in a nice build.

Now it's the waiting game for the delivery...









Cheers!


----------



## p4inkill3r

Quote:


> Originally Posted by *Gumbi*
> 
> How hot are the VRMs?


I don't know, as I just was able to do a cursory run through last night.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gdourado*
> 
> Finally pulled the trigger.
> Ordered a Saphire Fury X today.
> Also took the chance and ordered some other parts to do some changes on my build.
> A Corsair 450D case to properly mount the Fury X, a Corsair H110i GT to cool the CPU and a lamptron fan controller to keep the noise in check.
> Also got a nice promo and ordered a pair of Edifier studio monitors.
> Hope it all comes together in a nice build.
> 
> Now it's the waiting game for the delivery...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Nice, let us know how it comes together for you.


----------



## th3illusiveman

Quote:


> Originally Posted by *p4inkill3r*
> 
> Ok, got my Fury X installed.
> 
> A few impressions, first:
> 
> 
> The build quality is first rate; you can tell that it is a premium product just by the weight and solidity.
> The pump, fan, and radiator combination works beautifully in tandem. I had an initial gurgle on first power up as the pump primed itself and nothing since.
> I'm going to have to redo my entire wiring setup
> I need a new hex key set
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have a hex or screwdriver that will fit the screws holding the cover on so I cannot determine which revision I have; I will rectify this tomorrow with a trip to Harbor Freight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are just a couple quick pics and benchmarks before I eat dinner.
> 
> First, Mordor on my 290 Tri-X @ 1125mhz
> 
> 
> Fury @ stock


That's not good... (No offense, thanks for the pics) but the difference between 2800 cores and 4000 should result in much higher scores then that. Seems like you only gained around 10 fps.


----------



## rt123

Quote:


> Originally Posted by *th3illusiveman*
> 
> That's not good... (No offense, thanks for the pics) but the difference between 2800 cores and 4000 should result in much higher scores then that. Seems like you only gained around 10 fps.


30% increase in performance for 45.5% more cores is _acceptable_ considering performance doesn't scale linearly with the amount of Cores.

Not to mention the 4000 cores are clocked lower.


----------



## xer0h0ur

While the SP count and TMU count went up, the ROP count stayed the same. Tessellation also improved from Hawaii to Tonga so assuming Fiji is scaled Tonga then it should also have better tessellation performance. I still believe the ROP count is a large limiting factor in lower resolution performance but I am waiting to see what if any gains people get from more mature drivers.


----------



## Blackops_2

Quote:


> Originally Posted by *p4inkill3r*
> 
> I was sitting at 58C last night running Firestrike Extreme @ 1125mhz. 1200 should be easily attainable before temps become an issue.


1125 was the standard a good Tahiti sample would do on stock volts. In my limited run with Tahiti at least. I only kept one out of four 7970s. Three between RMAing constantly with XFX because they were crap, sold the remaining XFX. Bought a reference Diamond and held on to it.

No artifacting? 1125mhz is a good OC on stock volts.


----------



## gamervivek

ROPs are usually limiting in higher resolutions. Fiji doesn't scale the geometry hardware from Tonga so tessellation performance might be better but polygon throughput is about the same in TR's review.


----------



## xer0h0ur

Quote:


> Originally Posted by *gamervivek*
> 
> ROPs are usually limiting in higher resolutions. Fiji doesn't scale the geometry hardware from Tonga so tessellation performance might be better but polygon throughput is about the same in TR's review.


Okay, see. Plenty of people say that the ROP count affects high resolution more than anything else yet we see Fiji XT doing work at 4K with a lot less ROPs than Titan X. Explain that.


----------



## ban25

Quote:


> Originally Posted by *xer0h0ur*
> 
> Okay, see. Plenty of people say that the ROP count affects high resolution more than anything else yet we see Fiji XT doing work at 4K with a lot less ROPs than Titan X. Explain that.


Perhaps Titan X is bandwidth bound, but Fury X is ROP/fillrate bound. In order words, they may both be imbalanced, but in different ways.


----------



## p4inkill3r

1135mhz is as high as I can get, anything else crashes the driver.


----------



## p4inkill3r

Quote:


> Originally Posted by *Blackops_2*
> 
> 1125 was the standard a good Tahiti sample would do on stock volts. In my limited run with Tahiti at least. I only kept one out of four 7970s. Three between RMAing constantly with XFX because they were crap, sold the remaining XFX. Bought a reference Diamond and held on to it.
> 
> No artifacting? 1125mhz is a good OC on stock volts.


Zero artifacts.

Fury X @ 1135mhz is 13fps higher than 290 Tri-X @ 1125 in my setup in Shadow of Mordor.


----------



## ZealotKi11er

Quote:


> Originally Posted by *p4inkill3r*
> 
> Zero artifacts.
> 
> Fury X @ 1135mhz is 13fps higher than 290 Tri-X @ 1125 in my setup in Shadow of Mordor.


Very small difference. Might have to be some CPU limitation on play. That's a 290 not even a 290X + 290 Series dont have same drivers as Fury and 390s.


----------



## Orthello

Quote:


> Originally Posted by *Blackops_2*
> 
> 1125 was the standard a good Tahiti sample would do on stock volts. In my limited run with Tahiti at least. I only kept one out of four 7970s. Three between RMAing constantly with XFX because they were crap, sold the remaining XFX. Bought a reference Diamond and held on to it.
> 
> No artifacting? 1125mhz is a good OC on stock volts.


I have fond memories of Tahiti CFX Lightnings @ 1400 mhz core , chilled liquid with 1.35v but still - cleaned up anything NV had for a while.

Looking forward to Fury X been unlocked.


----------



## th3illusiveman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Very small difference. Might have to be some CPU limitation on play. That's a 290 not even a 290X + 290 Series dont have same drivers as Fury and 390s.


I was thinking the same thing. 13 fps.... is 13fps.... I'm sure it would do alot better at 4K but at 1440p there is clearly some serious bottleneck.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Orthello*
> 
> I have fond memories of Tahiti CFX Lightnings @ 1400 mhz core , chilled liquid with 1.35v but still - cleaned up anything NV had for a while.
> 
> Looking forward to Fury X been unlocked.


1400MHz HD 7970 has no problem beating 290X, Titan, GTX780. Problem is it was very hard to get anything over 1300MHz 24/7.


----------



## boredmug

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1400MHz HD 7970 has no problem beating 290X, Titan, GTX780. Problem is it was very hard to get anything over 1300MHz 24/7.


Good lord.. I was able to hit 1300 with water on the gpu. heatsinks on the vrm's. Can't imagine 1400.


----------



## p4inkill3r

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Very small difference. Might have to be some CPU limitation on play. That's a 290 not even a 290X + 290 Series dont have same drivers as Fury and 390s.


8320 @ 5GHz


----------



## Orthello

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1400MHz HD 7970 has no problem beating 290X, Titan, GTX780. Problem is it was very hard to get anything over 1300MHz 24/7.


Room temp liquid was 1250 ish core. -25c with more voltage was stable and 24/7 @ 1400mhz core. I clocked Far Cry 3 at this setting.
Quote:


> Originally Posted by *boredmug*
> 
> Good lord.. I was able to hit 1300 with water on the gpu. heatsinks on the vrm's. Can't imagine 1400.


They were fast cards for the time .. fastest in the world actually , see post 293 link below for my first attempt at World Record, then post 300 which was a World Record at the time (HWBot 3D Mark 2011 extreme)

http://forums.extremeoverclocking.com/showthread.php?t=363597&page=15

Never quite got to that level with NV cards, but 1600mhz+ TitanXs are pretty fast









Love to take Fury X into the 1400s haha.


----------



## BackwoodsNC

Calm down there @Orthello. I am going classified or this, depends on which one comes in stock first and if they get voltage control(fury).


----------



## Orthello

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Calm down there @Orthello
> . I am going classified or this, depends on which one comes in stock first and if they get voltage control(fury).


Backs love to see you clock either under chilled H20. Realistically $$ wise i'm not going to be able to do anything







so its up to you guys to provide me with the fun results


----------



## Mega Man

Quote:


> Originally Posted by *Ceadderman*
> 
> Why Fury X xFire for a NAS box?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't get me wrong, you should do whatever you like but I think a data backup sporting top end GPU make overkill look pensive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


nah was just letting him know he is not the only one waiting, my nas is a stock nas build no video/ no sound, just server nas


----------



## Casey Ryback

Quote:


> Originally Posted by *BackwoodsNC*
> 
> and if they get voltage control(fury).


Just waiting on the software updates (afterburner/trixx etc)

The card itself is not voltage locked on a software or hardware level.

"Alexey Nicolaychuk aka Unwinder, who's the creator of RivaTuner which forms the backbone of almost all 3rd party overclocking tools did not receive a Fury X card at launch. Which is why we've yet to see an update come out for MSI Afterburner to enable over-volting of Fury X cards."

http://wccftech.com/unlock-memory-overclocking-amd-r9-fury/

People are hoping voltage becomes adjustable in time for the fury air cooled launch.


----------



## bonami2

Quote:


> Originally Posted by *p4inkill3r*
> 
> 8320 @ 5GHz


Well that score remind be how poor my 4790k is in multithread i do 13k at 4.7.........

Those fx are awesome


----------



## Casey Ryback

Quote:


> Originally Posted by *bonami2*
> 
> Well that score remind be how poor my 4790k is in multithread i do 13k at 4.7.........
> 
> Those fx are awesome


Yep 13k is pretty poor considering the chip is over 2X the price.


----------



## Thoth420

Quote:


> Originally Posted by *Casey Ryback*
> 
> Just waiting on the software updates (afterburner/trixx etc)
> 
> The card itself is not voltage locked on a software or hardware level.
> 
> "Alexey Nicolaychuk aka Unwinder, who's the creator of RivaTuner which forms the backbone of almost all 3rd party overclocking tools did not receive a Fury X card at launch. Which is why we've yet to see an update come out for MSI Afterburner to enable over-volting of Fury X cards."
> 
> http://wccftech.com/unlock-memory-overclocking-amd-r9-fury/
> 
> People are hoping voltage becomes adjustable in time for the fury air cooled launch.


He can borrow mine....sitting in the box still...


----------



## en9dmp

Quote:


> Originally Posted by *p4inkill3r*
> 
> Zero artifacts.
> 
> Fury X @ 1135mhz is 13fps higher than 290 Tri-X @ 1125 in my setup in Shadow of Mordor.


Am I doing something wrong here? I can't even get an extra 25MHz out of either of my cards in crossfire without immediate driver crash, let alone artifacting... Is there like an eco BIOS on these things that I'm using by mistake? They do have a BIOS switch right?


----------



## ozyo

any 3 way crossfire benchmark yet ?


----------



## Gregster

This is how I roll....


----------



## p4inkill3r

Quote:


> Originally Posted by *en9dmp*
> 
> Am I doing something wrong here? I can't even get an extra 25MHz out of either of my cards in crossfire without immediate driver crash, let alone artifacting... Is there like an eco BIOS on these things that I'm using by mistake? They do have a BIOS switch right?


What model(s) do you own? What PSU?


----------



## DividebyZERO

Quote:


> Originally Posted by *Gregster*
> 
> This is how I roll....


What the....

Hehe


----------



## p4inkill3r

Quote:


> Originally Posted by *bonami2*
> 
> Well that score remind be how poor my 4790k is in multithread i do 13k at 4.7.........
> 
> Those fx are awesome


I could go and spend $1k for a 5960x and own every benchmark if I wanted to; I _choose_ to use AMD.

Here is my old 4770k @ 4.4Ghz with the Tri-X 290 for comparison's sake: http://www.3dmark.com/fs/2546153

8320 with the same card: http://www.3dmark.com/fs/5038904

Not $200 worth of difference IMO.


----------



## en9dmp

Quote:


> Originally Posted by *p4inkill3r*
> 
> What model(s) do you own? What PSU?


A pair of sapphires, undoubtedly first batch as I got them on launch day. PSU is a brand new corsair AX1200i


----------



## p4inkill3r

Have you tried OCing them in a single GPU setup?


----------



## bonami2

Quote:


> Originally Posted by *p4inkill3r*
> 
> I could go and spend $1k for a 5960x and own every benchmark if I wanted to; I _choose_ to use AMD.
> 
> Here is my old 4770k @ 4.4Ghz with the Tri-X 290 for comparison's sake: http://www.3dmark.com/fs/2546153
> 
> 8320 with the same card: http://www.3dmark.com/fs/5038904
> 
> Not $200 worth of difference IMO.


well for me it was singlethread wise. my game doubled and almost tripled fps in arma 3 and beam ng


----------



## DividebyZERO

Quote:


> Originally Posted by *bonami2*
> 
> well for me it was singlethread wise. my game doubled and almost tripled fps in arma 3 and beam ng


Nothing like paying more for hardware to run poorly programmed games. Its funny how we are in 2015 and still have games developed for a single thread. Of course this is more relevant at lower resolutions.


----------



## p4inkill3r

Quote:


> Originally Posted by *bonami2*
> 
> well for me it was singlethread wise. my game doubled and almost tripled fps in arma 3 and beam ng


Yeah, I don't play that game.


----------



## Duality92

Has anyone here tried folding on these cards already? Have you see what kind of results it yields with current work units? Quite a bit of us are interested in folding numbers for these cards!


----------



## xer0h0ur

I'll just leave this here for you DVI boys to


----------



## Zanpakuto

Quote:


> Originally Posted by *xer0h0ur*
> 
> I'll just leave this here for you DVI boys to


That's a nice looking R9 390X







, Source ?


----------



## blue1512

An Asus 980Ti STRIXX with AMD sticker, most likely


----------



## xer0h0ur

Actually that is a Fury

http://wccftech.com/asus-strix-radeon-r9-fury-graphics-card-pictured-3584-sps-directcu-iii-triple-fan-cooler/


----------



## hamzta09

http://www.sweclockers.com/nyhet/20796-sapphire-radeon-r9-fury-tri-x-i-bilder-och-specifikationer-pa-natet

Fury Tri-X.

That cooler is like twice the size of PCB.


----------



## blue1512

Quote:


> Originally Posted by *xer0h0ur*
> 
> Actually that is a Fury
> 
> http://wccftech.com/asus-strix-radeon-r9-fury-graphics-card-pictured-3584-sps-directcu-iii-triple-fan-cooler/


Image recycle at its best. It's wccftech after all


----------



## xer0h0ur

Whatever, I am not here to convince some rando


----------



## bonami2

Quote:


> Originally Posted by *DividebyZERO*
> 
> Nothing like paying more for hardware to run poorly programmed games. Its funny how we are in 2015 and still have games developed for a single thread. Of course this is more relevant at lower resolutions.


Beam ng is multithreaded

Direct x cause bottleneck on the first core.

After each car is loaded into different core thread

But If you want a high quality car with lot of moving part you still need singlethread

Multithreading is aint magic


----------



## Agent Smith1984

Anyone got 4 of these in a mining rig yet?


----------



## rt123

Mining has been dead for a while now.


----------



## ban25

Quote:


> Originally Posted by *Duality92*
> 
> Has anyone here tried folding on these cards already? Have you see what kind of results it yields with current work units? Quite a bit of us are interested in folding numbers for these cards!


I tried, but I wasn't able to get a GPU work unit. Assuming FAH is single-precision, it should scale linearly with the number of stream processors. Otherwise, I wouldn't expect any real gain -- my primegrid results earlier in this thread show negligible improvement over a stock 290, roughly 2%.


----------



## Duality92

Quote:


> Originally Posted by *ban25*
> 
> I tried, but I wasn't able to get a GPU work unit. Assuming FAH is single-precision, it should scale linearly with the number of stream processors. Otherwise, I wouldn't expect any real gain -- my primegrid results earlier in this thread show negligible improvement over a stock 290, roughly 2%.


That's not too promising. I guess NVidia will reign folding for quite a bit now.


----------



## tx12

Fury owners, what's PCI Revision ID of your Fury chip?
You can check it in Device manager -> Adapter -> Details -> select Hardware IDs. PCI\VEN.....&REV_XX, and XX is the REV ID.
Should be C8 or CB.


----------



## Duality92

Quote:


> Originally Posted by *tx12*
> 
> Fury owners, what's PCI Revision ID of your Fury chip?
> You can check it in Device manager -> Adapter -> Details -> select Hardware IDs. PCI\VEN.....&REV_XX, and XX is the REV ID.
> Should be C8 or CB.


Is there any influence of this being one or the other?


----------



## josephimports

Quote:


> Originally Posted by *tx12*
> 
> Fury owners, what's PCI Revision ID of your Fury chip?
> You can check it in Device manager -> Adapter -> Details -> select Hardware IDs. PCI\VEN.....&REV_XX, and XX is the REV ID.
> Should be C8 or CB.


C8.


----------



## tx12

Quote:


> Originally Posted by *Duality92*
> 
> Is there any influence of this being one or the other?


I really have no idea.
They could possibly distinguish different Fury variants by Rev ID now (like different REVID's for Fury X, Fury, Fury nano, etc), instead of DEV ID (but why?).
Or these could be just ASIC versions, like higher is better.

Interesting thing, Linux driver seem to know about 4 different revisions, while windows currently handles only two.


----------



## tx12

Quote:


> Originally Posted by *josephimports*
> 
> C8.


Thanks. In the case all Fury X are C8, CB could be Fury non-X. Maybe.


----------



## hyp36rmax

*+OP AMD Catalyst 15.6 Drivers*

Quote:


> This driver provides full WDDM 2.0 support for Windows 10 Technical Preview and DirectX 12 on all Graphics Core Next (GCN) supported products - AMD Radeon HD 7000 and newer graphics products. Official driver support for AMD products will be available when Microsoft launches Windows 10 on July 29th, 2015.
> 
> NOTE: To ensure stability, users should upgrade to the latest available Windows 10 Technical Preview build provided by Microsoft before installing AMD Catalyst 15.7.
> 
> *Virtual Super Resolution (VSR):*
> VSR provides image quality enhancements to games and Windows desktop users by rendering images at a higher resolution and then down-scaling the
> same.
> 
> *VSR support has now been extended to the following products:*
> 
> 
> AMD Radeon R9 Fury Series
> AMD Radeon R9 390 Series
> AMD Radeon R7 370 Series
> AMD Radeon R7 360 Series
> AMD Radeon R9 295X2 Series
> AMD Radeon R9 290 Series
> AMD Radeon R9 280 Series
> AMD Radeon R9 270 Series
> AMD Radeon R7 260 Series
> AMD Radeon R9 380 Series
> AMD Radeon HD 7900 Series
> AMD Radeon HD 7800 Series
> AMD Radeon HD 7790 Series
> Desktop A-Series 7400K APUs and above
> 
> *Supported resolutions:Supported VSR Modes*
> 
> 
> 1366 X 768 @ 60Hz
> 1600 X 900
> 1920 X 1080
> 1600 X 900 @ 60Hz
> 1920 X 10801920 X 1080 @ 60Hz
> 2560 X 1440
> 3200 X 1800
> 3840 X 2160 (AMD Radeon R9 285, AMD Radeon R9 Fury Series)
> 1920 X 1200 @ 60Hz
> 2048 X 1536
> 2560 X 1600
> 3840 X 2400 (AMD Radeon R9 285, AMD Radeon R9 Fury Series)2560 X 1440 @ 60Hz
> 3200 X 18001920 X 1080 @ 120Hz
> 1920 X 1200 @ 120Hz
> 2048 X 1536 @ 120Hz
> 
> *Frame Rate Target Control (FRTC):*
> 
> 
> FRTC allows the user to set a maximum frame rate when playing an application in full screen exclusive mode. This feature provides the following benefits:
> Reduced GPU power consumption
> Reduced system heat
> Lower fan speeds and less noise
> 
> *Supported Graphics Cards*
> 
> 
> AMD Radeon R9 Fury Series
> AMD Radeon R9 390 Series
> AMD Radeon R9 380 Series
> AMD Radeon R7 370 Series
> AMD Radeon R7 360 Series
> AMD Radeon R9 295X2 Series
> AMD Radeon R9 290 Series
> AMD Radeon R9 280 Series
> AMD Radeon R9 270 Series
> AMD Radeon R7 260 Series
> AMD Radeon HD 7900 Series
> AMD Radeon HD 7800 Series
> AMD Radeon HD 7700 Series
> 
> *AMD FreeSync and AMD CrossFire Support:*
> AMD FreeSync and AMD CrossFire can now be used together in applications using DirectX 10 or higher. Please note, this feature is currently not supported on systems configured in AMD Dual Graphics mode.
> 
> *AMD Catalyst 15.7 includes enhancement for the following games since AMD Catalyst Omega:*
> 
> 
> Battlefield: Hardline
> Evolve
> Far Cry 4
> Lords of the Fallen
> Project CARS
> Total War: Attila
> Alien: Isolation
> Assassin's Creed Unity
> Civilization: Beyond Earth
> FIFA 2015
> GRID Autosport
> Ryse: Son of Rome
> Talos Principle
> The Crew
> Grand Theft Auto V
> Dying Light
> The Witcher 3: Wild Hunt
> 
> *Performance Optimizations versus AMD Catalyst Omega Single GPU performance on Windows 8.1 based system:*
> 
> 
> Up to 7% in Far Cry 4 on AMD Radeon R7 and AMD Radeon R9 200 series and up
> Up to 10% in Tomb Raider on AMD Radeon R7 and AMD Radeon R9 200 series and up


*Source:* Link

*Download:* Link

Go get em! Looks like FreeSync Support for Crossfire! And a ton of updates! Someone report back with Project Cars and The Witcher 3: The Wild Hunt performance.


----------



## tx12

Yep, juicy thing.
Finally, unified drivers and Linux support for R9 300 + Fury.


----------



## tx12

Quote:


> Originally Posted by *josephimports*
> 
> C8.


BTW, is it retail of ES (sample) card?


----------



## ozyo




----------



## p4inkill3r

Dat exposed PCB :/


----------



## xer0h0ur

Be prepared to apparently provide a source and prove it since people can't just take information at face value.


----------



## Forceman

Quote:


> Originally Posted by *ozyo*


Looks familiar


----------



## xer0h0ur

Well the Fury Tri-X is rocking the same reference PCB from Fury X and if this picture were accurate then that would be a custom PCB for sure.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well the Fury Tri-X is rocking the same reference PCB from Fury X and if this picture were accurate then that would be a custom PCB for sure.


Doesn't really make any sense to put a full length PCB on a Fury. Why pay the extra expensive for what would likely be just a blank PCB? More likely that the picture is a fake/re-used 390X picture.


----------



## xer0h0ur

Sure, its much more likely a re-used photo than a custom PCB.


----------



## Duke976

*Battle Of The Kings Quad Crossfire R9 Fury X vs 4 Way SLI GTX Titan X & 980 Ti*


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> Be prepared to apparently provide a source and prove it since people can't just take information at face value.


Although both sides of that fence can be lazy... it's rather simple to fact check with a simple Google search or post a link of the argument, however it's just good practice to provide a solid source in any debate.


----------



## Ceadderman

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Be prepared to apparently provide a source and prove it since people can't just take information at face value.
> 
> 
> 
> Although both sides of that fence can be lazy... it's rather simple to fact check with a simple Google search or post a link of the argument, however it's just good practice to provide a solid source in any debate.
Click to expand...

Exactly. I am on my phone for the time being so Googling gets tedious unless I know right where to look.

So like with this BS(until otherwise proven) that AMD mass produced Fury X and simply distributed it to the manufacturers, if I don't know what to look for I'm not doing it.

But I can show the the term "Reference" relates to cards they *have built* and distributed "For Reference" to build the final product and to adhere to unless improving upon the "Reference" design. It's the way things are done. Even with 1st generation cards. "Reference" is the advised standard for manufacturing.

Still have yet to see anything from the people banging the "AMD builds them" drum, but whatever.









~Ceadder


----------



## xer0h0ur

Well that is the thing though, you can't prove leaked information / photos. All you can do is pass it along.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well that is the thing though, you can't prove leaked information / photos. All you can do is pass it along.


Agreed. But over the last 5 years I've learned to dismiss a lot of that leaked info and inuendo as rumor and ignore it as such unless I have hard evidence to cite my stance. Hence why I posted the information that LoneWolf15 provided us back when the 5*** Reference cards were the rage. I know it to be factual otherwise I wouldn't dream of passing it along.









~Ceadder


----------



## yawa

Not to cut off this argument, but is anyone benching a Fury X with the new 15.7 drivers? Wondering if it gives a boost.


----------



## p4inkill3r

del


----------



## dir_d

Dat quad Fury X....


----------



## Agent Smith1984

Anybody tried trixx 5.0 to see if voltage control works?


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody tried trixx 5.0 to see if voltage control works?


Do you have a forum link for that, latest I can find on their website or through google is 4.9.1?


----------



## Casey Ryback

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quick SOM runs.


That looks higher than your previous runs........

Different settings?


----------



## blue1512

Quote:


> Originally Posted by *Casey Ryback*
> 
> That looks higher than your previous runs........
> 
> Different settings?


Different driver I believe. That's 15% gain at 1440p with the latest driver.

It seems that all the reviews should be redone. Also Fury air will surely be a hit in reviews.


----------



## Ceadderman

Sapphire Nitro cards are being covered in CPUmag. They state that the Nitro nomenclature covers everything from R7 360 all the way up to R9 390.

It's a pretty solid read, for those wishing to hold off on Fury X due to their CLC and would rather go with air.









ComputerPowerUser.com









No technical reviews however. It's just a basic review of the Sapphire Nitro lineup overall.

~Ceadder


----------



## ban25

My second Fury X arrived, but unfortunately my 850W PSU can't run both cards...it shuts down on boot...so now the question is Corsair AX1500i, Antec HCP 1300, or PCP&C Silencer 1200?


----------



## Ceadderman

Try an EVGA 1000w or 1200w. Very good units from what I've read.









~Ceadder


----------



## bonami2

Quote:


> Originally Posted by *ban25*
> 
> My second Fury X arrived, but unfortunately my 850W PSU can't run both cards...it shuts down on boot...so now the question is Corsair AX1500i, Antec HCP 1300, or PCP&C Silencer 1200?


@shilka @TwoCables You can ask them


----------



## Zealon

My Fury X comes in on Friday


----------



## Mad Pistol

Quote:


> Originally Posted by *blue1512*
> 
> Different driver I believe. *That's 15% gain at 1440p with the latest driver.*
> 
> It seems that all the reviews should be redone. Also Fury air will surely be a hit in reviews.


15% is a massive jump in performance simply from a driver revision.

If true, then AMD's Fury X is just as powerful as we were led to believe in the first place... a proverbial beast.


----------



## edo101

Quote:


> Originally Posted by *Mad Pistol*
> 
> 15% is a massive jump in performance simply from a driver revision.
> 
> If true, then AMD's Fury X is just as powerful as we were led to believe in the first place... a proverbial beast.


No kidding. I honestly don't understand people on OCN at all. We all know from history that AMD cards get better as drivers roll out. The logic that you have to buy NOW within the next femtosecond to get max performance is something I never understood.

I just feel bad for AMD cause they can't make a big splash due to early reviews downplaying the potential of the cards
Quote:


> Originally Posted by *ban25*
> 
> My second Fury X arrived, but unfortunately my 850W PSU can't run both cards...it shuts down on boot...so now the question is Corsair AX1500i, Antec HCP 1300, or PCP&C Silencer 1200?


Go with an EVGA. From what I hear they are the gold standard these days


----------



## p4inkill3r

Quote:


> Originally Posted by *Mad Pistol*
> 
> 15% is a massive jump in performance simply from a driver revision.
> 
> If true, then AMD's Fury X is just as powerful as we were led to believe in the first place... a proverbial beast.


Before wccftech starts quoting this thread, I went back and reran the benchmarks:


These are the settings my initial runs were subject to, I believe.

I have reinstalled the game now twice since my Fury's arrival and I think that the runs showing higher results were at these settings:


I apologize for the confusion, and I will keep the game installed for the time being.


----------



## Thoth420

Quote:


> Originally Posted by *ban25*
> 
> My second Fury X arrived, but unfortunately my 850W PSU can't run both cards...it shuts down on boot...so now the question is Corsair AX1500i, Antec HCP 1300, or PCP&C Silencer 1200?


Superflower OEM high wattage gold or plat imo. EVGA G2 or BeQuiet! Dark Power Pro or anything else that are the same build as those that you like. I always give myself tons of overhead with power.


----------



## p4inkill3r

Now I have no clue what the difference is because even with FXAA off, I'm only getting a little over 70FPS at 1440p.

I'm going to start all over ;/


----------



## ban25

Quote:


> Originally Posted by *Ceadderman*
> 
> Try an EVGA 1000w or 1200w. Very good units from what I've read.


Quote:


> Originally Posted by *edo101*
> 
> Go with an EVGA. From what I hear they are the gold standard these days


Quote:


> Originally Posted by *Thoth420*
> 
> Superflower OEM high wattage gold or plat imo. EVGA G2 or BeQuiet! Dark Power Pro or anything else that are the same build as those that you like. I always give myself tons of overhead with power.


I *do* like Super Flower units and the EVGA T2 looks quite good...thanks!


----------



## p4inkill3r




----------



## ozyo

Quote:


> Originally Posted by *Forceman*
> 
> Looks familiar


http://www.computeruniverse.net/en/products/90610661/asus-strix-r9fury-dc3-4g-gaming.asp


----------



## Blackops_2

Quote:


> Originally Posted by *ban25*
> 
> My second Fury X arrived, but unfortunately my 850W PSU can't run both cards...it shuts down on boot...so now the question is Corsair AX1500i, Antec HCP 1300, or PCP&C Silencer 1200?


Seasonic X1250 Gold also. Though it might be on the pricier side it's solid.


----------



## blue1512

Quote:


> Originally Posted by *Derp*
> 
> Changing from 15.6 to 15.7 produced +43% DX11 multi-threaded draw calls and +34% DX11 single-threaded draw calls on my system.


FuryX loves this. With this boost the performance issue at 1080/1440p is fixed


----------



## Casey Ryback

Quote:


> Originally Posted by *ozyo*
> 
> http://www.computeruniverse.net/en/products/90610661/asus-strix-r9fury-dc3-4g-gaming.asp


Picture isn't accurate, but we can presume it's using that cooler.

Note the description says no DVI, when there is clearly a DVI port in the pic.

Also the PCB is full length, it's most likely a 390X.


----------



## Blackops_2

Was going to say there is no way it's that big lol


----------



## Casey Ryback

Quote:


> Originally Posted by *Blackops_2*
> 
> Was going to say there is no way it's that big lol


It will be that big if the sapphire fury is anything to go by.



The air passing through heatsink and out the back of the card should allow for good temps.


----------



## ECPowers

Het Trixx 5.0 here:

http://www.sapphiretech.com/productdetial.asp?pid=69ED4799-7518-434C-80CD-3FF8811F8648&lang=eng

TRIXX Features:

Performance Tune Memory and GPU Clock Speeds
Over and Under Voltage adjustment
Save configurations
Graphical H/W Monitor
H/W log file
Feedback option

Can anyone check voltage? My Fury is away for RMA.


----------



## Silent Scone

Quote:


> Originally Posted by *ECPowers*
> 
> Het Trixx 5.0 here:
> 
> http://www.sapphiretech.com/productdetial.asp?pid=69ED4799-7518-434C-80CD-3FF8811F8648&lang=eng
> 
> TRIXX Features:
> 
> Performance Tune Memory and GPU Clock Speeds
> Over and Under Voltage adjustment
> Save configurations
> Graphical H/W Monitor
> H/W log file
> Feedback option
> 
> Can anyone check voltage? My Fury is away for RMA.


That's been there a few days, no volt control yet. Adrian Thompson from Sapphire said it would be a week or two, that was sometime last week.


----------



## bkvamme

Have anybody tried out the new 15.7 drivers? Would be extremely interesting to see if the new driver has improved performance compared to the launch driver.


----------



## Gumbi

I'd love to see some proper benches ASAP


----------



## xer0h0ur

I am itching to see a reviewer re-test with the 15.7. If the DX11 draw calls went up that much then there is bound to be tangible change in FPS at lower resolutions. When I said lower, its all relative. I am at 4K.


----------



## Partogi

What's the advantages of this card compared to 980 Ti? As far as I know the Ti is faster and more overclockable.


----------



## Gdourado

Quote:


> Originally Posted by *Partogi*
> 
> What's the advantages of this card compared to 980 Ti? As far as I know the Ti is faster and more overclockable.


For me is temperature and silence,
Sure a 980ti is fast, but it gets so hot, that to have that overclock you mention the fans have to be at least at 70% speed.
In a custom 980ti with 3 fans like the gigabyte or Asus or zotac, that is really loud!

Also, the fury was neck and neck with the ti on launch drivers...
It will surely get better.
Also, AMD actually improves their products with driver releases.
Nvidia cuts the performance of previous generations products to force an upgrade...

By all that above, I ordered a Fury X.

Cheers!


----------



## bkvamme

Comparison video:
Quad Fury X, 980Ti and Titan X.




Numbers are pretty damn good. I wonder how much the Fury X can be overclocked. Would love to see it at the top of all of these.

*EDIT: For those too lazy to watch the video:*
Test system
Motherboard: Asus X99 ROG Rampage V Extreme
CPU: Intel i7 5960X 3.50 GHz
Memory: 32GB Corsair Dominator DDR4 3000
SSD: 250GB Samsung 850 EVO
PSU: Corsair AX1500i 1500 watt
Monitor: Asus 4K Monitor

3DMark 11 Extreme 4K - 3840x2160
4x 980Ti - 8468
4x 980Ti OC - 9961
4x Titan X - 9310
4x Titan X OC - 10287
4x Fury X - 9557
4x Fury X OC - 9893

3DMark Extreme 2.5K - 2560x1440
4x 980Ti - 19682
4x 980Ti OC - 21400
4x Titan X - 19500
4x Titan X OC - 21515
4x Fury X - 20771
4x Fury X OC - 20983

3DMark Firestrike Ultra 4K - 3840x2160
4x 980Ti - 11894
4x 980Ti OC - 14647
4x Titan X - 12123
4x Titan X OC - 14411
4x Fury X - 12577
4x Fury X OC - 13921

Unigine Valley Ultra No AA 4K - 3840x2160
4x 980Ti - 91.9
4x 980Ti OC - 95.3
4x Titan X - 93.5
4x Titan X OC - 94
4x Fury X - 94.2
4x Fury X OC - 95.7

Metro Last Light Very High no SSAA 4K - 3840x2160
4x 980Ti - 42.48
4x 980Ti OC - 47.12
4x Titan X - 41.21
4x Titan X OC - 46.87
4x Fury X - 65.87
4x Fury X OC - 72.6

Tomb Raider Ultra No TressFX 4K - 3840x2160
4x 980Ti - 177.7
4x 980Ti OC - 179.8
4x Titan X - 179.7
4x Titan X OC - 180.4
4x Fury X - 235.1
4x Fury X OC - 244.5

Far Cry 4 Very High 4K - 3840x2160
4x 980Ti - 81.3
4x 980Ti OC - 87.6
4x Titan X - 65.9
4x Titan X OC - 80.3
4x Fury X - 86.5
4x Fury X OC - 90.2

My comment: OC results for the Fury X are pretty weak so far, but overall, very promising. Excited to see how the card performs with unlocked voltage control.


----------



## Heimdallr

Guys one quick question, it's possible to swap the fan in the Fury X? Do i need to open the card? I have a SP120 that I would like to use.

Thanks


----------



## bkvamme

Quote:


> Originally Posted by *Heimdallr*
> 
> Guys one quick question, it's possible to swap the fan in the Fury X? Do i need to open the card? I have a SP120 that I would like to use.
> 
> Thanks


Appears to be a non-standard connector inside the card, but it should be normal dupont connectors inside the plug. If you have a needle or a thin pin, you should be able to extract the pins and replace the header on your SP120 (Assuming that it is a PWM SP120), and directly swap it. You definitly have to take off the front if you want to do this, but that should be enough. If there is very little space in the hole where the cables/tubing go into the card, you just take off the plug and only take the fan cables through.

If you need instructions in how to take off the plug, just see some of the fan sleeving guides. Lutr0 has a good one on YouTube. Same procedure apply here.


Spoiler: Warning: Spoiler!


----------



## sugarhell

Quote:


> Originally Posted by *Heimdallr*
> 
> Guys one quick question, it's possible to swap the fan in the Fury X? Do i need to open the card? I have a SP120 that I would like to use.
> 
> Thanks


Gentle typhoon >>>> SP120


----------



## p4inkill3r

Quote:


> Originally Posted by *Heimdallr*
> 
> Guys one quick question, it's possible to swap the fan in the Fury X? Do i need to open the card? I have a SP120 that I would like to use.
> 
> Thanks


Are you wanting to use the SP120 for looks or what? The stock fan is pretty high-end.


----------



## bkvamme

Quote:


> Originally Posted by *sugarhell*
> 
> Gentle typhoon >>>> SP120


Yes indeed. The everlasting question of looks <> performance.


----------



## Heimdallr

Quote:


> Originally Posted by *bkvamme*
> 
> Appears to be a non-standard connector inside the card, but it should be normal dupont connectors inside the plug. If you have a needle or a thin pin, you should be able to extract the pins and replace the header on your SP120 (Assuming that it is a PWM SP120), and directly swap it. You definitly have to take off the front if you want to do this, but that should be enough. If there is very little space in the hole where the cables/tubing go into the card, you just take off the plug and only take the fan cables through.
> 
> If you need instructions in how to take off the plug, just see some of the fan sleeving guides. Lutr0 has a good one on YouTube. Same procedure apply here.
> 
> 
> Spoiler: Warning: Spoiler!


thanks for the insight.
Quote:


> Originally Posted by *sugarhell*
> 
> Gentle typhoon >>>> SP120


Sure, but i have a spare SP120 at home, not a gentle typhoon








Quote:


> Originally Posted by *p4inkill3r*
> 
> Are you wanting to use the SP120 for looks or what? The stock fan is pretty high-end.


Just for looks, to match the rest of the fans








Quote:


> Originally Posted by *bkvamme*
> 
> Yes indeed. The everlasting question of looks <> performance.


Can't we have both? ;(


----------



## sugarhell

Fury use a Gentle typhoon. No point to change to a SP120. Just a bad fan overall


----------



## NBrock

Any of you guys/gals that are into Folding try the Fury X out yet?


----------



## bkvamme

FYI: Found a 3DMark comparison between previous driver revisions:

http://www.3dmark.com/compare/aot/37817/aot/37847/aot/38332/aot/41285/aot/41338/aot/41345

Can't remember where I found this link, but the results should still be valid. Check the graphics card section for the driver version. Pretty impressive results all together.


----------



## Slink3Slyde

Quote:


> Originally Posted by *NBrock*
> 
> Any of you guys/gals that are into Folding try the Fury X out yet?


I'm waiting to hear about this too. You've probably seen it but just in case, Anandtech include [email protected] benches in their reviews, its around about a GTX 980 performance in the single precision tests, which I think are most important seeing how Maxwell really sucks at double precision yet people with 980's are getting big PPD.

http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/24

Really strange I cant find any other references to Folding on the Fury X. I guess no one else can either as a couple of people have asked now.


----------



## NBrock

Quote:


> Originally Posted by *Slink3Slyde*
> 
> I'm waiting to hear about this too. You've probably seen it but just in case, Anandtech include [email protected] benches in their reviews, its around about a GTX 980 performance in the single precision tests, which I think are most important seeing how Maxwell really sucks at double precision yet people with 980's are getting big PPD.
> 
> http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/24
> 
> Really strange I cant find any other references to Folding on the Fury X. I guess no one else can either as a couple of people have asked now.


Thanks for the link. Looks promising. I'd still like to see actual PPD numbers. If I had the money I would totally buy one and post as much info as possible.


----------



## Slink3Slyde

Quote:


> Originally Posted by *NBrock*
> 
> Thanks for the link. Looks promising. I'd still like to see actual PPD numbers. If I had the money I would totally buy one and post as much info as possible.


Me too, If the Fury reviews aren't so great I might step up to Fury X. I read someone somewhere saying they couldn't get a unit for some reason, maybe drivers I dont know. I guess by then someone else will have posted something up though.


----------



## royfrosty

BOOM!

This is awesome. The new drivers really worked.

Farcry 4 on 1440p with all Preset Ultra settings.

Note that it is just on stock clock. No OC, nor HBM OC.

2015-07-09 22:44:11 - FarCry4
Frames: 7056 - Time: 87953ms - Avg: 80.225 - Min: 62 - Max: 98

Before on 15.15 drivers.

1440P with all Preset Ultra settings.

2015-06-26 16:04:54 - FarCry4
Frames: 5810 - Time: 88234ms - Avg: 65.848 - Min: 51 - Max: 95

EDIT: Note that the benchmark was taken during the first time when you met Sabal and he asked you to run through the doors to the truck. The benchmark ends when the truck was hit and went off the cliff.


----------



## Casey Ryback

Wow that bump to minimum and average frames is huge.


----------



## Gumbi

Wow,are those solid bench numbers? That boost is HUGE.


----------



## Alastair

Someone please post these results in the Fury X reviews thread so that we can finally shut the green goblins up.


----------



## Agent Smith1984

Every AMD card has seen a boost in fc4 so far.... Crysis3 also improved for me....

Anyone tested other titles yet?

Gta v and witcher 3 especially....


----------



## Gregster

I have tested The Witcher 3 and frames seem to have dropped off slightly. I am not really seeing a performance boost at all in truth from the 15.15 on the Fury X Maybe I am doing something wrong?

3930K at 4.4 XFX Fury X
Win 8.1


----------



## xer0h0ur

Holy crap, I am reading of people already ordering Fury? Did the launch get pushed up? Or are retailers just selling them early without permission?


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Every AMD card has seen a boost in fc4 so far.... Crysis3 also improved for me....
> 
> Anyone tested other titles yet?
> 
> Gta v and witcher 3 especially....


What I am also more interested in seeing as well.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gregster*
> 
> I have tested The Witcher 3 and frames seem to have dropped off slightly. I am not really seeing a performance boost at all in truth from the 15.15 on the Fury X Maybe I am doing something wrong?
> 
> 3930K at 4.4 XFX Fury X
> Win 8.1


W3 performance hit is a known issue in the driver notes, mainly curious as to how big the hit is....


----------



## Casey Ryback

Quote:


> Originally Posted by *xer0h0ur*
> 
> Holy crap, I am reading of people already ordering Fury? Did the launch get pushed up? Or are retailers just selling them early without permission?


Sapphire are releasing tomorrow afaik.

http://sapphirenation.net/theclock/


----------



## Gregster

Quote:


> Originally Posted by *Agent Smith1984*
> 
> W3 performance hit is a known issue in the driver notes, mainly curious as to how big the hit is....


I am encoding a video that shows performance going from 15.15 to 15.20 and just watching the little preview screen in Vegas as it goes, it is anywhere from 2 fps to 5 fps of a performance hit over the 15.15's.


----------



## Alastair

Any body have any idea if Fury might get marked down for Amazon Prime day? They say it will be like a Black Friday. How much does stuff normally get marked down for Back Friday?


----------



## Gregster

3 games tested on the new and old drivers with a Fury X.


----------



## optimumbox

Quote:


> Originally Posted by *royfrosty*
> 
> BOOM!
> 
> This is awesome. The new drivers really worked.
> 
> Farcry 4 on 1440p with all Preset Ultra settings.
> 
> Note that it is just on stock clock. No OC, nor HBM OC.
> 
> 2015-07-09 22:44:11 - FarCry4
> Frames: 7056 - Time: 87953ms - Avg: 80.225 - Min: 62 - Max: 98
> 
> Before on 15.15 drivers.
> 
> 1440P with all Preset Ultra settings.
> 
> 2015-06-26 16:04:54 - FarCry4
> Frames: 5810 - Time: 88234ms - Avg: 65.848 - Min: 51 - Max: 95
> 
> EDIT: Note that the benchmark was taken during the first time when you met Sabal and he asked you to run through the doors to the truck. The benchmark ends when the truck was hit and went off the cliff.


Could we get some screenshots of that in MSI Afterburner? I don't want to doubt you, that's amazing news, but people are already taking at face value on other sites.


----------



## blue1512

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> 
> 
> 3 games tested on the new and old drivers with a Fury X.


Thanks for your hard work, BUT...
I saw 2. One is Project car which refuses to run multithread dx11 with AMD GPU (except on win10 when it was forced) and has PhysX built in engine. The second is TW3 which has a specific note about performance issue with this driver. No offence.
Could you plz test it with GTA V, BF 4 and FC 4?


----------



## Gregster

Quote:


> Originally Posted by *blue1512*
> 
> Thanks for your hard work, BUT...
> I saw 2. One is Project car which refuses to run multithread dx11 with AMD GPU (except on win10 when it was forced) and has PhysX built in engine. The second is TW3 which has a specific note about performance issue with this driver. No offence.
> Could you plz test it with GTA V, BF 4 and FC 4?


Well from what I am reading, GTA V is a bit of a mess at the mo for both AMD and Nvidia and I picked 3 games that got mentioned for optimisations however, they are mentioning since the Omega drivers, so quite possible that there isn't any since the last WHQL driver (Christmas?).


----------



## Newbie2009

Quote:


> Originally Posted by *Gregster*
> 
> Well from what I am reading, GTA V is a bit of a mess at the mo for both AMD and Nvidia and I picked 3 games that got mentioned for optimisations however, they are mentioning since the Omega drivers, so quite possible that there isn't any since the last WHQL driver (Christmas?).


GTA works perfectly


----------



## Gregster

Quote:


> Originally Posted by *Newbie2009*
> 
> GTA works perfectly


I will fire it up and give it a run. I was just going on hearsay


----------



## xer0h0ur

Who goes on hearsay when they have the hardware and the game to test it themselves


----------



## Ceadderman

Someone who doesn't wish to waste their time, if there are issues with a game. That's who.









~Ceadder


----------



## derickwm

3  water blocks inbound early next week


----------



## NBrock

O...M....G!!!! I am sooooo jelly

PLEASE do some Folding at home tests for us!!!


----------



## gamervivek

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> 
> 
> 3 games tested on the new and old drivers with a Fury X.


Not inspiring except for the 200MB reduction in the already low vram usage in Grid.


----------



## derickwm

Quote:


> Originally Posted by *NBrock*
> 
> O...M....G!!!! I am sooooo jelly
> 
> PLEASE do some Folding at home tests for us!!!


Got an extra 100Mhz out of HBM and a few extra Mhz on the core.










Going to try folding tonight but I'm not so sure it'll work as you know how new GPUs go...


----------



## NBrock

Quote:


> Originally Posted by *derickwm*
> 
> Got an extra 100Mhz out of HBM and a few extra Mhz on the core.
> 
> 
> 
> 
> 
> 
> 
> 
> Going to try folding tonight but I'm not so sure it'll work as you know how new GPUs go...


Nice. I know folding on new GPUs is never the best but I am still curious how it can do with just brute force. I appreciate any info you can get








+ Rep for doing the dirty work







just saw I cannot give you rep lol...but I would if I could


----------



## Gregster

Quote:


> Originally Posted by *xer0h0ur*
> 
> Who goes on hearsay when they have the hardware and the game to test it themselves


Quote:


> Originally Posted by *Ceadderman*
> 
> Someone who doesn't wish to waste their time, if there are issues with a game. That's who.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


You are both correct







Having jumped on it, it runs fine and no problems on either but then again, no improvements on either. Maybe these improvements are for older cards that people are claiming seeing? I have even run DDU just to make sure the drivers didn't have any nasties left but still nothing









Video will be done soon.

Edit:




Just a quick recording.


----------



## rv8000

Quote:


> Originally Posted by *Gregster*
> 
> You are both correct
> 
> 
> 
> 
> 
> 
> 
> Having jumped on it, it runs fine and no problems on either but then again, no improvements on either. Maybe these improvements are for older cards that people are claiming seeing? I have even run DDU just to make sure the drivers didn't have any nasties left but still nothing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Video will be done soon.
> 
> Edit:
> 
> 
> 
> 
> Just a quick recording.


Even single GPU, the new drivers look MUCH smoother in comparison to the 15.15 drivers. VRAM usage has definitely changed too, huge gap 2200->3400


----------



## royfrosty

I found out not only SoM has a smaller increase in FPS.

Alright SoM

1440p Ultra Preset settings and no OC stock settings.



Min: 61fps
Avg: 88fps

On 15.15



Min: 56fps
Avg: 85fps

Will bench the rest tomorrow. And compile everything.

But also thief have some good improvements.

Thief

1440p with max Ultra preset settings on new drivers



Min: 54fps
Avg: 73fps

1440p on previous old drivers



Min: 47fps
Avg: 65fps


----------



## optimumbox

Quote:


> Originally Posted by *Gregster*
> 
> 
> 
> 
> 
> 3 games tested on the new and old drivers with a Fury X.


Quote:


> Originally Posted by *royfrosty*
> 
> I found out not only SoM has a smaller increase in FPS.
> 
> Alright SoM
> 
> 1440p Ultra Preset settings and no OC stock settings.
> 
> 
> 
> Min: 61fps
> Avg: 88fps
> 
> On 15.15
> 
> 
> 
> Min: 56fps
> Avg: 85fps
> 
> Will bench the rest tomorrow. And compile everything.
> 
> But also thief have some good improvements.
> 
> Thief
> 
> 1440p with max Ultra preset settings on new drivers
> 
> 
> 
> Min: 54fps
> Avg: 73fps
> 
> 1440p on previous old drivers
> 
> 
> 
> Min: 47fps
> Avg: 65fps


Would you say it would be worth it to get the FuryX for 1080P? I'm forced at 1080 due to only having DVI on my IPS monitor. I originally returned my FuryX due to driver issues in certain games, rendering them not playable. I'd much rather have the lower temp AMD card, but at this point I'm going for whatever gets a higher framerate for my buck.


----------



## royfrosty

Quote:


> Originally Posted by *optimumbox*
> 
> Would you say it would be worth it to get the FuryX for 1080P? I'm forced at 1080 due to only having DVI on my IPS monitor. I originally returned my FuryX due to driver issues in certain games, rendering them not playable. I'd much rather have the lower temp AMD card, but at this point I'm going for whatever gets a higher framerate for my buck.


Well. I guess it is a good time to move on to 1440p at least. Fury X for a 1080p is really overkill imho.

But for some reason, if you aint gonna upgrade to 1440p and choose to stay on 1080p instead. Might wanna consider waiting for Nano or Fury?


----------



## p4inkill3r

FWIW, my pump:


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> FWIW, my pump:


Ultimately pump noise, no pump noise, coil whine, no coil whine or both...what matters is that you are happy. So if you're not satisfied get it exchanged.


----------



## p4inkill3r

I'm totally happy; no pump noise, no whine.


----------



## Jflisk

Power Color Fury X Just showed up sitting in the box waiting for my CPU water block to water leak test. Had to take it apart and clean it up was gunked up.



[I


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> I'm totally happy; no pump noise, no whine.


Nice, you lucked out on a perfectly working unit from the initial run then. How does your HBM and GPU take to overclocking?


----------



## ban25

Quote:


> Originally Posted by *NBrock*
> 
> O...M....G!!!! I am sooooo jelly
> 
> PLEASE do some Folding at home tests for us!!!


I have my Crossfire setup working now that I replaced the PSU. Running FAH, I'm seeing an estimated PPD of 72000 on the first card and 71643 on the second.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nice, you lucked out on a perfectly working unit from the initial run then. How does your HBM and GPU take to overclocking?


I haven't had much time to sit and mess with it (ergo my settings mistake earlier in thread) but here is a Firestrike Extreme run from just now:

[email protected] 5.0GHz
Fury X @ 1125mhz/600mhz

http://www.3dmark.com/fs/5355916


----------



## Orthello

Not sure how happy i can be about this owning a couple of titans but i actually am .. kudos to AMD in CFX.

http://www.pcper.com/reviews/Graphics-Cards/AMD-Fury-X-vs-NVIDIA-GTX-980-Ti-2-and-3-Way-Multi-GPU-Performance/Power-Consu

"But what about that direct AMD and NVIDIA comparisons? Despite what we might have expected going in, the AMD Radeon R9 Fury X actually scaled in CrossFire better than the NVIDIA GeForce GTX 980 Ti. This comes not only in terms of average frame rate increases, but also in lower frame time variances that result in a smoother gaming experience. In several cases the extra scalability demonstrated by the Fury X allowed its dual-GPU performance to surpass a pair of GTX 980 Ti cards even though in a single GPU configuration the GeForce card was the winner. GRID 2 at 4K is one example of this result as is Bioshock Infinite at 4K. And even in a game like Crysis 3 at 4K where we saw NVIDIA's card scale by a fantastic 84%, AMD's Fury X card scaled by 95%!"


----------



## Ceadderman

Can't wait to pull the trigger on Fury or x2.









~Ceadder


----------



## Orthello

Quote:


> Originally Posted by *Ceadderman*
> 
> Can't wait to pull the trigger on Fury or x2.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


My cousin is pulling the trigger soon on a Fury X soon so i'll get to set it up for him (and oc it too) - fun times ahead. Hes busy saving for it .. i hate waiting while he saves up lol ...


----------



## ssateneth

Quote:


> Originally Posted by *optimumbox*
> 
> Would you say it would be worth it to get the FuryX for 1080P? I'm forced at 1080 due to only having DVI on my IPS monitor. I originally returned my FuryX due to driver issues in certain games, rendering them not playable. I'd much rather have the lower temp AMD card, but at this point I'm going for whatever gets a higher framerate for my buck.


If you are going to spend 600 bucks on a top of the line graphics card, why in the hell are you using a $100 walmart knockoff PoS monitor? This GPU is for 1440p and 4k and higher (triple monitor, etc).

Pony up the money and buy a 144hz 1440p IPS or a 4k (but I wouldnt recommend on the 4k yet due to high PPI causing eye strain on desktop without DPI scaling which usually isnt compatible with a lot of things)


----------



## Ceadderman

Quote:


> Originally Posted by *Orthello*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Can't wait to pull the trigger on Fury or x2.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My cousin is pulling the trigger soon on a Fury X soon so i'll get to set it up for him (and oc it too) - fun times ahead. Hes busy saving for it .. i hate waiting while he saves up lol ...
Click to expand...

I would save for 2 moths and get Fury X, but am in the middle of a rebuild/update of my sig rig which includes *alot* of cooling gear so I have to stay the course until I'm finished with that. Performan-PCs is gonna make quite a bit off me over the next 4 months. PDXLan is just round the corner. Would *love* to pick one up there but that's not my luck usually. If I do win one, I wouldn't be able to install it there unless I have some extra PETG and bending kit onhand.









~Ceadder


----------



## derickwm

Quote:


> Originally Posted by *NBrock*
> 
> O...M....G!!!! I am sooooo jelly
> 
> PLEASE do some Folding at home tests for us!!!


----------



## flopper

Quote:


> Originally Posted by *Gregster*
> 
> You are both correct
> 
> 
> 
> 
> 
> 
> 
> Having jumped on it, it runs fine and no problems on either but then again, no improvements on either. Maybe these improvements are for older cards that people are claiming seeing? I have even run DDU just to make sure the drivers didn't have any nasties left but still nothing


those that are cpu bound have boosts and at lower resolutions benefit more.


----------



## Nizzen

Looks like cpu bottleneck in some tests. Bf 4 scream for higher frequence for sli/cfx









Retest with 4500mhz +


----------



## Gdourado

Anyone bought a Sapphire this week?
If so, what's the pump model?

Cheers!


----------



## Gregster

Quote:


> Originally Posted by *Nizzen*
> 
> Looks like cpu bottleneck in some tests. Bf 4 scream for higher frequence for sli/cfx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Retest with 4500mhz +


Interesting you say that. Run the games I did on your system, log the results and then run again with the 15.15 and again log the reslults and then show us. With your CPU, it will show if there is a CPU bottleneck. Run at 1080P as well where there is a bigger demand on the CPU.


----------



## NBrock

Quote:


> Originally Posted by *derickwm*


Nice that looks promising for the card just being released. I wonder how much we will be able to over clock them once we get voltage control.


----------



## Sgt Bilko

Looks like Asus opted for a Full Length PCB on the Fury

http://hothardware.com/reviews/amd-radeon-r9-fury-review


----------



## ozyo




----------



## th3illusiveman

The Fury has launched, 8% slower then the Fury X at 1440p and 10% slower at 4K for $100 less. Seems it performs at the level of a highly over-clocked GTX 980


----------



## Casey Ryback

Looks to be a nice competitive card.

Both the asus and sapphire have zero fan modes for those wanting silent operation.


----------



## flopper

Quote:


> Originally Posted by *th3illusiveman*
> 
> The Fury has launched, 8% slower then the Fury X at 1440p and 10% slower at 4K for $100 less. Seems it performs at the level of a highly over-clocked GTX 980


Price/value at entusiast end just got interesting.


----------



## Gdourado

How about noise at full load against the X?
Those 3 fans coolers have to be noisy...


----------



## Agent Smith1984

Here's a good read!
http://www.tomshardware.com/reviews/sapphire-amd-radeon-r9-fury-tri-x-overclocked,4216-5.html

Very disappointed in that overclock result...

A few others broke 1100 though.

Stock Fury graphics score is around 14,400 in firestrike....
My 390 is scoring 14,000 flat with an OC

Looks like I'll hang tight and wait to see if voltage unlocking makes some kind of huge difference in overclocking capabilities.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gdourado*
> 
> How about noise at full load against the X?
> Those 3 fans coolers have to be noisy...


Sapphire is great as usual:


----------



## mav451

Fury vanilla is certainly compelling, and Sapphire is looking solid


----------



## Gdourado

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Sapphire is great as usual:


I think those numbers might be optimistic...
The same happened for me when choosing a card.
At first I was thinking about a 980 TI.
Reviews always said the EVGA ACX and MSI TfV were very quiet.
But then In real world usage in closed cases and with overclock requiring fans speeds above 50%, the cards were really loud!
That's why I chose the fury X.
If there is no pump noise, a single 120mm typhoon will always be much quieter than 3 80mm fans or 2 100mm fans.


----------



## Ganf

http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/3

12 phase power for the Strix, that extra PCB isn't empty by any stretch of the imagination. Is this the Lightning of the Fury Lineup?


----------



## Ha-Nocri

Quote:


> Originally Posted by *Gdourado*
> 
> I think those numbers might be optimistic...
> The same happened for me when choosing a card.
> At first I was thinking about a 980 TI.
> Reviews always said the EVGA ACX and MSI TfV were very quiet.
> But then In real world usage in closed cases and with overclock requiring fans speeds above 50%, the cards were really loud!
> That's why I chose the fury X.
> If there is no pump noise, a single 120mm typhoon will always be much quieter than 3 80mm fans or 2 100mm fans.


New TriX cooler is dead silent at idle (fans turn off), and under load it is quieter than any other aftermarket cooler. I guess that's the best you can get atm.


----------



## Casey Ryback

Quote:


> Originally Posted by *Gdourado*
> 
> I think those numbers might be optimistic...
> The same happened for me when choosing a card.
> At first I was thinking about a 980 TI.
> Reviews always said the EVGA ACX and MSI TfV were very quiet.
> But then In real world usage in closed cases and with overclock requiring fans speeds above 50%, the cards were really loud!
> That's why I chose the fury X.
> If there is no pump noise, a single 120mm typhoon will always be much quieter than 3 80mm fans or 2 100mm fans.


Well of course the fury X will be quieter that goes without saying, but every review for the sapphire has low noise levels.

The biggest factor is probably the shorter PCB with an open back on around of the third of the card, allowing for air to pass through the heatsink and hot air will go up and out of the case.

The asus uses a full length PCB and doesn't perform as well in the noise aspect.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> New TriX cooler is dead silent at idle (fans turn off), and under load it is quieter than any other aftermarket cooler. I guess that's the best you can get atm.


In the Tom's review, it says they were having problems with the fans doing anything, and they didn't even spin up until late in the FireStrike test.

They tried a forced profile and had no luck improving overclocking.

They spoke with Sapphire about the fans, and they claimed to be releasing a new BIOS to "fix" the issue.

Under no circumstances though, did the card ever get that hot, so not sure why they need to "fix" the BIOS. I guess some may just leave it as is.

I think with some consumer' 25C+ ambient temps, in a closed case, the temps could be more of an issue, but I'm sure fans will compensate, even on the current BIOS.

I REALLY REALLY don't understand why these cards don't have voltage control yet, and even then, is it going to help that much?

Many of the Hawaii cards only hit around 1100 on stock voltage also, and adding voltage only gets them in the 1150-1180 in most cases, with a few nice samples around that do 1200+/-.

Fiji may have similar OC headroom, even after voltage control.


----------



## Ha-Nocri

yeah. I think this will be similar, but seems that more cards are hitting 1100 compared to 290(x), so we might see 1180-1200 more often.


----------



## Ganf

So are Sapphire and Asus the only companies with Furies out?

Not finding anything on any of the other company's websites, newegg or amazon.


----------



## NBrock

Quote:


> Originally Posted by *Agent Smith1984*
> 
> In the Tom's review, it says they were having problems with the fans doing anything, and they didn't even spin up until late in the FireStrike test.
> 
> They tried a forced profile and had no luck improving overclocking.
> 
> They spoke with Sapphire about the fans, and they claimed to be releasing a new BIOS to "fix" the issue.
> 
> Under no circumstances though, did the card ever get that hot, so not sure why they need to "fix" the BIOS. I guess some may just leave it as is.
> 
> I think with some consumer' 25C+ ambient temps, in a closed case, the temps could be more of an issue, but I'm sure fans will compensate, even on the current BIOS.
> 
> I REALLY REALLY don't understand why these cards don't have voltage control yet, and even then, is it going to help that much?
> 
> Many of the Hawaii cards only hit around 1100 on stock voltage also, and adding voltage only gets them in the 1150-1180 in most cases, with a few nice samples around that do 1200+/-.
> 
> Fiji may have similar OC headroom, even after voltage control.


My Hawaii cards did pretty well if you had the right cooling (I have a 290 tri X running 1215 core and 1300 mem currently running [email protected] 24/7 with out getting above 70*c) . The other thing to remember about Hawaii is that performance scales pretty well with the overclock... you might not get 1500MHz core but at 1100-1150 you get pretty darn good boost in actual performance.


----------



## Casey Ryback

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I REALLY REALLY don't understand why these cards don't have voltage control yet, and even then, is it going to help that much?.


It's been stated a lot of times, but could easily be missed in these huge threads.

Afaik (and to put it simply) it's more work on the software side to unlock voltage control on GCN cards through the likes of MSI AB.

Sapphire has stated that it's coming soon to trixx, and I assume msi afterburner will follow suit.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> So are Sapphire and Asus the only companies with Furies out?
> 
> Not finding anything on any of the other company's websites, newegg or amazon.


Yes, they are the only 2 AIBs making cards right now. Having only 3 varieties of a new card at launch seems strange. Even with Titan/Fury X you had more partner choices, even if the card is the same.


----------



## Ganf

Quote:


> Originally Posted by *Forceman*
> 
> Yes, they are the only 2 AIBs making cards right now. Having only 3 varieties of a new card at launch seems strange. Even with Titan/Fury X you had more partner choices, even if the card is the same.


I'll hold my tongue until the cards actually go up for sale on the major vendors. None of them are available yet.


----------



## Casey Ryback

Quote:


> Originally Posted by *Ganf*
> 
> I'll hold my tongue until the cards actually go up for sale on the major vendors. None of them are available yet.


Asus and sapphire aren't major vendors?










Agree would be nice to see others making them though.


----------



## Forceman

Quote:


> Originally Posted by *Ganf*
> 
> I'll hold my tongue until the cards actually go up for sale on the major vendors. None of them are available yet.


Sounds like more are coming, eventually.
Quote:


> Out of the gate the only partners launching cards are Sapphire and Asus, AMD's closest and largest partners respectively. Sapphire will be releasing stock and overclocked SKUs based on a semi-custom design that couples the AMD reference PCB with Sapphire's Tri-X cooler. Asus on the other hand has gone fully-custom right out of the gate, pairing up a new custom PCB with one of their DirectCU III coolers. Cards from additional partners will eventually hit the market, but not until later in the quarter.


----------



## Ganf

Quote:


> Originally Posted by *Casey Ryback*
> 
> Asus and sapphire aren't major vendors?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Agree would be nice to see others making them though.


Major vendors being Newegg, Amazon, NCIX, etc... Perhaps I should have said e-tailer.


----------



## Gdourado

My previous 290X with reference cooler did 1175 core.
But free fan had to be at 85%. But the core was stable.
I had to use noise canceling headphones though,
That is why I sold the card and ordered the fury x.


----------



## Casey Ryback

Quote:


> Originally Posted by *Ganf*
> 
> Major vendors being Newegg, Amazon, NCIX, etc... Perhaps I should have said e-tailer.


Ah my bad I getcha.


----------



## Gregster

I seriously don't think more voltage is going to really push these cards much higher. I can't run 1120Mhz Stable and 11000 seems to be my max stable clocks. 3 attempts to complete Valley bench with 1120Mhz


----------



## Noufel

seeing temps results of the asus and the sapphire fury i can't see the point why AMD didn't let AIB partners to make custum air cooled FuryX, i don't think that the 512 sp that the furyx has over the regular fury would have been a problem to cool


----------



## Ganf

Quote:


> Originally Posted by *Noufel*
> 
> seeing temps results of the asus and the sapphire fury i can't see the point why AMD didn't let AIB partners to make custum air cooled FuryX, i don't think that the 512 sp that the furyx has over the regular fury would have been a problem to cool


Margins. AMD can't make as much of a profit selling the chips in bulk.


----------



## Noufel

Quote:


> Originally Posted by *Ganf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Noufel*
> 
> seeing temps results of the asus and the sapphire fury i can't see the point why AMD didn't let AIB partners to make custum air cooled FuryX, i don't think that the 512 sp that the furyx has over the regular fury would have been a problem to cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Margins. AMD can't make as much of a profit selling the chips in bulk.
Click to expand...

that sounds logical.


----------



## Agent Smith1984

I wonder how much of the BIOS for the Fury X is vastly different from the Fury? Maybe some things controlling pump and fan??

I bet if the BIOS is that different, it will greatly reduce the likelihood of successfully flashing Fury X BIOS to Fury Pro to see if shaders unlock....

May even be the purpose of making a reference only X in the first place?


----------



## bonami2

Quote:


> Originally Posted by *Gregster*
> 
> I seriously don't think more voltage is going to really push these cards much higher. I can't run 1120Mhz Stable and 11000 seems to be my max stable clocks. 3 attempts to complete Valley bench with 1120Mhz


Yea but maybe amd did not says what is the real stock voltage. they could put it so low but in reality the chip can take 1.5v


----------



## hamzta09

http://www.sweclockers.com/test/20792-amd-radeon-r9-fury-fran-asus-och-sapphire

Fury X, waste of money it seems. As Fury > Fury X


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.sweclockers.com/test/20792-amd-radeon-r9-fury-fran-asus-och-sapphire
> 
> Fury X, waste of money it seems. As Fury > Fury X


Fury X clock for clock will always be faster than the Fury, same for 290x vs 290, 7970 vs 7950 etc etc.

if you want the fastest AMD card out there....get a Fury X, if you want a bang for buck beast, get a Fury


----------



## hyp36rmax

*OP +Added R9 Radeon FURY Specs*


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Fury X clock for clock will always be faster than the Fury, same for 290x vs 290, 7970 vs 7950 etc etc.
> 
> if you want the fastest AMD card out there....get a Fury X, if you want a bang for buck beast, get a Fury


always a diminishing return at entusiast end.
fury are interesting and I get one at end of month


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.sweclockers.com/test/20792-amd-radeon-r9-fury-fran-asus-och-sapphire
> 
> *Titan X*, waste of money it seems. As Fury > Fury X


Fixed.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bonami2*
> 
> Yea but maybe amd did not says what is the real stock voltage. they could put it so low but in reality the chip can take 1.5v


That'd be nice, but I don't see how the air cooler could hold that kind of voltage on a GPU....

It is worth noting though, that while Hawaii was released near the upper end of it's clock space, the Tahiti core was launched in the 800MHz range, and those cores just seem to keep climbing with voltage...

Mine ran at 1260MHz with 1.25v on my 280X.....

My brothers 7950 ran at 1200MHz on 1.2v....

My sons will run at 1000 on a measly 1v vcore.

IF, Fury owners are really lucky, then the card's clocks will scale nicely with voltage increases.

Yes, we see 1100-1150 OC's now on stock voltage, but it's not to say that 200mv+ won't be a common thing to run on these cards, and if that type of voltage increase were to yield clock speeds in the 1250-1300 range, I don't think you'd hear anyone complain at all.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Fury X clock for clock will always be faster than the Fury, same for 290x vs 290, 7970 vs 7950 etc etc.


Doesnt really look like it.

1-2 fps difference? Meh.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Fury X clock for clock will always be faster than the Fury, same for 290x vs 290, 7970 vs 7950 etc etc.
> 
> 
> 
> Doesnt really look like it.
> 
> 1-2 fps difference? Meh.
Click to expand...

So just like the Titan X and 980 Ti, GTX 980 and 970, R9 290x and 290 the list goes on tbh.

The gap does widen in time with drivers though, the difference between a 290 and 290x isn't as small as it was on launch day.


----------



## Agent Smith1984

I just want to know when we are getting a new 3dmark....

And when we do, it better not be gimping AMD CPU owners the way FireStrike does....

3dmark11 was such a better benchmark that FireStrike.

I hope the next one goes back to the realistic nature of 11.

Of course, along with a new 3dmark, comes a







when you see your expensive hardware get it's


----------



## liquidaim

Hello all,

I've heard rumors that the R9 Nano will have the full fiji silicon operating at ~700 mhz with a 175W tdp limit.

I would like to request if one of the owners of fury x could get a bench score on firestrike while limiting the power draw of the card to 175W. I know tdp =/= power draw but i'm curious about the potential performance of the nano if the rumored specs from anandtech are true.

thanks in advance.


----------



## Casey Ryback

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The gap does widen in time with drivers though, the difference between a 290 and 290x isn't as small as it was on launch day.


iirc the 7950/7970 gap got wider over time.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Casey Ryback*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> The gap does widen in time with drivers though, the difference between a 290 and 290x isn't as small as it was on launch day.
> 
> 
> 
> iirc the 7950/7970 gap got wider over time.
Click to expand...

Yep, you see it with everything tbh, but for people who upgrade every gen or every second gen sometimes don't notice it that much


----------



## Ganf

Quote:


> Originally Posted by *liquidaim*
> 
> Hello all,
> 
> I've heard rumors that the R9 Nano will have the full fiji silicon operating at ~700 mhz with a 175W tdp limit.
> 
> I would like to request if one of the owners of fury x could get a bench score on firestrike while limiting the power draw of the card to 175W. I know tdp =/= power draw but i'm curious about the potential performance of the nano if the rumored specs from anandtech are true.
> 
> thanks in advance.


If that is true, which is incredibly doubtful given that the Fury is cut down, and partners are allowed to alter the PCB all hell will break loose. I wouldn't count on it.


----------



## Fyrwulf

Has anybody tested if these will Crossfire with an A10?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Fyrwulf*
> 
> Has anybody tested if these will Crossfire with an A10?


WHy would these be able to crossfire with an A10?

And why would you try it?

You'd limit the card to a lower performance than it would acheive by itself...

Unless, you mean, running a set of these IN crossfire, ON an A10, in which case I could see you being extremely CPU bound.....

Confused about the question.


----------



## Gdourado

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.sweclockers.com/test/20792-amd-radeon-r9-fury-fran-asus-och-sapphire
> 
> Fury X, waste of money it seems. As Fury > Fury X


That test is not accurate!
Fury testes on 15.70 drivers.
Fury X on 15.15 drivers.

Not a fair test for the Fury X...


----------



## ozyo

I hope someone give AMD mad finger and make fury x with custom pcb


----------



## looncraz

Quote:


> Originally Posted by *ozyo*
> 
> I hope someone give AMD mad finger and make fury x with custom pcb


That would be AIB suicide


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That'd be nice, but I don't see how the air cooler could hold that kind of voltage on a GPU....
> 
> It is worth noting though, that while Hawaii was released near the upper end of it's clock space, the Tahiti core was launched in the 800MHz range, and those cores just seem to keep climbing with voltage...
> 
> Mine ran at 1260MHz with 1.25v on my 280X.....
> 
> My brothers 7950 ran at 1200MHz on 1.2v....
> 
> My sons will run at 1000 on a measly 1v vcore.
> 
> IF, Fury owners are really lucky, then the card's clocks will scale nicely with voltage increases.
> 
> Yes, we see 1100-1150 OC's now on stock voltage, but it's not to say that 200mv+ won't be a common thing to run on these cards, and if that type of voltage increase were to yield clock speeds in the 1250-1300 range, I don't think you'd hear anyone complain at all.


Make total sense









My current 7950 do 1.187v 1070core But im vrm stuck at 90c under load not interrested to push them more


----------



## looncraz

Quote:


> Originally Posted by *Gregster*
> 
> I seriously don't think more voltage is going to really push these cards much higher. I can't run 1120Mhz Stable and 11000 seems to be my max stable clocks. 3 attempts to complete Valley bench with 1120Mhz


I have to disagree, unless HBM voltage creeps with the core voltage (possible) and becomes unstable first (less likely).

Compared to Hawaii, Fiji is supposed to have lower voltage overhead, which is enabled by its ability to watch for voltage dips and respond very quickly by dropping the clock rate for a few hundred cycles. I think this might be partly responsible for the less than 'normal' (for GCN) scaling with clocks that we see. In fact, it might be interesting to see if voltage increases, alone, might increase performance (very slightly, undoubtedly). I suspect we'll see slightly better scaling with voltage than we had with Hawaii, and 1200mhz will be a given with an extra 60~80mv, 1250 with 120~150mv, with some golden samples doing much better.


----------



## looncraz

Quote:


> Originally Posted by *hamzta09*
> 
> Doesnt really look like it.
> 
> 1-2 fps difference? Meh.


The R9 290X improved more than the R9 290 simply because it had slightly more hardware. The Fury X will improve more than the Fury, however the Fury X only comes in one design, which might inhibit its growth some.

It's a simple matter of math. Of course, they will usually only stay within a few FPS, and it is quite probable that stock OC samples will beat the Fury X.


----------



## snow cakes

do they plan on making a 395x2?


----------



## looncraz

Quote:


> Originally Posted by *liquidaim*
> 
> Hello all,
> 
> I've heard rumors that the R9 Nano will have the full fiji silicon operating at ~700 mhz with a 175W tdp limit.
> 
> I would like to request if one of the owners of fury x could get a bench score on firestrike while limiting the power draw of the card to 175W. I know tdp =/= power draw but i'm curious about the potential performance of the nano if the rumored specs from anandtech are true.
> 
> thanks in advance.


I expect the Fury Nano to have 3072 SPs at a higher frequency than 700MHz (probably 850). I hope some AIBs release some with two power connectors so it can make a nicely overclocking card (and a worthy successor to my 7870XT in my HTPC).


----------



## hyp36rmax

Quote:


> Originally Posted by *snow cakes*
> 
> do they plan on making a 395x2?


According to AMD we're expecting an R9 FURY X2 sometime this Fall.


----------



## Agent Smith1984

Quote:


> Originally Posted by *snow cakes*
> 
> do they plan on making a 395x2?


I doubt that, but they will definitely be making a Fury X2

I think they'd have a hard time finding a good price point for a 395x2


----------



## Agent Smith1984

Uggghhh

This guy on Craigslist is selling a new 980ti from Zotac for $500









He RAM'd it, wouldn't wait for the replacement, bought another card, and is selling the replacement (which came new in the box, not a refurb)

What a deal.......

Sorry, random off topic venting....


----------



## snow cakes

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I doubt that, but they will definitely be making a Fury X2
> 
> I think they'd have a hard time finding a good price point for a 395x2


hmm, I am looking to go 4k around the spring of 2016, with BF5 coming out probably mid or end of 2016 I want to have a nice quadfire setup with at least a 32" 4K monitor. So I have time to wait


----------



## Agent Smith1984

Quote:


> Originally Posted by *snow cakes*
> 
> hmm, I am looking to go 4k around the spring of 2016, with BF5 coming out probably mid or end of 2016 I want to have a nice quadfire setup with at least a 32" 4K monitor. So I have time to wait


You can scoop up 295x2's right now for $600.... just sayin'

Not sure how attractive those will be soooo far away from now though.

You may be more interested in the next gen of HBM cards, where there should be some 6 or 8GB frame buffers, running at higher clocks speeds...


----------



## p4inkill3r

Wrong thread, friend.


----------



## mav451

Brown coats forever!

Yeah seems to me that Sapphire made great use of those space savings for an ideal thermal solution.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> Wrong thread, friend.


You had a chance to log any games with Fury, using your regular 4.8 CPU clock?


----------



## ozyo

Quote:


> Originally Posted by *bastian*
> 
> But they have enjoyed better performance for much longer.


and how ?


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You had a chance to log any games with Fury, using your regular 4.8 CPU clock?


I had a few entries of Shadow of Mordor a few days ago, but the results were skewed by settings borkage.
I'm not playing anything remotely demanding right now but I own BF4, GTA5, and SOM, so its just a matter of installing them and playing around.

I hope to do so this weekend.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> I had a few entries of Shadow of Mordor a few days ago, but the results were skewed by settings borkage.
> I'm not playing anything remotely demanding right now but I own BF4, GTA5, and SOM, so its just a matter of installing them and playing around.
> 
> I hope to do so this weekend.


Really interested in the GTA V results when you get a chance









Thanks


----------



## p4inkill3r

IIRC, the GTA5 ingame benchmark is rather underwhelming, is it not?

TPU and [H] use it in their reviews but I'm not seeing what settings they use for me to compare with.

edit: also, I've been running @ 5GHz for a while now. I'm growing bored with this 8320 setup and I'm running at 1.55v all the time, summer be damned.


----------



## Neon Lights

Finally, Aqua Computer released water blocks for the Fury X! This would make them the fourth company to announce them and the second to make them available for ordering.

http://forum.aquacomputer.de/wasserk-hlung/106123-neu-kryographics-radeon-r9-fury-x/



I ordered two (standard ones or "acrylic glass edition") for my Fury Xs!


----------



## th3illusiveman

I hope voltage addition changes things.


----------



## rv8000

Quote:


> Originally Posted by *Neon Lights*
> 
> Finally, Aqua Computer released water blocks for the Fury X! This would make them the fourth company to announce them and the second to make them available for ordering.
> 
> http://forum.aquacomputer.de/wasserk-hlung/106123-neu-kryographics-radeon-r9-fury-x/
> 
> 
> 
> I ordered two (standard ones or "acrylic glass edition") for my Fury Xs!


Any price info?


----------



## Neon Lights

Quote:


> Originally Posted by *rv8000*
> 
> Any price info?


http://shop.aquacomputer.de/

100€ or 115€ (the same amount in $), the same as always.


----------



## optimumbox

Quote:


> Originally Posted by *ssateneth*
> 
> If you are going to spend 600 bucks on a top of the line graphics card, why in the hell are you using a $100 walmart knockoff PoS monitor? This GPU is for 1440p and 4k and higher (triple monitor, etc).
> 
> Pony up the money and buy a 144hz 1440p IPS or a 4k (but I wouldnt recommend on the 4k yet due to high PPI causing eye strain on desktop without DPI scaling which usually isnt compatible with a lot of things)


My

I would never buy a 144hz monitor for a card that struggles to achieve that high of a frame rate in the first place. My Asus IPS was $250 when I bought it off newegg so I wouldn't really call it a knock off, but that's just me. I'm personally not buying a monitor until the price of 4k monitors go way down, that's one of the reasons I'm not upgrading. I'd also rather save my money for the time being and wait until the FuryX2 release. There's no point in pushing 4k for me if I have to do it under 60fps. Until then, I'm keeping my 980ti, the single fury is too underwhelming.


----------



## Ganf

Soon....


----------



## xer0h0ur

Quote:


> Originally Posted by *Ganf*
> 
> 
> 
> Soon....


Would also be compatible with the Tri-X Fury for that matter. They use the same PCB as the Fury X.


----------



## bonami2

Quote:


> Originally Posted by *p4inkill3r*
> 
> IIRC, the GTA5 ingame benchmark is rather underwhelming, is it not?
> 
> TPU and [H] use it in their reviews but I'm not seeing what settings they use for me to compare with.
> 
> edit: also, I've been running @ 5GHz for a while now. I'm growing bored with this 8320 setup and I'm running at 1.55v all the time, summer be damned.


Those thing transfer heat for real!

You just remember me my old fx6300 that was running super cool on a hyper 212.

And now my 4790k that heat like crap dissipate no heat in the Watercooler almost cpu at 90c and the Water top at 30c


----------



## Zealon

I just got my Fury X today









Apparently it has the older pump version, but I'll see how it performs over time. No abnormal pump noise from it yet.


----------



## looncraz

Quote:


> Originally Posted by *bastian*
> 
> So basically, like AMD fans always say - wait for better drivers. Just hardly seems to happen. And while you say its being limited due to stock voltages... nVidia cards can overclock great on stock voltages to begin with. So again, AMD is poor at this and honestly due to the power and heat limitations I doubt it will be much better. But we'll see....
> Sure it does, you also get a free game with nVidia. And like I said, if you can get a good deal on the 980, which I'm sure through certain e-tailers you will be able to.


Well, my old 7870XT is nearly 25% faster on average than it was when I bought it and my R9 290 has seen a 15% jump in a shorter period of time. Of course, both cards saw much the same benefits at the same time since the 7870XT is a GCN 1.0 card and the 290 is a GCN 1.1 card and AMD treats them much the same.

Also, nVidia uses dynamic voltage, they don't have a stock voltage. When you increase the voltage on an nVidia GPU the voltage goes up automatically. AMD may adopt this in the future, but I think it takes away some of the fun of discovery









That said, the Hawaii GPU responds fairly well to voltage, but not as well as Maxwell. Fiji GPUs are set to lower voltages and the GPUs will down-clock if the voltages are too low (and they can do this with astonishing speed, changing the clock speed in response to voltage ripple). These are very different strategies, and AMD's means less overclocking headroom without manual voltage control - which is due Any Day Now™









Many AMD cards come with bundles as well, that's nothing new, and you have to want the game to actually consider it as part of the value. So far, no card has given me a game I wanted.


----------



## Thoth420

I got Deus Ex HR with my 6970 and it is probably favorite recent game but yeah for the most part it's almost always a game I don't like from AMD bundles.

Bioshock infinite not withstanding as well.


----------



## Ceadderman

Stalker: Call of Pripyat was good. Dirt was good. Grand Theft Auto V? Borderlands?

Thee have been some reasonably solid games bundled with AMD cards imho.

~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ceadderman*
> 
> Stalker: Call of Pripyat was good. Dirt was good. Grand Theft Auto V? Borderlands?
> 
> Thee have been some reasonably solid games bundled with AMD cards imho.
> 
> ~Ceadder


Borderlands is a TWIMTBP title but in general yeah, AMD has been pretty good at lining up deals.

Tomb Raider, Deus Ex: HR, Dirt 3, Dirt Rally, Alien Isolation, Sniper Elite 3, Battlefield 3 + 4, Star Citizen, Bioshock infinite, Murdered: Soul Suspect, Hitman: Absolution, Sleeping Dogs etc etc......Never Settle bundles are usually better than Nvidia ones but that's all dependant on gamers choices of course


----------



## Jflisk

Finally have my Fury X installed.Time to go play some games give this thing a workout.


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Stalker: Call of Pripyat was good. Dirt was good. Grand Theft Auto V? Borderlands?
> 
> Thee have been some reasonably solid games bundled with AMD cards imho.
> 
> ~Ceadder


Aside Dirt I must have missed those...and indeed they are. I don't really like Borderlands but it is down to a matter of taste...I can see why people like it so much. Dirt on the other hand...meh.


----------



## Clockster

Well my rig is finally rebuilt.

So I'll start benching









http://www.3dmark.com/3dm/7711629?

http://www.3dmark.com/fs/5369662


----------



## joeh4384

Has anyone here ran without the cover and used an IR gun to check VRM temps while gaming? Just curious.


----------



## p4inkill3r

Quote:


> Originally Posted by *Clockster*
> 
> Well my rig is finally rebuilt.
> 
> So I'll start benching
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7711629?
> 
> http://www.3dmark.com/fs/5369662


Dat physics result.


----------



## Silent Scone

Is low


----------



## xer0h0ur

If his is low what is mine? http://www.3dmark.com/fs/5346396


----------



## Ganf

Quote:


> Originally Posted by *Silent Scone*
> 
> Is low


No, it's not, it's dead on for a 5930k at 4.4-4.5ghz.

http://www.3dmark.com/fs/5347714


----------



## Silent Scone

Was in jest at the fact it's not an 8 core, it's a synthetic bench and honestly the 1080p test is way too CPU centric









Hence the tongue smiley. Ever present and ready to try and stand out and correct, Ganf







.


----------



## p4inkill3r

A few more tests this morning:

Heaven @ 1080p:




Same settings, but @ 1440p:


----------



## Ganf

Quote:


> Originally Posted by *Silent Scone*
> 
> Was in jest at the fact it's not an 8 core, it's a synthetic bench and honestly the 1080p test is way too CPU centric
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hence the tongue smiley. Ever present and ready to try and stand out and correct, Ganf
> 
> 
> 
> 
> 
> 
> 
> .


Tongue smiley doesn't tell me anything about whether or not you've seen relevant test results from a similar system, but thanks for pointing out that you were more interested in drawing attention to the fact that you own a 5960x than providing any relevant feedback.


----------



## Silent Scone

Quote:


> Originally Posted by *Ganf*
> 
> Tongue smiley doesn't tell me anything about whether or not you've seen relevant test results from a similar system, but thanks for pointing out that you were more interested in drawing attention to the fact that you own a 5960x than providing any relevant feedback.


Ever ready to also dig holes and go around in circles to make a point too lol. Move on







.

No I'm not at all well versed in X99 overclocking results, thank you for the links.


----------



## blue1512

Quote:


> Originally Posted by *p4inkill3r*
> 
> A few more tests this morning:
> 
> Same settings, but @ 1440p:


Why do you still use 15.15 driver?


----------



## p4inkill3r

Quote:


> Originally Posted by *blue1512*
> 
> Why do you still use 15.15 driver?


I got an infinite loop BSOD and DDU'd last night, thought I'd reinstalled with the latest but evidently I hadn't.


----------



## Ganf

Quote:


> Originally Posted by *p4inkill3r*
> 
> A few more tests this morning:
> 
> Same settings, but @ 1440p:


Fury X at stock better not trip and fall or I'm gonna get'em...


----------



## blue1512

Quote:


> Originally Posted by *p4inkill3r*
> 
> I got an infinite loop BSOD and DDU'd last night, thought I'd reinstalled with the latest but evidently I hadn't.


Which Windows, mate? For some ****** reason, win 7 and win 8 share the same driver, while win 8.1 has its own.


----------



## p4inkill3r

with 15.20 drivers

1050mhz/500mhz
FPS:
42.4
Score:
1069
Min FPS:
19.5
Max FPS:
87.1

1150mhz/600mhz
FPS:
45.9
Score:
1156
Min FPS:
19.5
Max FPS:
93.4

No difference.


----------



## Ceadderman

Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *p4inkill3r*
> 
> I got an infinite loop BSOD and DDU'd last night, thought I'd reinstalled with the latest but evidently I hadn't.
> 
> 
> 
> Which Windows, mate? For some ****** reason, win 7 and win 8 share the same driver, while win 8.1 has its own.
Click to expand...

Likely 8. Unless he meant latest drivers.









~Ceadder


----------



## xer0h0ur

I thought AMD stopped supporting 8. In other words only making drivers for 7, 8.1, and 10


----------



## Ceadderman

Hadn't heard that. My brother is running 8/8.1. I'm still on 7 Ultimate.









~Ceadder


----------



## the9quad

You guys on windows 10 and furys have AMD APP and VCE support? It seems the AVT and OVE packages are missing from the windows 8.1 15.7 WHQL's. So no hardware video encoding acceleration or playback for windows 8.1. Curious to see if you have it on windows 10 and if it is working with your Fury's.


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Hadn't heard that. My brother is running 8/8.1. I'm still on 7 Ultimate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Well if you go to AMD's site and select Windows 8 instead of 8.1 then you're offered the Cat 14.4 instead of the 15.7 for Windows 8.1.


----------



## Ceadderman

Ahhhhhh. Understood.









~Ceadder


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Hadn't heard that. My brother is running 8/8.1. I'm still on 7 Ultimate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well if you go to AMD's site and select Windows 8 instead of 8.1 then you're offered the Cat 14.4 instead of the 15.7 for Windows 8.1.
Click to expand...

Yeah, that's the last WHQL that is supported by AMD for Win 8.....they only support 7,8 and 10 now (as you said above)


----------



## p4inkill3r

Quote:


> Originally Posted by *Ceadderman*
> 
> Likely 8. Unless he meant latest drivers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I did mean drivers, and I"m on win7.


----------



## Ceadderman

Reinforcement of "ass-u-me" comes through loud and clear.









~Ceadder


----------



## cravinmild

wow 130 pages and im just finding this thread lol.

That aio cooled card is amazing





















Just a beautiful card


----------



## Jflisk

Ran My Fury X thru Its paces. Looks like it plays most games better then my Tri-fire 290X's. In 3d mark looks like the one card is close to the 3 in score one more of these and I think my 3d mark score will be over what I had before.

BF Hardline - All ultra no stuttering

Pretty much happy with the card no odd noises. As a matter of fact my 2 D5 are the loudest things in my system.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jflisk*
> 
> Ran My Fury X thru Its paces. Looks like it plays most games better then my Tri-fire 290X's. In 3d mark looks like the one card is close to the 3 in score one more of these and I think my 3d mark score will be over what I had before.
> 
> BF Hardline - All ultra no stuttering
> 
> Pretty much happy with the card no odd noises. As a matter of fact my 2 D5 are the loudest things in my system.


Wat? You're getting some pretty low performance out of those 290X's if a single Fury X comes that close. Got any scores so I can compare with?

For comparison's sake this is my tri-fire performance in FS Ultra:

15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> Wat? You're getting some pretty low performance out of those 290X's if a single Fury X comes that close. Got any scores so I can compare with?
> 
> For comparison's sake this is my tri-fire performance in FS Ultra:
> 
> 15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
> 15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score


My results 15.7 Firestrike ultra. Think the second card would bring it up on par.
http://www.3dmark.com/fs/5374676

This is a run with the 3x290X
http://www.3dmark.com/fs/5163575


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Wat? You're getting some pretty low performance out of those 290X's if a single Fury X comes that close. Got any scores so I can compare with?
> 
> For comparison's sake this is my tri-fire performance in FS Ultra:
> 
> 15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
> 15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
> 
> 
> 
> My results 15.7 Firestrike ultra. Think the second card would bring it up on par.
> http://www.3dmark.com/fs/5374676
> 
> This is a run with the 3x290X
> http://www.3dmark.com/fs/5163575
Click to expand...

What resolution are you running for games by chance?

This is what i'm getting: http://www.3dmark.com/3dm/7718265?

Graphics Score 7680 vs your 3 x 290x's:7996


----------



## xer0h0ur

Quote:


> Originally Posted by *Jflisk*
> 
> My results 15.7 Firestrike ultra. Think the second card would bring it up on par.
> http://www.3dmark.com/fs/5374676
> 
> This is a run with the 3x290X
> http://www.3dmark.com/fs/5163575


Dang you're right. Taking crossfire scaling into account 2 Fury X's would seem to be practically equal to 3 290X's and were talking about overclocked cards at that. Out of curiosity can you produce a gaming stable overclocked Fury X bench on FS Ultra?


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> Dang you're right. Taking crossfire scaling into account 2 Fury X's would seem to be practically equal to 3 290X's and were talking about overclocked cards at that. Out of curiosity can you produce a gaming stable overclocked Fury X bench on FS Ultra?


I don't think they have the drivers straight yet. I tried slight overclock 5% over and did not have any problems in games. But when I went to the administration logs my screen goes funky so I took the over clock off and the administration logs work again then again I am running windows 10 so anything is possible at this point.


----------



## xer0h0ur

Ohhhhh, well yeah that will make a difference for sure. I haven't even touched Windows 10 yet. Can't compare there.


----------



## Jflisk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What resolution are you running for games by chance?
> 
> This is what i'm getting: http://www.3dmark.com/3dm/7718265?
> 
> Graphics Score 7680 vs your 3 x 290x's:7996


I run my games at 1080P 60 HZ 1920x1080 . I still have a 3D monitor so that's what I am stuck at for 3D Games


----------



## Partogi

I don't get it. Why would someone choose fury x over 980 ti?


----------



## flopper

Quote:


> Originally Posted by *Partogi*
> 
> I don't get it. Why would someone choose fury x over 980 ti?


cool and silent.
980ti hot and loud.
same gaming experience.

I would also go as far if you support a company that wants to close any tech from anyone else you should check your ethic and morals along the way also.
You cant trust Nvidia its as simple as that.


----------



## p4inkill3r

Quote:


> Originally Posted by *Partogi*
> 
> I don't get it. Why would someone choose fury x over 980 ti?


Assuming you're not trolling, because I like AMD.


----------



## Jflisk

Quote:


> Originally Posted by *Partogi*
> 
> I don't get it. Why would someone choose fury x over 980 ti?


Cool and quite and I Dumped a rad box.


----------



## Partogi

Quote:


> Originally Posted by *p4inkill3r*
> 
> Assuming you're not trolling, because I like AMD.


No, I'm not trolling.


----------



## GorillaSceptre

Quote:


> Originally Posted by *flopper*
> 
> cool and silent.
> 980ti hot and loud.
> same gaming experience.
> 
> I would also go as far if you support a company that wants to close any tech from anyone else you should check your *ethic and morals* along the way also.
> You cant trust Nvidia its as simple as that.


----------



## p4inkill3r

Quote:


> Originally Posted by *Partogi*
> 
> No, I'm not trolling.


In that case, also because I've supported AMD since the 90s, have built dozens of computers with AMD/ATI components, and the relative difference between the 980Ti and the Fury X is within the margin of error.

I don't think nvidia is the devil, but my money goes to the Red team any time it can.


----------



## POOTYTANGASAUR

Sorry but the 980ti isn't hot and loud. It isn't chilly obviously, since it isn't water cooled. But is is priced the same and performs better. I went 980ti, coming from a 7950 (1200,1800) and r9 290 (1225,1675). I definitely like amd but I am not biased and I will buy based on performance whenever it makes sense. Driver improvements should make the Fury X alot more competitive in the future, tbh based off of specs it should wreck 980ti but amd just can't figure it out I guess lol.

EDIT: My favorite thing about this card is the overclocking, (i have evga acx 2.0) mine is clocked at 1490, 2004. The release of the Fury actually made me consider returning this for 2 of those. I am not ashamed to admit that hahaha.


----------



## Ceadderman

~Ceadder


----------



## derickwm

So close yet so far


----------



## mojobear

Hey guys,

As a owner of 4 way crossfire AMD, although the prev gen... thought you all might like to see this assessment of scaling going from single to 4 way for fury x and titan x. I'm not sure if it was the same korean reviewer who did a 4 way titan vs 4 way 290x but its a very similar story. AMD pulls ahead once you start stacking multi-gpus.

AMD scaling is quite good with their XDMA crossfire...seems to be better than SLI.

http://www.techpowerup.com/forums/threads/an-epic-fury-x-review-quad-fury-x-vs-quad-titan-x.214231/


----------



## akumaburn

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> Sorry but the 980ti isn't hot and loud. It isn't chilly obviously, since it isn't water cooled. But is is priced the same and performs better. I went 980ti, coming from a 7950 (1200,1800) and r9 290 (1225,1675). I definitely like amd but I am not biased and I will buy based on performance whenever it makes sense. Driver improvements should make the Fury X alot more competitive in the future, tbh based off of specs it should wreck 980ti but amd just can't figure it out I guess lol.
> 
> EDIT: My favorite thing about this card is the overclocking, (i have evga acx 2.0) mine is clocked at 1490, 2004. The release of the Fury actually made me consider returning this for 2 of those. I am not ashamed to admit that hahaha.


It is an AIR cooled card that is price the same as a card that comes with a closed loop liquid cooler.. a darn good one too..

I'd say you'd get $60-100 in extra value from the cooler alone with the Fury X.


----------



## xer0h0ur

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> As a owner of 4 way crossfire AMD, although the prev gen... thought you all might like to see this assessment of scaling going from single to 4 way for fury x and titan x. I'm not sure if it was the same korean reviewer who did a 4 way titan vs 4 way 290x but its a very similar story. AMD pulls ahead once you start stacking multi-gpus.
> 
> AMD scaling is quite good with their XDMA crossfire...seems to be better than SLI.
> 
> http://www.techpowerup.com/forums/threads/an-epic-fury-x-review-quad-fury-x-vs-quad-titan-x.214231/


Thanks for that. Very interesting.


----------



## royfrosty

Finally my x99 + i7-5820k arrived.

Still waiting for my gskills ram though.

Gonna put this with my Fury X. Probably another for xfire.


----------



## HiTechPixel

I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


----------



## Casey Ryback

Quote:


> Originally Posted by *HiTechPixel*
> 
> I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


lol then you can expect to keep crying many times.

4790K is a quad core that almost costs as much.

The 5930K is useless to someone not using the extra lanes.

Each to their own.


----------



## blue1512

Quote:


> Originally Posted by *HiTechPixel*
> 
> I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


It's off topic but for me 5820K is better than 5930K price/perf, unless your plan is tri or quad fire/sli. And a soldered 6 cores, more PCI lane with DDR4 is better than a TIM 4 cores on everyday of the week


----------



## royfrosty

Quote:


> Originally Posted by *HiTechPixel*
> 
> I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


Never once intel made a 6 core extreme series i7 and priced as close to the mainstream i7-4790k. Lets be honest. i7-5820k is a good entry level for LGA2011-V3 socket. In Singapore, this board + CPU cost at S$890.

A high end board from each major brands such as ASRock, MSI, Asus and Gigabyte pair it up with an i7-4790k would cost anything from S$820-$860. It makes more sense to buy the i7-5820k with a small top up of S$30-$50.

DDR4 prices have taken a plunge. Although not locally in Singapore, but rather in the states. So i just bought the Gskills Ripjaws 3000mhz 4x4gb kit at US$149.99.

Which is the cost of a DDR3 16gb 4x4gb kit Rams in Singapore.


----------



## Balsagna

Quote:


> Originally Posted by *HiTechPixel*
> 
> I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


Why?

As long as the 5820K is clocked about the same as a 4790K, the difference in performance is nearly the same... you still get a small platform upgrade using X99, even though it doesn't appear to help much in the gaming section. You can't judge what the user needs the 5820K for. Price/Performance it's a great CPU on the X99, I'd pick it over the 5930K if I was only using single GPU's.


----------



## xer0h0ur

LOL, yeah, a 4790K over a 5820K on the eve of 6-core optimized DX12. I will have what you're smoking please.


----------



## HiTechPixel

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, yeah, a 4790K over a 5820K on the eve of 6-core optimized DX12. I will have what you're smoking please.


Nowhere is it implied that DX12 is specifically or specially optimized for 6 cores and 6 cores only. So I'm not sure where you got that misinformation from.

And to all of you others, X99 is an enthusiast platform. If you want price per performance or vice versa you're seriously better off buying a 4790K.


----------



## xer0h0ur

I actually had read several articles months back about DX12 being optimized for up to 6-core processors though people can of course use up to 8-core processors with a marginal difference. Once I get my work done I will drag them up for you.


----------



## Casey Ryback

Quote:


> Originally Posted by *HiTechPixel*
> 
> Nowhere is it implied that DX12 is specifically or specially optimized for 6 cores and 6 cores only. So I'm not sure where you got that misinformation from.
> 
> And to all of you others, X99 is an enthusiast platform. If you want price per performance or vice versa you're seriously better off buying a 4790K.


You obviously didn't read royfrosty's post. It barely cost any more.

Plus another user states how can you judge how a person uses a PC? They might want the extra cores and threads for various reasons.

You're just being ignorant atm.

Saying you cry when someone buys something that you don't see any value in..............grow up .

Edit - Damn just realised this is the owners thread for fury cards why would you troll someone's cpu purchase decision in here anyway?

My bad too for contributing to this stupid argument, I should've just reported your post and been done with it.


----------



## DNMock

Been hectic at work lately, haven't had a chance to keep up and just wanted to check in on the progress of overclocking the Fury-X.


----------



## xer0h0ur

The progress is no progress. Still waiting on Afterburner / Trixx updates to manage voltage on Fiji XT / Pro


----------



## xer0h0ur

Quote:


> Originally Posted by *HiTechPixel*
> 
> Nowhere is it implied that DX12 is specifically or specially optimized for 6 cores and 6 cores only. So I'm not sure where you got that misinformation from.
> 
> And to all of you others, X99 is an enthusiast platform. If you want price per performance or vice versa you're seriously better off buying a 4790K.


I can't find the exact article I was thinking of but towards the bottom of this one: http://www.pcworld.com/article/2900814/tested-directx-12s-potential-performance-leap-is-insane.html they were testing on 2 and 4 core processors with and without HT and you can see how it scales linearly with more cores. Unfortunately they didn't have a 6 or 8 core processor to test with to show it scaling further.

This is an AMD slide showing their draw call scaling from 2-8 cores :


It is however worth noting that if you're not CPU bound, for instance by virtue of multi-gpu setups, then you will actually not get a benefit from having 6(or 8 for that matter) versus 4 CPU cores. So if you're running a high end single GPU setup then you will likely be just cherry with a 4 core CPU.

If I find the article I was thinking of with the 6/8 core cpu testing I will post it.


----------



## kayan

Quote:


> Originally Posted by *royfrosty*
> 
> Finally my x99 + i7-5820k arrived.
> 
> Still waiting for my gskills ram though.
> 
> Gonna put this with my Fury X. Probably another for xfire.


Quote:


> Originally Posted by *HiTechPixel*
> 
> I always cry when people choose the 5820K on the X99 platform. Might as well choose the 4790K instead or save up for the 5930K.


I too got a 5820k and x99 platform over the 4790k, reason being the x99 for cpu + motherboard was a whole 20 bucks more than the 4790k + mobo. 6 cores vs 4 cores, and mine is clocked at 4.5ghz, so that's a wash. Why not when there is little to no price difference? At least if you have a Microcenter nearby. The only real price difference is when it comes to DDR3 vs DDR4, and even those prices have dropped a lot.

Anyway, on topic, how is Witcher 3 running on these Furies with the new drivers? If anyone has 3440x1440 resolution, I'd love to see some benchmarks, W3 and BF4 mostly!


----------



## rv8000

Quote:


> Originally Posted by *kayan*
> 
> I too got a 5820k and x99 platform over the 4790k, reason being the x99 for cpu + motherboard was a whole 20 bucks more than the 4790k + mobo. 6 cores vs 4 cores, and mine is clocked at 4.5ghz, so that's a wash. Why not when there is little to no price difference? At least if you have a Microcenter nearby. The only real price difference is when it comes to DDR3 vs DDR4, and even those prices have dropped a lot.
> 
> Anyway, on topic, how is Witcher 3 running on these Furies with the new drivers? If anyone has 3440x1440 resolution, I'd love to see some benchmarks, W3 and BF4 mostly!


What settings did you want again? Finally got the new card in and everything all setup. Can only do 2560x1440 though.


----------



## armartins

Guys don't forget with 15.7 you can emulate 4k via amd's DSR.


----------



## kayan

Quote:


> Originally Posted by *rv8000*
> 
> What settings did you want again? Finally got the new card in and everything all setup. Can only do 2560x1440 though.


W3 - Everything maxed, except hairworks. And also everything maxed without AA. Please and thank you.

BF4 - everything maxed, for maxed too. Thanks.


----------



## rv8000

Quote:


> Originally Posted by *kayan*
> 
> W3 - Everything maxed, except hairworks. And also everything maxed without AA. Please and thank you.
> 
> BF4 - everything maxed, for maxed too. Thanks.


4670k @ 4.2, 16GB of ddr3 2400, Fury X @ stock, cat 15.7

For W3 I ran from the first town in an arc into the forest past the river to the devil by the well, fought a pack of 3 ghouls on the way, about a 5 min run depending on the fight.

1440p W3 Ultra no GW no AA: Min - 47 Avg - 53 Max - 62
1440p W3 Ultra no GW in game AA: The same max and average, only saw a 1 fps decrease in min fps.

Don't own BF4 sorry


----------



## Shatun-Bear

A 5820k O/C to around it's average achievable 4.4-4.5Ghz will eat a 4970K for lunch at the Haswell CPU's average attainable overclock in any multi-threaded tasks. And really, this is all that matters as real-world gaming performance for any top end Intel CPU is more or less the same, modern games not being CPU bound as they are. Haswell-E really does overclock like a dream, mine's on 4.5Ghz 24/7 use with only 1.23 core voltage. Sorry off topic.

A 5820k paired with a Fury is a very sensible choice set-up for future games and DX12/Vulkan.


----------



## rv8000

Can some other Fury X owners run Firestrike @ stock with GPU-Z open to monitor max vcore, I'd like to get an idea of what the actual voltage seems to be on retail cards.

My card hit a max of 1.219v and under load during the 3dmark tests seem to sit around 1.18v.


----------



## xer0h0ur

Is anyone able to record a video of SoM performance for Fury X with the 15.7?


----------



## kayan

Quote:


> Originally Posted by *rv8000*
> 
> 4670k @ 4.2, 16GB of ddr3 2400, Fury X @ stock, cat 15.7
> 
> For W3 I ran from the first town in an arc into the forest past the river to the devil by the well, fought a pack of 3 ghouls on the way, about a 5 min run depending on the fight.
> 
> 1440p W3 Ultra no GW no AA: Min - 47 Avg - 53 Max - 62
> 1440p W3 Ultra no GW in game AA: The same max and average, only saw a 1 fps decrease in min fps.
> 
> Don't own BF4 sorry


Thanks. Rep for you.


----------



## josephimports

Quote:


> Originally Posted by *rv8000*
> 
> Can some other Fury X owners run Firestrike @ stock with GPU-Z open to monitor max vcore, I'd like to get an idea of what the actual voltage seems to be on retail cards.
> 
> My card hit a max of 1.219v and under load during the 3dmark tests seem to sit around 1.18v.


1.219v and 1.244v with the latter being the better OC'er.


----------



## rsiyasena

Hey you guys I'm currently experiencing some weird issues on my setup. I bought the Fury X a few weeks ago along with the necessary Active DP->DVI-D adapter (http://www.amazon.com/gp/product/B00A493CNY?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00) to allow it to run my QX2710 monitor.

First thing I noticed is whenever the input to the monitor gets "reset" aka whenever i apply new changes on CCC the monitor flashes with lines all over the screen and then resumes being normal. Here is a picture of the response: http://i.imgur.com/KsOJEPL.jpg

Similarity, when I attempt to play certain games I observe those lines except this time it doesn't disappear. Games like Battlefield 4, Witcher 3, Far Cry, Dota 2, Sniper V2, Dirt Rally all work fine albeit seeing the lines for a second. However, games like Wolfenstein - Old Blood and Hawken "reset" my monitor input and I'm presented with the no signal color pattern on my screen. Response:


http://imgur.com/p7lVo


I've tried reset my active adapter during the no signal response to no effect. Any Ideas?

My build:

http://pcpartpicker.com/user/rsiyasena/saved/36zxFT


----------



## Bludge

Quote:


> Originally Posted by *Casey Ryback*
> 
> Weren't 290X 8GB's enough for 4K? damn ultra HD...........


Quote:


> Originally Posted by *rsiyasena*
> 
> Hey you guys I'm currently experiencing some weird issues on my setup. I bought the Fury X a few weeks ago along with the necessary Active DP->DVI-D adapter (http://www.amazon.com/gp/product/B00A493CNY?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00) to allow it to run my QX2710 monitor.
> 
> First thing I noticed is whenever the input to the monitor gets "reset" aka whenever i apply new changes on CCC the monitor flashes with lines all over the screen and then resumes being normal. Here is a picture of the response: http://i.imgur.com/KsOJEPL.jpg
> 
> Similarity, when I attempt to play certain games I observe those lines except this time it doesn't disappear. Games like Battlefield 4, Witcher 3, Far Cry, Dota 2, Sniper V2, Dirt Rally all work fine albeit seeing the lines for a second. However, games like Wolfenstein - Old Blood and Hawken "reset" my monitor input and I'm presented with the no signal color pattern on my screen. Response:
> 
> 
> http://imgur.com/p7lVo
> 
> 
> I've tried reset my active adapter during the no signal response to no effect. Any Ideas?
> 
> My build:
> 
> http://pcpartpicker.com/user/rsiyasena/saved/36zxFT


Hi mate, I'm running a Dell adapter that looks very similar to yours, I can get you the model if needed. I have the adapter running a QX2710 like you, and a Samsung U28 4K through displayport. Initially on reset the Qnix monitor would display the same lines, and needed an unplug and replug to sort. I since moved the USB power to an onboard USB 2 port, rather than a USB 3 and reconnected everything, and since then all is fine.

No idea if the USB 3 -> 2 was it, or just retightening connections, sorry.


----------



## AKA1

What kind of memory clocks is everyone getting. I am at 555mhz http://i.imgur.com/a5mEf9h.png I just checked the use offical overclock method in afterburner and the slider for the memory worked after that







. I can't wait to overvolt this thing and see where it fly's!!!


----------



## rsiyasena

Quote:


> Originally Posted by *Bludge*
> 
> Hi mate, I'm running a Dell adapter that looks very similar to yours, I can get you the model if needed. I have the adapter running a QX2710 like you, and a Samsung U28 4K through displayport. Initially on reset the Qnix monitor would display the same lines, and needed an unplug and replug to sort. I since moved the USB power to an onboard USB 2 port, rather than a USB 3 and reconnected everything, and since then all is fine.
> 
> No idea if the USB 3 -> 2 was it, or just retightening connections, sorry.


Sweet I'll give it a shot when I get home Thursday. I'm assuming you play games on your 4k display only? If it's not too much trouble could you test out a few games on your QNIX and let me know if you lose all display signal? Also how do you like the Fury X?


----------



## p4inkill3r

Quote:


> Originally Posted by *AKA1*
> 
> What kind of memory clocks is everyone getting. I am at 555mhz http://i.imgur.com/a5mEf9h.png I just checked the use offical overclock method in afterburner and the slider for the memory worked after that
> 
> 
> 
> 
> 
> 
> 
> . I can't wait to overvolt this thing and see where it fly's!!!


Anything above 600mhz crashes the driver for me.


----------



## rv8000

Quote:


> Originally Posted by *josephimports*
> 
> 1.219v and 1.244v with the latter being the better OC'er.


Did the vcore hover about 40mV lower during the 3d tests 90% of the time? While my peak was 1.219, the card seemed to be closer to running @ 1.18v for the majority of the 3D tests.

As it stands it seems the vcore for Fiji is VERY similar to hawaii, I'm expecting to see ~1200 with around +140mV depending on the card, and maybe 1250 on an average card under a custom block with ~200mV. The golden cards may hit 1300. I'm really not expecting any miracles at this point.


----------



## flopper

Quote:


> Originally Posted by *rsiyasena*
> 
> Hey you guys I'm currently experiencing some weird issues on my setup. I bought the Fury X a few weeks ago along with the necessary Active DP->DVI-D adapter (http://www.amazon.com/gp/product/B00A493CNY?psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00) to allow it to run my QX2710 monitor.
> 
> First thing I noticed is whenever the input to the monitor gets "reset" aka whenever i apply new changes on CCC the monitor flashes with lines all over the screen and then resumes being normal. Here is a picture of the response: http://i.imgur.com/KsOJEPL.jpg
> 
> Similarity, when I attempt to play certain games I observe those lines except this time it doesn't disappear. Games like Battlefield 4, Witcher 3, Far Cry, Dota 2, Sniper V2, Dirt Rally all work fine albeit seeing the lines for a second. However, games like Wolfenstein - Old Blood and Hawken "reset" my monitor input and I'm presented with the no signal color pattern on my screen. Response:
> 
> 
> http://imgur.com/p7lVo
> 
> 
> I've tried reset my active adapter during the no signal response to no effect. Any Ideas?
> 
> My build:
> 
> http://pcpartpicker.com/user/rsiyasena/saved/36zxFT


Quote:


> Originally Posted by *Bludge*
> 
> Hi mate, I'm running a Dell adapter that looks very similar to yours, I can get you the model if needed. I have the adapter running a QX2710 like you, and a Samsung U28 4K through displayport. Initially on reset the Qnix monitor would display the same lines, and needed an unplug and replug to sort. I since moved the USB power to an onboard USB 2 port, rather than a USB 3 and reconnected everything, and since then all is fine.
> 
> No idea if the USB 3 -> 2 was it, or just retightening connections, sorry.


yea the adapter needs a charger to boost the voltage for stability.
was my experience also.


----------



## Silent Scone

So diddy.


----------



## ban25

Does anyone know if the Fury X supports 4 concurrent displays? I'm currently trying to set up a 4-display desktop, but the monitor connected to the HDMI port is not getting a signal.


----------



## hyp36rmax

Quote:


> Originally Posted by *Silent Scone*
> 
> So diddy.


Nice! Hurry and post with it on your FURY!


----------



## bastian

Quote:


> Originally Posted by *kayan*
> 
> I too got a 5820k and x99 platform over the 4790k, reason being the x99 for cpu + motherboard was a whole 20 bucks more than the 4790k + mobo. 6 cores vs 4 cores, and mine is clocked at 4.5ghz, so that's a wash. Why not when there is little to no price difference? At least if you have a Microcenter nearby. The only real price difference is when it comes to DDR3 vs DDR4, and even those prices have dropped a lot.


The 5820k is the best bang for buck CPU on X99 and overclocks in many cases better than the top X99 CPU.


----------



## bonami2

Hey uh my post got deleted









Anyways i dont remember exactly what i wrote ahahah

I love my 4790k for the canadian dollar value currently i saved lot of money


----------



## nyboy42

anyone else getting FreeSync NOT working when VSR is being used on 15.7? I have a 1440p 144hz native monitor and when I game in the support 3200x1800 60hz VSR mode, i notice screen tearing which means FreeSync is not working (its definitely enabled in catalyst)


----------



## aDyerSituation

Any word on the nano? Seems the most interesting


----------



## bonami2

Do vsr work in eyefinity?

Tried to find setting manually but i think i need to google


----------



## DividebyZERO

Quote:


> Originally Posted by *bonami2*
> 
> Do vsr work in eyefinity?
> 
> Tried to find setting manually but i think i need to google


Yes VSR works in eyefinity the link in my sig is based on VSR in eyefinity


----------



## Tivan

Quote:


> Originally Posted by *bonami2*
> 
> Do vsr work in eyefinity?
> 
> Tried to find setting manually but i think i need to google


Seems to work on my 2x1080p monitors!



Set em to 1440p in desktop after enabling VSR, then made the eyefinity group.


----------



## bonami2

Quote:


> Originally Posted by *DividebyZERO*
> 
> Yes VSR works in eyefinity the link in my sig is based on VSR in eyefinity


Ok so i need to rob a bank now....... Anyone? ahah

So if i understand well

It like a 4k display splitted in four = 1080p x 4

And you add vsr over that?

Any idea if a 1080p ips setup would look as awesome in 3 monitor setup with vsr? All i know is it help alot with aa and stuff In euro truck simulator it make the game look awesome.

I need lot more gpu horsepower for that kind of res ahah


----------



## bonami2

So a 7950 can push 1440p x 3

at 25fps in beam ng on high without some setting

Wow in normal im at like 35fps... I expected to be worse... If i knew i would have got 3 1440p

Anyways i need new gpuuuuu

Look awesome except in desktop mode


----------



## Alastair

Guys what is the bolt spacing for the Fiji chip? I want to work out if my current universal Watercool Heatkiller GPUX3's will be able to mount on the card. Then I will work out a way to cool the VRM's.


----------



## ozyo

So any update on volt controller ?


----------



## New green

"The card uses noticeably less power however and the custom PCB may prove to be a boon for overclocking." http://wccftech.com/wip-sapphire-fury-quieter-liquid-cooled-fury-asus-strix-significantly-power/

Would a 10+2 power phase overclock the cut down Fiji with better voltage control or just more efficient voltage control? I know the reference PCB is easier to water block atm not sure if there are any that will fit the strix. I guess other AIB's will be announcing their fury lines soon as well.


----------



## Ceadderman

That R9 395x2 has gotta be making nVidea







sweating bullets.









Same price point as 990ti and as good as their cards?

What's this about AMD going bust again?









~Ceadder


----------



## hyp36rmax

Quote:


> Originally Posted by *Ceadderman*
> 
> That R9 395x2 has gotta be making nVidea
> 
> 
> 
> 
> 
> 
> 
> sweating bullets.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same price point as 990ti and as good as their cards?
> 
> What's this about AMD going bust again?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


What's this news about the R9 395X2 Fury X2!?!? haha i want


----------



## boredmug

The solitaire benches were incredible I must say. Lol


----------



## Casey Ryback

Quote:


> Originally Posted by *hyp36rmax*
> 
> What's this news about the R9 395X2 Fury X2!?!? haha i want


Wouldn't a 395X2 just be a 295X2 but with double vram?

Wonder what price they'll release the fury X2 for?


----------



## Ceadderman

IKR?









I saw those an was like...









But to be fair to the reviewer, there really is no current game that could give the newer AMD cards much of a workout. Notice Reviewer said HBM regarding R9 395x2 so really, until benches are optimized we won't get real world testing for a little while.

Did you see the TriX Strix review? Made me want to pull the trigger on TriXX right now. 300w card that's quieter than any card on the market right now including Fury X.









But 395x2 is likely the card for me. Freaking 2Bills though. So that fits my time frame for the upgrade.









~Ceadder


----------



## Dtrain

Does anyone know when we'll be able to purchase the Fury cards? I figured they'd be up by now on Newegg since they supposedly release tomorrow, but I'm not seeing anything.


----------



## rv8000

Quote:


> Originally Posted by *Dtrain*
> 
> Does anyone know when we'll be able to purchase the Fury cards? I figured they'd be up by now on Newegg since they supposedly release tomorrow, but I'm not seeing anything.












The release date is the 14th, why would they be on the egg on the 13th? I'd expect it to have a similar time schedule of mid morning pst/est again for the egg.


----------



## Dtrain

Quote:


> Originally Posted by *rv8000*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The release date is the 14th, why would they be on the egg on the 13th? I'd expect it to have a similar time schedule of mid morning pst/est again for the egg.


Well I was mainly thinking they would have them for pre-order or something of the sort. I could of sworn after the NDA for the FuryX/300 they were online but you couldn't buy. I'm just trying to purchase one and not get left out with a backorder like the Fury X.


----------



## Ceadderman

Newegg rarely if ever puts anything on Preorder.









~Ceadder


----------



## Dtrain

Maybe I'm just thinking of everyone other website having the Fury X on backorder and preorder now status. Here's hoping we all get our cards tomorrow.


----------



## bonami2

Quote:


> Originally Posted by *Ceadderman*
> 
> IKR?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I saw those an was like...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But to be fair to the reviewer, there really is no current game that could give the newer AMD cards much of a workout. Notice Reviewer said HBM regarding R9 395x2 so really, until benches are optimized we won't get real world testing for a little while.
> 
> Did you see the TriX Strix review? Made me want to pull the trigger on TriXX right now. 300w card that's quieter than any card on the market right now including Fury X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But 395x2 is likely the card for me. Freaking 2Bills though. So that fits my time frame for the upgrade.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


my 5760x1080 want more power than a single titan x to max out some of my game that dont support sli and crossfire


----------



## DividebyZERO

So i have a good question i think about Fury x and 4k with HDMI and VSR.

So we know so far Fury x doesn't have hdmi 2.0 right? So whats stopping someone from using a 4k monitor/tv and using CRU to set it up as a 1080p max on resolution? Then enable VSR for 4k and see if the scaling is 1:1 pixel wise? Then you would get 60hz 4k over HDMI 1.4? I would love to test this as i have both 4k 60hz and 30hz panels, but no damn fury cards to try.....


----------



## bonami2

Just tried dsr on 5760x1080 to get 7680x1440p

Asked my friend if she see something different... Nope

And i cant see anything either

Tried 3 game.... Arma 3 in solo on my server and beam ng + Spin tire

Maybe my game arent fluid enough. 10-20fps. But even then it should..

Aa make all the difference in jaggy..

Seem i need to get 1440p or 4k panel if i want higher quality


----------



## aDyerSituation

Quote:


> Originally Posted by *bonami2*
> 
> Just tried dsr on 5760x1080 to get 7680x1440p
> 
> Asked my friend if she see something different... Nope
> 
> And i cant see anything either
> 
> Tried 3 game.... Arma 3 in solo on my server and beam ng + Spin tire
> 
> Maybe my game arent fluid enough. 10-20fps. But even then it should..
> 
> Aa make all the difference in jaggy..
> 
> Seem i need to get 1440p or 4k panel if i want higher quality


DSR just makes text too hard to read and the game super blurry for me


----------



## Ceadderman

Quote:


> Originally Posted by *DividebyZERO*
> 
> So i have a good question i think about Fury x and 4k with HDMI and VSR.
> 
> *So we know so far Fury x doesn't have hdmi 2.0 right?* So whats stopping someone from using a 4k monitor/tv and using CRU to set it up as a 1080p max on resolution? Then enable VSR for 4k and see if the scaling is 1:1 pixel wise? Then you would get 60hz 4k over HDMI 1.4? I would love to test this as i have both 4k 60hz and 30hz panels, but no damn fury cards to try.....


The AMD speaker stated they're working on that. Suggesting a Dport dongle to remedy that issue.

Probably similar to the dongles they came up with to run multiple displays with the 5*** series but boosting the signal to allow 2.0 hdmi over the 1.34 hdmi output level.









~Ceadder


----------



## tx12

According to Fury non-X's BIOS, Air Fury may use software locks to disable extra CU's.
On other words, Air Fury MAY be unlockable like some of R9 290's before, by reflashing with Fury X BIOS.


----------



## blue1512

Quote:


> Originally Posted by *tx12*
> 
> According to Fury non-X's BIOS, Air Fury may use software locks to disable extra CU's.
> On other words, Air Fury MAY be unlockable like some of R9 290's before, by reflashing with Fury X BIOS.


The problem is the FuryX BIOS is dedicated to pump control, which means flashing on an air cooled Fury is close to impossible


----------



## tx12

Quote:


> Originally Posted by *blue1512*
> 
> The problem is the FuryX BIOS is dedicated to pump control, which means flashing on an air cooled Fury is close to impossible


That's not a problem. Software unlockable Air Furys could be rare or even not exist. That's the problem.


----------



## tx12

Based on review photos I were able to find these part numbers:
*215-0862000* - Fiji ES chip, found on Sapphire Fury Tri-X press photo.
*215-0862040* - Full Fiji, Fury X production and ES chips, found in every Fury X review.
*215-0862046* - Cutdown Fiji(?), Fury Air found on Sapphire Fury Tri-X and Asus R9 Fury STRIX.

If there are any chances for unlockable Fury Air, they should have the same number as in full Fury. Different number must point to HW-locked variant.


----------



## Gdourado

Arrived!




I'm now officially in the club!








It's going to be a very interesting afternoon...


----------



## Luxkeiwoker

Quote:


> Originally Posted by *tx12*
> 
> According to Fury non-X's BIOS, Air Fury may use software locks to disable extra CU's.
> On other words, Air Fury MAY be unlockable like some of R9 290's before, by reflashing with Fury X BIOS.


Also hoping for that. Thinking over buying a R9 Fury right now, in case some early models can be unlocked.


----------



## swiftypoison

Quote:


> Originally Posted by *Gdourado*
> 
> Arrived!
> 
> 
> 
> 
> I'm now officially in the club!
> 
> 
> 
> 
> 
> 
> 
> 
> It's going to be a very interesting afternoon...


Nice!

Where are you going to install the rad? I have a 450D as well and cant place it in the back so let me know if its possible to install it in the front.


----------



## Ha-Nocri

Quote:


> Originally Posted by *swiftypoison*
> 
> Nice!
> 
> Where are you going to install the rad? I have a 450D as well and cant place it in the back so let me know if its possible to install it in the front.


In the top?


----------



## HaloFx

Anyone in the US seeing the fury pro for sale yet?


----------



## Forceman

Quote:


> Originally Posted by *tx12*
> 
> Based on review photos I were able to find these part numbers:
> *215-0862000* - Fiji ES chip, found on Sapphire Fury Tri-X press photo.
> *215-0862040* - Full Fiji, Fury X production and ES chips, found in every Fury X review.
> *215-0862046* - Cutdown Fiji(?), Fury Air found on Sapphire Fury Tri-X and Asus R9 Fury STRIX.
> 
> If there are any chances for unlockable Fury Air, they should have the same number as in full Fury. Different number must point to HW-locked variant.


We need a Fiji info tool to identify unlockable cards, like we had with Hawaii.


----------



## bonami2

Quote:


> Originally Posted by *blue1512*
> 
> The problem is the FuryX BIOS is dedicated to pump control, which means flashing on an air cooled Fury is close to impossible


i dont think it will change anything except the rpm The fan and the pump are pwm controlled with % so the pump is only going to run slower but in that case you flash a high fan speed or something no?


----------



## xer0h0ur

Quote:


> Originally Posted by *blue1512*
> 
> The problem is the FuryX BIOS is dedicated to pump control, which means flashing on an air cooled Fury is close to impossible


And that close to impossible turns into very probable with people slapping waterblocks on Fury.


----------



## Clockster

If anyone is interested, I am busy with a review on the Asus R9 Fury X and thought I would update in this thread as I go through the benches ect.

FireStrike Stock vs Overclock.
http://www.3dmark.com/compare/fs/5399120/fs/5399173

FireStrike Extreme Stock vs Overclock.
http://www.3dmark.com/compare/fs/5399267/fs/5399205

FireStrike Ultra Stock vs Overclock.
http://www.3dmark.com/compare/fs/5399291/fs/5399236


----------



## HaloFx

Sapphire Fury tri-x oc is on newegg.


----------



## rv8000

Quote:


> Originally Posted by *HaloFx*
> 
> Sapphire Fury tri-x oc is on newegg.


Do you have a link, not showing up under any category I search?


----------



## HaloFx

Filter on the left when looking at all cards. Under the R9 section.


----------



## Agent Smith1984

ATTACK

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202156&cm_re=sapphire_r9-_-14-202-156-_-Product


----------



## p4inkill3r

Quote:


> Originally Posted by *HaloFx*
> 
> Sapphire Fury tri-x oc is on newegg.


http://www.newegg.com/Product/Product.aspx?Item=N82E16814202156&cm_re=amd_radeon_r9_fury-_-14-202-156-_-Product


----------



## HaloFx

http://m.newegg.com/Product/index?itemnumber=N82E16814202156


----------



## rv8000

Quote:


> Originally Posted by *HaloFx*
> 
> Filter on the left when looking at all cards. Under the R9 section.


There's only the Fury X under the R9 drop down menu on both the US and CA site. Post a link or stop trolling.

Thanks for the links.


----------



## Clockster

lol I told you guys


----------



## Ceadderman

That price is $200 more than what MSRP is listed at. Blast Newegg!









CPU shows that card at $365 or theabouts.









~Ceadder


----------



## p4inkill3r

Quote:


> Originally Posted by *Ceadderman*
> 
> That price is $200 more than what MSRP is listed at. Blast Newegg!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CPU shows that card at $365 or theabouts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I wonder how many they'll sell before people realize that.


----------



## tx12

Quote:


> Originally Posted by *Ceadderman*
> 
> That price is $200 more than what MSRP is listed at. Blast Newegg!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CPU shows that card at $365 or theabouts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Isn't $549 Fury's MSRP? +little extra from Sapphire for OC.


----------



## Agent Smith1984

Fury pro was expected to be $550 all along....

Maybe Nano will be in the $365 range?


----------



## jase78

Is fury air supposed to be 369 or something? When I look at price on newegg it's 569 not 769? What u mean 200 over msrp


----------



## criminal

Quote:


> Originally Posted by *jase78*
> 
> Is fury air supposed to be 369 or something? When I look at price on newegg it's 569 not 769? What u mean 200 over msrp


Someone is confused I guess. Fury has always had a MSRP of $550.


----------



## Agent Smith1984

From the time the Fury air was announced the price was $549....
Not sure where the $36* numbers are coming from.


----------



## DividebyZERO

Limit of 1 yay!


----------



## Ceadderman

Quote:


> Originally Posted by *jase78*
> 
> Is fury air supposed to be 369 or something? When I look at price on newegg it's 569 not 769? What u mean 200 over msrp


Now I could be wrong here. But just pulled this out to check...



And looked at this again...



Now it looks exactly the same to me. Am I wrong?









Taken with my S4 so my apologies for the ghetto pix.









~Ceadder


----------



## jase78

You guys wanna chip up and get me one. Probably only cost you a few bux each. I'll let ya know how it performs


----------



## criminal

Quote:


> Originally Posted by *Ceadderman*
> 
> Now I could be wrong here. But just pulled this out to check...
> 
> 
> 
> And looked at this again...
> 
> 
> 
> Now it looks exactly the same to me. Am I wrong?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Taken with my S4 so my apologies for the ghetto pix.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


That is the R9390 (have that issue at home). It says R9 300 series right on the cover!


----------



## Ceadderman

I know what it says. I have it sitting on the floor right in front of me.

But am thinking this "Air" reference is hyperbole. Cause that card on the Egg looks exactly the same. CPU even mentions that the card I put up has been "optimized".

Which leads to my confusion. It's Sapphire, it's been optimized and it's air. Also these photos have been released in time for this and there isn't (last I checked anyway) an R9 390 listed on the Egg. Unless that is this.









~Ceadder


----------



## swiftypoison

Quote:


> Originally Posted by *Ceadderman*
> 
> I know what it says. I have it sitting on the floor right in front of me.
> 
> But am thinking this "Air" reference is hyperbole. Cause that card on the Egg looks exactly the same. CPU even mentions that the card I put up has been "optimized".
> 
> Which leads to my confusion. It's Sapphire, it's been optimized and it's air. Also these photos have been released in time for this and there isn't (last I checked anyway) an R9 390 listed on the Egg. Unless that is this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah no. Whatever you're smoking, put it down.

Also, Asus Fury X in stock:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121969


----------



## Ceadderman

Just looked again and that card looks *exactly* like the one referred to in CPU.

So while I am likely wrong, I won't take offense and will pick my bleeding cigarette up and continue smoking it. I had stated that I could be wrong. Nothing hallucinogenic in my smokes since I get them at the local filling station.







lol

~Ceadder


----------



## p4inkill3r

I didn't even look, I just assumed they had marked it up 

Yeah, the 390 is not the Fury though, even though they look the same.


----------



## Ceadderman

Yup, I was assuming the same which is why I posted pics in order to get clarification.









~Ceadder


----------



## xer0h0ur

And people assumed when I posted the Asus Fury Strix that it was a 390. Even though it was not.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> And people assumed when I posted the Asus Fury Strix that it was a 390. Even though it was not.


When I saw the 1080/1440 benchmarks for the Fury Strix, I still thought it was a 390


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> When I saw the 1080/1440 benchmarks for the Fury Strix, I still thought it was a 390


Except the 1080p and 1440p benches aren't very much better for the Fury X either. People keep claiming ROPs help at higher resolutions but I still contend its affecting the lower resolutions far more than 4K. Its 4K performance is still solid with only 64 ROPs yet its suffering badly at lower resolutions and we know its not a bandwidth limitation either.


----------



## littlestereo

It took less time and money for these to ship from EKWB in Slovenia to Colorado than the rest of my loop components to ship from Florida to Colorado









My current PSU is only rated for 800 watts (upgrading to EVGA 1200 P2) and these cards in crossfire are being throttled/ underpowered, wall draw on the system is almost at 850 during peaks (based on UPS readout). I should have some benches up with the whole rig under water in a couple weeks.


----------



## Ceadderman

Happens like that if you choose cheapest FedEx/UPS option.

Be happy that you aren't up here in Washington State. It takes nearly 10 days for my orders to arrive from PPCs. So my last order *must* be placed in early October to make my November 1st deadline. When that hits it's time to fine tune and leak test.









~Ceadder


----------



## hyp36rmax

Quote:


> Originally Posted by *littlestereo*
> 
> It took less time and money for these to ship from EKWB in Slovenia to Colorado than the rest of my loop components to ship from Florida to Colorado
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My current PSU is only rated for 800 watts (upgrading to EVGA 1200 P2) and these cards in crossfire are being throttled/ underpowered, wall draw on the system is almost at 850 during peaks (based on UPS readout). I should have some benches up with the whole rig under water in a couple weeks.


----------



## gatygun

Quote:


> Originally Posted by *p4inkill3r*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202156&cm_re=amd_radeon_r9_fury-_-14-202-156-_-Product


should have kept the yellow color on it.


----------



## Ceadderman

Eeeew Lord no. I like it. And honestly if I need a color on it I would just drag out the Frog Tape and some rattle cans.









~Ceadder


----------



## xer0h0ur

I would just break out my paint brushes and model paints. I hate dealing with overspray when I can be precise with a brush.


----------



## Ceadderman

Hence Frog Tape painter's tape. Stuff is as good as the 3M stuff that Bill Owens reps on MNPCTech. I love it so much. I applied some before I left Washington 3 years ago. Got back last year and finally peeled it away. No residue on my CPU Block and the surface was easily cleaned of any accumulate tarnish.









~Ceadder


----------



## Ganf

Quote:


> Originally Posted by *littlestereo*
> 
> It took less time and money for these to ship from EKWB in Slovenia to Colorado than the rest of my loop components to ship from Florida to Colorado
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My current PSU is only rated for 800 watts (upgrading to EVGA 1200 P2) and these cards in crossfire are being throttled/ underpowered, wall draw on the system is almost at 850 during peaks (based on UPS readout). I should have some benches up with the whole rig under water in a couple weeks.
> 
> 
> Spoiler: Warning: Spoiler!


I get the same thing from Aquatuning in Germany. Order from them and PPC's at the same time, Aquatuning gets here a day earlier even though I'm in the same state as PPC's. It's glorious.


----------



## Maximization

well guys i commited..
see ya in a month hahahahahaah hope my 4960x can handle the strain

Capture.PNG 18k .PNG file


----------



## akumaburn

That graphics score.. http://www.3dmark.com/fs/5347923


----------



## Maximization

was that stock?


----------



## rv8000

Quote:


> Originally Posted by *Maximization*
> 
> was that stock?


No stock is around 16.3k gpu score, sometimes 300 points higher with the newer 15.7 drivers, really seems to differ from setup to setup tbh. I scored around 17.6k gpu score on 15.15 betas @ 1150/525. May have a golden card on his hands, from that score I'd guess he could be around 1180/600; unless he found a way to trick his submission and has some form of Tess setting teak.


----------



## akumaburn

Quote:


> Originally Posted by *Maximization*
> 
> was that stock?


According to 3DMark, it was a stock score.

Maybe it has to do with him being on windows 8.1?


----------



## Ganf

Quote:


> Originally Posted by *akumaburn*
> 
> According to 3DMark, it was a stock score.
> 
> Maybe it has to do with him being on windows 8.1?


That would only be to his detriment. 3dmark reads the clocks wrong all of the time. I swear there are more OC'd submissions listed under stock clocks than there are listed at their true clock settings.


----------



## royfrosty

Hi guys, i have a minor issue.

After porting my fury X to the x99 sli plus. I cant enable fast boot after i fresh installed Windows 8.1.

Each time i enable windows 8.1 features in BIOS to enable either msi fast boot or fast boot. After saving the BIOS configuration and restart, it shows....

"System bios detected a non windows 8 logo graphic card.
there is no GOP (graphics output protocol) support detected on this card.
windows 8 feature settings in bios will be changed to disabled.

Press f1 to run setup. F2 to load default values"

Any idea?


----------



## Ganf

Quote:


> Originally Posted by *royfrosty*
> 
> Hi guys, i have a minor issue.
> 
> After porting my fury X to the x99 sli plus. I cant enable fast boot after i fresh installed Windows 8.1.
> 
> Each time i enable windows 8.1 features in BIOS to enable either msi fast boot or fast boot. After saving the BIOS configuration and restart, it shows....
> 
> "System bios detected a non windows 8 logo graphic card.
> there is no GOP (graphics output protocol) support detected on this card.
> windows 8 feature settings in bios will be changed to disabled.
> 
> Press f1 to run setup. F2 to load default values"
> 
> Any idea?


Sounds like something in the MB Bios doesn't play well with the card, likely that fast boot doesn't have the GPU up high enough in the boot order, so when it goes to verify that the card is working the GPU hasn't even finished going through it's own Bios checks yet.

Maybe? I've never encountered this problem. What's your MB?


----------



## royfrosty

Quote:


> Originally Posted by *Ganf*
> 
> Sounds like something in the MB Bios doesn't play well with the card, likely that fast boot doesn't have the GPU up high enough in the boot order, so when it goes to verify that the card is working the GPU hasn't even finished going through it's own Bios checks yet.
> 
> Maybe? I've never encountered this problem. What's your MB?


Im using the msi x99s sli plus.


----------



## Neon Lights

Does anyone know what would have to be done to edit a Fury X BIOS?


----------



## Gdourado

Done.
No pump noise, no coil whine.
The system is almost totally silent.
What a difference from my 290x!



Cheers!


----------



## magicc8ball

Quote:


> Originally Posted by *Gdourado*
> 
> Done.
> No pump noise, no coil whine.
> The system is almost totally silent.
> What a difference from my 290x!
> 
> 
> 
> Cheers!


Looks very good Gdourado. The red and black theme is spot on.


----------



## Alastair

Quote:


> Originally Posted by *magicc8ball*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gdourado*
> 
> Done.
> No pump noise, no coil whine.
> The system is almost totally silent.
> What a difference from my 290x!
> 
> 
> 
> Cheers!
> 
> 
> 
> Looks very good Gdourado. The red and black theme is spot on.
Click to expand...

I want to see two of these in a loop!!!!!


----------



## Maximization

Quote:


> Originally Posted by *Alastair*
> 
> I want to see two of these in a loop!!!!!


ditto


----------



## Gdourado

Quote:


> Originally Posted by *magicc8ball*
> 
> Looks very good Gdourado. The red and black theme is spot on.


Thanks!


----------



## Gdourado

On another note...
Any benefit on increasing the power limit on Afterburner?

Cheers!


----------



## By-Tor

Glad to see someone added a DVI out to a Fury card.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121975


----------



## Maximization

overclocking amd cards always required full power limit, this is such a new beast it is exciting


----------



## Bill D

anyone see anywhere if fury x and a Sapphire Tri-X fury will work in cfx and how well at 4k

my MAGNUM TX10-D is too big to mount the second rad other than on a post inside









and want to run them for a bit before I switch to EK blocks

thanks


----------



## Agent Smith1984

Quote:


> Originally Posted by *Bill D*
> 
> anyone see anywhere if fury x and a Sapphire Tri-X fury will work in cfx and how well at 4k
> 
> my MAGNUM TX10-D is too big to mount the second rad other than on a post inside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and want to run them for a bit before I switch to EK blocks
> 
> thanks


Won't the slower card just limit the faster one? Unless the asymmetric rendering thing is really happening soon, I wouldn't advise...
Of course, based on initial benchmarks, the difference in shaders is not having a big impact on the Fury performance, so even in DX11, it may not pigeon-hold the X by much....


----------



## Ganf

Quote:


> Originally Posted by *Bill D*
> 
> anyone see anywhere if fury x and a Sapphire Tri-X fury will work in cfx and how well at 4k
> 
> my MAGNUM TX10-D is too big to mount the second rad other than on a post inside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and want to run them for a bit before I switch to EK blocks
> 
> thanks


Can't sandwich the rads together?


----------



## Bill D

Quote:


> Originally Posted by *Ganf*
> 
> Can't sandwich the rads together?


would hit my memory on my R4E


----------



## Maximization

2 furys crossfire outperfom titian x sli






i got a mountiain mods pedistal attached to my raven 3 case...your need rad space for this stuff. I plan to order blocks after using them in stock config.


----------



## Ganf

Quote:


> Originally Posted by *Bill D*
> 
> would hit my memory on my R4E


Bench grinder on the memory heatsink?









I'm out of ideas, unless you wanted to pick up a PCI-e extender and put that card on the other side.









In all likelihood the cards will crossfire, same die, same architecture, but I wouldn't purchase one without confirmation and confirmation will likely be a long time in the coming.


----------



## Bill D

Quote:


> Originally Posted by *Ganf*
> 
> Bench grinder on the memory heatsink?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm out of ideas, unless you wanted to pick up a PCI-e extender and put that card on the other side.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all likelihood the cards will crossfire, same die, same architecture, but I wouldn't purchase one without confirmation and confirmation will likely be a long time in the coming.


post it is then


----------



## Ceadderman

That's one way to geterdun.









~Ceadder


----------



## Thoth420

Quote:


> Originally Posted by *Gdourado*
> 
> Done.
> No pump noise, no coil whine.
> The system is almost totally silent.
> What a difference from my 290x!
> 
> 
> 
> Cheers!


Looks great and glad you got a good one!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Bill D*
> 
> post it is then


OH MAN!

Where did you find that at??


----------



## Bill D

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OH MAN!
> 
> Where did you find that at??


they have them for all their cases

http://www.caselabs-store.com/hd-vertical-accessory-mounts-pricing-varies/

http://www.caselabs-store.com/120-1-120-fan-radiator-mount/
http://www.caselabs-store.com/fan-attachments-120mm-pricing-varies/

I wish they had 140mm fan mounts


----------



## xer0h0ur

That post mount is a pretty ingenious idea.


----------



## Ceadderman

Quote:


> Originally Posted by *Bill D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> OH MAN!
> 
> Where did you find that at??
> 
> 
> 
> they have them for all their cases
> 
> http://www.caselabs-store.com/hd-vertical-accessory-mounts-pricing-varies/
> 
> http://www.caselabs-store.com/120-1-120-fan-radiator-mount/
> http://www.caselabs-store.com/fan-attachments-120mm-pricing-varies/
> 
> I wish they had 140mm fan mounts
Click to expand...

Try a 120 to 140 adapter?









~Ceadder


----------



## Gdourado

What Core OC are you getting?
Still no voltage control?
What do you use to test stability?


----------



## josephimports

Quote:


> Originally Posted by *Gdourado*
> 
> What Core OC are you getting?
> Still no voltage control?
> What do you use to test stability?


*What Core OC are you getting?* GPU1 max 1125/550, GPU2 1150+/600+
*Still no voltage control?* Not yet.
*What do you use to test stability?* Games (GTAV, BF4) and benchmarks (3DMark, Valley).


----------



## Clockster

Quote:


> Originally Posted by *josephimports*
> 
> *What Core OC are you getting?* GPU1 max 1125/550, GPU2 1150+/600+
> *Still no voltage control?* Not yet.
> *What do you use to test stability?* Games (GTAV, BF4) and benchmarks (3DMark, Valley).


This is so funny, I know of someone who also bought a card at release and then grabbed a 2nd one a couple of days ago and the 2nd one overclocks much better than the 1st...very weird.

Also I'm still waiting for my damn second card to arrive


----------



## Gdourado

I will launch uniengine in window mode and then start increasing the core until artifacts start to show up.
Let's see how it goes...

Cheers!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Clockster*
> 
> This is so funny, I know of someone who also bought a card at release and then grabbed a 2nd one a couple of days ago and the 2nd one overclocks much better than the 1st...very weird.
> 
> Also I'm still waiting for my damn second card to arrive


Normally I'd chock that up to temps on card 1 associated with crossfire, but with the watercooling on these things, that's not the case at all.









I can see the core results varying in the 1100-1150 range for Fury/Fury X but one thing I don't understand is why the HBM clocks are varying so much....

If they were 1500MHz chips varying by 50MHz I could see it, but the clock speed of 500MHz is so low....

Strange that some cards are doing 550, and others are easily hitting 600MHz.... Hitting 600MHz on those cards is a 20% increase in bandwidth, that's pretty significant.


----------



## Ganf

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Normally I'd chock that up to temps on card 1 associated with crossfire, but with the watercooling on these things, that's not the case at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can see the core results varying in the 1100-1150 range for Fury/Fury X but one thing I don't understand is why the HBM clocks are varying so much....
> 
> If they were 1500MHz chips varying by 50MHz I could see it, but the clock speed of 500MHz is so low....
> 
> Strange that some cards are doing 550, and others are easily hitting 600MHz.... Hitting 600MHz on those cards is a 20% increase in bandwidth, that's pretty significant.


This is HBM's beta test. They're proving it's reliability and stability before putting it in full swing and taking it to the Enterprise market. Low, locked down clocks give them feedback data they need to take to their engineers, who then write up specs to impress the enterprise buyers, so that the marketing team can start selling the next lineup of Firepros before they ever go into production.


----------



## Agent Smith1984

I can't believe you guys still have no voltage control....
I am dying to see how these cores react to juice.


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can't believe you guys still have no voltage control....
> I am dying to see how these cores react to juice.


We need proper WIN 10 drivers first.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> We need proper WIN 10 drivers first.


I was under the impression that the 15.20 driver was doing fine in Win 10?


----------



## Cyants

I will be part of the club someday but right now I can`t get a hold of one in canada, they seem to be on sale for less than a hour at a time before they are sold out


----------



## Ganf

Quote:


> Originally Posted by *Jflisk*
> 
> We need proper WIN 10 drivers first.


GTA V the only thing you're having Crossfire problems with? That isn't going to get better any time soon, no matter how much they polish the drivers.


----------



## Agent Smith1984

Has anyone verified whether or not Fury pro runs on lower voltage that the X??

Seems like all the Furies are hitting 1090-1110 OC, while the X's are in the 1125 to 1150 range...

If that's the result of binning, then I don't expect much headroom left in the core, even with voltage, however, if the VID of the core on the Fury (non X) card is lower, then OC headroom with voltage control could be very promising.....


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Has anyone verified whether or not Fury pro runs on lower voltage that the X??
> 
> Seems like all the Furies are hitting 1090-1110 OC, while the X's are in the 1125 to 1150 range...
> 
> If that's the result of binning, then I don't expect much headroom left in the core, even with voltage, however, if the VID of the core on the Fury (non X) card is lower, then OC headroom with voltage control could be very promising.....


Ill know in 2-3 hours


----------



## Jflisk

Quote:


> Originally Posted by *Ganf*
> 
> GTA V the only thing you're having Crossfire problems with? That isn't going to get better any time soon, no matter how much they polish the drivers.


No with the 15.7 WIN 10 drivers if I set the power options to turn off the monitor after and hour my system blue screen crashes. So I went back to 15.7 non Windows 10 and went into power options turned off - Turn off monitor after 1 hour or x amount of time have not had a problem since so its either the driver or the power option.I am willing to bet after the 29th if enough people complain about it AMD will fix it.


----------



## Ganf

Quote:


> Originally Posted by *Jflisk*
> 
> No with the 15.7 WIN 10 drivers if I set the power options to turn off the monitor after and hour my system blue screen crashes. So I went back to 15.7 non Windows 10 and went into power options turned off - Turn off monitor after 1 hour or x amount of time have not had a problem since so its either the driver or the power option.I am willing to bet after the 29th if enough people complain about it AMD will fix it.


I think that's a windows problem. It was one of the W10 builds that did it for me, I started getting black screen lockups, and then a couple builds later my monitor just started refusing to go to sleep and it's been that way since for about a month now. Pretty sure none of that happened in relation to my driver updates though.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Has anyone verified whether or not Fury pro runs on lower voltage that the X??
> 
> Seems like all the Furies are hitting 1090-1110 OC, while the X's are in the 1125 to 1150 range...
> 
> If that's the result of binning, then I don't expect much headroom left in the core, even with voltage, however, if the VID of the core on the Fury (non X) card is lower, then OC headroom with voltage control could be very promising.....


Fury X - 1.212 V
Flurry - 1.169 V
Sapphire Fury OC - 1.212 V

From AnandTech


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Fury X - 1.212 V
> Flurry - 1.169 V
> Sapphire Fury OC - 1.212 V
> 
> From AnandTech


That's somewhat bad news....

All the Tri-X fury reviews show it hitting around 1090MHz.. if it's doing that on the same voltage as the Fury X, then that means a) These cores take pretty well to lower temps (don't they all?







) and/or b) the silicon sucks and they need to be binned for a measly 50MHz.... and if that last part is the case, then then OC ceiling on these cards, even with voltage, is going to be very similar to Hawaii.... Luckily it's got a LOT of shaders to put any additional MHz gained to work, however, looking at Fury VS Fury X performance, those shaders aren't nearly as important as the ROP's.... That also explains why 390X is right on it's tail at 1080 and 1440.

That's just all theory though... I could be totally wrong, and these things may take to voltage like a fat kid at a Chinese buffet....


----------



## xer0h0ur

Who cares about what the clock speed of HBM is. Ultimately what matters is the bandwidth produced and its a hell of a lot more than your 390. The reason HBM has been hitting varying clocks is because AMD originally intended to run the HBM at a higher clock speed, I can't remember if it was supposed to be 550 or 600MHz. This must have backfired on them or simply the current HBM yields weren't able to handle those higher clock speeds consistently which is more likely.

The main difference I am noting between Hawaii and Fury though is the crossfire scaling. That Fury / Fury X crossfire scaling is ridiculous. Hawaii has never scaled that well.


----------



## Ganf

Quote:


> Originally Posted by *xer0h0ur*
> 
> Who cares about what the clock speed of HBM is. Ultimately what matters is the bandwidth produced and its a hell of a lot more than your 390. The reason HBM has been hitting varying clocks is because AMD originally intended to run the HBM at a higher clock speed, I can't remember if it was supposed to be 550 or 600MHz. This must have backfired on them or simply the current HBM yields weren't able to handle those higher clock speeds consistently which is more likely.
> 
> The main difference I am noting between Hawaii and Fury though is the crossfire scaling. That Fury / Fury X crossfire scaling is ridiculous. Hawaii has never scaled that well.


HBM clock speed can actually be a bottleneck in huge parallel processing tasks. Tessellation and Physics, namely. When the Fiji die starts to run full tilt user benchmarks have shown OCing the memory provides some pretty decent boosts.


----------



## blue1512

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's somewhat bad news....
> 
> All the Tri-X fury reviews show it hitting around 1090MHz.. if it's doing that on the same voltage as the Fury X, then that means a) These cores take pretty well to lower temps (don't they all?
> 
> 
> 
> 
> 
> 
> 
> ) and/or b) the silicon sucks and they need to be binned for a measly 50MHz.... and if that last part is the case, then then OC ceiling on these cards, even with voltage, is going to be very similar to Hawaii.... Luckily it's got a LOT of shaders to put any additional MHz gained to work, however, looking at Fury VS Fury X performance, those shaders aren't nearly as important as the ROP's.... That also explains why 390X is right on it's tail at 1080 and 1440.
> 
> That's just all theory though... I could be totally wrong, and these things may take to voltage like a fat kid at a Chinese buffet....


No, Anantech's Sapphire Fury hit 1125MHz.
http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/18


----------



## xer0h0ur

AMD is way behind on tessellation and particle effect processing but I don't attribute that to HBM even in the least.


----------



## swiftypoison

I found a open box Fury X at my local Microcenter. hopefully ill go pick it up today so I can bench it against my 980 Kingpin


----------



## Ha-Nocri

Quote:


> Originally Posted by *swiftypoison*
> 
> I found a open box Fury X at my local Microcenter. hopefully ill go pick it up today so I can bench it against my 980 Kingpin


what resolution?


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Who cares about what the clock speed of HBM is. Ultimately what matters is the bandwidth produced and its a hell of a lot more than your 390. The reason HBM has been hitting varying clocks is because AMD originally intended to run the HBM at a higher clock speed, I can't remember if it was supposed to be 550 or 600MHz. This must have backfired on them or simply the current HBM yields weren't able to handle those higher clock speeds consistently which is more likely.
> 
> The main difference I am noting between Hawaii and Fury though is the crossfire scaling. That Fury / Fury X crossfire scaling is ridiculous. Hawaii has never scaled that well.


Well... I manage around 434GB @ 1750MHz, so I'm pretty well covered on bandwidth, but still a good bit behind Fury though









Crossfire scaling on Fury is really good, but from what I've seen, it has also improved quite a bit on Hawaii now....

The initial Fury X CF results were mostly on 15.15 drivers, which were proprietary for 300/Fury, and AMD had already implemented some tess improvements, and draw call improvements in that driver. It gave the Fury a chance to look great in crossfire right out of the gate.
Now that those improvements are part of 15.7, everyone using crossfire should see improved scaling.


----------



## Agent Smith1984

Quote:


> Originally Posted by *blue1512*
> 
> No, Anantech's Sapphire Fury hit 1125MHz.
> http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/18


They hit 1125 on the ref Fury X, the Sapphire only did 1100 according to the chart.


----------



## swiftypoison

Quote:


> Originally Posted by *Ha-Nocri*
> 
> what resolution?


At 1080p most likely. I may pick up a 1440p monitor while I am there, but not sure.


----------



## xer0h0ur

This is particularly impressive: http://iyd.kr/753

In fact its scaling so well that even when single cards lose to 980 Ti they catch up and pass up in crossfire. I wish this guy had done the same testing with 390X's just for shiz and giggles.


----------



## Ha-Nocri

Quote:


> Originally Posted by *swiftypoison*
> 
> At 1080p most likely. I may pick up a 1440p monitor while I am there, but not sure.


Well, you have VSR and whatever the NV equivalent is called


----------



## Nizzen

It is so boring to own an Fury X and not overclocking higher than 1125 mhz. If not MSI afterburner or sapphire trixx supporting voltagecontrol soon on FX, I'll trow it in the garbage


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nizzen*
> 
> It is so boring to own an Fury X and not overclocking higher than 1125 mhz. If not MSI afterburner or sapphire trixx supporting voltagecontrol soon on FX, I'll trow it in the garbage


Uhhhh

I'll take it off your hands pal!!


----------



## xer0h0ur

LOL, get rekt. You spend $3000 on Titan X's but that $650 Fury X is the card making you angry. Seems legit.


----------



## boredmug

Quote:


> Originally Posted by *Nizzen*
> 
> It is so boring to own an Fury X and not overclocking higher than 1125 mhz. If not MSI afterburner or sapphire trixx supporting voltagecontrol soon on FX, I'll trow it in the garbage


I'll trade you my two blocked 290x's and you can play with overclocking all you want. ;-)


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> This is particularly impressive: http://iyd.kr/753
> 
> In fact its scaling so well that even when single cards lose to 980 Ti they catch up and pass up in crossfire. I wish this guy had done the same testing with 390X's just for shiz and giggles.


Take a look at this:
http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1638&page=2

390 scaling like crazy in the titles that use crossfire... that is the 15.15 driver, so 15.7 may make an even better showing.


----------



## xer0h0ur

FWIW I see far more people saying 15.15 was better than 15.20 which is the same thing as the 15.7. So I doubt the 15.7 would net better results.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> FWIW I see far more people saying 15.15 was better than 15.20 which is the same thing as the 15.7. So I doubt the 15.7 would net better results.


That's strange, cause I have noticed no difference in 15.7. I don't play witcher 3 either though, and that was supposed to be the title most hurt by 15.20.

Either way, 15.7 did bring improvements to Hawaii, so everyone on 290/x/295x2 who went to 15.7 should see big crossfire improvements.


----------



## xer0h0ur

There wasn't exactly a huge difference in FS/Extreme/Ultra at least. I never make note of my FPS in games to compare that so can't really speak as to that. Ultimately all I care about is fluidity and keeping the frametime down. AMD needs more driver work on crossfire's frametimes.


----------



## Ganf

Quote:


> Originally Posted by *xer0h0ur*
> 
> There wasn't exactly a huge difference in FS/Extreme/Ultra at least. I never make note of my FPS in games to compare that so can't really speak as to that. Ultimately all I care about is fluidity and keeping the frametime down. AMD needs more driver work on crossfire's frametimes.


More work on frametimes period. They're getting better but they're not perfect.


----------



## xer0h0ur

Yup. I wish AMD prioritized their driver development team far higher than they do. I realize they had to fire people and restructure but you can't skimp on your driver team.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> There wasn't exactly a huge difference in FS/Extreme/Ultra at least. I never make note of my FPS in games to compare that so can't really speak as to that. Ultimately all I care about is fluidity and keeping the frametime down. AMD needs more driver work on crossfire's frametimes.


So you didn't see a big increase in FireStrike with the 15.7 driver?

I thought everyone with a 290 got some big gains from the tess improvements.

Most people were furious with AMD for making the 15.15 driver proprietary, and wanted the notable improvements being seen on the 390's to be incorporated in a universal Radeon driver, so that's what the 15.2 (15.7 package) was meant to be.

I noticed huge differences coming from my 290 to the 390, obviously from the clock speed improvements, and the driver...

My 290 crossfire setup performed admirably, but once I add my second 390 this month, it should be a good bit faster. Between the overclocking improvements, and the driver improvements, I'm guessing around 15-20% faster.


----------



## xer0h0ur

15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
15.7 Firestrike Extreme: http://www.3dmark.com/fs/5358452 13290 overall score 16370 graphics score
15.7 Firestrike: http://www.3dmark.com/fs/5358521 22159 overall score 34095 graphics score

Wish I had tested FS Extreme and standard FS with the 15.6 to give comparisons there too but I didn't since 4K gaming is my concern.


----------



## rv8000

Stock run: *weird some reason didn't save under my results GPU score was around 15.3k, Ill have to rerun later
1110/550: http://www.3dmark.com/fs/5421687

Was seeing a consistent 1.212-1.200v under load so about 20mV higher than my Fury X. Driver crash @ 1125, only pushed up to 1110 might have an extra 5 to 10mhz we'll see. Cranked fans up just to ensure there were no temp issues, I will check the secondary bios later as it does have an increased TDP; interesting that my Fury X would artifact on the core when unstable and Fury just seems to driver reset instead of showing instability first.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> 15.6 Firestrike Ultra: http://www.3dmark.com/fs/5055862 7437 overall score 7905 graphics score
> 15.7 Firestrike Ultra: http://www.3dmark.com/fs/5346396 7571 overall score 8100 graphics score
> 15.7 Firestrike Extreme: http://www.3dmark.com/fs/5358452 13290 overall score 16370 graphics score
> 15.7 Firestrike: http://www.3dmark.com/fs/5358521 22159 overall score 34095 graphics score
> 
> Wish I had tested FS Extreme and standard FS with the 15.6 to give comparisons there too but I didn't since 4K gaming is my concern.


Nice scores man, your rig is an absolute power house as it sits.... you should have 4K pretty well covered for a while pending you aren't hitting a VRAM limitation in anything.


----------



## xer0h0ur

Yeah this rig isn't getting any more changes other than perhaps adding an Intel NVMe SSD. Unless ASUS ends up deciding to support NVMe boot drives on its X79 boards at which point I will drop this motherboard like a bad habit.


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So you didn't see a big increase in FireStrike with the 15.7 driver?
> 
> I thought everyone with a 290 got some big gains from the tess improvements.
> 
> Most people were furious with AMD for making the 15.15 driver proprietary, and wanted the notable improvements being seen on the 390's to be incorporated in a universal Radeon driver, so that's what the 15.2 (15.7 package) was meant to be.
> 
> I noticed huge differences coming from my 290 to the 390, obviously from the clock speed improvements, and the driver...
> 
> My 290 crossfire setup performed admirably, but once I add my second 390 this month, it should be a good bit faster. Between the overclocking improvements, and the driver improvements, I'm guessing around 15-20% faster.


Seem i seen the same thing with my 7950.... Benchmark showed bad score and at the end some years later it seem to beat a 680


----------



## POOTYTANGASAUR

How long till Fury and Fury X get full voltage control? Hopefully memory control aswell. I wanna see these cards posting score that'll push Nvidia.


----------



## Noufel

The nano confirmed for august








http://www.guru3d.com/news-story/amd-confirms-radeon-r9-nano-launching-in-august.html
i hope it will lauch with optimized drivers especialy with win 10


----------



## Gumbi

Quote:


> Originally Posted by *Noufel*
> 
> The nano confirmed for august
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/news-story/amd-confirms-radeon-r9-nano-launching-in-august.html
> i hope it will lauch with optimized drivers especialy with win 10


That reporter is very misleading. The card will not provide twuce the performance of a 290x, rather, due to reduced PCB size, twice the "performance density" and twice the efficiency of the 290X.

ie It's a 290x with a 175w tdp and small form factor.

Cooling will suck though, I wish they'd mounted a decent cooler on it. That way one would get the equivalent 290X performance of somwthing like 1200/1600 if overclocked and it'd fantastic!


----------



## Munkypoo7

I'm kinda butthurt I can't seem to purchase a Fury. Newegg has them listed as sold out / auto notify, and Amazon doesn't even have them listed...

Any other (USA) retailers / e-tailors I should keep an eye on?


----------



## p4inkill3r

Keep an eye on this page: http://www.nowinstock.net/computers/videocards/amd/r9furyx/


----------



## Munkypoo7

That's fantastic, I can even set it to text me. Thank you so much p4inkill3r, +rep


----------



## rsiyasena

Quote:


> Originally Posted by *Bludge*
> 
> Hi mate, I'm running a Dell adapter that looks very similar to yours, I can get you the model if needed. I have the adapter running a QX2710 like you, and a Samsung U28 4K through displayport. Initially on reset the Qnix monitor would display the same lines, and needed an unplug and replug to sort. I since moved the USB power to an onboard USB 2 port, rather than a USB 3 and reconnected everything, and since then all is fine.
> 
> No idea if the USB 3 -> 2 was it, or just retightening connections, sorry.


Hey! So i figured it out, it ended up being a resolution issue and not a voltage problem. Apparently my monitor was no receiving an input from the video card b/c the games that I attempted to run were starting up with unsupported resolutions. I was able to verify this by VPN'ing into my computer with a different pc while the monitor was showing the no input signal. The fix was pretty simple too, I went into CCC and enables GPU Up-Scaling. That allowed me to get into the game and apply the correct launch resolution. I can finally play Wolfenstein! Woohoo.


----------



## Maximization

Quote:


> Originally Posted by *Munkypoo7*
> 
> I'm kinda butthurt I can't seem to purchase a Fury. Newegg has them listed as sold out / auto notify, and Amazon doesn't even have them listed...
> 
> Any other (USA) retailers / e-tailors I should keep an eye on?


your can back order on amazon, thats what i did, be prepared to wait a month


----------



## Munkypoo7

Quote:


> Originally Posted by *Maximization*
> 
> your can back order on amazon, that's what i did, be prepared to wait a month


That's for the Fury X, I'm aiming for the Fury, non-X, just because I'm a quiet air cooling fanboy. The Fury isn't listed on Amazon for preorder so I'm using that tracking site p4inkill3r linked, and I'll check it daily for extra links to add to my following list


----------



## Maximization

article is 16 hours old

they are nowhere

http://techreport.com/news/28651/radeon-r9-fury-and-r9-fury-x-availability-check-nope


----------



## tx12

I can't imagine Furys selling like a hot pies, so the supply should be ridiculously low.
Meanwhile in Russia, Fury X is stocked in several big and small online shops in capital city. Prices are varying from $800 to $1160 in US$ price equivalent.


----------



## Agent Smith1984

That's ridiculous...

Using low supply to force limited stock, creating false exclusivity, to generate an up tick in future sales when the supply increases.....

NICE!!

"Na bro.... I don't have anymore of that Kush X 99, everybody keeps buying it all, but I'm getting a lot in a month, you probably want to go ahead and get all of it when it comes in..."


----------



## BaddParrot

Tiger Direct seems to be keeping them in stock by charging an extra $50. (698$) I refuse!

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9781592&Sku=


----------



## Forceman

Quote:


> Originally Posted by *BaddParrot*
> 
> Tiger Direct seems to be keeping them in stock by charging an extra $50. (698$) I refuse!
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9781592&Sku=


Or not. Out of stock.


----------



## xer0h0ur

Quote:


> Originally Posted by *rsiyasena*
> 
> Hey! So i figured it out, it ended up being a resolution issue and not a voltage problem. Apparently my monitor was no receiving an input from the video card b/c the games that I attempted to run were starting up with unsupported resolutions. I was able to verify this by VPN'ing into my computer with a different pc while the monitor was showing the no input signal. The fix was pretty simple too, I went into CCC and enables GPU Up-Scaling. That allowed me to get into the game and apply the correct launch resolution. I can finally play Wolfenstein! Woohoo.


That is actually good to know. I had never heard of something like that and one could easily blame a video card / driver for that sort of issue.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gumbi*
> 
> That reporter is very misleading. The card will not provide twuce the performance of a 290x, rather, due to reduced PCB size, twice the "performance density" and twice the efficiency of the 290X.
> 
> ie It's a 290x with a 175w tdp and small form factor.
> 
> Cooling will suck though, I wish they'd mounted a decent cooler on it. That way one would get the equivalent 290X performance of somwthing like 1200/1600 if overclocked and it'd fantastic!


You don't get it at all. The Nano is aimed at a market segment that doesn't have anything even remotely as powerful available in that small of a form factor. Its perfectly fine as is with that cooler. The part that is truly exciting about this card is the rumor that it may have a fully unlocked Fiji XT die. If that is the case then people like me which have no interest in mini ITX builds can then slap a waterblock on a Nano and essentially turn it into a Fury X that is cheaper.


----------



## BaddParrot

Quote:


> Originally Posted by *Forceman*
> 
> Or not. Out of stock.


Understood but they have had them in stock 5x since yesterday alone. Wednesday the 15th they seemed to have them all day long (Sapphire brand) I just refused to pay the extra 50$. I caught the New Egg batch. I was lucky.

As pointed out by other posters. Anyone looking can just use Nowinstock.net.

http://www.nowinstock.net/computers/videocards/amd/r9furyx/full_history.php

Note: Wednesday (15th) they had them from 6:23 Am till Thursday (16th) at 10:09 am. Probably due to the price gouging.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> The part that is truly exciting about this card is the rumor that it may have a fully unlocked Fiji XT die. If that is the case then people like me which have no interest in mini ITX builds can then slap a waterblock on a Nano and essentially turn it into a Fury X that is cheaper.


If they can't keep actual full die Fury X (or cut Fury) cards in stock, how are they going to have enough full dies to throw on yet a third card? And if it had a full die, why would it be cheaper?

I can't imagine a world where the Nano are full chips.


----------



## xer0h0ur

Well mind you, the Nano isn't "launching" until some time next month. I can only presume they are expecting their yields to go up by then. I still think the problem with keeping stock has everything to do with HBM and not the Fiji dies themselves. The way its currently operating AMD is shipping dies to SK Hynix where they are mating it with the HBM and interposer then sending it back to AMD who ships the product to the AIBs. Who is handling the PCB assembly from there on out I have no idea.


----------



## en9dmp

Quote:


> Originally Posted by *xer0h0ur*
> 
> You don't get it at all. The Nano is aimed at a market segment that doesn't have anything even remotely as powerful available in that small of a form factor. Its perfectly fine as is with that cooler. The part that is truly exciting about this card is the rumor that it may have a fully unlocked Fiji XT die. If that is the case then people like me which have no interest in mini ITX builds can then slap a waterblock on a Nano and essentially turn it into a Fury X that is cheaper.


If it only has 1 8-pin connector, surely it's not going to be able to provide anywhere near the performance potential of fury x?.. Plus the PCB is reportedly 1.5 inches shorter (6 vs 7.5) so you won't necessarily be able to just slap a fury x block on it... Also it's doubtful a waterblock will even be necessary on a card with such a low tdp, so why would EK manufacture a separate block for it?


----------



## en9dmp

Would a fury x boot with only one of the 8 pin connectors attached? If so maybe someone can underclock it until stable to estimate nano performance? Still makes no sense for them to use fully unlocked xt dies and underpower then like that. Like running a v12 Ferrari on 80 octane petrol


----------



## fewness

Quote:


> Originally Posted by *en9dmp*
> 
> Would a fury x boot with only one of the 8 pin connectors attached? If so maybe someone can underclock it until stable to estimate nano performance? Still makes no sense for them to use fully unlocked xt dies and underpower then like that. Like running a v12 Ferrari on 80 octane petrol


Or these are full chip but cannot run at full speed due to some "technical" reason.....It could be that Nano was only a byproduct as the way to rescue these handicapped chips.


----------



## xer0h0ur

Quote:


> Originally Posted by *en9dmp*
> 
> If it only has 1 8-pin connector, surely it's not going to be able to provide anywhere near the performance potential of fury x?.. Plus the PCB is reportedly 1.5 inches shorter (6 vs 7.5) so you won't necessarily be able to just slap a fury x block on it... Also it's doubtful a waterblock will even be necessary on a card with such a low tdp, so why would EK manufacture a separate block for it?


Crap, you're absolutely right about that. I forgot altogether that the Nano will only carry one PCI-E power connector. However the Nano, like Fury, is not reference locked like the Fury X. So AIBs are welcome to slap on whatever cooler they want to use or create a non-reference PCB for it which I suppose could end up being more than just one 8-pin power connector.

You do realize that multiple companies make waterblocks right? EK isn't the only one making Fury X / Fury blocks right now.


----------



## xer0h0ur

This basically confirms my suspicions about HBM being the root cause for inventory shortages of Fury / Fury X: http://wccftech.com/amd-hbm-ramp-expected-shortages-fury-fury/

'"Our initial ramp-up has been as expected, we're pleased with the Fury X ramp-up. Certainly the fact that it's out of stock is not a bad thing cause it gives us good confidence that the customers are appreciating the product. Fury just launched actually this week and we will be launching Nano in the August timeframe. I think overall the High Bandwidth Memory ramp-up is going as expected and we have a number of products coming out." Said Dr Lisa Su, AMD President and CEO.'


----------



## Maximization

the yields are still sucky, teething pains that is all.

I think the nano is where all the "sucky" ones go


----------



## localh85

.


----------



## Ceadderman

Don't expect nVidia to get their grubby paws on HBM or to come out with HBM 2. According to that article AMD invented the tech. They would be mind numbingly stupid to share it.







lulz

~Ceadder


----------



## Casey Ryback

Quote:


> Originally Posted by *Gumbi*
> 
> That reporter is very misleading. The card will not provide twuce the performance of a 290x, rather, due to reduced PCB size, twice the "performance density" and twice the efficiency of the 290X.
> 
> ie It's a 290x with a 175w tdp and small form factor.
> 
> Cooling will suck though, I wish they'd mounted a decent cooler on it. That way one would get the equivalent 290X performance of somwthing like 1200/1600 if overclocked and it'd fantastic!


Then it would no longer have 2X the performance per watt of a 290X lol.

It's not a magical card that is far more efficient than current fury cards, it's a combination of low power, clocks and therefore staying cool that is giving them that performance per watt.

If you want a high performance fury, you buy fury X or fury pro.


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Don't expect nVidia to get their grubby paws on HBM or to come out with HBM 2. According to that article AMD invented the tech. They would be mind numbingly stupid to share it.
> 
> 
> 
> 
> 
> 
> 
> lulz
> 
> ~Ceadder


They co-developed HBM with SK Hynix and had begun work on it going back 7 years I believe. They had the foresight to see that GDDR's lifecycle was drawing to a close. AMD had exclusive access to HBM1 and they retain priority access to HBM2 so Nvidia is screwed on initial yields of HBM2 since AMD will get priority over them.


----------



## swiftypoison

Weeee. Picked up a open box Fury X from Microcenter. From the looks of it, its pretty good. No major damage other than a few bent finds on the rad. I also picked up a 1440p Asus PB258Q monitor. Ill test it against my 980 Kingpin tonight. I dont have alot of games other than BF4 and GTA5 so theres that. I also dont have 3Dmark so the basic version will have to do.


----------



## tx12

Quote:


> Originally Posted by *Forceman*
> 
> If they can't keep actual full die Fury X (or cut Fury) cards in stock, how are they going to have enough full dies to throw on yet a third card? And if it had a full die, why would it be cheaper? .


Why do you think Nano would be cheaper?
That's unrivaled product in terms of power/area and I'm sure AMD will ask good money for it. At least, nano should be more expensive than fury air.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> They co-developed HBM with SK Hynix and had begun work on it going back 7 years I believe. They had the foresight to see that GDDR's lifecycle was drawing to a close. AMD had exclusive access to HBM1 and they retain priority access to HBM2 so Nvidia is screwed on initial yields of HBM2 since AMD will get priority over them.


I have trouble believing that Nvidia would have designed Pascal around HBM2 without being assured of their access to it.
Quote:


> Originally Posted by *tx12*
> 
> Why do you think Nano would be cheaper?
> That's unrivaled product in terms of power/area and I'm sure AMD will ask good money for it. At least, nano should be more expensive than fury air.


That's what I'm saying. I don't see why people think this is going to be a $350 (or even $450) card.


----------



## Jflisk

I have to admit this fury X is awesome. I have been playing Crysis 3 ultra with no problems with 1 card. I have the card up to 1082 over clock how are you guys over clocking the ram to 550. Thanks

BF 4 and hardline chews thru them
Crysis 2 and 3 chews thru them
Actually Batman Arham Knight seems to be better with this card too. Not perfect but playable

I give it


----------



## djsatane

Problem is that due to shortages the prices of these cards only gonna go up.... also did AMD state anything officially that they have solved coil whine issue with the pumps?


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> I have trouble believing that Nvidia would have designed Pascal around HBM2 without being assured of their access to it.
> That's what I'm saying. I don't see why people think this is going to be a $350 (or even $450) card.


I didn't say they wouldn't get HBM2 at some point but all things being equal AMD still gets first dibs. Nvidia doesn't have an option at this point. The memory technology they originally wanted for Pascal is not ready and they committed to HBM afterwards knowing full on well it was Hynix's and AMD's baby. Its a situation completely out of their control.


----------



## Shatun-Bear

Quote:


> Originally Posted by *xer0h0ur*
> 
> I didn't say they wouldn't get HBM2 at some point but all things being equal AMD still gets first dibs. Nvidia doesn't have an option at this point. *The memory technology they originally wanted for Pascal is not ready* and they committed to HBM afterwards knowing full on well it was Hynix's and AMD's baby. Its a situation completely out of their control.


This is why I find it amusing that upon the Fury X's release, I kept seeing posts saying "just wait for Pascal instead of getting a Fury X" and intimating that Pascal is just around the corner. We're looking at Q3 2016 at the earliest for Arctic Islands and Pascal GPUs. We're stuck with Fury X, Nano, and all the non-reference Fury and 980 Ti until then. The only noteworthy release coming soon is the Fury X2. After that comes out (September-October), we'll have a long wait until the next line of cards from either AMD or Nvidia.


----------



## wanna_buy

Can anyone post screenshot or video showing FPS in original Crysis and Crysis Warhead with every setting set to Ultra? I couldn't find any video on YouTube.


----------



## New green

http://forums.guru3d.com/showthread.php?p=5110209#post5110209

If I am following it correctly the fury x looks like it can go up to +100mV. Unwinder still doesn't have a card.


----------



## TK421

Have anyone managed to hack the vbios to have higher power limit + overvolt and possibly hbm overclocking?

Also does the gentletyphoon have a standard 4 pin or 3 pin connector?


----------



## royfrosty

Quote:


> Originally Posted by *TK421*
> 
> Have anyone managed to hack the vbios to have higher power limit + overvolt and possibly hbm overclocking?
> 
> Also does the gentletyphoon have a standard 4 pin or 3 pin connector?


No it does not have a standard 3 or 4 pin connector. If you remove the front faceplate you can see it.

You can easily solder it (diy) on your own.


----------



## xer0h0ur

Quote:


> Originally Posted by *Shatun-Bear*
> 
> This is why I find it amusing that upon the Fury X's release, I kept seeing posts saying "just wait for Pascal instead of getting a Fury X" and intimating that Pascal is just around the corner. We're looking at Q3 2016 at the earliest for Arctic Islands and Pascal GPUs. We're stuck with Fury X, Nano, and all the non-reference Fury and 980 Ti until then. The only noteworthy release coming soon is the Fury X2. After that comes out (September-October), we'll have a long wait until the next line of cards from either AMD or Nvidia.


Precisely. After that there is nothing in the pipeline for either camp unless Nvidia decides to make a GTX 990 or whatever a dual GPU card would be called. Given how much larger it would be than a Fury X2 I don't know if Nvidia would want to make one. Then again I never underestimate Huang's ego. He may want to deliver a kill shot not caring about the size or power consumption of the card just for the sake of being able to say they still have the world's strongest graphics card.


----------



## swiftypoison

Some updates:
Stock Fury X
http://www.3dmark.com/fs/5435280

980 Kingpin
http://www.3dmark.com/fs/5434805

5% difference

The open box did have coil whine like crazy so I am going to return it tomorrow and wait for a good batch. Material wise, the Fury X is super nice! The fan is amazing! Running at 4000RPM you cant even hear it.


----------



## Cool Mike

Received my new Sapphire Fury x yesterday.

Ran Valley @ 4K ultra settings for 30 minutes. Core - 1125, Memory - 550, Fan - 35%, Max Temp - 54C

Very Quiet. No Coil Noise


----------



## boi801

Can I join the club?

Got a Sapphire Fury X today and here are my first results:

core - 1130
mem - 610
Valley Benchmark 1.0
Extreme HD Preset
-FPS 84.0
-Score 3515
-min 35.6
-max 163.1

CPU: i7 [email protected]
Windows 10
15.200.1023.10 WHQL

Now waiting for voltage control!


----------



## Jflisk

How do we open the memory with these. Thanks

Never mind found it anyone looking for it can be found here
http://wccftech.com/unlock-memory-overclocking-amd-r9-fury/


----------



## Gumbi

Quote:


> Originally Posted by *boi801*
> 
> Can I join the club?
> 
> Got a Sapphire Fury X today and here are my first results:
> 
> core - 1130
> mem - 610
> Valley Benchmark 1.0
> Extreme HD Preset
> -FPS 84.0
> -Score 3515
> -min 35.6
> -max 163.1
> 
> CPU: i7 [email protected]
> Windows 10
> 15.200.1023.10 WHQL
> 
> Now waiting for voltage control!


Jeez, that's some pretty legit overclocking given there's no voltage control yet. I think you're looking at 1200 plus mhz when it's all said and done!


----------



## boi801

Quote:


> Originally Posted by *Gumbi*
> 
> Jeez, that's some pretty legit overclocking given there's no voltage control yet. I think you're looking at 1200 plus mhz when it's all said and done!


Sory... too soon, not stable for gaming, crashing after 30min of gameplay...


----------



## Cool Mike

I'm hitting 600 on the memory also. Had to back it off to 575 because after running valley @ 4K Ultra the greens begin to turn Blue after 15 minutes or so. Never locked up, just blue colors. Could this be heat related? My fans are at 35%.

Update: the FuryX really responds to added fan speed. Bumped the fan speed to 40% and now running 600Mhz on the memory. valley and Farcry4 @ 4K stable. Now, if we could add some core voltage we would be off and running.


----------



## Jflisk

Quote:


> Originally Posted by *Cool Mike*
> 
> I'm hitting 600 on the memory also. Had to back it off to 575 because after running valley @ 4K Ultra the greens begin to turn Blue after 15 minutes or so. Never locked up, just blue colors. Could this be heat related? My fans are at 35%.


That's your ram


----------



## Maximization

Is ek, aquacomputer or koolance better for fury x blocks? Or too soon to tell


----------



## Ceadderman

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> They co-developed HBM with SK Hynix and had begun work on it going back 7 years I believe. They had the foresight to see that GDDR's lifecycle was drawing to a close. AMD had exclusive access to HBM1 and they retain priority access to HBM2 so Nvidia is screwed on initial yields of HBM2 since AMD will get priority over them.
> 
> 
> 
> I have trouble believing that Nvidia would have designed Pascal around HBM2 without being assured of their access to it.
> Quote:
> 
> 
> 
> Originally Posted by *tx12*
> 
> Why do you think Nano would be cheaper?
> That's unrivaled product in terms of power/area and I'm sure AMD will ask good money for it. At least, nano should be more expensive than fury air.
> 
> Click to expand...
> 
> That's what I'm saying. I don't see why people think this is going to be a $350 (or even $450) card.
Click to expand...

I have trouble believing they have access to HBM2 for Pascal at all. Considering HBM2 hasn't hit the engineering table that we know of. AMD owns the rights to the tech and if nVidia gets access to it before AMD gets to it, expect to see headlines of "AMD to sue Hynix".

~Ceadder


----------



## flopper

Quote:


> Originally Posted by *Ceadderman*
> 
> I have trouble believing they have access to HBM2 for Pascal at all. Considering HBM2 hasn't hit the engineering table that we know of. AMD owns the rights to the tech and if nVidia gets access to it before AMD gets to it, expect to see headlines of "AMD to sue Hynix".
> 
> ~Ceadder


if you expect nvidia to go fresh start from scratch once HBM2 goes into production you wont have cards until 2018.
they know the specs before hand and smaller samples reach out way before production starts.


----------



## alcal

Just built my new new rig with a 5930k and Fury-x in a Node 804



http://www.techpowerup.com/gpuz/details.php?id=mx9q2

Sign me up for the fun.

Haven't started gaming or benching yet because I was an idiot and didn't get a displayport -> dual link DVI adapter so my 30q5 is out of action for the moment.

I have a the grinding pump sound but I frankly don't care too much--will probably shake it around to see if it's an air bubble issue I can deal with by applying brute force.


----------



## alcal

Quote:


> Originally Posted by *flopper*
> 
> if you expect nvidia to go fresh start from scratch once HBM2 goes into production you wont have cards until 2018.
> they know the specs before hand and smaller samples reach out way before production starts.


If Nvidia were as rich/smart as Intel, they would have made their own HBM implementation, called it "FragRAM," released their own line of DDR3/DDR4 desktop memory, and then made FragRAM require you to have Nvidias desktop DIMMS installed to function









In all seriousness though, I bet the only way they get their hands on HBM2 in a timely fashion is if they cough up a hefty fee for it. IP ain't free.


----------



## Superplush

Am I the only one waiting on Fury x2 ? Given how the 295x2 held up I'm interested to see the stats on that thing. Nano looks 'cute' still AMD all the way not like those crappy Nvidia fryers !


----------



## Alastair

So. I have gone a head and ordered to Sapphire Fury's. When I say ordered, I told my friend up there in the US to by two as soon as he sees them in stock.









Now. Here are some questions and concerns I have.

Firstly. I plan on taking my usiversal GPU blocks on my Fury.
Firstly. Is there anything you guys are doing with your blocks to protect the HBM? Cause I hear it's on a slightly different hight to the rest of the GPU. Or do you just slap the block on and be done with it. I am using Heatkiller GPU-X3's (the ones without expandable VRM cooling.)

Since I am going GPU block only I need to make a plan to cool my VRM's. Any suggestions as what to do about VRM cooling?

Also. I have a BeQuiet Dark Power Pro P10 850w power supply. It's basically a rebadged Seasonic platinum.

Now if you just work on the stock TDP of 275w per card. That's 550w.

That leaves me with 300w. Well my 5GHz 8350 probably consumes close to that. So that probably means I am going to be very limited to OC'ing on my new cards. Or no overclocking at all. What say ye.


----------



## wanna_buy

I'VE A REQUEST FOR THE OWNERS!
Can anyone post screenshot or video showing FPS in original Crysis and Crysis Warhead with every setting set to Ultra? I couldn't find any video on YouTube.


----------



## wanna_buy

Either Full HD or 1440p will suffice.


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> So. I have gone a head and ordered to Sapphire Fury's. When I say ordered, I told my friend up there in the US to by two as soon as he sees them in stock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now. Here are some questions and concerns I have.
> 
> Firstly. I plan on taking my usiversal GPU blocks on my Fury.
> Firstly. Is there anything you guys are doing with your blocks to protect the HBM? Cause I hear it's on a slightly different hight to the rest of the GPU. Or do you just slap the block on and be done with it. I am using Heatkiller GPU-X3's (the ones without expandable VRM cooling.)
> 
> Since I am going GPU block only I need to make a plan to cool my VRM's. Any suggestions as what to do about VRM cooling?
> 
> Also. I have a BeQuiet Dark Power Pro P10 850w power supply. It's basically a rebadged Seasonic platinum.
> 
> Now if you just work on the stock TDP of 275w per card. That's 550w.
> 
> That leaves me with 300w. Well my 5GHz 8350 probably consumes close to that. So that probably means I am going to be very limited to OC'ing on my new cards. Or no overclocking at all. What say ye.


EK makes a full water block and back plate for the Fury. Is there any reason you would want to risk a 650.00 video card on a universal block with no protection on the ram or vrm area.

http://www.performance-pcs.com/catalogsearch/result/?q=EK-FC-R9-FURYX


----------



## Alastair

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So. I have gone a head and ordered to Sapphire Fury's. When I say ordered, I told my friend up there in the US to by two as soon as he sees them in stock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now. Here are some questions and concerns I have.
> 
> Firstly. I plan on taking my usiversal GPU blocks on my Fury.
> Firstly. Is there anything you guys are doing with your blocks to protect the HBM? Cause I hear it's on a slightly different hight to the rest of the GPU. Or do you just slap the block on and be done with it. I am using Heatkiller GPU-X3's (the ones without expandable VRM cooling.)
> 
> Since I am going GPU block only I need to make a plan to cool my VRM's. Any suggestions as what to do about VRM cooling?
> 
> Also. I have a BeQuiet Dark Power Pro P10 850w power supply. It's basically a rebadged Seasonic platinum.
> 
> Now if you just work on the stock TDP of 275w per card. That's 550w.
> 
> That leaves me with 300w. Well my 5GHz 8350 probably consumes close to that. So that probably means I am going to be very limited to OC'ing on my new cards. Or no overclocking at all. What say ye.
> 
> 
> 
> EK makes a full water block and back plate for the Fury. Is there any reason you would want to risk a 650.00 video card on a universal block with no protection on the ram or vrm area.
> 
> http://www.performance-pcs.com/catalogsearch/result/?q=EK-FC-R9-FURYX
Click to expand...

I had a look at pictures of the block. The contact area seems to be flat. So there doesn't seem to be any additional machining for the HBM memory. Seems you have forgotten that a universal GPU only block would cover both the GPU and HBM. The only thing needing additional cooling would be the VRM.

I am spending all I have managed to save on the GPU's there isn't additional budget for blocks. I bought universal blocks originally because I intended to keep them for multiple GPU's. I don't have the funds to buy new blocks every time I get new cards.


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> I had a look at pictures of the block. The contact area seems to be flat. So there doesn't seem to be any additional machining for the HBM memory. Seems you have forgotten that a universal GPU only block would cover both the GPU and HBM. The only thing needing additional cooling would be the VRM.
> 
> I am spending all I have managed to save on the GPU's there isn't additional budget for blocks. I bought universal blocks originally because I intended to keep them for multiple GPU's. I don't have the funds to buy new blocks every time I get new cards.


Don't forget the GPU foot print is also bigger then older Dies. So the universal GPU block may not fit the top of the DIE. Remember reading this in a few places. I am fully with you on wanting to save cash.

Check this one
http://cxzoid.blogspot.com/2015/06/r9-fury-x-pcb-analysis.html


----------



## Alastair

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I had a look at pictures of the block. The contact area seems to be flat. So there doesn't seem to be any additional machining for the HBM memory. Seems you have forgotten that a universal GPU only block would cover both the GPU and HBM. The only thing needing additional cooling would be the VRM.
> 
> I am spending all I have managed to save on the GPU's there isn't additional budget for blocks. I bought universal blocks originally because I intended to keep them for multiple GPU's. I don't have the funds to buy new blocks every time I get new cards.
> 
> 
> 
> Don't forget the GPU foot print is also bigger then older Dies. So the universal GPU block may not fit the top of the DIE. Remember reading this in a few places. I am fully with you on wanting to save cash.
Click to expand...

well can someone give me the measurements of the die? I know the area but I want to know the length and width. And then also the mounting holes for heatsinks/blocks around the die. Cause then I can work out in advance if my blocks will work.

Also remember EK makes the blocks for Fury X. But I am buying non-reference Sapphire Tri-x Fury. I doubt EK blocks will fit.


----------



## xer0h0ur

Quote:


> Originally Posted by *Alastair*
> 
> well can someone give me the measurements of the die? I know the area but I want to know the length and width. And then also the mounting holes for heatsinks/blocks around the die. Cause then I can work out in advance if my blocks will work.
> 
> Also remember EK makes the blocks for Fury X. But I am buying non-reference Sapphire Tri-x Fury. I doubt EK blocks will fit.


Actually, that card uses the identical PCB as Fury X. The only existing non-reference Fury is the Asus Strix.


----------



## rv8000

Quote:


> Originally Posted by *Alastair*
> 
> well can someone give me the measurements of the die? I know the area but I want to know the length and width. And then also the mounting holes for heatsinks/blocks around the die. Cause then I can work out in advance if my blocks will work.
> 
> Also remember EK makes the blocks for Fury X. But I am buying non-reference Sapphire Tri-x Fury. I doubt EK blocks will fit.


Sapphire Fury TriX uses the reference pcb for Fury X, the ek and other fury x blocks will fit.


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> well can someone give me the measurements of the die? I know the area but I want to know the length and width. And then also the mounting holes for heatsinks/blocks around the die. Cause then I can work out in advance if my blocks will work.
> 
> Also remember EK makes the blocks for Fury X. But I am buying non-reference Sapphire Tri-x Fury. I doubt EK blocks will fit.
> 
> 
> 
> Actually, that card uses the identical PCB as Fury X. The only existing non-reference Fury is the Asus Strix.
Click to expand...

I see. Blocks are still a load of cash though.


----------



## xer0h0ur

Oh there is no denying that. I spent entirely more money that I wanted to to make my current setup. This is why I am skipping this generation altogether and going for a Zen + Arctic Islands build next year.


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh there is no denying that. I spent entirely more money that I wanted to to make my current setup. This is why I am skipping this generation altogether and going for a Zen + Arctic Islands build next year.


no I definitely want to buy into this gen. I don't see my HD6850's holding out till Q3 2016 or more. I need something now. And if I am going to buy. I'm gonna get the top end so that I can see myself through another 3 plus years.


----------



## xer0h0ur

Yeah I was talking about my setup allowing me to skip Fury/Fury X. I am on triple Hawaii XT's so realistically if I wanted to I could skip Arctic Islands too. I just want to give AMD some monetary support on the next generation of enthusiast processors and see what Arctic Islands looks like on a node shrink + HBM 2. If it also ends up carrying a new generation of GCN as is being speculated then even better.


----------



## Alastair

How much does the cheapest EK Fury block cost? I'm thinking Acetal + normal copper?


----------



## xer0h0ur

Off hand I don't know EK's pricing but other blocks were announced too so you will end up having options from other companies too.


----------



## xer0h0ur

Jflisk linked performance-pcs.com:http://www.performance-pcs.com/catalogsearch/result/?q=EK-FC-R9-FURYX

Looks like the block you're describing is $113.99 there and if you wanted the black backplate that is another $34.99. Multiply that by two since you wanted dual Fury. It adds up quickly to a chunk of change.

Aquacomputer block:





XSPC block:


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Jflisk linked performance-pcs.com:http://www.performance-pcs.com/catalogsearch/result/?q=EK-FC-R9-FURYX
> 
> Looks like the block you're describing is $113.99 there and if you wanted the black backplate that is another $34.99. Multiply that by two since you wanted dual Fury. It adds up quickly to a chunk of change.
> 
> Aquacomputer block:
> 
> 
> 
> 
> 
> XSPC block:


Yeah. Exactly. Looks to be too much me thinks. Next step would probably mean figuring out if my current Heatkiller blocks will work.

So can anyone give the measurements for the length and breadth of the entire chip. And also the distance between two of the mounting holes for heatsinks + blocks.
Like this.


If they do not work. I could always sell them. How much do you think I can sell a pair of Heatkiller GPU-X3 Core-LC's + multi link for. Keep in mind that water cooling parts are few and far to come by in SA so that would mark the price up a bit.


----------



## TK421

Concering the stock radiator, does the bottom part really stick out or can you remove it?

What is it really? A reservoir or cover?


----------



## xer0h0ur

Nothing on that AIO solution is meant to be removable unless you were taking the entire unit off for a waterblock. To answer your question though its acting as a small reservoir.


----------



## BaddParrot

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nothing on that AIO solution is meant to be removable unless you were taking the entire unit off for a waterblock. To answer your question though its acting as a small reservoir.


Stupid question here-
So does it come with the fan set to push or pull? Also, is the fan reversible (Owners manual says not to mess with it!) ?


----------



## rv8000

Quote:


> Originally Posted by *BaddParrot*
> 
> Stupid question here-
> So does it come with the fan set to push or pull? Also, is the fan reversible (Owners manual says not to mess with it!) ?


Push, by default AMD wants you to set up the radiator as an exhaust in the rear side of the case.


----------



## BaddParrot

Thanks for the quick answer. I'm still wanting to put it up front to pull in clean air (Corsair 760T case that is in my signature).


----------



## rv8000

Quote:


> Originally Posted by *BaddParrot*
> 
> Thanks for the quick answer. I'm still wanting to put it up front to pull in clean air (Corsair 760T case that is in my signature).


Just flip the fan and run it in pull, temps for the card should be no different (maybe 1-2c), but you will be heating up the inside of your case a few degrees more and this may adversely affect CPU temps if you dont have an AIO or loop for your cpu already and are running on an air cooler.


----------



## Alastair

Apon closer inspection. It would seem that I cannot use my blocks. And will have to get new ones.


----------



## Alastair

Do I HAVE to get the EK backplate for the EK blocks to work? Or is it just an aesthetic thing?


----------



## xer0h0ur

I honestly don't know what to tell you there. An EK rep here claimed that the Fury X's backplate did nothing and that their backplate cooled VRMs on the back side of the PCB but that sounded like a load of bull to me to upsell you on EK backplate (http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block/80#post_24140194). He may be right but I truly have a hard time believing him that AMD willingly increases the cost of their Fury X by including a backplate on it that provides no cooling effect. Hence why I believe it was a load of you know what.

As for the backplate already on the Tri-X Fury, I don't know if you would be able to re-apply the same backplate once the EK block has been installed. The screws for the block may not allow the backplate back on.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> I honestly don't know what to tell you there. An EK rep here claimed that the Fury X's backplate did nothing and that their backplate cooled VRMs on the back side of the PCB but that sounded like a load of bull to me to upsell you on EK backplate (http://www.overclock.net/t/1561907/ek-fc-r9-fury-x-amd-radeon-r9-fury-x-full-cover-water-block/80#post_24140194). He may be right but I truly have a hard time believing him that AMD willingly increases the cost of their Fury X by including a backplate on it that provides no cooling effect. Hence why I believe it was a load of you know what.
> 
> As for the backplate already on the Tri-X Fury, I don't know if you would be able to re-apply the same backplate once the EK block has been installed. The screws for the block may not allow the backplate back on.


He would likely need longer screws or have to mod/drill the Tri-X backplate to fit properly.


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> Do I HAVE to get the EK backplate for the EK blocks to work? Or is it just an aesthetic thing?


The back plate is usually for aesthetic But also usually adds extra passive cooling for the card.


----------



## Ceadderman

For those wishimg to use one, Tiborrr (I think it was) already confirmed that a Thermosphere will not cover the HBMs. So they will not work on Fury cards except for DDR5 pre-gen cards.

Hopefully EK will make a Fury capable Thermosphere soon.









~Ceadder


----------



## stoker

Quote:


> Originally Posted by *alcal*
> 
> If Nvidia were as rich/smart as Intel, they would have made their own HBM implementation, called it "FragRAM," released their own line of DDR3/DDR4 desktop memory, and then made FragRAM require you to have Nvidias desktop DIMMS installed to function


LOL, Don't give them ideas


----------



## Mega Man

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Do I HAVE to get the EK backplate for the EK blocks to work? Or is it just an aesthetic thing?
> 
> 
> 
> The back plate is usually for aesthetic But also usually adds extra passive cooling for the card.
Click to expand...

they also add protection to the back of the pcb from things like minor water leaks and shorting out if you drop something


----------



## Silent Scone

I have three blocks available that I'm not going to use. change of plans. Nickel plexi.


----------



## en9dmp

Quote:


> Originally Posted by *xer0h0ur*
> 
> Precisely. After that there is nothing in the pipeline for either camp unless Nvidia decides to make a GTX 990 or whatever a dual GPU card would be called. Given how much larger it would be than a Fury X2 I don't know if Nvidia would want to make one. Then again I never underestimate Huang's ego. He may want to deliver a kill shot not caring about the size or power consumption of the card just for the sake of being able to say they still have the world's strongest graphics card.


It's doubtful that a dual 980ti or dual titan card would come close to a fury x2 given how much better AMD scales with more gpus than nVidia. When you also consider that the 295x2 performs better than 2x 290xs in most applications I would expect the fury x2 to absolutely wipe the floor with any dual GPU card nVidia could put out. Especially after their highly underclocked previous effort.


----------



## en9dmp

Quote:


> Originally Posted by *Mega Man*
> 
> they also add protection to the back of the pcb from things like minor water leaks and shorting out if you drop something


Yeah it's good for protection for sure, it's nice to be able to pick it up by the block and backplate without worrying. But if you touch the backplate when it's in use it gets pretty toasty, so it's definitely doing something. At the very least it's not insulating it like the standard fury x backplate does!


----------



## mustrum

Quote:


> Originally Posted by *Superplush*
> 
> Am I the only one waiting on Fury x2 ? Given how the 295x2 held up I'm interested to see the stats on that thing. Nano looks 'cute' still AMD all the way not like those crappy Nvidia fryers !


That was my plan as well. Casually browsed availability and suddenly i ordered a sapphire fury X.
No idea how that happened!









Can allways add a second.


----------



## TK421

Quote:


> Originally Posted by *mustrum*
> 
> That was my plan as well. Casually browsed availability and suddenly i ordered a sapphire fury X.
> No idea how that happened!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can allways add a second.


fury x2 devil13 anyone?


----------



## escksu

I am waiting for Fury X2........ Now using Fury X.


----------



## en9dmp

Quote:


> Originally Posted by *escksu*
> 
> I am waiting for Fury X2........ Now using Fury X.


Any chance they will have figured out how to put 8Gb per die by then? Otherwise I don't really understand what's stopping them releasing the card sooner. Unless it's linked to the production capacity issues...

I'm likely to swap out one of my fury x when the x2 comes out and go with trifire. Hoping for some dx12 games by then as well which will hopefully have been programmed with SFR and memory sharing capabilities...


----------



## GorillaSceptre

Any word on voltage control yet?


----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Any word on voltage control yet?


unwinder doesn't have a Fury X to work with yet at this point. So we are waiting on that for afterburner.


----------



## Ha-Nocri

Not sure why Sapphire doesn't add voltage control in the TriXX utility. They clearly did up the voltage on their Flurry (non-X) OC card, from 1.169 to 1.212V


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> unwinder doesn't have a Fury X to work with yet at this point. So we are waiting on that for afterburner.


Still? ...

Nice to see a fellow South African








Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not sure why Sapphire doesn't add voltage control in the TriXX utility. They clearly did up the voltage on their Flurry (non-X) OC card, from 1.169 to 1.212V


Agreed.


----------



## Agent Smith1984

My concern though, is that Sapphire upped their voltage on the non-x Fury, and it still doesn't clock any better


----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> unwinder doesn't have a Fury X to work with yet at this point. So we are waiting on that for afterburner.
> 
> 
> 
> Still? ...
> 
> Nice to see a fellow South African
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ha-Nocri*
> 
> Not sure why Sapphire doesn't add voltage control in the TriXX utility. They clearly did up the voltage on their Flurry (non-X) OC card, from 1.169 to 1.212V
> 
> Click to expand...
> 
> Agreed.
Click to expand...

have you got your Fury? Or thinking of buying. They are almost 10 grand for non X Fury here. What are you planning on doing? Buying local or importing yourself? Also do you watercool? If so where do you get your bits? Cause I normally use Titan ice but they are limited to universal GPU blocks. Which seem won't be any good for Fury.


----------



## josephimports

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not sure why Sapphire doesn't add voltage control in the TriXX utility. They clearly did up the voltage on their Flurry (non-X) OC card, from 1.169 to 1.212V


It describes having voltage adjustment but the slider is N/A.










Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## alcal

Update on my Fury-X pump grinding noise: It appears to have gone away making me suspect bubbles in the pump. Now all I hear is a slight coil whine but I can only notice it when my head is at a certain angle so w/e.


----------



## Munkypoo7

Sapphire TriX OC in stock @ $569.99: Newegg Link


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> have you got your Fury? Or thinking of buying. They are almost 10 grand for non X Fury here. What are you planning on doing? Buying local or importing yourself? Also do you watercool? If so where do you get your bits? Cause I normally use Titan ice but they are limited to universal GPU blocks. Which seem won't be any good for Fury.


I'm thinking of buying one, just waiting for voltage control to make sure, i'm leaning towards a 980Ti atm though.

I don't watercool, always wanted a custom loop though. I'm doing a whole new build next year so hoping to go all out with it







Can't you just use a mail-forwarding service from the states for the WC parts?

I'm just going to buy through Amazon, i always save a ton of money going that route, and the service is even better than most of our local places. I got a faulty Amp from them once and they covered the return shipping both ways.

I wouldn't say this on local forums, but i'll never buy a PC part locally again (except cheaper components), they can go jump with the ridiculous prices they charge.


----------



## Agent Smith1984




----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> have you got your Fury? Or thinking of buying. They are almost 10 grand for non X Fury here. What are you planning on doing? Buying local or importing yourself? Also do you watercool? If so where do you get your bits? Cause I normally use Titan ice but they are limited to universal GPU blocks. Which seem won't be any good for Fury.
> 
> 
> 
> I'm thinking of buying one, just waiting for voltage control to make sure, i'm leaning towards a 980Ti atm though.
> 
> I don't watercool, always wanted a custom loop though. I'm doing a whole new build next year so hoping to go all out with it
> 
> 
> 
> 
> 
> 
> 
> Can't you just use a mail-forwarding service from the states for the WC parts?
> 
> I'm just going to buy through Amazon, i always save a ton of money going that route, and the service is even better than most of our local places. I got a faulty Amp from them once and they covered the return shipping both ways.
> 
> I wouldn't say this on local forums, but i'll never buy a PC part locally again (except cheaper components), they can go jump with the ridiculous prices they charge.
Click to expand...

think you could PM me. I would like to know how I go about doing this.


----------



## Ceadderman

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mustrum*
> 
> That was my plan as well. Casually browsed availability and suddenly i ordered a sapphire fury X.
> No idea how that happened!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can allways add a second.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> fury x2 devil13 anyone?
Click to expand...

Pretty sure that's *not* Fury. DDR5 surrounding the GPUs say it's not.

And *Four* 8pin power connections?!?









Quote:


> Originally Posted by *Agent Smith1984*












*ahem*...

...Afterburner?









~Ceadder


----------



## TK421

Quote:


> Originally Posted by *Ceadderman*
> 
> Pretty sure that's *not* Fury. DDR5 surrounding the GPUs say it's not.
> 
> And *Four* 8pin power connections?!?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *ahem*...
> 
> ...Afterburner?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


That's Devil 13 295X2 PCB, but I would want to see a similar thing done to the FuryX2


----------



## Nizzen

Quote:


> Originally Posted by *Agent Smith1984*


Tried Trixx 5.00, and voltagecontrol is grayed out. I can easily overclock memory with slide. Have an MSI Fury X

TRIXX_installer_5.0.0_58787851.zip 1967k .zip file


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nizzen*
> 
> Tried Trixx 5.00, and voltagecontrol is grayed out. I can easily overclock memory with slide. Have an MSI Fury X


Surprised they released 5.00 without voltage control....

I guess they wanted to get easy HBM overclocking out there in the mean time.

We are almost a month past Fiji launch now, and there is NO ONE with confirmed voltage control.... how long did it take for Hawaii?


----------



## xer0h0ur

Quote:


> Originally Posted by *en9dmp*
> 
> Any chance they will have figured out how to put 8Gb per die by then? Otherwise I don't really understand what's stopping them releasing the card sooner. Unless it's linked to the production capacity issues...
> 
> I'm likely to swap out one of my fury x when the x2 comes out and go with trifire. Hoping for some dx12 games by then as well which will hopefully have been programmed with SFR and memory sharing capabilities...


Nothing has changed. The ramp up of HBM production is still taking place. They aren't ready to release the Nano, much less the Fury X2.

HBM 2 is not going to be ready until Q2 2016 so don't expect 8GB on a single interposer either. While in theory they could still just add more stacks of HBM on the existing interposer, I highly doubt we get to see that happen.


----------



## DividebyZERO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Surprised they released 5.00 without voltage control....
> 
> I guess they wanted to get easy HBM overclocking out there in the mean time.
> 
> We are almost a month past Fiji launch now, and there is NO ONE with confirmed voltage control.... how long did it take for Hawaii?


Right now the biggest issue is inventory. They are not in stock and when you look at nowinstock.net the restock dates are just horribad. They have seriously screwed the pooch on this launch. I just am at a loss of words and all it does is feed ammo to the green team shills. It's so damn frustrating.


----------



## Nizzen

Quote:


> Originally Posted by *DividebyZERO*
> 
> Right now the biggest issue is inventory. They are not in stock and when you look at nowinstock.net the restock dates are just horribad. They have seriously screwed the pooch on this launch. I just am at a loss of words and all it does is feed ammo to the green team shills. It's so damn frustrating.æ


Rule nr 1. Buy new gpu release date








Only noobs do not get cards









Stock here in Norway


----------



## DividebyZERO

Quote:


> Originally Posted by *Nizzen*
> 
> Rule nr 1. Buy new gpu release date
> 
> 
> 
> 
> 
> 
> 
> 
> Only noobs do not get cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock here in Norway


Clearly i'm wrong and your right. I didn't say i couldn't get a card anywhere in that. If i really wanted one i would just go to ebay and pay 200$+ MSRP. Sorry for lying about the stock issue.


----------



## Ceadderman

If you don't want one, then why bother complaining about lack of stock?









I cannot find fault with lack of stock. This happens whether AMD or nVidia. New tech hits the market, gets gobbled up early. Creates glut in the market and needs to be filled by the AIBs.

How is this AMDs fault?

I remember when my Motherboard first launched. I had to wait *months* to get mine. I guess that too was AMDs fault because it's an AMD platform.









AMD *DOESN'T* supply the market. They produced the Reference boards and distributed them to the AIBs. They don't manufacture them and split them between the vendors. That's not how it's done at all. Blaming them is incredibly myopic.









~Ceadder


----------



## rv8000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Not sure why Sapphire doesn't add voltage control in the TriXX utility. They clearly did up the voltage on their Flurry (non-X) OC card, from 1.169 to 1.212V


On avg its 20mV higher than Fury X, the default voltage is not 1.169; remember the strix PCB is custom built, has a totally different power delivery setup, and has a hugely crippled stock bios


----------



## xer0h0ur

Quote:


> Originally Posted by *rv8000*
> 
> On avg its 20mV higher than Fury X, the default voltage is not 1.169; remember the strix PCB is custom built, has a totally different power delivery setup, and has a hugely crippled stock bios


^ This, the BIOS on the Strix isn't even remotely taking advantage of that 12 phase power on their custom PCB design. Rumor has it that this custom PCB will end up on overclocked variants but take that for what its worth. As a rumor.


----------



## Joshy Ocuk

theres your voltage control, still alittle buggy. so to confirm voltage is not 'locked' just the current ati i2c driver is not fully 100% compatable but does work with afterburner 3rd party volt control


----------



## DividebyZERO

Quote:


> Originally Posted by *Ceadderman*
> 
> If you don't want one, then why bother complaining about lack of stock?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I cannot find fault with lack of stock. This happens whether AMD or nVidia. New tech hits the market, gets gobbled up early. Creates glut in the market and needs to be filled by the AIBs.
> 
> How is this AMDs fault?
> 
> I remember when my Motherboard first launched. I had to wait *months* to get mine. I guess that too was AMDs fault because it's an AMD platform.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD *DOESN'T* supply the market. They produced the Reference boards and distributed them to the AIBs. They don't manufacture them and split them between the vendors. That's not how it's done at all. Blaming them is incredibly myopic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I'm not sure if i am speaking english or gibberish. This is twice now someone has put words in my mouth. I never said i couldn't get a card, and i never said i didn't want one in either of my last two posts here. It's AMD's product and they are doing extremely poor at supplying vendors. You can twist it anyways you want to justify it to yourself. AMD may correct this soon but were already past a month so look at it how you want.


----------



## Ceadderman

Firstly you stated "not that I want one". I didn't put those words in your mouth. You did.

Second do you have *ANY* idea what the terms "Reference" or "Reference Design" refer to in engineering and manufacturing capacities?









It's the design by which all manufacturing processes are based. Meaning that AMD builds a working model for which all manufacturers/vendors refer to schematically to manufacture the product.

How difficult is this to understand? AMD doesn't build the end product. The manufacturers putting their face to it, are.

It's no more their fault other than supplying the GPU and HBM. If they're shipping them then whose fault is it that we don't have cards available? Try googling "Reference Design".









~Ceadder


----------



## rv8000

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> 
> 
> 
> 
> theres your voltage control, still alittle buggy. so to confirm voltage is not 'locked' just the current ati i2c driver is not fully 100% compatable but does work with afterburner 3rd party volt control


Seems like what I expected if that oc of 1242 on the core is stable @ ~1.4v (even better if GPU-Z is just bugging out and it's really +100mV which would end up somewhere around 1.3-1.33v), appears to be very similar to hawaii. Curse you MSI for not getting unwinder a Fury X ASAP









1250 core oc should put a non X Fury around stock/ref 980ti performance, 1250 core oc on the Fury X will place it somewhat equivalent to a 1400mhz 980ti; all depending on the game and if any of my scaling tests were accurate


----------



## p4inkill3r

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> 
> 
> 
> 
> theres your voltage control, still alittle buggy. so to confirm voltage is not 'locked' just the current ati i2c driver is not fully 100% compatable but does work with afterburner 3rd party volt control


where dat link doe


----------



## royfrosty

Quote:


> Originally Posted by *rv8000*
> 
> Seems like what I expected if that oc of 1242 on the core is stable @ ~1.4v (even better if GPU-Z is just bugging out and it's really +100mV which would end up somewhere around 1.3-1.33v), appears to be very similar to hawaii. Curse you MSI for not getting unwinder a Fury X ASAP
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1250 core oc should put a non X Fury around stock/ref 980ti performance, 1250 core oc on the Fury X will place it somewhat equivalent to a 1400mhz 980ti; all depending on the game and if any of my scaling tests were accurate


Hmm. Not sure about that. But I only know at an oc core clock at 1110mhz 570mhz the results of benchmark test it was just on par with 980ti ref card.

I remember jay2cents was mentioning it that it was the same performance of a 980ti ref if the fury x is on a slight oc.

Unless it is oced further with voltages unlocked, it will be like a non ref 980ti.


----------



## Orthello

Quote:


> Originally Posted by *royfrosty*
> 
> Hmm. Not sure about that. But I only know at an oc core clock at 1110mhz 570mhz the results of benchmark test it was just on par with 980ti ref card.
> 
> I remember jay2cents was mentioning it that it was the same performance of a 980ti ref if the fury x is on a slight oc.
> 
> Unless it is oced further with voltages unlocked, it will be like a non ref 980ti.


I think you have to look at %s and then compare, eg base fury x core is 1050, so if you get to 1250 with voltage then that's 19%. Its hard to say what the average Fury x with untapped voltage will clock to but if that 19% was relatively average then 1380-1400 mhz 980ti would be comparitive clock for the same oc % on a 980ti (they boost to ~1150 mhz in most games default). If the 980ti lead the benchmark / game at default it would still lead i feel with 1400mhz vs 1250 on the fury x is another way to look at it.

The reality is some 980tis will go further than 1400mhz, 1529-1550 mhz for air cooled custom versions (980ti) is not unheard of . So whilst this is great for Fury X its really just catch up and i think we still need driver improvements and windows 10 to helpout also - i'm looking forward to that.

If its CFX vs SLI however then yeah AMDs xdma efficiency is going to really help out there also.

I'm going to have some fun with my cousins fury X and the voltage bits soon, we were getting lockups in the witcher 3 at 1120/550 , not sure if it was ram or core at fault - didn't have much time.


----------



## Gumbi

Quote:


> Originally Posted by *Orthello*
> 
> I think you have to look at %s and then compare, eg base fury x core is 1050, so if you get to 1250 with voltage then that's 19%. Its hard to say what the average Fury x with untapped voltage will clock to but if that 19% was relatively average then 1380-1400 mhz 980ti would be comparitive clock for the same oc % on a 980ti (they boost to ~1150 mhz in most games default). If the 980ti lead the benchmark / game at default it would still lead i feel with 1400mhz vs 1250 on the fury x is another way to look at it.
> 
> The reality is some 980tis will go further than 1400mhz, 1529-1550 mhz for air cooled custom versions (980ti) is not unheard of . So whilst this is great for Fury X its really just catch up and i think we still need driver improvements and windows 10 to helpout also - i'm looking forward to that.
> 
> If its CFX vs SLI however then yeah AMDs xdma efficiency is going to really help out there also.
> 
> I'm going to have some fun with my cousins fury X and the voltage bits soon, we were getting lockups in the witcher 3 at 1120/550 , not sure if it was ram or core at fault - didn't have much time.


I'm pretty sure a reference 980ti boosts quite a bit higher than 1150mhz.


----------



## blue1512

Quote:


> Originally Posted by *Gumbi*
> 
> I'm pretty sure a reference 980ti boosts quite a bit higher than 1150mhz.


Yes. Most of them boosts to 1200MHz. Some good cards boost to 1250MHz


----------



## Alastair

Amazon has ONE Sapphire Fury Tri-x in stock. Get yours now! (non-OC version)


----------



## Alastair

You guys think the manufacturing node for Fiji might have matured a bit by the time Nano comes out? Possibly of slightly newer chips produced around August having a lower stock VID than current chips and so maybe slightly better at overclocking?


----------



## xer0h0ur

Quote:


> Originally Posted by *Alastair*
> 
> You guys think the manufacturing node for Fiji might have matured a bit by the time Nano comes out? Possibly of slightly newer chips produced around August having a lower stock VID than current chips and so maybe slightly better at overclocking?


It has nothing to do with the node or the Fiji dies. The 28 nanometer node has matured as freakin much as it possibly can. The HBM supply is literally what is holding back keeping Fury / Fury X in stock. They are ramping up the manufacturing of HBM right now trying to keep up with demand but clearly its not there yet. I would hope they have it sorted out by the time the R9 Nano is released in August and it sure as hell better be ready for the Fury X2 release.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> You guys think the manufacturing node for Fiji might have matured a bit by the time Nano comes out? Possibly of slightly newer chips produced around August having a lower stock VID than current chips and so maybe slightly better at overclocking?
> 
> 
> 
> It has nothing to do with the node or the Fiji dies. The 28 nanometer node has matured as freakin much as it possibly can. The HBM supply is literally what is holding back keeping Fury / Fury X in stock. They are ramping up the manufacturing of HBM right now trying to keep up with demand but clearly its not there yet. I would hope they have it sorted out by the time the R9 Nano is released in August and it sure as hell better be ready for the Fury X2 release.
Click to expand...

This.

You can only build what you have the necessary parts for. If you are missing sparkplugs to that shiny new motor on the bench, it's gonna be a shiny paperweight. All show an no go.









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> You guys think the manufacturing node for Fiji might have matured a bit by the time Nano comes out? Possibly of slightly newer chips produced around August having a lower stock VID than current chips and so maybe slightly better at overclocking?
> 
> 
> 
> It has nothing to do with the node or the Fiji dies. The 28 nanometer node has matured as freakin much as it possibly can. The HBM supply is literally what is holding back keeping Fury / Fury X in stock. They are ramping up the manufacturing of HBM right now trying to keep up with demand but clearly its not there yet. I would hope they have it sorted out by the time the R9 Nano is released in August and it sure as hell better be ready for the Fury X2 release.
Click to expand...

Well I thought since Fury is a new chip even although on 28nm it would still have some manufacturing difficulties? Anyways. I am still gonna wait for when Nano drops. Maybe get better HBM that OC's better. I think I have a problem. I have an addiction. Do we have a subforum for overclockers anonymous?


----------



## Ceadderman

That's like having a Overeaters Anonymous thread on a Food Lovers Forum.


















~Ceadder


----------



## xer0h0ur

http://wccftech.com/nvidias-gp100-pascal-flagship-pack-4096-bit-memory-bus-8hi-hbm-stacks/

"Although notably, unlike Nvidia which has confirmed that its Pascal GPUs will be manufactured using TSMC's 16nm FinFET process, AMD has yet to announced whether the Arctic Islands family of GPUs will be made on TSMC's 16nm or Samsung's 14nm process. Both nodes are very similar, so which process AMD ends up using will be primarily dictated by yields and time-to-market.

*Also unlike Nvidia, AMD has a much more powerful incentive to launch its next generation of FinFET GPUs first. This is because the company has priority to HBM2 capacity - which is going to be limited initially - as a result of co-inventing the technology with Hynix.* By pushing its graphics products to launch first AMD can establish two competitive advantages over its rival. The first obvious advantage is being first to market by launching its products earlier than its rival. *But most importantly this enables AMD to capture much of that initial HBM2 capacity away from Nvidia and extend its time-to-market lead substantially.* This could create an interesting market dynamic but whether it can succeed remains to be seen."

They will truly be green with envy


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> http://wccftech.com/nvidias-gp100-pascal-flagship-pack-4096-bit-memory-bus-8hi-hbm-stacks/
> 
> "Although notably, unlike Nvidia which has confirmed that its Pascal GPUs will be manufactured using TSMC's 16nm FinFET process, AMD has yet to announced whether the Arctic Islands family of GPUs will be made on TSMC's 16nm or Samsung's 14nm process. Both nodes are very similar, so which process AMD ends up using will be primarily dictated by yields and time-to-market.
> 
> *Also unlike Nvidia, AMD has a much more powerful incentive to launch its next generation of FinFET GPUs first. This is because the company has priority to HBM2 capacity - which is going to be limited initially - as a result of co-inventing the technology with Hynix.* By pushing its graphics products to launch first AMD can establish two competitive advantages over its rival. The first obvious advantage is being first to market by launching its products earlier than its rival. *But most importantly this enables AMD to capture much of that initial HBM2 capacity away from Nvidia and extend its time-to-market lead substantially.* This could create an interesting market dynamic but whether it can succeed remains to be seen."
> 
> They will truly be green with envy












I knew nVidia wouldn't be coming out with HBM2 tech enforce AMD. It just wasn't feasible.









~Ceadder


----------



## Skinnered

Is it possible to OC two Fury's in CF? I tried to unlock the memory OC with msi Afterburner (extended oc), but I couldn't get the second GPU's memory slider working (in CCC, couldn't get settings applied in ab?), resulting in asynchronic clockspeeds reported by Ab and crashes. I aimed at 1150+/600, core/mem, on both, but no luck... I had UPS disabled btw.


----------



## xer0h0ur

Quote:


> Originally Posted by *Skinnered*
> 
> Is it possible to OC two Fury's in CF? I tried to unlock the memory OC with msi Afterburner (extended oc), but I couldn't get the second GPU's memory slider working (in CCC, couldn't get settings applied in ab?), resulting in asynchronic clockspeeds reported by Ab and crashes. I aimed at 1150+/600, core/mem, on both, but no luck... I had UPS disabled btw.


Have you by any chance attempted to disable crossfire first so that Afterburner treats each card individually? This may allow you to set your overclocks on each card individually then just enable crossfire again and bang you're good to go. I assume you're already going into the settings to specifically select each card you're trying to overclock.


----------



## xer0h0ur

Please do tell if this ends up working for you. I had thought of this before when people were having trouble overclocking in crossfire but I never spoke up and forgot the suggestion altogether till now. Would be useful info if it works.


----------



## josephimports

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Skinnered*
> 
> Is it possible to OC two Fury's in CF? I tried to unlock the memory OC with msi Afterburner (extended oc), but I couldn't get the second GPU's memory slider working (in CCC, couldn't get settings applied in ab?), resulting in asynchronic clockspeeds reported by Ab and crashes. I aimed at 1150+/600, core/mem, on both, but no luck... I had UPS disabled btw.








Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *xer0h0ur*
> 
> Please do tell if this ends up working for you. I had thought of this before when people were having trouble overclocking in crossfire but I never spoke up and forgot the suggestion altogether till now. Would be useful info if it works.






Memory OC on the second GPU began to work after driver 15.7.





Spoiler: Warning: Spoiler!







http://www.3dmark.com/3dm/7828165


----------



## By-Tor

Quote:


> Originally Posted by *josephimports*
> 
> 
> 
> Memory OC on the second GPU began to work after driver 15.7.
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7828165


I would have thought that the FS score would be better than that with 2 Fury's...

290x's.


----------



## sugarhell

I dont understand why you compare different settings? Or i am missing something?


----------



## rv8000

Quote:


> Originally Posted by *By-Tor*
> 
> I would have thought that the FS score would be better than that with 2 Fury's...
> 
> 290x's.


His benchmark was Firestrike Extreme, check again


----------



## By-Tor

Quote:


> Originally Posted by *rv8000*
> 
> His benchmark was Firestrike Extreme, check again


wow just noticed that.. Nice score...

TY


----------



## xer0h0ur

For comparison, my 295X2 and 290X match up nearly perfectly with his dual Fury X's. So 3 Hawaii XT's OCed match dual Fiji XT's OCed: http://www.3dmark.com/fs/5358452


----------



## fewness

I downloaded 15.7 for Win10 from AMD site, installed, but 3DMark still reads my driver as 15.20, and says it's not approved for valid score. What's wrong?


----------



## Skinnered

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you by any chance attempted to disable crossfire first so that Afterburner treats each card individually? This may allow you to set your overclocks on each card individually then just enable crossfire again and bang you're good to go. I assume you're already going into the settings to specifically select each card you're trying to overclock.


Thanx for the tip







I will try this later and let it know.


----------



## blue1512

Quote:


> Originally Posted by *fewness*
> 
> I downloaded 15.7 for Win10 from AMD site, installed, but 3DMark still reads my driver as 15.20, and says it's not approved for valid score. What's wrong?


Because win10 is not a valid OS at this moment.


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> For comparison, my 295X2 and 290X match up nearly perfectly with his dual Fury X's. So 3 Hawaii XT's OCed match dual Fiji XT's OCed: http://www.3dmark.com/fs/5358452


Yours and his graphics scores look low. 3xR9 290X

http://www.3dmark.com/fs/5313971

http://www.3dmark.com/fs/5256629

FuryX
http://www.3dmark.com/fs/5450036


----------



## rv8000

Quote:


> Originally Posted by *Jflisk*
> 
> Yours and his graphics scores look low. 3xR9 290X
> 
> http://www.3dmark.com/fs/5313971
> 
> http://www.3dmark.com/fs/5256629
> 
> FuryX
> http://www.3dmark.com/fs/5450036


They were showing/talking about FS Extreme, not FS. If you thought the score was fishy, first thing i always do is check the title of the benchmark and system info.


----------



## Jflisk

Quote:


> Originally Posted by *rv8000*
> 
> They were showing/talking about FS Extreme, not FS. If you thought the score was fishy, first thing i always do is check the title of the benchmark and system info.


Dozen of one a half dozen of another.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jflisk*
> 
> Yours and his graphics scores look low. 3xR9 290X
> 
> http://www.3dmark.com/fs/5313971
> 
> http://www.3dmark.com/fs/5256629
> 
> FuryX
> http://www.3dmark.com/fs/5450036


Were those your 3 290X's? My regular firestrike graphics score kinda makes those look pitiful. 26389/25080 to my 34095 http://www.3dmark.com/fs/5358521


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> Were those your 3 290X's? My regular firestrike graphics score kinda makes those look pitiful. 26389/25080 to my 34095 http://www.3dmark.com/fs/5358521


Thats all I got out of them and thats with a slight overclock.The I7 probably helps matters.Looks like yours are overclocked to 1150/1500 mine were at 1050 /1150.


----------



## xer0h0ur

You had em on air I am guessing? I couldn't sustain my OCs I use in benching until I put everything under water.


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> You had em on air I am guessing? I couldn't sustain my OCs I use in benching until I put everything under water.


Nope thats water but I was not out to kill them either and its 1060/1250 had to go look.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fewness*
> 
> I downloaded 15.7 for Win10 from AMD site, installed, but 3DMark still reads my driver as 15.20, and says it's not approved for valid score. What's wrong?


You are confusing the CCC release number with the driver number....

15.20.1046 is the driver version and is a windows 10 compatible driver.

15.7 is the CCC package release that incorporates the 15.2 driver.

Windows 10 is not an official OS yet, so the driver is not valid either when 3dmark validates the results. Just give it a little time.


----------



## xer0h0ur

Well I dropped in Fujipoly Ultra Extreme pads on the vRAM which let me safely push 1500MHz. The 295X2 has Hynix chips that will go 1700MHz before adversely affecting performance but the pitiful Elphida chips on the 290X will only even push 1500MHz as a slave card because if its the primary graphics card it will only do 1400MHz. But I can't complain since a friend sold me his 290X for a song. Like $160 bucks and he had only used it for a month. Those clocks are pushing +100mV and +50 power limit all on stock BIOSes. Never had it in me to flash them.


----------



## gamervivek

Quote:


> Originally Posted by *Orthello*
> 
> I think you have to look at %s and then compare, eg base fury x core is 1050, so if you get to 1250 with voltage then that's 19%. Its hard to say what the average Fury x with untapped voltage will clock to but if that 19% was relatively average then 1380-1400 mhz 980ti would be comparitive clock for the same oc % on a 980ti (they boost to ~1150 mhz in most games default). If the 980ti lead the benchmark / game at default it would still lead i feel with 1400mhz vs 1250 on the fury x is another way to look at it.
> 
> The reality is some 980tis will go further than 1400mhz, 1529-1550 mhz for air cooled custom versions (980ti) is not unheard of . So whilst this is great for Fury X its really just catch up and i think we still need driver improvements and windows 10 to helpout also - i'm looking forward to that.
> 
> If its CFX vs SLI however then yeah AMDs xdma efficiency is going to really help out there also.
> 
> I'm going to have some fun with my cousins fury X and the voltage bits soon, we were getting lockups in the witcher 3 at 1120/550 , not sure if it was ram or core at fault - didn't have much time.


Depends on how well the OC does in practice. Maxwell does get bandwidth limited with higher core overclocks at 4k and that could help Fury to catch up even with a lower OC. Seeing the gamernexus review, oc'ed to 1514Mhz 980ti hybrid vs. 980ti stock.

Witcher 3 - 24%
SoM - 21%
Grid: 21%

http://www.gamersnexus.net/hwreviews/1983-evga-gtx-980-ti-hybrid-review-and-benchmarks/Page-2

If Fury is a slightly faster at stock, say 5%, it could end up about equal with a 1.25Ghz core and 600Mhz HBM and 15% improvement in practice. A 1.453v reading on afterburner in that video doesn't give me hope though.


----------



## Yorkston

Finally got my Fury X RMA back and installed, another sticker version unfortunately. Pump noise is still there but less awful than the first one, I can drown this one out with fans. The fan on this one is really bad though, makes this howling whine at anything over 1200rpm. Will probably end up pulling that off and sticking one of my own GTs on it.

This one seems to clock far better than the first at least, seemed stable at 1140 core. I'm occasionally getting BSODs at the end of benchmark runs so i'm going to do a driver reinstall tonight.


----------



## Gumbi

Quote:


> Originally Posted by *Yorkston*
> 
> Finally got my Fury X RMA back and installed, another sticker version unfortunately. Pump noise is still there but less awful than the first one, I can drown this one out with fans. The fan on this one is really bad though, makes this howling whine at anything over 1200rpm. Will probably end up pulling that off and sticking one of my own GTs on it.
> 
> This one seems to clock far better than the first at least, seemed stable at 1140 core. I'm occasionally getting BSODs at the end of benchmark runs so i'm going to do a driver reinstall tonight.


Nothing wrong with 1140 core, especially considering we haven't got voltage control yet


----------



## fjordiales

I will join as soon as I get these bad boys.



Also, it is now available @ Amazon US.

Regular (1000mhz)
http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-11247-00-40G/dp/B011D7A526/ref=sr_1_1?s=electronics&ie=UTF8&qid=1437583894&sr=1-1&keywords=11247-00-40G

http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Delectronics&field-keywords=11247-00-40G

OC(1040mhz)
http://www.amazon.com/s/ref=nb_sb_noss_2?url=search-alias%3Delectronics&field-keywords=11247-01-40G

http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-11247-01-40G/dp/B011D79Z6S/ref=sr_1_1?s=electronics&ie=UTF8&qid=1437584196&sr=1-1&keywords=11247-01-40G


----------



## Agent Smith1984

Just so hard to know if Fiji will clock poorly with voltage the way Hawaii does, or if it will scale nicely with voltage the way tahiti did.....

If it's the latter, it really would be the "overclocker's dream" that they claimed it to be.

If it's more like Hawaii, then they have no idea what kind of dreams overclockers really have...


----------



## fewness

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are confusing the CCC release number with the driver number....
> 
> 15.20.1046 is the driver version and is a windows 10 compatible driver.
> 
> 15.7 is the CCC package release that incorporates the 15.2 driver.
> 
> Windows 10 is not an official OS yet, so the driver is not valid either when 3dmark validates the results. Just give it a little time.


That makes sense. Thank you!


----------



## criminal

Quote:


> Originally Posted by *fjordiales*
> 
> I will join as soon as I get these bad boys.
> 
> 
> 
> Also, it is now available @ Amazon US.
> 
> Regular (1000mhz)
> http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-11247-00-40G/dp/B011D7A526/ref=sr_1_1?s=electronics&ie=UTF8&qid=1437583894&sr=1-1&keywords=11247-00-40G
> 
> http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Delectronics&field-keywords=11247-00-40G
> 
> OC(1040mhz)
> http://www.amazon.com/s/ref=nb_sb_noss_2?url=search-alias%3Delectronics&field-keywords=11247-01-40G
> 
> http://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-11247-01-40G/dp/B011D79Z6S/ref=sr_1_1?s=electronics&ie=UTF8&qid=1437584196&sr=1-1&keywords=11247-01-40G


Good deal. Glad Amazon has some now and is not price gouging.


----------



## DNMock

Still locked voltage I see, anyone been brave enough to hard mod a fury X yet to take more voltage?


----------



## huzzug

So, who have started with CNC & milling:


----------



## TK421

Quote:


> Originally Posted by *DNMock*
> 
> Still locked voltage I see, anyone been brave enough to hard mod a fury X yet to take more voltage?


I believe the controller has ability to overvolt, once afterburner updates you should be able to increase volt without any hardware mods.


----------



## Mr.N00bLaR

I'm terribly out of the loop on new GPUs... Anyone have more specific information on this "R9 Nano" and where it fits into the market performance and price wise?


----------



## xer0h0ur

Quote:


> Originally Posted by *DNMock*
> 
> Still locked voltage I see, anyone been brave enough to hard mod a fury X yet to take more voltage?


There is no lock by way of hardware. The 3rd party applications haven't been updated yet to control the voltage. I have no idea what the hold up is on Sapphire's end with their Trixx app but progress is being made using Afterburner. Nothing is fully operational yet.


----------



## xer0h0ur

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> I'm terribly out of the loop on new GPUs... Anyone have more specific information on this "R9 Nano" and where it fits into the market performance and price wise?


The card hasn't even been paper launched yet much less released for sale. All we have is a nebulous August release timeframe. Some believe its a further cut down variant of Fiji and some believe its the full fat Fiji XT die simply being underclocked to greatly reduce TDP. Either way most people agree its not going to be a cheap card. Should be more expensive than the R9 390X and cheaper than Fury (non X). This card also will of course carry HBM just as Fiji Pro and XT do.

This card is going to completely dominate the market segment though. There is nothing even remotely close to it in terms of performance in that size package.


----------



## criminal

In stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121975


----------



## Mr.N00bLaR

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr.N00bLaR*
> 
> I'm terribly out of the loop on new GPUs... Anyone have more specific information on this "R9 Nano" and where it fits into the market performance and price wise?
> 
> 
> 
> The card hasn't even been paper launched yet much less released for sale. All we have is a nebulous August release timeframe. Some believe its a further cut down variant of Fiji and some believe its the full fat Fiji XT die simply being underclocked to greatly reduce TDP. Either way most people agree its not going to be a cheap card. Should be more expensive than the R9 390X and cheaper than Fury (non X). This card also will of course carry HBM just as Fiji Pro and XT do.
> 
> This card is going to completely dominate the market segment though. There is nothing even remotely close to it in terms of performance in that size package.
Click to expand...

Interesting. I've discovered the wonderfulness of virtualizing my gaming PC. That kind of GPU will hopefully fit into a server well.


----------



## xer0h0ur

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> Interesting. I've discovered the wonderfulness of virtualizing my gaming PC. That kind of GPU will hopefully fit into a server well.


FWIW: http://wccftech.com/amd-r9-nano-pictures-unigine-heaven-benchmark/


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> FWIW: http://wccftech.com/amd-r9-nano-pictures-unigine-heaven-benchmark/


Oh my goodness, I want to put two of those little jokers in crossfire!

890MHz with 4096 shaders is pretty hefty.

It probably won't OC all that great, but it's only going to be a tad slower than Fury X if those specs are correct. How can they say it's the same performance of a 290x???

It's faster on paper than the Fury Pro???


----------



## swiftypoison

Asus is in stock right now. Thinking about getting one to replace my 980 kinping...ahhhhh


----------



## aznever

Asus Strix R9 Fury is available for order @ newegg: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121975


----------



## Joshy Ocuk

ok i have some scores for fire strike. i used a test ssd and didn't realise the sysinfo was unable to read my cards clocks, also will need to update the vantage version. this will be done for my next runs.

this is my best fs score so far
www.3dmark.com/fs/5476446 1195mhz core / 595mhz hbm
Extream www.3dmark.com/fs/5476467 same clocks
still difficult to maintain high clocks in these

new cloud gate and vantage scores

www.3dmark.com/cg/2912011 core 1220 / 630mhz hbm
Vantage (v1.1) www.3dmark.com/3dmv/5309828 i think same clocks as CG


----------



## Neon Lights

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> ok i have some scores for fire strike. i used a test ssd and didn't realise the sysinfo was unable to read my cards clocks, also will need to update the vantage version. this will be done for my next runs.
> 
> this is my best fs score so far
> www.3dmark.com/fs/5476446 1195mhz core / 595mhz hbm
> Extream www.3dmark.com/fs/5476467 same clocks
> still difficult to maintain high clocks in these
> 
> new cloud gate and vantage scores
> 
> www.3dmark.com/cg/2912011 core 1220 / 630mhz hbm
> Vantage (v1.1) www.3dmark.com/3dmv/5309828 i think same clocks as CG


Was that on 1.453V?


----------



## Nizzen

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> ok i have some scores for fire strike. i used a test ssd and didn't realise the sysinfo was unable to read my cards clocks, also will need to update the vantage version. this will be done for my next runs.
> 
> this is my best fs score so far
> www.3dmark.com/fs/5476446 1195mhz core / 595mhz hbm
> Extream www.3dmark.com/fs/5476467 same clocks
> still difficult to maintain high clocks in these
> 
> new cloud gate and vantage scores
> 
> www.3dmark.com/cg/2912011 core 1220 / 630mhz hbm
> Vantage (v1.1) www.3dmark.com/3dmv/5309828 i think same clocks as CG


What about giving more info?


----------



## Joshy Ocuk

no not able to use full volts yet due to power limits on the card throttling back under intense 3d load. hopefully will have power limit sorted soon. all results were at +100mv in AB giving 1.29/1.32v on the core. really do believe it can do better than that in FS, just don't think the drivers are up to it yet or power limit could fix it, if its not drivers.


----------



## By-Tor

Quote:


> Originally Posted by *xer0h0ur*
> 
> FWIW: http://wccftech.com/amd-r9-nano-pictures-unigine-heaven-benchmark/


Would be sweet in a mini ITX build..


----------



## tx12

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh my goodness, I want to put two of those little jokers in crossfire!
> 
> 890MHz with 4096 shaders is pretty hefty.


I can't believe in 890 MHz in 175w TDP. 650-750 MHz top with pre-binned low leakage chips for a premium price.
And fury X2 is a two nanos in terms of clocks and binning.


----------



## xer0h0ur

Quote:


> Originally Posted by *tx12*
> 
> I can't believe in 890 MHz in 175w TDP. 650-750 MHz top with pre-binned low leakage chips for a premium price.
> And fury X2 is a two nanos in terms of clocks and binning.


Its not hard to believe at all. Take a Hawaii XT and downclock it to 890MHz and you can see its TDP drops like a rock. Now take into account that Fiji already has an improved power arch as does the 3XX series and its not hard to believe at all that an underclocked Fiji XT could generate a TDP that low.


----------



## xer0h0ur

Well ...this is what was holding back keeping Fury / Fury X in stock:

http://wccftech.com/umc-begins-volume-production-tsv-hbm/

"Let me give a short overview of the HBM technology first. The interposer is made by UMC as well, and employs the TSV (the standard tech for coupling stacked dies) process for stacking DRAM dies. A CMOS redistribution layer is used to place micro bumps to make the dies communicate with the interposer while TSVs are drilled through. All this is done in the 300mm Fab 12 foundry (UMC) in Singapore. Here is the thing however, UMC hadn't entered into volume production of Through Silicon Vias until very recently (20th July 2015) and this meant that there was a very big bottleneck in the Radeon R9 Fury X supply chain."


----------



## Shatun-Bear

Aw man. If the Nano is around the same performance as the 290X or a little faster, that is very disappointing as it is not going to be cheap. I'll take that WCCFTECH article with a grain of salt though obviously, as I think it was them that predicted that the Nano would be within 10% of a stock Fury in terms of performance.


----------



## xer0h0ur

The statement from AMD was that it performs above a 290X while delivering 2X the performance per watt. I highly doubt its going to match its performance. That wouldn't make any sense since this thing can't possibly be cheap. At least we aren't too far away from its launch. We will know soon enough.


----------



## Neon Lights

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> no not able to use full volts yet due to power limits on the card throttling back under intense 3d load. hopefully will have power limit sorted soon. all results were at +100mv in AB giving 1.29/1.32v on the core. really do believe it can do better than that in FS, just don't think the drivers are up to it yet or power limit could fix it, if its not drivers.


And do you think you will be able to tweak the BIOS so that these power limits are removed?
Would it not be the best to just be able create a BIOS that has a certain voltage and power limit so that the whole MSI Afterburner voltage probing or whatever is not needed?


----------



## Maximization

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well ...this is what was holding back keeping Fury / Fury X in stock:
> 
> http://wccftech.com/umc-begins-volume-production-tsv-hbm/
> 
> "Let me give a short overview of the HBM technology first. The interposer is made by UMC as well, and employs the TSV (the standard tech for coupling stacked dies) process for stacking DRAM dies. A CMOS redistribution layer is used to place micro bumps to make the dies communicate with the interposer while TSVs are drilled through. All this is done in the 300mm Fab 12 foundry (UMC) in Singapore. Here is the thing however, UMC hadn't entered into volume production of Through Silicon Vias until very recently (20th July 2015) and this meant that there was a very big bottleneck in the Radeon R9 Fury X supply chain."


THIS IS AN OUTRAGE!!! The back order is killing me


----------



## Ceadderman

Have patience young man. Fury X is as young as my son
Both had the same launch date.









~Ceadder


----------



## huzzug

Quote:


> Originally Posted by *Ceadderman*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have patience young man. Fury X is as young as my son
> Both had the same launch date.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Does that mean you'd be overclocking him too


----------



## Ceadderman

Hahaha yeah um he came that way. Lil dude has a pair of lungs on him.









~Ceadder


----------



## huzzug

Congrats buddy. So the count now is 8, 9 ...


----------



## fewness

Quote:


> Originally Posted by *Joshy Ocuk*
> 
> ok i have some scores for fire strike. i used a test ssd and didn't realise the sysinfo was unable to read my cards clocks, also will need to update the vantage version. this will be done for my next runs.
> 
> this is my best fs score so far
> www.3dmark.com/fs/5476446 1195mhz core / 595mhz hbm
> Extream www.3dmark.com/fs/5476467 same clocks
> still difficult to maintain high clocks in these
> 
> new cloud gate and vantage scores
> 
> www.3dmark.com/cg/2912011 core 1220 / 630mhz hbm
> Vantage (v1.1) www.3dmark.com/3dmv/5309828 i think same clocks as CG


Care to share more about this? Is this done by a special oc software or mod BIOS or something else?


----------



## Ceadderman

4 an no more. Got that taken care of on his birthday. No more bundles of joy for this boy. An I didn't have to get snipped.









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *fewness*
> 
> Care to share more about this? Is this done by a special oc software or mod BIOS or something else?


Go back and see his posts, he does it with Afterburner though its not working completely.
Quote:


> Originally Posted by *Joshy Ocuk*
> 
> 
> 
> 
> 
> theres your voltage control, still alittle buggy. so to confirm voltage is not 'locked' just the current ati i2c driver is not fully 100% compatable but does work with afterburner 3rd party volt control


----------



## Scorpion49

Got my Sapphire Fury Tri-X in today, hopefully this weekend I'll have time to fiddle with it beyond just running some basic benches: http://www.3dmark.com/fs/5484647


----------



## xer0h0ur

Interesting that 3dmark calls it a Fury X in your bench.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> Interesting that 3dmark calls it a Fury X in your bench.


Does the same for my trix


----------



## xer0h0ur

Has anyone confirmed yet if Fury is hardware gimped or software gimped? In other words if flashing Fury with Fury X's BIOS would unlock those shaders?


----------



## fjordiales

Has anyone tried trifire benchmarks yet? I know there are some pics online but didn't see benchmarks. Also, is there even a benefit on that? Was thinking of 3x fury since it fits but seems like it's just bragging rights


----------



## Thoth420

If I am happy with my fury x after seeing the nano... totally building a mini itx system for fun.


----------



## xer0h0ur

Quote:


> Originally Posted by *fjordiales*
> 
> Has anyone tried trifire benchmarks yet? I know there are some pics online but didn't see benchmarks. Also, is there even a benefit on that? Was thinking of 3x fury since it fits but seems like it's just bragging rights


http://iyd.kr/753


----------



## lucasj1974

https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fiyd.kr%2F753&edit-text=&act=url

above link (translated)


----------



## en9dmp

Let's see what these can do in crossfire with blocks on....

To the guy after tri-fire benchmarks, amdmatt posted some a few days after release that showed diminishing returns after 2x cards. That's likely to improve over time though especially when games start to properly use the dx12 capability. Right now it's probably pointless, and would recommend crossfire only. If in a few months games start properly scaling on multiple CPUs and sharing memory then get an x2 and add it to your existing setup, otherwise get a third single fury x then....


----------



## Orthello

Quote:


> Originally Posted by *en9dmp*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Let's see what these can do in crossfire with blocks on....
> 
> To the guy after tri-fire benchmarks, amdmatt posted some a few days after release that showed diminishing returns after 2x cards. That's likely to improve over time though especially when games start to properly use the dx12 capability. Right now it's probably pointless, and would recommend crossfire only. If in a few months games start properly scaling on multiple CPUs and sharing memory then get an x2 and add it to your existing setup, otherwise get a third single fury x then....


Man that form factor looks great .. so much power in such a small space









Be curious if you're going to overclock to see how you get on with those EK blocks.


----------



## Ceadderman

You really won't see a major increase running TriFire tbh. But hey if you can afford nearly $2k for three Furies and don't care about the expense, more power to ya.









~Ceadder


----------



## fjordiales

Quote:


> Originally Posted by *xer0h0ur*
> 
> http://iyd.kr/753


Thanks.
Quote:


> Originally Posted by *lucasj1974*
> 
> https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fiyd.kr%2F753&edit-text=&act=url
> 
> above link (translated)


Thanks. Was gonna ask my wife to translate it for me. LOL.
Quote:


> Originally Posted by *en9dmp*
> 
> 
> 
> 
> 
> Let's see what these can do in crossfire with blocks on....
> 
> To the guy after tri-fire benchmarks, amdmatt posted some a few days after release that showed diminishing returns after 2x cards. That's likely to improve over time though especially when games start to properly use the dx12 capability. Right now it's probably pointless, and would recommend crossfire only. If in a few months games start properly scaling on multiple CPUs and sharing memory then get an x2 and add it to your existing setup, otherwise get a third single fury x then....


I have 2 r9 fury arriving tomorrow that will be in X-fire.
Quote:


> Originally Posted by *Ceadderman*
> 
> You really won't see a major increase running TriFire tbh. But hey if you can afford nearly $2k for three Furies and don't care about the expense, more power to ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I probably need to upgrade my PSU before even adding a 3rd card. Supernova T2 1600 is my target but feels like an overkill.

Thank you guys/gals for the response. I will post pics as soon as I can. Also, I was able to order 2 from Newegg and "bypass" that rule of "Only 1 per customer".

2 accounts, mine(gets shipped to my house with Paypal) and my wife's account(gets shipped to her work which I will pick up tomorrow other Paypal). If not, you gotta wait 48hours but as of today, they're already out of stock. So... Most likely the 2rd fury will have to wait.


----------



## Ceadderman

Well that's one way to make sure everyone has access to Fury X.









I might pick one up before heading out to PDXLan and pick up an EK block and install it onsite. Kinda doubt that will be able to happen(Logistics an all to be considered) but it would be kewl to bend some tubing and include Fury into my loop, in front of everyone.









~Ceadder


----------



## Thoth420

Almost done....thinking about swapping the rad positions as the swiftech tube you really can't see is very taught. So much so it concerns me.

I can barely wiggle it...thoughts, advice? I am a novice builder


----------



## Ceadderman

I would put the Fury's Raidator on the front fan point and put the Swiftech Radiator in the exhaust point.

But that's just me. That Swiftech's Radiator/Reservoir should remain upright as there is a bleeder valve on it. You could develop a leak there.









~Ceadder


----------



## p4inkill3r

You could move the Fury's radiator to the front of the case and move the Swiftech back a slot perhaps.


----------



## Thoth420

Thanks guys I was just thinking a straight swap. Also thanks about the info on the swiftech!









I was thinking top mounting the fury x where the swiftech rad is now since the tubing is super long and then putting the swiftech in rear exhaust.

Kind of want to keep front intake case fan on its spot and possibly add one below as well.


----------



## Ceadderman

Well tbh, you're better off mounting and keeping Fury as Intake at that point.

If you mount it where the Swiftech Radiator is you will be fighting thermal airflow and while you *could* mount as Exhaust you would do better in Intake with that GT. What's even better is that you could put the GT side of that Radiator as Intake and mount your other fan in Pull since it's the weaker of the two. Giving your card even better temps. Of course that's *if* you can mount another fan to the other side of course. If not you can relocate it to the open spot you're already contemplating.









~Ceadder


----------



## en9dmp

Quote:


> Originally Posted by *Orthello*
> 
> Man that form factor looks great .. so much power in such a small space
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be curious if you're going to overclock to see how you get on with those EK blocks.


Here it is in my rig...




Without the blocks on I was only getting about 20Mhz overclock and even then I was getting some crashes... Really need a lot more voltage unfortunately. I seem to have got a couple of duff ones, most likely because I was lucky enough to get them on launch day.

Temps with the blocks on are insane. Barely going into the 40°C range, so hopefully with a lot more voltage I'll have some O/C headroom. Otherwise I can live with a totally silent stock rig...


----------



## tx12

Quote:


> Originally Posted by *xer0h0ur*
> 
> Has anyone confirmed yet if Fury is hardware gimped or software gimped? In other words if flashing Fury with Fury X's BIOS would unlock those shaders?


I'm still waiting for Fury to arrive. Will do a tool asap. I can try to make something w/o actual hardware in hands, but that's a very error-prone way to do things.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> 
> 
> Almost done....thinking about swapping the rad positions as the swiftech tube you really can't see is very taught. So much so it concerns me.
> 
> I can barely wiggle it...thoughts, advice? I am a novice builder


I see you have a Dark Power Pro 11. I have the DPP10. I think they are really awesome PSU's. I hope my 850w will have enough power for 2 Fury's and my [email protected]


----------



## Gumbi

Quote:


> Originally Posted by *Alastair*
> 
> I see you have a Dark Power Pro 11. I have the DPP10. I think they are really awesome PSU's. I hope my 850w will have enough power for 2 Fury's and my [email protected]


Amd cpus absolutely devour power, so it's gonna be tight, but you should be OK.

I have a Dark Rock Pro 2 CPU cooler - best air cooler I've ever got, an amazing balance between performance and silence. Bequeit have fantastic products!


----------



## royfrosty

I'm still waiting for Fury X water blocks from bitspower.

Anyone here has any info on when are they releasing? lol


----------



## Talon720

Quote:


> Originally Posted by *fjordiales*
> 
> Thanks.
> Thanks. Was gonna ask my wife to translate it for me. LOL.
> I have 2 r9 fury arriving tomorrow that will be in X-fire.
> I probably need to upgrade my PSU before even adding a 3rd card. Supernova T2 1600 is my target but feels like an overkill.
> 
> Thank you guys/gals for the response. I will post pics as soon as I can. Also, I was able to order 2 from Newegg and "bypass" that rule of "Only 1 per customer".
> 
> 2 accounts, mine(gets shipped to my house with Paypal) and my wife's account(gets shipped to her work which I will pick up tomorrow other Paypal). If not, you gotta wait 48hours but as of today, they're already out of stock. So... Most likely the 2rd fury will have to wait.


After running tri fire 290x on a evga 1300w g2 I'd go 1600w if I was you. Also if it is a little overkill then the psus fan profile won't be at full rpm. There have been times when really pushing my cpu and gpus that I'm pretty sure I exceeded the 1300w and it shut off. As for getting the t2 that's not necessary 1600w g2 or p2 would work just the same though I gotta believe the more efficient models run cooler. If I could do it again I'd go for the 1600w p2 and if I had the money if go with a t2 not like there's a drawback other than a hoke in the wallet


----------



## fjordiales

Quote:


> Originally Posted by *Talon720*
> 
> After running tri fire 290x on a evga 1300w g2 I'd go 1600w if I was you. Also if it is a little overkill then the psus fan profile won't be at full rpm. There have been times when really pushing my cpu and gpus that I'm pretty sure I exceeded the 1300w and it shut off. As for getting the t2 that's not necessary 1600w g2 or p2 would work just the same though I gotta believe the more efficient models run cooler. If I could do it again I'd go for the 1600w p2 and if I had the money if go with a t2 not like there's a drawback other than a hoke in the wallet


I'm still thinking about it anyway but as far as pricing, these got me thinking especially on what you said.

http://www.amazon.com/EVGA-SuperNOVA-TITANIUM-Crossfire-220-T2-1600-X1/dp/B00R33ZBQU/ref=sr_1_1?ie=UTF8&qid=1437747659&sr=8-1&keywords=supernova+1600

http://www.amazon.com/EVGA-SuperNOVA-PLATINUM-Crossfire-220-P2-1600-X1/dp/B00NJG61JQ/ref=sr_1_2?ie=UTF8&qid=1437747659&sr=8-2&keywords=supernova+1600

http://www.amazon.com/EVGA-SuperNOVA-Crossfire-Warranty-120-G2-1600-X1/dp/B00MMLUIE8/ref=sr_1_3?ie=UTF8&qid=1437747659&sr=8-3&keywords=supernova+1600

Prices are not that far apart & It's still a plan I'm thinking about. I'm doing more research if it's ok for trifire in my Air 540 & maximus vi z97 mobo. Very tight fit for sure. T still have to see the spacing then I'll decide. Thanks for the input though.


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Well tbh, you're better off mounting and keeping Fury as Intake at that point.
> 
> If you mount it where the Swiftech Radiator is you will be fighting thermal airflow and while you *could* mount as Exhaust you would do better in Intake with that GT. What's even better is that you could put the GT side of that Radiator as Intake and mount your other fan in Pull since it's the weaker of the two. Giving your card even better temps. Of course that's *if* you can mount another fan to the other side of course. If not you can relocate it to the open spot you're already contemplating.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I don't think I can set it up push pull but thanks for the advice. I will try her in the front as intake.

Also to the others in regard to the remarks about BeQuiet...great company!


----------



## josephimports

My previous statement on unlocking memory OC on the second GPU was inaccurate. One must disable the first GPU and apply the "extend official overclocking limits" to the second GPU alone. Restart as normal and it should work.


----------



## Talon720

Quote:


> Originally Posted by *fjordiales*
> 
> I'm still thinking about it anyway but as far as pricing, these got me thinking especially on what you said.
> 
> http://www.amazon.com/EVGA-SuperNOVA-TITANIUM-Crossfire-220-T2-1600-X1/dp/B00R33ZBQU/ref=sr_1_1?ie=UTF8&qid=1437747659&sr=8-1&keywords=supernova+1600
> 
> http://www.amazon.com/EVGA-SuperNOVA-PLATINUM-Crossfire-220-P2-1600-X1/dp/B00NJG61JQ/ref=sr_1_2?ie=UTF8&qid=1437747659&sr=8-2&keywords=supernova+1600
> 
> http://www.amazon.com/EVGA-SuperNOVA-Crossfire-Warranty-120-G2-1600-X1/dp/B00MMLUIE8/ref=sr_1_3?ie=UTF8&qid=1437747659&sr=8-3&keywords=supernova+1600
> 
> Prices are not that far apart & It's still a plan I'm thinking about. I'm doing more research if it's ok for trifire in my Air 540 & maximus vi z97 mobo. Very tight fit for sure. T still have to see the spacing then I'll decide. Thanks for the input though.


Well I have very similar setup 540 air z87 formula I got 3 cards, and eventually mounted the case to a file caddy cut out the bottom and added a 3rd 45mm 240 rad in push/pull. For the most part though the 30mm 360 in push/pull in the front and a 30mm 240 push/pull at the top worked fine at stock just not when ocing. Also with fury x being short you could get away with a thicker front rad if you chose to go down that route. My biggest complaint of my tri Fire system is the heat produced by it. I have my own window ac unit in addition to central air but fury x is a little more efficient that should help. Could probably get away with keeping that aio on and putting all the rads up front. T2 price has come down a lot.


----------



## Scorpion49

I'm liking this Fury Tri-X that I got but man the coil whine is awful. I've never had a card this loud, I can hear it in my office from the kitchen if I pause a game to go get a drink or something.


----------



## fjordiales

Quote:


> Originally Posted by *Talon720*
> 
> Well I have very similar setup 540 air z87 formula I got 3 cards, and eventually mounted the case to a file caddy cut out the bottom and added a 3rd 45mm 240 rad in push/pull. For the most part though the 30mm 360 in push/pull in the front and a 30mm 240 push/pull at the top worked fine at stock just not when ocing. Also with fury x being short you could get away with a thicker front rad if you chose to go down that route. My biggest complaint of my tri Fire system is the heat produced by it. I have my own window ac unit in addition to central air but fury x is a little more efficient that should help. Could probably get away with keeping that aio on and putting all the rads up front. T2 price has come down a lot.


Nice setup. I stayed away with watercool since I have bad luck with it. Also, I had a typo. I have same board as yours, z87 formula but with 4790k.

I'll have to plan it out especially with heat output from trifire.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> I'm liking this Fury Tri-X that I got but man the coil whine is awful. I've never had a card this loud, I can hear it in my office from the kitchen if I pause a game to go get a drink or something.


Man, I hear so much about the coil whine on these cards.... not typical for AMD at all, however I have had several NVIDIA cards with coil whine over the years.

Maybe this is AMD's way of competing??


----------



## xer0h0ur

That is precisely why if I were in the market for a Fury it would have been the Strix. 12 phase power and premium components. No coil whine.


----------



## Cool Mike

Not sure anyone has mentioned this.
Fury x available at Newegg. Powercolor


----------



## Thoth420

Well I hit another stall..no way to configure this setup without the tubing looking like crap or without stretching something. I can't even top or front mount this gpu rad in this case...IDK who is the moron AMD, Swiftech or Fractal but GG. It won't front mount intake either...Two small rads in a full atx and only one bad option...


----------



## p4inkill3r

You cannot remove the drive cage and mount the Fury's radiator to the bottom of the case?


----------



## BaddParrot

Quick question here. I installed my new Fury X a couple of days ago. Every thing seems to be running fine/great (No noise etc).
When I loaded GTA 5, I went into settings to make sure every thing was still maxed out. I noticed this on the Video memory:

2760 MB / 8384386 MB & the bar was empty! (Note: I have not grabbed a 4k monitor yet so at 1080p & I had to use window mode to take the screen shot but the numbers were basically the same in Full screen mode).



IS the game (GTA5) confused with the new HBM ??


----------



## fjordiales

Hey guys, remember when I got this?


I got impatient and color scheme isn't my type so I got these instead... A bit cheaper too since I was not taxed.


"Unboxing"





Size "comparison"






Installed in System







System Noise


----------



## By-Tor

Very Sexy...


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> You cannot remove the drive cage and mount the Fury's radiator to the bottom of the case?


I need it for my drives.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> That is precisely why if I were in the market for a Fury it would have been the Strix. 12 phase power and premium components. No coil whine.


That doesn't mean there won't be coil whine, my 290x Lightning and 780 Lightning both had coil whine. There are so many contributing factors, and when you're pulling as much power as these cards do there is never a guarantee coil whine won't be present. Both my G1 970 and MSI Gaming 970 had the same amount of whine as my TriX.


----------



## Neon Lights

Aqua Computer water blocks got delivered to me today, a few hours ago I mounted them and integrated them into my loop (which is shared witch my mainboard water blocks). Here is how it looks:




I have to admit, though, that I totally broke part of the threads in the Aqua Computer kryoConnect. The set screw that I screwed into it for right water flow just would not let it screw itself into the thread after the gap where the opening of the water outlet is. I also could not get it out again either by turning it backwards so I tried to get it as far as possible into the thread and now it is screwed into totally wrong, but there does not seem to be a problem with the water flow.


----------



## xer0h0ur

Quote:


> Originally Posted by *rv8000*
> 
> That doesn't mean there won't be coil whine, my 290x Lightning and 780 Lightning both had coil whine. There are so many contributing factors, and when you're pulling as much power as these cards do there is never a guarantee coil whine won't be present. Both my G1 970 and MSI Gaming 970 had the same amount of whine as my TriX.


Did you even read a review of the Strix?

"The ASUS STRIX R9 Fury uses ASUS' custom 12-phase Super Alloy Power II capacitors. This means 2.5X extended lifespan, Super Allow Power II MOS FETs for lower temperature and increased efficiency and *Super Alloy Power II Choke's for reduced noise with concrete cores.*"

http://www.hardocp.com/article/2015/07/10/asus_strix_r9_fury_dc3_video_card_review#.VbMao5XbLTg

Sure, it may not eliminate coil whine altogether. But there isn't a chance in hell its going to be the same as the reference board.


----------



## Thoth420

Can anyone recommend an AIO cooler that will fit in the same position(front top bay OR if 240 or 280 then front and mid) as the swiftech pictured here? One of the 140x tubes are too tight to leave like this and the case won't allow top mounting of the fury x rad. Front is not an option.

Long pref thin tubes would be great and quiet over performance. I want the tubes as clean as possible.

Thanks in advance


----------



## en9dmp

I seem to have literally unleashed the Fury after putting on my EK blocks... With the stock AIO cooler I couldn't get even 20Mhz more on either card without a crash, now I've been able to crank up to 1120 on each card without any signs of artifacting or crashing in Firestrike Ultra. Not tried any games yet tho, so that could be a different story



Still can't get the second card's memory slider to move though, or the power limit to change though CCC or afterburner... Is there a better third party app available yet?


----------



## xer0h0ur

Did you enable unofficial overclocking?


----------



## New green

Is there a better bios to use for the fury strix to take advantage of its 10+2 power phase? Also does anyone know if the fury strix can be unlocked by flashing the fury x bios or would that only work on a reference fury if it works at all?


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> Can anyone recommend an AIO cooler that will fit in the same position(front top bay OR if 240 or 280 then front and mid) as the swiftech pictured here? One of the 140x tubes are too tight to leave like this and the case won't allow top mounting of the fury x rad. Front is not an option.
> 
> Long pref thin tubes would be great and quiet over performance. I want the tubes as clean as possible.
> 
> Thanks in advance


I thought the whole point of that thing is its expandable? Just get some tubing lol. Make it how you want it for way cheaper than buying another AIO


----------



## bonami2

Anyone can tell me the power comsumption of 2 fury x max out + CPU like amd fx 8000 or 5820k 5960x kind of high power comsumption maxed out?

Im looking to give me an idea of psu needed

At the same time anyone tried 4k with 4x pci express ? i can do sli 8x 8x or crossfire x4 x4 + x4 i think with my z97 gaming 7 and i want to put the 7950 to [email protected] until it blow up


----------



## Scorpion49

Quote:


> Originally Posted by *bonami2*
> 
> Anyone can tell me the power comsumption of 2 fury x max out + CPU like amd fx 8000 or 5820k 5960x kind of high power comsumption maxed out?
> 
> Im looking to give me an idea of psu needed
> 
> At the same time anyone tried 4k with 4x pci express ? i can do sli 8x 8x or crossfire x4 x4 + x4 i think with my z97 gaming 7 and i want to put the 7950 to [email protected] until it blow up


You should be looking for ~1000W quality PSU for that kind of setup. My 5820K with 290X Crossfire could pull around 750W with everything loaded up.

Also, I don't know about Fury but I was just running my 290X's at x8/x4 on my Z97 board (my SSD takes x4 lanes on the m.2 ultra) and I had no issues with games/benches at 4K. I was hesitant at first but I guess my fears were for nothing.


----------



## bonami2

Quote:


> Originally Posted by *Scorpion49*
> 
> You should be looking for ~1000W quality PSU for that kind of setup. My 5820K with 290X Crossfire could pull around 750W with everything loaded up.
> 
> Also, I don't know about Fury but I was just running my 290X's at x8/x4 on my Z97 board (my SSD takes x4 lanes on the m.2 ultra) and I had no issues with games/benches at 4K. I was hesitant at first but I guess my fears were for nothing.


That a great example the 290x has almost as much bandwith as the fury x.









Well im looking overkill wise thinking it probably the last build i will be able to afford ahah

The EVGA plat 1200w p2 or the g2 1600w but it almost 2 x the price









Be 2016 i should have the money

Thank you


----------



## bonami2

Quote:


> Originally Posted by *Thoth420*
> 
> Can anyone recommend an AIO cooler that will fit in the same position(front top bay OR if 240 or 280 then front and mid) as the swiftech pictured here? One of the 140x tubes are too tight to leave like this and the case won't allow top mounting of the fury x rad. Front is not an option.
> 
> Long pref thin tubes would be great and quiet over performance. I want the tubes as clean as possible.
> 
> Thanks in advance


Well maybe look at the corsair unit they have great warantly and support. Im aint a fanboy and anyways i just trust what work


----------



## Scorpion49

Quote:


> Originally Posted by *bonami2*
> 
> That a great example the 290x has almost as much bandwith as the fury x.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well im looking overkill wise thinking it probably the last build i will be able to afford ahah
> 
> The EVGA plat 1200w p2 or the g2 1600w but it almost 2 x the price
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be 2016 i should have the money
> 
> Thank you


No problem, I'm actually excited for Skylake because I can get x20 PCIe 3.0 lanes from the PCH and run my SSD off of that instead of using the x16 lanes direct to the CPU. I'm hoping to ditch the two 290X and get a second Fury.


----------



## royfrosty

Finally, after much consideration. With all the funds i had left, I have managed to squeeze out another r9 Fury X.









Warning large image ahead.

Yet again the wrong memory bandwidth printing.


Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/royfrosty/media/Mobile Uploads/20150725_191646_zpsrbfqy907.jpg.html



Managed to get the revised pump...


Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/ro...oads/IMG-20150725-WA0013_zpsxmxxfggm.jpg.html



Sorry it look like a Frankenstein setup.


Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/royfrosty/media/20150726_000043_zpstyhrashz.jpg.html



Running on....

MSI X99S SLI Plus + i7-5820k (4.5ghz 1.226v) <--- Cooled by Noctua NH-U12S with 2x NF-12 vomit color fans
Gskills Ripjaws 4 3000mhz 4x4gb kit
Crucial m550 512gb
2x Sapphire R9 Fury X (1110mhz core 570mhz mem)
Seasonic P1000 1000w 80+ Platinum with custom white sleeves



Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/ro...mem_CPU44_Firestrike_1.1_zpswddyaiar.png.html


----------



## xer0h0ur

Quote:


> Originally Posted by *royfrosty*
> 
> Finally, after much consideration. With all the funds i had left, I have managed to squeeze out another r9 Fury X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Warning large image ahead.
> 
> Yet again the wrong memory bandwidth printing.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/royfrosty/media/Mobile Uploads/20150725_191646_zpsrbfqy907.jpg.html
> 
> 
> 
> Managed to get the revised pump...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/ro...oads/IMG-20150725-WA0013_zpsxmxxfggm.jpg.html
> 
> 
> 
> Sorry it look like a Frankenstein setup.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/royfrosty/media/20150726_000043_zpstyhrashz.jpg.html
> 
> 
> 
> Running on....
> 
> MSI X99S SLI Plus + i7-5820k (4.5ghz 1.226v) <--- Cooled by Noctua NH-U12S with 2x NF-12 vomit color fans
> Gskills Ripjaws 4 3000mhz 4x4gb kit
> Crucial m550 512gb
> 2x Sapphire R9 Fury X (1110mhz core 570mhz mem)
> Seasonic P1000 1000w 80+ Platinum with custom white sleeves
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/ro...mem_CPU44_Firestrike_1.1_zpswddyaiar.png.html


I salute you sir. Ramen noodles for a month if need be. Amirite?


----------



## Scorpion49

I just tried the FFXIXIXIXIXIXIXLXLXLXIXLXXIXLXLXIXXXIULXIXWASDOMGBBQ bench (seriously, how many of those games there even?) and my Fury absolutely crushed my 290X, not even remotely close. My 290X at 1100mhz under water scored ~8500 points max, and the Fury managed to crack this off:










I guess that tessellation performance borrowed from Tonga paid off.


----------



## xer0h0ur

Unfortunately even still, AMD's architecture lags behind considerably in tessellation performance versus Maxwell. Its even worse at particle effects. These are things Nvidia are pushing hard in GameWorks to keep gimping AMD's performance in games.


----------



## royfrosty

Quote:


> Originally Posted by *xer0h0ur*
> 
> I salute you sir. Ramen noodles for a month if need be. Amirite?


lol, probably just rice with Soy sauce.


----------



## Thoth420

Quote:


> Originally Posted by *Scorpion49*
> 
> I thought the whole point of that thing is its expandable? Just get some tubing lol. Make it how you want it for way cheaper than buying another AIO


It's all new hardware and the thing is also brand new. If I have to modify it day 1 no thanks. Also there is stuff floating inside. I think for the price it's going back...what I don't get is that the 240x has the same tube length..seems moronic that it isn't just a tad longer...expandable or not.


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> It's all new hardware and the thing is also brand new. If I have to modify it day 1 no thanks. Also there is stuff floating inside. I think for the price it's going back...what I don't get is that the 240x has the same tube length..seems moronic that it isn't just a tad longer...expandable or not.


Yeah I get ya, I personally won't ever touch one of those Swiftech units again. I have two dead 220X and a dead Glacer 240L which is the same thing. They all wore the bearing out in the first 2-3 weeks of use and had tons of black nasty crap in the loop. Never again.


----------



## Ceadderman

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> I thought the whole point of that thing is its expandable? Just get some tubing lol. Make it how you want it for way cheaper than buying another AIO
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's all new hardware and the thing is also brand new. If I have to modify it day 1 no thanks. Also there is stuff floating inside. I think for the price it's going back...what I don't get is that the 240x has the same tube length..seems moronic that it isn't just a tad longer...expandable or not.
Click to expand...

Shoot me a pic of that reservoir in PM.









~Ceadder


----------



## Shatun-Bear

Quote:


> Originally Posted by *royfrosty*
> 
> Finally, after much consideration. With all the funds i had left, I have managed to squeeze out another r9 Fury X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/royfrosty/media/Mobile Uploads/20150725_191646_zpsrbfqy907.jpg.html
> 
> 
> 
> Managed to get the revised pump...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/ro...oads/IMG-20150725-WA0013_zpsxmxxfggm.jpg.html
> 
> 
> 
> Sorry it look like a Frankenstein setup.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/royfrosty/media/20150726_000043_zpstyhrashz.jpg.html
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s995.photobucket.com/user/ro...mem_CPU44_Firestrike_1.1_zpswddyaiar.png.html


Wow great build there. + REP for that


----------



## Scorpion49

Here is what my card sounds like after a few days of 4K games/benches to break in lol:


----------



## xer0h0ur

Have you by any chance tried to leave it at load over night? Basically use whatever it is that causes the worst coil whine for you and leave it idling on that for a night or two. It doesn't have to be something that necessarily causes high temps either. I used the CS:GO menu and uncapped the framerate so it was shooting into the thousands which in itself didn't really spike my temps but gave me coil whine. Left it like that over night for two nights. My coil whine is nearly inaudible anymore unless I shut off all my radiator fans which aren't loud to begin with.


----------



## Scorpion49

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you by any chance tried to leave it at load over night? Basically use whatever it is that causes the worst coil whine for you and leave it idling on that for a night or two. It doesn't have to be something that necessarily causes high temps either. I used the CS:GO menu and uncapped the framerate so it was shooting into the thousands which in itself didn't really spike my temps but gave me coil whine. Left it like that over night for two nights. My coil whine is nearly inaudible anymore unless I shut off all my radiator fans which aren't loud to begin with.


Yeah I left it on the Valley closing screen which causes it to wail like a banshee. Its better than it was out of the box but still not what I would call good. It doesn't really bother me that much, but I may try to get it replaced if I can get my hands on a second one for the meantime.


----------



## xer0h0ur

Well if you're seeing improvement it may be worth leaving it idling another night. That is why I did it for two nights. It worked well the first time and I left it a second night to see if it would be any better and it was.


----------



## zhoulander

Might be a bit late to the party, but I put together a video about my Fury X including recordings of fan, (first production run whining) pump, and coil noise (like Scorpion49 says, Unigine Valley exit splash screen causes mine to wail like a banshee).

I also have full recordings of Unigine Valley test for the 780Ti & Fury X that I can upload if there is interest.


----------



## xer0h0ur

As for the guys with the revised pump, are any of you guys experiencing this level of coil whine? I wonder if AMD also changed the chokes on the reference boards after the pump noise negative PR.

Edit: Not to nitpick, but you weren't testing the fans at an equal distance from the microphone. Sure its not a big deal but its not the exact same test.


----------



## royfrosty

No coil whine nor pump whine for both my cards. Guess im lucky









Coil whine, hmm only if it output extreme high frame rates. like above 200fps in old game titles. Then yes but its not those very loud kind.


----------



## zhoulander

Quote:


> Originally Posted by *xer0h0ur*
> 
> Edit: Not to nitpick, but you weren't testing the fans at an equal distance from the microphone. Sure its not a big deal but its not the exact same test.


No prob, I appreciate the feedback. In practice the Fury X radiator fan didn't add an appreciable amount of noise during load like the 780Ti Reference fan did which is what I was trying to show, and the audio recordings matched closely to what I heard standing behind the camera. The 1-2 combo of pump whine and coil whine is why I opted to return the card, even though I got in on the early TigerDirect deal. It would have just sat on the shelf until VisionTek had a replacement unit ready, and who knows how long that would have been.


----------



## Jflisk

My second Power color Fury X is on it way.


----------



## xer0h0ur

Quote:


> Originally Posted by *zhoulander*
> 
> No prob, I appreciate the feedback. In practice the Fury X radiator fan didn't add an appreciable amount of noise during load like the 780Ti Reference fan did which is what I was trying to show, and the audio recordings matched closely to what I heard standing behind the camera. The 1-2 combo of pump whine and coil whine is why I opted to return the card, even though I got in on the early TigerDirect deal. It would have just sat on the shelf until VisionTek had a replacement unit ready, and who knows how long that would have been.


Oh absolutely. I can't blame you one bit for returning it. I would have too. One does not pay that much money for pump noise + coil whine. At least in my case I would have not cared about the pump noise since I waterblock my cards anyways but the coil whine. No thanks.


----------



## tx12

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you even read a review of the Strix?
> 
> "The ASUS STRIX R9 Fury uses ASUS' custom 12-phase Super Alloy Power II capacitors. This means 2.5X extended lifespan, Super Allow Power II MOS FETs for lower temperature and increased efficiency and *Super Alloy Power II Choke's for reduced noise with concrete cores.*"


Asus kills the idea of fury's compactness. With ref design you can add a waterblock to get small card but strix is huge and nothing could be done with that.
Sure 12-phase design helps to reduce coil current (and to use cheaper mosfets), so generally coil noise should be reduced.
Still, there are no such thing as a silent coil. Even then non-audible, coils may still be killing you with undetectable ultrasounds.


----------



## royfrosty

Quote:


> Originally Posted by *en9dmp*
> 
> I seem to have literally unleashed the Fury after putting on my EK blocks... With the stock AIO cooler I couldn't get even 20Mhz more on either card without a crash, now I've been able to crank up to 1120 on each card without any signs of artifacting or crashing in Firestrike Ultra. Not tried any games yet tho, so that could be a different story
> 
> 
> 
> Still can't get the second card's memory slider to move though, or the power limit to change though CCC or afterburner... Is there a better third party app available yet?


Yeah. I having issues to slide the 1st gpu rather than the 1st gpu. @[email protected]

Not sure whats the issue.


----------



## TK421

sooooooo

voltage unlock yet? :v


----------



## richie_2010

can someone advise me on what leds are used for the radeon sign and its power source, i cant make out if its a single diode or a glass style strip.

im currently in the process of making back-plate led kits from acrylic and wonder if i could modify the power to come from this connector or not


----------



## xer0h0ur

Quote:


> Originally Posted by *tx12*
> 
> Asus kills the idea of fury's compactness. With ref design you can add a waterblock to get small card but strix is huge and nothing could be done with that.
> Sure 12-phase design helps to reduce coil current (and to use cheaper mosfets), so generally coil noise should be reduced.
> Still, there are no such thing as a silent coil. Even then non-audible, coils may still be killing you with undetectable ultrasounds.


LOL, as if the Tri-X is any shorter...next time try making sense.


----------



## xer0h0ur

Quote:


> Originally Posted by *richie_2010*
> 
> can someone advise me on what leds are used for the radeon sign and its power source, i cant make out if its a single diode or a glass style strip.
> 
> im currently in the process of making back-plate led kits from acrylic and wonder if i could modify the power to come from this connector or not


Its the exact same logo used on the 295X2, its a strip.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, as if the Tri-X is any shorter...next time try making sense.


Instead of being rude, because english doesnt seem like his first language, be patient and try to make sense of his post. One of his points was that when using a waterblock the trix would be significantly shorter than the custom pcb on the strix.

The majority of what he posted is correct, it simply iant expressed in the easiest means to understand.


----------



## xer0h0ur

There will be waterblocks made for the Strix at some point by at least one of the many block manufacturers. There is no chance in hell they will only make one revision for the reference PCB and call it done. Especially considering that this is literally the only non-reference PCB and its better suited for overclocking.


----------



## Scorpion49

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL, as if the Tri-X is any shorter...next time try making sense.


The card itself is shorter, its a reference PCB. The strix is shorter counting the air cooler but if you waterblock it the Tri-X is only 6 inches long. I think thats what he meant.


----------



## xer0h0ur

Quote:


> Originally Posted by *Scorpion49*
> 
> The card itself is shorter, its a reference PCB. The strix is shorter counting the air cooler but if you waterblock it the Tri-X is only 6 inches long. I think thats what he meant.


Yeah sure I get that but if people are following the conversation then they realize it all began over a comment made about which of the two AIR COOLED Furys I would have picked and why. Little to no coil whine, better components, 12 phase power.


----------



## Scorpion49

Need some help guys, how is the best way to re-TIM this thing? I'm thinking big pea sized dot for the core and tiny ones for the HMB stacks? They put a whole tube of TIM into my card and it was like chalk, the VRM pad crumbled to dust as well. Good thing I still have some fujipoly pads left over.



















Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah sure I get that but if people are following the conversation then they realize it all began over a comment made about which of the two AIR COOLED Furys I would have picked and why. Little to no coil whine, better components, 12 phase power.


I understand, I kinda of wish I got the Strix as well but they're so rarely in stock I just got what I could.


----------



## xer0h0ur

Honest question, does using that much TIM produce adverse effects? Was the card getting hot on you?

As for your question, I normally use the big pea method for dies but with the EK blocks I just followed their instructions and used the double X method they called for which on several block mounts/re-mounts turned out to give good result although it is quite a bit more TIM than using the big pea method.


----------



## Scorpion49

Quote:


> Originally Posted by *xer0h0ur*
> 
> Honest question, does using that much TIM produce adverse effects? Was the card getting hot on you?
> 
> As for your question, I normally use the big pea method for dies but with the EK blocks I just followed their instructions and used the double X method they called for which on several block mounts/re-mounts turned out to give good result although it is quite a bit more TIM than using the big pea method.


Getting hot, not particularly but there isn't an HBM temp sensor that I know of, and one of them was only half covered even with all of that stuff on there. Was just curious to see what they did. Too much TIM is bad, its an insulator when you get more than a thin layer.

EDIT: I also noticed there is a tiny chip in the core by one of the HBM stacks. I don't like that.

This is what I ended up doing:


----------



## PEJUman

A long time lurker, noob poster here. I been tinkering with PC since the i486 days, generally been quiet unless I have something to contribute.
Finally got my R9 FURY Tri-X (sapphire non OC version) yesterday. I had some time benching and generally able to verify the online benches from Anand/Toms/TechReport etc.

The next step was to test fury X bios, I grabbed & flashed both bios (AMD & ASUS) from techpowerup:
http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=&model=R9+Fury+X&interface=&memType=&memSize=

the old R9 290 winflash don't work, but I was able to get the latest atiwinflash from asus GPU tweak II that seems to work:
https://www.asus.com/Graphics-Cards/R9FURYX4G/HelpDesk_Download/
if you install this GPU tweak II, it will include atiwinflash from 11/2014 that seems to work (see more about this below)

So far, it is looking like a hardware locked system. my shader count remains at 3584 with either FURY X bioses, 3Dmark benchmark went from 3600ish to 3800ish. This corresponds to the 1050 core of fury X (vs. 1000 on my stock fury). I was expecting 4000ish if it were to unlock correctly.

Cautionary note, I am not responsible with any damage if you followed my steps below:

make sure to grab a stock copy of your bios before flashing. FURY and FURY X have the normal dual bios switch. I used the 'fast' bios position to play with (this is the one closer to the display ports connector).

The normal command to flash, using admin cmd prompt in win 8.1 x64 works:
atiwinflash -f -p 'x' 'y.rom'

where 'x' is the adapter number, for most people this will 0 and 'y.rom' is the 128KB ROM file.

confirmed the new flash changes the bios version and default clock switches to the 1050. Restart is required to complete the flash

***IMPORTANT***
COLD BOOT NO LONGER WORKS AFTER FLASHING. I CAN ONLY COLD BOOT WITH THE 2ND STOCK BIOS SWITCH ENABLED ('NORMAL' OR BACK BIOS POSITION). RESTARTS FROM WINDOWS WORKS FINE. WHAT THIS MEANS:

IN FAST POSITION: HOT RESTARTS ONLY, EVERYTHING WORKS BUT SHADER COUNTS REMAINS AT 3584. COLD BOOT FAILS WITH BLANK DISPLAY AND BOOT DEBUG CODE b2 ON ASROCK X99M+XEON 1660V3.
IN NORMAL POSITION (STOCK, UNTOUCHED): COLD BOOT AND HOT RESTARTS WORKS.
LEAVE BIOS AT 'NORMAL' POSITION >>> BOOT >>> SWITCH TO 'FAST' IN WINDOWS >>> atiwinflash -f -p 'x' 'y.rom' >>> RESTART >>> BENCHES, GPU-Z, ETC >>> REPEAT WITH 2ND BIOS.
FLASHED 'FAST' SAPPHIRE TRI-X OEM BIOS BACK TO 'FAST' POSITION, BUT COLD BOOT ISSUE ABOVE REMAINS. MY GUESS IS AMD IMPLEMENTED A CHECKSUM COUNTER/MEMORY TRAINING THAT ATIWINFLASH FROM 11/14 FAILS TO ACCOUNT FOR.

Hope this helps you guys. I am open to testing anything reasonable/technically sound to further this investigation.


----------



## richie_2010

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its the exact same logo used on the 295X2, its a strip.


thanks, from what i can google its powered of a 2 pin connector on the pcb i have found these i can splice a new header to to power another led set
http://www.ebay.co.uk/itm/111677294197?ru=http%3A%2F%2Fwww.ebay.co.uk%2Fsch%2Fi.html%3F_from%3DR40%26_sacat%3D0%26_nkw%3D111677294197%26_rdc%3D1#shpCntId


----------



## richie_2010

Quote:


> Originally Posted by *Scorpion49*
> 
> Getting hot, not particularly but there isn't an HBM temp sensor that I know of, and one of them was only half covered even with all of that stuff on there. Was just curious to see what they did. Too much TIM is bad, its an insulator when you get more than a thin layer.
> 
> EDIT: I also noticed there is a tiny chip in the core by one of the HBM stacks. I don't like that.
> 
> This is what I ended up doing:


do you happen to know the size of the screws used to hold the cooler to the card im thinking they are m2/m2,5 as they look quite small.


----------



## xer0h0ur

Quote:


> Originally Posted by *richie_2010*
> 
> thanks, from what i can google its powered of a 2 pin connector on the pcb i have found these i can splice a new header to to power another led set
> http://www.ebay.co.uk/itm/111677294197?ru=http%3A%2F%2Fwww.ebay.co.uk%2Fsch%2Fi.html%3F_from%3DR40%26_sacat%3D0%26_nkw%3D111677294197%26_rdc%3D1#shpCntId


I'll tell ya what though. If you look at the last photo in my sig, I had modded a bay cover with my 295X2's logo and the damn thing didn't even last a week. It died out. Even though I know its a 12V accessory and there is nothing wrong with connecting it directly to the PSU with a molex to mini 2 pin cable. You may be disappointed with a similar result and no one will give them up from their shrouds even if they are waterblocked. You can't find them for sale anywhere either.

FWIW: http://www.moddiy.com/products/4%252dPin-Molex-Connector-(Male)-to-2%252dPin-GPU-Mini-Fan-Connector-(Male-2.0mm).html


----------



## Cool Mike

A lot of over clockers wondering.

*Any new news on voltage unlock?

*Seems to be some work going on in the back ground.


----------



## Scorpion49

Quote:


> Originally Posted by *richie_2010*
> 
> do you happen to know the size of the screws used to hold the cooler to the card im thinking they are m2/m2,5 as they look quite small.


I don't know the size, but they are the same thread as all of the other GPU screws I have from Nvidia and AMD reference cards. I have a pile from 580's/680's and they are the same thread. Hope that helps you narrow it down.


----------



## richie_2010

Quote:


> Originally Posted by *xer0h0ur*
> 
> I'll tell ya what though. If you look at the last photo in my sig, I had modded a bay cover with my 295X2's logo and the damn thing didn't even last a week. It died out. Even though I know its a 12V accessory and there is nothing wrong with connecting it directly to the PSU with a molex to mini 2 pin cable. You may be disappointed with a similar result and no one will give them up from their shrouds even if they are waterblocked. You can't find them for sale anywhere either.
> 
> FWIW: http://www.moddiy.com/products/4%252dPin-Molex-Connector-(Male)-to-2%252dPin-GPU-Mini-Fan-Connector-(Male-2.0mm).html


i wasnt meaning removing it and powering it another way. i was looking at making an acrylic piece that sits under the front panel and mounting a led strip so it makes similar to this (but the front)


i could use the wires i linked to allow the leds to connect to the pcb using the 2pin connector the radeon sign uses or or splice 2 of them together to allow both parts to light up of the one connector


----------



## richie_2010

Quote:


> Originally Posted by *Scorpion49*
> 
> I don't know the size, but they are the same thread as all of the other GPU screws I have from Nvidia and AMD reference cards. I have a pile from 580's/680's and they are the same thread. Hope that helps you narrow it down.


it certainly does








sometimes they change things but im glad they havent, got loads of screws to use


----------



## xer0h0ur

Ohhhhh, that is sexy. This man gives the thumbs up


----------



## richie_2010

Quote:


> Originally Posted by *xer0h0ur*
> 
> Ohhhhh, that is sexy. This man gives the thumbs up


Quote:


> Originally Posted by *xer0h0ur*
> 
> Ohhhhh, that is sexy. This man gives the thumbs up


cheers, ive been making led kits for a while with other cards for the back of the pcb (check out my sig The Artisan Store)
with the fury x i could make them for both sides the backplate though ive moved to acrylic from aluminum


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Shoot me a pic of that reservoir in PM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder




Rather put them on blast publicly. Definitely never again...

Also that is unistalled, when it was mounted it looked worse tons more than what you can see. Worst part...waited a week for it.


----------



## xer0h0ur

I love my EK reservoir. The pump can be noisy at full clip but at 90% speed its not even audible to me.


----------



## Ceadderman

Did you fill it or did it come prefilled?

I'm with you however. There shouldn't be stuff floating around in it at all.









Unless you filled it with old distilled. Any chance that happened?









~Ceadder


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Did you fill it or did it come prefilled?
> 
> I'm with you however. There shouldn't be stuff floating around in it at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless you filled it with old distilled. Any chance that happened?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Prefilled. Def not worth the price or the wait time. I was considering rush ordering an h50 and just mounting it in the front most top slot for now.


----------



## Orivaa

Quote:


> Originally Posted by *zhoulander*
> 
> Might be a bit late to the party, but I put together a video about my Fury X including recordings of fan, (first production run whining) pump, and coil noise (like Scorpion49 says, Unigine Valley exit splash screen causes mine to wail like a banshee).
> 
> I also have full recordings of Unigine Valley test for the 780Ti & Fury X that I can upload if there is interest.


That Fury X has a sticker for the pump logo, meaning it's not the revised version. Ideally, you should not get a non-revised version, unless the vendor "accidentally" sells you one, so the video is kind of redundant.


----------



## Ceadderman

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Did you fill it or did it come prefilled?
> 
> I'm with you however. There shouldn't be stuff floating around in it at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless you filled it with old distilled. Any chance that happened?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Prefilled. Def not worth the price or the wait time. I was considering rush ordering an h50 and just mounting it in the front most top slot for now.
Click to expand...

Get Hyper 212 for now and upgrade that to an AIO from EK when they launch theirs likely this fall. They posted pics on their fB and it looks aweaomesauce.









~Ceadder


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> Get Hyper 212 for now and upgrade that to an AIO from EK when they launch theirs likely this fall. They posted pics on their fB and it looks aweaomesauce.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Say what?! EK is making an AIO!? Thanks I'll research that!

I think an air cooler might be too cramped to fit, also wouldn't it dump the cpu heat toward the fury x rad? Assuming I can fit it in there... I really wanted to go water for both cpu and gpu this time.


----------



## Ceadderman

Hyper 212 really isn't that big. In fact it's rather tiny compared to a Noctua HS. I ran one in between using h50 to custom loop. They're rather good for Temps and it would only be for a short time until you got your new AIO.









~Ceadder


----------



## Thoth420

Thanks will consider it for sure. I just want to get to testing phase before my hardware passes its 30 day mark.


----------



## Cool Mike

Anybody notice that Newegg only shows 3 different brand names for the Fury x currently.


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> 
> 
> Rather put them on blast publicly. Definitely never again...
> 
> Also that is unistalled, when it was mounted it looked worse tons more than what you can see. Worst part...waited a week for it.


That is exactly what the fluid out of my H220's looked like. I think they're not flushing the radiators before assembling them.


----------



## Thoth420

Quote:


> Originally Posted by *Scorpion49*
> 
> That is exactly what the fluid out of my H220's looked like. I think they're not flushing the radiators before assembling them.


Whatever it is...it's their problem now.


----------



## Ceadderman

Quote:


> Originally Posted by *Cool Mike*
> 
> Anybody notice that Newegg only shows 3 different brand names for the Fury x currently.


And all out of stock.









~Ceadder


----------



## p4inkill3r

Amazon has the Sapphire for sale right now: http://www.nowinstock.net/computers/videocards/amd/r9fury/


----------



## Ceadderman

He was referring to Fury X.









~Ceadder


----------



## bonami2

http://asetek.com/customers/do-it-yourself/corsair/corsair-hydro-series-h75/

Gonna stick with asetek at least they know what they do







Well i think we never know with company these days


----------



## Scorpion49

SO, I accidentally my memory. I was fiddling with AB and unlocked the memory slider, and there were some actual gains doing so even at the stock clocks. Check the 3Dmark11 at the bottom.



http://www.3dmark.com/compare/3dm11/10099705/3dm11/10099711


----------



## bonami2

Quote:


> Originally Posted by *Scorpion49*
> 
> SO, I accidentally my memory. I was fiddling with AB and unlocked the memory slider, and there were some actual gains doing so even at the stock clocks. Check the 3Dmark11 at the bottom.
> 
> 
> 
> http://www.3dmark.com/compare/3dm11/10099705/3dm11/10099711


Yea i told a while back that afterburner could overclock vram speed... But people are blind ahah

Some french guy found that about a month ago


----------



## Scorpion49

Quote:


> Originally Posted by *bonami2*
> 
> Yea i told a while back that afterburner could overclock vram speed... But people are blind ahah
> 
> Some french guy found that about a month ago


Well I knew it could be done, but I wasn't expecting it to do much. In Valley at the same core speed I gained 2fps overall with 550mhz memory. You would think with the big fat bus and relatively poor throughput that memory speeds would have little effect on anything.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I knew it could be done, but I wasn't expecting it to do much. In Valley at the same core speed I gained 2fps overall with 550mhz memory. You would think with the big fat bus and relatively poor throughput that memory speeds would have little effect on anything.


Ive noticee that outside of short benchmarks its really hard for the card to sustain any memory oc, none of my fiji cards have really been able to bench any higher than 530 on the memory.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Ive noticee that outside of short benchmarks its really hard for the card to sustain any memory oc, none of my fiji cards have really been able to bench any higher than 530 on the memory.


Hm, seems fine at 550 to me. Its been looping Valley for about 30 minutes now while I do other things. Maybe its making up for the fact that the core clocking sucks.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Hm, seems fine at 550 to me. Its been looping Valley for about 30 minutes now while I do other things. Maybe its making up for the fact that the core clocking sucks.


FPS gain from memory OC'ing so far is almost non-existant at best its going to give 1-2fps, better than nothing I guess. Clocking the memory on my 2x Fury X and Fury so far have just not been worth it.

TriX card never ceases to amaze me though, at 50% fan speed (basically where I can barely begin to hear it over case fans, and its hardly loud), my core hasn't broken 54c after almost an hour of benchmarking, but I'm very curious about VRM temps right now as theres no readout..

Wheres that voltage control


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> FPS gain from memory OC'ing so far is almost non-existant at best its going to give 1-2fps, better than nothing I guess. Clocking the memory on my 2x Fury X and Fury so far have just not been worth it.
> 
> TriX card never ceases to amaze me though, at 50% fan speed (basically where I can barely begin to hear it over case fans, and its hardly loud), my core hasn't broken 54c after almost an hour of benchmarking, but I'm very curious about VRM temps right now as theres no readout..
> 
> Wheres that voltage control


I'm wondering the same thing, but also maybe the memory might help more at 4K? Ill have to do some checking. I just thought it was cool it showed any gains at all.


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> FPS gain from memory OC'ing so far is almost non-existant at best its going to give 1-2fps, better than nothing I guess. Clocking the memory on my 2x Fury X and Fury so far have just not been worth it.
> 
> TriX card never ceases to amaze me though, at 50% fan speed (basically where I can barely begin to hear it over case fans, and its hardly loud), my core hasn't broken 54c after almost an hour of benchmarking, but I'm very curious about VRM temps right now as theres no readout..
> 
> Wheres that voltage control


Knowing sapphire they can be cold as ice or hot as lava... Im in love with high phase count so asus evga msi for me

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I knew it could be done, but I wasn't expecting it to do much. In Valley at the same core speed I gained 2fps overall with 550mhz memory. You would think with the big fat bus and relatively poor throughput that memory speeds would have little effect on anything.


Cpu bound scenario show Ram speed in some case to improve performance... Bf4 beta Arma 3 and other cpu bound

So maybe gpu show the same improvement


----------



## rv8000

Quote:


> Originally Posted by *bonami2*
> 
> Knowing sapphire they can be cold as ice or hot as lava... Im in love with high phase count so asus evga msi for me
> Cpu bound scenario show Ram speed in some case to improve performance... Bf4 beta Arma 3 and other cpu bound
> 
> So maybe gpu show the same improvement


To bad ASUS seems to be applying their own "red light" initiative with their Fury card, they release a freak of a card from a hardware standpoint then cripple it's potential with low voltages, $*@#$ bios and a worse performing cooler than the TriX @ the same fan speeds and loudness.


----------



## bonami2

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I knew it could be done, but I wasn't expecting it to do much. In Valley at the same core speed I gained 2fps overall with 550mhz memory. You would think with the big fat bus and relatively poor throughput that memory speeds would have little effect on anything.


Quote:


> Originally Posted by *rv8000*
> 
> To bad ASUS seems to be applying their own "red light" initiative with their Fury card, they release a freak of a card from a hardware standpoint then cripple it's potential with low voltages, $*@#$ bios and a worse performing cooler than the TriX @ the same fan speeds and loudness.


+ crappy stock 8 phase









Asus seem to hate amd recently with the crappy r9 290x + now the fury overpriced and etc


----------



## Ehsteve

Long time lurker.

Got a set of Gigabyte Fury X in 2xCF with some EK waterblocks in a custom loop. That said the shroud is beautiful, made me sad to see it go (nowhere near as tacky/cheap) as it looked in the previews)

Noticed some whine under full load (not pump whine, just from the card). Nothing too distracting though (going to burn over a couple of days to see if it improves). Anyone else had this using non-AIO solutions?


----------



## rv8000

Quote:


> Originally Posted by *Ehsteve*
> 
> Long time lurker.
> 
> Got a set of Gigabyte Fury X in 2xCF with some EK waterblocks in a custom loop. That said the shroud is beautiful, made me sad to see it go (nowhere near as tacky/cheap) as it looked in the previews)
> 
> Noticed some whine under full load (not pump whine, just from the card). Nothing too distracting though (going to burn over a couple of days to see if it improves). Anyone else had this using non-AIO solutions?


Yes, the majority of high end cards have this issue. Its called coil whine (electricity passing through coils in the power circuitry at a frequency that causes them to vibrate/resonate and create the buzzing noise) and its severity will differ from card to card and setup to setup. Ive had instances of coil whine on 7970s, 780s, 970s, 290s, 290x etc...


----------



## EpicOtis13

Quote:


> Originally Posted by *rv8000*
> 
> Yes, the majority of high end cards have this issue. Its called coil whine (electricity passing through coils in the power circuitry at a frequency that causes them to vibrate/resonate and create the buzzing noise) and its severity will differ from card to card and setup to setup. Ive had instances of coil whine on 7970s, 780s, 970s, 290s, 290x etc...


Yup, I get cool whine on my 7970's when I decide to unleash their full 1300/1750 OC's.


----------



## Ehsteve

Quote:


> Originally Posted by *rv8000*
> 
> Yes, the majority of high end cards have this issue. Its called coil whine (electricity passing through coils in the power circuitry at a frequency that causes them to vibrate/resonate and create the buzzing noise) and its severity will differ from card to card and setup to setup. Ive had instances of coil whine on 7970s, 780s, 970s, 290s, 290x etc...


Know what coil whine is, just specifically had not heard this particular card having coil while in and of itself, just from the stock pump due to insufficient insulation on the coils.

It's not terrible, and after some burning should probably quiet down but in a quiet rig it can be noticeable (again, only at full load).


----------



## Newbie2009

Quote:


> Originally Posted by *Cool Mike*
> 
> A lot of over clockers wondering.
> 
> *Any new news on voltage unlock?
> 
> *Seems to be some work going on in the back ground.


Yeah I have been keeping an eye on this thread for same reason. Looks like no joy yet.


----------



## en9dmp

Quote:


> Originally Posted by *royfrosty*
> 
> Yeah. I having issues to slide the 1st gpu rather than the 1st gpu. @[email protected]
> 
> Not sure whats the issue.


Sorry for the late reply... I've enabled extended overclocking limits in AB and over noticed that if the checkbox is ticked to apply overclock to both cards it doesn't work. So if you uncheck it. On GPU1 you can overclock core and memory and adjust the power limit, but on GPU2 you can only adjust the core clock. Memory is greeted out and power limit you can move but when you hit apply it jumps back to 0.


----------



## boi801

Quote:


> Originally Posted by *Newbie2009*
> 
> Yeah I have been keeping an eye on this thread for same reason. Looks like no joy yet.


http://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/

The wait is almost over!!!!!!


----------



## en9dmp

Thanks for
Quote:


> Originally Posted by *boi801*
> 
> http://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/
> 
> The wait is almost over!!!!!!


Thanks for the link, shame the OC results, even with the extra voltage, don't look to be very good. Although I'm slightly confused as to why he used a pretty ancient game to test on... Maybe will be more noticeable on the newer games


----------



## New green

Ya from that link it looks like in order to get +144mV you need to waterblock since the VRMs are not being cooled enough by the stock AIO. A +40mV gets less than a frame on bf3 @ 4k. Seems underwhelming.


----------



## en9dmp

Quote:


> Originally Posted by *New green*
> 
> Ya from that link it looks like in order to get +144mV you need to waterblock since the VRMs are not being cooled enough by the stock AIO. A +40mV gets less than a frame on bf3 @ 4k. Seems underwhelming.


When you look at how much the power consumption and heat increases, it barely seems worth it... At the moment I can't get my cards to hit 40°C under load but hardly seems worth cranking them up if the gains are negligible


----------



## boi801

Quote:


> Originally Posted by *en9dmp*
> 
> Thanks for
> Thanks for the link, shame the OC results, even with the extra voltage, don't look to be very good. Although I'm slightly confused as to why he used a pretty ancient game to test on... Maybe will be more noticeable on the newer games


And the "Maximum GPU clock" blue line starts at 1135Mhz... so its more than 3 frames gain from stock... don't forget the memory clock also...


----------



## Gumbi

Quote:


> Originally Posted by *New green*
> 
> Ya from that link it looks like in order to get +144mV you need to waterblock since the VRMs are not being cooled enough by the stock AIO. A +40mV gets less than a frame on bf3 @ 4k. Seems underwhelming.


How hot are they getting under load under the stock setup?


----------



## Alastair

Aaai. Those initial overclocking results don't look too great. I mean 1200Mhz is bad. But I want to see what these things do with proper VRM cooling as well.


----------



## New green

Quote:


> Originally Posted by *Gumbi*
> 
> How hot are they getting under load under the stock setup?


The link said the VRMs were hitting over 95 Celsius. If the fury non x is software locked and not hardware locked it can be bios flashed to unlock to a fury x it would be more cost effective to waterblock.

It would be interesting how the ASUS strix's custom PCB compares to the reference cards in power draw and oc potential.


----------



## Gumbi

Quote:


> Originally Posted by *New green*
> 
> The link said the VRMs were hitting over 95 Celsius. If the fury non x is software locked and not hardware locked it can be bios flashed to unlock to a fury x it would be more cost effective to waterblock.
> 
> It would be interesting how the ASUS strix's custom PCB compares to the reference cards in power draw and oc potential.


He also stated 95 degrees celcius isn't great for long term viability, and tbh I disagree, given they are rated for 125 degrees.

In any case, I wonder what they were at 100mv, he stopped at 144mv, I wonder if the VRMs were acceptable (sub 90 we'll say) at 100mv overvolt.


----------



## Cool Mike

Would be nice to see voltage control this week.


----------



## en9dmp

Has anyone on here upgraded from windows 8.1 to 10? I'm contemplating upgrading when it's released in a couple of days but would be good to know if anyone here has taken any before and after benches to see if there is any improvement in existing games using the new API...


----------



## rv8000

For all the people going on about this "overclockers dream" stuff, and being disappointed in fiji hitting 1215 on stock cooling, mind you with +144mV at that, step back and look at the situation. Fiji is a huge and dense die, yet here it is clocking just as high as hawaii (if not marginally better as I couldnt get 1215 stable with +200mV on my Lightning). The scaling is EXACTLY what I expected and said almost 2-3 weeks ago now.

If you consider Fury and it's stock core of 1000mhz, if they can hit 1200 thats still a 20% oc, how is that bad. In the end it's still pretty much the same GCN, I don't understand why you guys expected more and are dogging fiji just because some AMD employee made a comment I doubt they had any proof for.
Quote:


> Originally Posted by *New green*
> 
> The link said the VRMs were hitting over 95 Celsius. If the fury non x is software locked and not hardware locked it can be bios flashed to unlock to a fury x it would be more cost effective to waterblock.
> 
> It would be interesting how the ASUS strix's custom PCB compares to the reference cards in power draw and oc potential.


They will be cooler on average, more phases = less amperage = less resistance = less heat.


----------



## p4inkill3r

Don't waste your time trying to understand it, man. Enjoy your purchase!


----------



## en9dmp

Quote:


> Originally Posted by *rv8000*
> 
> For all the people going on about this "overclockers dream" stuff, and being disappointed in fiji hitting 1215 on stock cooling, mind you with +144mV at that, step back and look at the situation. Fiji is a huge and dense die, yet here it is clocking just as high as hawaii (if not marginally better as I couldnt get 1215 stable with +200mV on my Lightning). The scaling is EXACTLY what I expected and said almost 2-3 weeks ago now.
> 
> If you consider Fury and it's stock core of 1000mhz, if they can hit 1200 thats still a 20% oc, how is that bad. In the end it's still pretty much the same GCN, I don't understand why you guys expected more and are dogging fiji just because some AMD employee made a comment I doubt they had any proof for.
> They will be cooler on average, more phases = less amperage = less resistance = less heat.


It's not the clock speed that's the issue, I think anything around 1200 is decent, but if that doesn't translate into any appreciable performance increase then there's not much point...


----------



## Jflisk

Did we manage to get the second cards clocks/ram open on X fire configurations yet. Any help greatly appreciated. Second Fury X on its way. Might water block them eventually - With the stock cooler - I dont see any advantages of blocking them.


----------



## Cool Mike

I will be happy with a stable 1200Mhz core and 550Mhz memory clock. Ready to download now.

Not clear, is Alex working on afterburner as we speak?


----------



## bonami2

Quote:


> Originally Posted by *Gumbi*
> 
> He also stated 95 degrees celcius isn't great for long term viability, and tbh I disagree, given they are rated for 125 degrees.
> 
> In any case, I wonder what they were at 100mv, he stopped at 144mv, I wonder if the VRMs were acceptable (sub 90 we'll say) at 100mv overvolt.


that relative my msi mobo throttle point is 140 max and my asus was 180 max

Id says 90-100c is safe if capacitor are not cooking 1cm near.. On the fury x the cap are the same as on mobile so i think they can handle high heat

( tantalum i think it called can handle high heat application

400w to 600w power comsumption sickness


----------



## josephimports

Quote:


> Originally Posted by *Jflisk*
> 
> Did we manage to get the second cards clocks/ram open on X fire configurations yet. Any help greatly appreciated. Second Fury X on its way. Might water block them eventually - With the stock cooler - I dont see any advantages of blocking them.


You can enable memory OC on GPU2 by disabling GPU1. I used the PCI switch on my MB. Removing the power connectors or GPU completely may work as well. Restart the PC, open AB, and enable "extend official overclocking limits". Restart the PC as normal and viola. Core clock should work by default.


----------



## rv8000

Quote:


> Originally Posted by *en9dmp*
> 
> It's not the clock speed that's the issue, I think anything around 1200 is decent, but if that doesn't translate into any appreciable performance increase then there's not much point...


GCN actually scales very well, in fact much better than maxwell. I'll show some rough math using example scores from *my Fury @ 1100/530 vs a 980 G1 @ 1540/7800* from the bench off thread. Note both Valley and Heaven often favor Nvidia, and this being a synthetic will generally represent the best case scaling in terms of overclocking.

*Valley @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 79.9 FPS R9 Fury - 78.5 FPS % vs. 980 - -1.78%

*Valley @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 48.1 FPS R9 Fury - 50.2 FPS % vs. 980 - +4.18%

*Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 22.3 FPS R9 Fury - 23.8 FPS % vs. 980 - +6.3%
________________________________________________

*Heaven @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 76.6 FPS R9 Fury - 68.4 FPS % vs. 980 - -11.98%

*Heaven @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 44.8 FPS R9 Fury - 44.7 FPS % vs. 980 - ~0%

*Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*

GTX 980 - 18.7 FPS R9 Fury - 20.6 FPS % vs. 980 - +9.23%
________________________________________________

Now for some scaling on each per MHZ. I will be using Valley results @ 1440p for this comparison. The clocks for the 980 G1 will be 1404 vs 1540, and the clocks for the Fury will be 1000 vs 1100, effective 134mhz and 100mhz oc respectively.

GTX 980 @ 1540 - 48.1 FPS GTX 980 @ 1404 - 46.1 FPS % increase - +4.33%

R9 Fury @ 1100 - 50.2 FPS R9 Fury @ 1000 - 46.7 FPS %increase - + 7.5%

*Core mhz required for a 1% increase in score on a GTX 980*

134mhz/4.33% = 30.94mhz per 1% score increase

*Core mhz required for a 1% increase in score on an R9 Fury*

100mhz/7.5% = 13.33mhz per 1% score increase

So in conclusion, at 1440p, *overclocking for a 1% increase in score is 2.3x better on an R9 Fury than a GTX 980*

To sum things up just a little better, as well as a few assumptions... An R9 Fury, which it's direct competition is a GTX 980 regardless of the current pricing scheme, will 9 times out of 10 be the faster card on 1440p and above *before voltage control and going past 1100mhz on the core*, any higher oc on the Fury will allow the 980 to not keep up. Higher oc's on the R9 fury will generall make 1080p a wash (tie) favoring each camp for the games that normally run best on their hardware, but I do expect the R9 Fury to have a lead on a heavily oc'd 980 past 1150mhz on the core (with voltage control). Max OC on air/water will 99.9% show that the Fury has a small lead @ 1080p, a 5-10% lead depending on games @ 1440p, and a 10%+ lead at 4k. Ultimately R9 Fury will be far better for multi card setups due to crossfire scaling, with much heftier % leads at every resolution.


----------



## bonami2

i may add im running my 7950 for like 4 month worth of time 24/7 at 90c vrm + this thing is still looking ok
Quote:


> Originally Posted by *rv8000*
> 
> GCN actually scales very well, in fact much better than maxwell. I'll show some rough math using example scores from *my Fury @ 1100/530 vs a 980 G1 @ 1540/7800* from the bench off thread. Note both Valley and Heaven often favor Nvidia, and this being a synthetic will generally represent the best case scaling in terms of overclocking.
> 
> *Valley @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 79.9 FPS R9 Fury - 78.5 FPS % vs. 980 - -1.78%
> 
> *Valley @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 48.1 FPS R9 Fury - 50.2 FPS % vs. 980 - +4.18%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 22.3 FPS R9 Fury - 23.8 FPS % vs. 980 - +6.3%
> ________________________________________________
> 
> *Heaven @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 76.6 FPS R9 Fury - 68.4 FPS % vs. 980 - -11.98%
> 
> *Heaven @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 44.8 FPS R9 Fury - 44.7 FPS % vs. 980 - ~0%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 18.7 FPS R9 Fury - 20.6 FPS % vs. 980 - +9.23%
> ________________________________________________
> 
> Now for some scaling on each per MHZ. I will be using Valley results @ 1440p for this comparison. The clocks for the 980 G1 will be 1404 vs 1540, and the clocks for the Fury will be 1000 vs 1100, effective 134mhz and 100mhz oc respectively.
> 
> GTX 980 @ 1540 - 48.1 FPS GTX 980 @ 1404 - 46.1 FPS % increase - +4.33%
> 
> R9 Fury @ 1100 - 50.2 FPS R9 Fury @ 1000 - 46.7 FPS %increase - + 7.5%
> 
> *Core mhz required for a 1% increase in score on a GTX 980*
> 
> 134mhz/4.33% = 30.94mhz per 1% score increase
> 
> *Core mhz required for a 1% increase in score on an R9 Fury*
> 
> 100mhz/7.5% = 13.33mz per 1% score increase
> 
> So in conclusion, at 1440p, *overclocking for a 1% increase in score is 2.3x better on an R9 Fury than a GTX 980*
> 
> To sum things up just a little better, as well as a few assumptions... An R9 Fury, which it's direct competition is a GTX 980 regardless of the current pricing scheme, will 9 times out of 10 be the faster card on 1440p and above *before voltage control and going past 1100mhz on the core*, any higher oc on the Fury will allow the 980 to not keep up. Higher oc's on the R9 fury will generall make 1080p a wash (tie) favoring each camp for the games that normally run best on their hardware, but I do expect the R9 Fury to have a lead on a heavily oc'd 980 past 1150mhz on the core (with voltage control). Max OC on air/water will 99.9% show that the Fury has a small lead @ 1080p, a 5-10% lead depending on games @ 1440p, and a 10%+ lead at 4k. Ultimately R9 Fury will be far better for multi card setups due to crossfire scaling, with much heftier % leads at every resolution.


So i did have a right idea that the nvidia card overclocked high but did not show great improvement.

Let wait windows 10 with new amd driver


----------



## Agent Smith1984

Quote:


> Originally Posted by *rv8000*
> 
> GCN actually scales very well, in fact much better than maxwell. I'll show some rough math using example scores from *my Fury @ 1100/530 vs a 980 G1 @ 1540/7800* from the bench off thread. Note both Valley and Heaven often favor Nvidia, and this being a synthetic will generally represent the best case scaling in terms of overclocking.
> 
> *Valley @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 79.9 FPS R9 Fury - 78.5 FPS % vs. 980 - -1.78%
> 
> *Valley @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 48.1 FPS R9 Fury - 50.2 FPS % vs. 980 - +4.18%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 22.3 FPS R9 Fury - 23.8 FPS % vs. 980 - +6.3%
> ________________________________________________
> 
> *Heaven @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 76.6 FPS R9 Fury - 68.4 FPS % vs. 980 - -11.98%
> 
> *Heaven @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 44.8 FPS R9 Fury - 44.7 FPS % vs. 980 - ~0%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 18.7 FPS R9 Fury - 20.6 FPS % vs. 980 - +9.23%
> ________________________________________________
> 
> Now for some scaling on each per MHZ. I will be using Valley results @ 1440p for this comparison. The clocks for the 980 G1 will be 1404 vs 1540, and the clocks for the Fury will be 1000 vs 1100, effective 134mhz and 100mhz oc respectively.
> 
> GTX 980 @ 1540 - 48.1 FPS GTX 980 @ 1404 - 46.1 FPS % increase - +4.33%
> 
> R9 Fury @ 1100 - 50.2 FPS R9 Fury @ 1000 - 46.7 FPS %increase - + 7.5%
> 
> *Core mhz required for a 1% increase in score on a GTX 980*
> 
> 134mhz/4.33% = 30.94mhz per 1% score increase
> 
> *Core mhz required for a 1% increase in score on an R9 Fury*
> 
> 100mhz/7.5% = 13.33mhz per 1% score increase
> 
> So in conclusion, at 1440p, *overclocking for a 1% increase in score is 2.3x better on an R9 Fury than a GTX 980*
> 
> To sum things up just a little better, as well as a few assumptions... An R9 Fury, which it's direct competition is a GTX 980 regardless of the current pricing scheme, will 9 times out of 10 be the faster card on 1440p and above *before voltage control and going past 1100mhz on the core*, any higher oc on the Fury will allow the 980 to not keep up. Higher oc's on the R9 fury will generall make 1080p a wash (tie) favoring each camp for the games that normally run best on their hardware, but I do expect the R9 Fury to have a lead on a heavily oc'd 980 past 1150mhz on the core (with voltage control). Max OC on air/water will 99.9% show that the Fury has a small lead @ 1080p, a 5-10% lead depending on games @ 1440p, and a 10%+ lead at 4k. Ultimately R9 Fury will be far better for multi card setups due to crossfire scaling, with much heftier % leads at every resolution.


Thanks for posting those numbers, and thanks also for expressing something (with actual data) that so many overlook; the fact that overclocking itself does not always correlate to actual performance improvements, and hardly ever scales accordingly.....

Wanted to add, that AMD's performance per-core overclocking should always scale better than NVIDIA's when NVIDIA has a comparative memory bandwidth limitation.
AMD's clock ceiling is unfortunately lower, so maxwell can gain some of that ground back, but for what you get up to that ceiling, AMD does well.


----------



## en9dmp

Quote:


> Originally Posted by *rv8000*
> 
> GCN actually scales very well, in fact much better than maxwell. I'll show some rough math using example scores from *my Fury @ 1100/530 vs a 980 G1 @ 1540/7800* from the bench off thread. Note both Valley and Heaven often favor Nvidia, and this being a synthetic will generally represent the best case scaling in terms of overclocking.
> 
> *Valley @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 79.9 FPS R9 Fury - 78.5 FPS % vs. 980 - -1.78%
> 
> *Valley @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 48.1 FPS R9 Fury - 50.2 FPS % vs. 980 - +4.18%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 22.3 FPS R9 Fury - 23.8 FPS % vs. 980 - +6.3%
> ________________________________________________
> 
> *Heaven @ 1920x1080, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 76.6 FPS R9 Fury - 68.4 FPS % vs. 980 - -11.98%
> 
> *Heaven @ 2560x1440, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 44.8 FPS R9 Fury - 44.7 FPS % vs. 980 - ~0%
> 
> *Valley @ 3840x2160, Fullscreen, 8xAA, Quality - Ultra:*
> 
> GTX 980 - 18.7 FPS R9 Fury - 20.6 FPS % vs. 980 - +9.23%
> ________________________________________________
> 
> Now for some scaling on each per MHZ. I will be using Valley results @ 1440p for this comparison. The clocks for the 980 G1 will be 1404 vs 1540, and the clocks for the Fury will be 1000 vs 1100, effective 134mhz and 100mhz oc respectively.
> 
> GTX 980 @ 1540 - 48.1 FPS GTX 980 @ 1404 - 46.1 FPS % increase - +4.33%
> 
> R9 Fury @ 1100 - 50.2 FPS R9 Fury @ 1000 - 46.7 FPS %increase - + 7.5%
> 
> *Core mhz required for a 1% increase in score on a GTX 980*
> 
> 134mhz/4.33% = 30.94mhz per 1% score increase
> 
> *Core mhz required for a 1% increase in score on an R9 Fury*
> 
> 100mhz/7.5% = 13.33mhz per 1% score increase
> 
> So in conclusion, at 1440p, *overclocking for a 1% increase in score is 2.3x better on an R9 Fury than a GTX 980*
> 
> To sum things up just a little better, as well as a few assumptions... An R9 Fury, which it's direct competition is a GTX 980 regardless of the current pricing scheme, will 9 times out of 10 be the faster card on 1440p and above *before voltage control and going past 1100mhz on the core*, any higher oc on the Fury will allow the 980 to not keep up. Higher oc's on the R9 fury will generall make 1080p a wash (tie) favoring each camp for the games that normally run best on their hardware, but I do expect the R9 Fury to have a lead on a heavily oc'd 980 past 1150mhz on the core (with voltage control). Max OC on air/water will 99.9% show that the Fury has a small lead @ 1080p, a 5-10% lead depending on games @ 1440p, and a 10%+ lead at 4k. Ultimately R9 Fury will be far better for multi card setups due to crossfire scaling, with much heftier % leads at every resolution.


Great analysis! I guess the link is slightly misleading as at first look it only shows a 1.5 fps increase all the way from -48mV all the way up to +144mV, which would indicate a poor performance increase. Looking closer the clock speed at -48mV is 1125MHz which is already a 75MHz overclock.

From the other chart in the link the core and memory overclock is yielding about 12% performance increase for about a 20% overclock on the core and 12% overclock on the memory, which seems reasonable. As for the statement that performance scales better than Maxwell, that may be true, but Maxwell overclocks significantly higher.


----------



## Orthello

Some really interesting posts, first time i've seen some max oc benches of reasonable 980 vs max oc of a Fury bench results.

Shows that the 980 needs all the overclock it can get , even then once the Fury is in the 1200mhz realm (i would say more than possible considering the great coolers these cards have been designed with) that it will still leave a heavily oc'd 980 behind by 10% + , possibly only on par in 1080p as previously said with current state of drivers.

Comparing to a 980 however , i'm not sure if thats great for AMD. The Fury should by all accounts wipe the floor with the 980 , thanks to overclocking headroom on maxwell its not quite what it should be in terms of AMD dominance.

What i would like to see is Fury versions coming out waterblocked .. that would be interesting. A waterblocked fury 1250 ish + .. would give some really nice performance. The smaller active transistor count could make up some difference in OC % vs fury x, time will tell.

Who is Wizard ? (that guy at techpower up having done that Fruy voltage overclock review) i havn't seen any overclocking tools from him prior (i think) ?


----------



## rv8000

Quote:


> Originally Posted by *Orthello*
> 
> Some really interesting posts, first time i've seen some max oc benches of reasonable 980 vs max oc of a Fury bench results.
> 
> Shows that the 980 needs all the overclock it can get , even then once the Fury is in the 1200mhz realm (i would say more than possible considering the great coolers these cards have been designed with) that it will still leave a heavily oc'd 980 behind by 10% + , possibly only on par in 1080p as previously said with current state of drivers.
> 
> Comparing to a 980 however , i'm not sure if thats great for AMD. The Fury should by all accounts wipe the floor with the 980 , thanks to overclocking headroom on maxwell its not quite what it should be in terms of AMD dominance.
> 
> What i would like to see is Fury versions coming out waterblocked .. that would be interesting. A waterblocked fury 1250 ish + .. would give some really nice performance. The smaller active transistor count could make up some difference in OC % vs fury x, time will tell.
> 
> Who is Wizard ? (that guy at techpower up having done that Fruy voltage overclock review) i havn't seen any overclocking tools from him prior (i think) ?


Hes the developer of the Sapphire TrixX overclocking software.


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> Hes the developer of the Sapphire TrixX overclocking software.


Who made sapphire trixx ? i need to send him like 5 bug problem with the software.... remind me of the old intel extreme motherboard utility that thing worked fine and other time like crap


----------



## dir_d

W1zzard the owner of TPU


----------



## Orthello

Quote:


> Originally Posted by *rv8000*
> 
> Hes the developer of the Sapphire TrixX overclocking software.


Ah that makes sense







, i'll keep my eye out for a TrixX update.


----------



## Orthello

Quote:


> Originally Posted by *dir_d*
> 
> W1zzard the owner of TPU


Now i'm informed , owner of TPU and creator of TrixX software ... hopefully will OC my cousins Fury X like made once software is released, any ETAs on this ??


----------



## dir_d

Quote:


> Originally Posted by *Orthello*
> 
> Now i'm informed , owner of TPU and creator of TrixX software ... hopefully will OC my cousins Fury X like made once software is released, any ETAs on this ??


I's say pretty close for this new version of trixx since he just wrote on his findings. Probably in a week or 2.


----------



## bonami2

So their telling me that a random dude is doing the sapphire program....... Im the only one scratching my head. It a professional software it suppose to be build by Sapphire employee no?

I says that but i have no idea who build Msi evga asus program either


----------



## Orthello

Quote:


> Originally Posted by *bonami2*
> 
> So their telling me that a random dude is doing the sapphire program....... Im the only one scratching my head. It a professional software it suppose to be build by Sapphire employee no?
> 
> I says that but i have no idea who build Msi evga asus program either


Sapphire most likely license it, eg like i believe MSI does with Unwinders Afterburner program.

Its good in a sense, it means these guys are getting some monetary support for their endeavors. I actually offered Unwinder some donation $$ to get negative temperature reading supported in Afterburner, he told me he would never accept a donation for it - he is a principled man, about 2 months later AB was modded for negative temperature reading , never gone back to Prec X since (major Kudos to Unwinder).


----------



## bonami2

Quote:


> Originally Posted by *Orthello*
> 
> Sapphire most likely license it, eg like i believe MSI does with Unwinders Afterburner program.
> 
> Its good in a sense, it means these guys are getting some monetary support for their endeavors. I actually offered Unwinder some donation $$ to get negative temperature reading supported in Afterburner, he told me he would never accept a donation for it - he is a principled man, about 2 months later AB was modded for negative temperature reading , never gone back to Prec X since (major Kudos to Unwinder).


Well i love msi afterburner it the best i used


----------



## BrotherBeast

Asus Strix Fury







Card is huge but surprisingly lightweight.Will be gaming shortly.


----------



## royfrosty

Guys just wanna ask. Is the Sapphire r9 Fury TriX reference pcb? Same as r9 Fury X?


----------



## Jflisk

Looks like they have fixed the Monitor Sleep problem in windows 10. Tried the monitor sleep last night and no crashes. Looks like they are pushing minor driver or catalyst updates every other day. This is with a Fury X

Trying to also figure out where to put the radiator when my new fury gets here. I can dump the top rad and put them up there but will have to drill the screw holes out. I am pretty sure the rads will not lay back to back. If I go for the bottom I have to flip the fan and still drill screw holes . My power supply over hangs the bottom fan mount. Dont think ill lose much radiator cooling if I put it at the bottom. Looks like either way I am - you get the idea


----------



## rv8000

Quote:


> Originally Posted by *royfrosty*
> 
> Guys just wanna ask. Is the Sapphire r9 Fury TriX reference pcb? Same as r9 Fury X?


Yes, its the same pcb.


----------



## Scorpion49

I'll copy this over to here, something I noticed when I was re-doing my thermal paste. The Tri-X cooler should work on a Fury X.
Quote:


> Originally Posted by *Scorpion49*
> 
> Whats interesting is I think the Tri-X cooler from Sapphire was designed for the Fury X and then slapped on the Fury later, which suggests AMD might not always have been saying it was reference only. I say this because it has provisions for extra VRM locations on the card, complete with thermal pads attached that aren't actually doing anything that coincide with the extra VRM location on the Fury X. All just speculation on my part of course.
> 
> Here is what I mean:
> 
> Fury
> 
> 
> Fury X
> 
> 
> Tri-X cooler


----------



## Agent Smith1984

Why do manufacturers continue to GLOB TIM on GPU's.... I never understood that.....

Anyways, looks like Fury owners can mount used AIO's off of ebay onto their cards, and visa versa, for anyone wanting to air cool the Fury X....
Not sure there's much sense to any of that, but I've seen stranger....


----------



## xer0h0ur

Quote:


> Originally Posted by *Scorpion49*
> 
> I'll copy this over to here, something I noticed when I was re-doing my thermal paste. The Tri-X cooler should work on a Fury X.


Good eye. Honestly there isn't a damn thing holding back AIBs from slapping an air cooler on Fury X other than AMD themselves. They are complete bafoons for locking them out of putting air coolers on Fury X. I know for a fact there are a boatload of people that would like that card but can't stick a radiator into their cases.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Good eye. Honestly there isn't a damn thing holding back AIBs from slapping an air cooler on Fury X other than AMD themselves. They are complete bafoons for locking them out of putting air coolers on Fury X. I know for a fact there are a boatload of people that would like that card but can't stick a radiator into their cases.


Agreed.....

I could see some $599 Air cooled Fury X cards doing well, and it will fill the middle ground between Fury and Fury X.... though I think some people would still opt for a water cooled Fury X, many would save $50 and get the air cooled version. I don't think it would cannibalize their sales on the AIO X at all though....


----------



## xer0h0ur

The thing is I can still see people paying the full price for Fury X despite being air cooled. I have seen people say as much.


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> Yes, its the same pcb.


Aint exactly the same... The stock fury x is 6 phase

The fury is probably 8 phase


----------



## xer0h0ur

Quote:


> Originally Posted by *bonami2*
> 
> Aint exactly the same... The stock fury x is 6 phase
> 
> The fury is probably 8 phase


I don't know where you're getting this from. Both Fury and Fury X use the same PCB with the same 6 phase power. The only card that is different is the Strix with 12 phase power.

http://www.legitreviews.com/sapphire-radeon-r9-fury-tri-x-oc-video-card-review_169018

"The AMD Radeon R9 Fury is a 275W card that has two 8-pin PCIe power connectors on it. *The Radeon R9 Fury has a 6-phase power design that is capable of delivering up to 400 Amps of power to the Fiji GPU.* There are two DIP switches on the backplate that allow you to enable or disable the GPU Tach and allows you to change the color of the LEDs between red and blue to go with your case theme better. Eight of the LEDs are for the GPU load level and can be red or blue. The ninth LED light is green and when it is lit up it visually lets you know that the GPU is in AMD's ZeroCore power mode."


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know where you're getting this from. Both Fury and Fury X use the same PCB with the same 6 phase power. The only card that is different is the Strix with 12 phase power.
> 
> http://www.legitreviews.com/sapphire-radeon-r9-fury-tri-x-oc-video-card-review_169018
> 
> "The AMD Radeon R9 Fury is a 275W card that has two 8-pin PCIe power connectors on it. *The Radeon R9 Fury has a 6-phase power design that is capable of delivering up to 400 Amps of power to the Fiji GPU.* There are two DIP switches on the backplate that allow you to enable or disable the GPU Tach and allows you to change the color of the LEDs between red and blue to go with your case theme better. Eight of the LEDs are for the GPU load level and can be red or blue. The ninth LED light is green and when it is lit up it visually lets you know that the GPU is in AMD's ZeroCore power mode."


Wut must be ******ed to put that kind of crap on that kind of power comsumption gpu

7950 are know to melt 6 phase if voltage is pushed

7970 are on 8 phase

Asus matrix had 20 phase.

And you tell me that those fury have 6 phase with air cooling...

How is the temp on the phase?









Found out http://www.tomshardware.com/reviews/sapphire-amd-radeon-r9-fury-tri-x-overclocked,4216-7.html

Melting piece of crap









Did not even touch voltage.......

I love sapphire we put the best cooler in the world and stock phase and people think they can overclock

At least the strixx is decent enough


----------



## fjordiales

Quote:


> Originally Posted by *bonami2*
> 
> Wut must be ******ed to put that kind of crap on that kind of power comsumption gpu
> 
> 7950 are know to melt 6 phase if voltage is pushed
> 
> 7970 are on 8 phase
> 
> Asus matrix had 20 phase.
> 
> And you tell me that those fury have 6 phase with air cooling...
> 
> How is the temp on the phase?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out http://www.tomshardware.com/reviews/sapphire-amd-radeon-r9-fury-tri-x-overclocked,4216-7.html
> 
> Melting piece of crap
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did not even touch voltage.......
> 
> I love sapphire we put the best cooler in the world and stock phase and people think they can overclock
> 
> At least the strixx is decent enough


Strix is 10+2 BUT it only runs 1.169v so far. I can get to 1040/540/120% in xfire. Anything more in core, crash. Anything more in mem and power target, same fps. 570 in mem, some artifacts.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fjordiales*
> 
> Strix is 10+2 BUT it only runs 1.169v so far. I can get to 1040/540/120% in xfire. Anything more in core, crash. Anything more in mem and power target, same fps. 570 in mem, some artifacts.


Makes no sense as to why they would build such a nice power circuitry, and then set the voltage so low.... The Tri-x ships with 1.2** like the Fury X....

Weren't they aiming the strix at overclockers? Not that it will matter once you get voltage control, but no one even knows when that is coming. This is the longest wait I've seen for voltage control in a long long time....


----------



## bonami2

Quote:


> Originally Posted by *fjordiales*
> 
> Strix is 10+2 BUT it only runs 1.169v so far. I can get to 1040/540/120% in xfire. Anything more in core, crash. Anything more in mem and power target, same fps. 570 in mem, some artifacts.


Voltage is locked?

Edit it is it seem


----------



## Orthello

Quote:


> Originally Posted by *fjordiales*
> 
> Strix is 10+2 BUT it only runs 1.169v so far. I can get to 1040/540/120% in xfire. Anything more in core, crash. Anything more in mem and power target, same fps. 570 in mem, some artifacts.


Once that new version of Trix is out with voltage control you might have some fun , i wonder if that will give the asus fury vcore control or if its non standard ? let us know anyway.

If it works especially keen to know your results with the Asus Fury .


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> I don't know where you're getting this from. Both Fury and Fury X use the same PCB with the same 6 phase power. The only card that is different is the Strix with 12 phase power.
> 
> http://www.legitreviews.com/sapphire-radeon-r9-fury-tri-x-oc-video-card-review_169018
> 
> "The AMD Radeon R9 Fury is a 275W card that has two 8-pin PCIe power connectors on it. *The Radeon R9 Fury has a 6-phase power design that is capable of delivering up to 400 Amps of power to the Fiji GPU.* There are two DIP switches on the backplate that allow you to enable or disable the GPU Tach and allows you to change the color of the LEDs between red and blue to go with your case theme better. Eight of the LEDs are for the GPU load level and can be red or blue. The ninth LED light is green and when it is lit up it visually lets you know that the GPU is in AMD's ZeroCore power mode."
> 
> 
> 
> Wut must be ******ed to put that kind of crap on that kind of power comsumption gpu
> 
> 7950 are know to melt 6 phase if voltage is pushed
> 
> 7970 are on 8 phase
> 
> Asus matrix had 20 phase.
> 
> And you tell me that those fury have 6 phase with air cooling...
> 
> How is the temp on the phase?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out http://www.tomshardware.com/reviews/sapphire-amd-radeon-r9-fury-tri-x-overclocked,4216-7.html
> 
> Melting piece of crap
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did not even touch voltage.......
> 
> I love sapphire we put the best cooler in the world and stock phase and people think they can overclock
> 
> At least the strixx is decent enough
Click to expand...

Cause. It's not rocket science. A 6 phase system capable of pushing 400amps. You still have a lot to learn about computers. The 6 phase system isn't crap. Your lack of knowledge is. The lesson here is that it is the QUALITY of phases that matter here. Not the QUANTITY. Oh but no. Wait. I've already told this to you on topics relating to 990FX motherboards.

Just cause your 6 phase system on your 7950 gets hot doesn't mean the Fury will have the same issue. I am almost willing to bet the 6 phase system of your 7950 will struggle to push out the over half that sort of amperage.

Before you lable something as crap please learn specifics about the technology that you are busy trying to lable. Cause once again. Your ranting will potentially scare a potential buyer off of the product.


----------



## Agent Smith1984

I'm surprised there is nothing out of the Asus CPUTweak camp regarding this....

I have seem several cards over the last few years, where certain versions of GPUTweak were either the only app that would unlock voltage, or the only app to give you max voltage control. They were on the cutting edge of voltage increase for a while there, especially when it came to their own cards, but they seemed to have drifted away from that some....


----------



## xer0h0ur

I am waiting to see if MSI is going to step in with a Fury Lightning using their own custom PCB design. Some reviewers alluded to the possibility of the Asus Strix's PCB design possibly being used later on in an OC variant of Fury which I get and all but then what was the point of using it at all on the Strix? Testing the design?


----------



## bonami2

Quote:


> Originally Posted by *Alastair*
> 
> Cause. It's not rocket science. A 6 phase system capable of pushing 400amps. You still have a lot to learn about computers. The 6 phase system isn't crap. Your lack of knowledge is. The lesson here is that it is the QUALITY of phases that matter here. Not the QUANTITY. Oh but no. Wait. I've already told this to you on topics relating to 990FX motherboards.
> 
> Just cause your 6 phase system on your 7950 gets hot doesn't mean the Fury will have the same issue. I am almost willing to bet the 6 phase system of your 7950 will struggle to push out the over half that sort of amperage.
> 
> Before you lable something as crap please learn specifics about the technology that you are busy trying to lable. Cause once again. Your ranting will potentially scare a potential buyer off of the product.


Wow quality is one thing and temperature of operation is another thing it can do 400amp on ln2 yea that his marketing crap...

And the link i shared show the vrm hitting 100c stock..

That his a piece of crap stop defending crap..

Im at 1.168v at 1070core hitting 90c for the 7950

So you think that sapphire is as strong as the asus or what? the asus phase are of higher quality... Higher quality soldering Higher number of phase.... etc

Anyways im happy with my 12 phase msi mobo that had 55-60c vrm load temp at 4.7 1.32v that cold as ice


----------



## fjordiales

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Makes no sense as to why they would build such a nice power circuitry, and then set the voltage so low.... The Tri-x ships with 1.2** like the Fury X....
> 
> Weren't they aiming the strix at overclockers? Not that it will matter once you get voltage control, but no one even knows when that is coming. This is the longest wait I've seen for voltage control in a long long time....


Quote:


> Originally Posted by *bonami2*
> 
> Voltage is locked?
> 
> Edit it is it seem


Quote:


> Originally Posted by *Orthello*
> 
> Once that new version of Trix is out with voltage control you might have some fun , i wonder if that will give the asus fury vcore control or if its non standard ? let us know anyway.
> 
> If it works especially keen to know your results with the Asus Fury .


I'm just gonna reply to all since I'm on my phone. Sapphire does get to 1.212v and has 2 bios. From my understanding, Strix is aimed at silence 1st then performance. It seems more like bios/voltage locked to 1.169v.

When the new voltage control comes out, I will experiment as soon as I can. With the Strix custom pcb and power phase but bios/voltage locked, It feels like driving a BMW m3 with rev limiter at 4k rpm. LOL


----------



## bonami2

what is wrong with you im the one with 2 infrared thermometer im the one who read sin motherboard spec im the one who always look at my vrm temp...

And your the one telling me that the stock Factory vrm are of high quality.

Well that why Ln2 overclocker overclock on sapphire gpu for sure..........................................................................

Msi lighthing Asus matrix trixx etc

And the new lineup soldering from asus is probably beating anything from everyone currently

http://www.legitreviews.com/asus-auto-extreme-technology-fully-automated-video-card-production_165392

Bam


----------



## bonami2

i must says i dont understand at all what is happening

All i said was it stupids to put 6 phase on air....

Im told they are higher quality....

They run at 100c stock..

It an enthusiasm gpu with tri fan

It bad like the gigabyte and run hot as hell

you put that in 3 ways crossfire and it gonna melt those vrm

but am wrong

Even the local canada computer told me they could not handle gigabyte 3 fan in sli crossfire setup because their phase are crappy and heat like mad.

But the Asus strix is just marketing......


----------



## rv8000

Just because I'm sick and tired of all the people saying Fiji OC scaling sucks, and voltage scaling is going to be terrible, well guess what it's not, and here's why:

Test system - i5 4670k @ 4.4ghz/ 16GB DDR3 @ 2400 Cas 10/ Cat 15.7 w/default CCC settings

Witcher 3 @ 1440/ Quality - Ultra *Foliage visibility set to high/ Gameworks - off/ In Game AA - on

*The following scores were averaged from 3 consecutive runs from the first town in the game, on a short run to the devil at the well, including combat against 3 ghouls for each clock setting and data was captured using FRAPS. *2 minute and 23 second run time.
_____________________________________

*R9 Fury @ 1000/500*

Min - 45 FPS Avg - 59.196 FPS Max - 75 FPS

*R9 Fury @ 1100/530 +50% PL no overvoltage*

Min - 54 FPS Avg - 64.585 FPS Max - 81 FPS
_____________________________________

Min FPS % increase - 20%

Avg FPS % increase - 9.11%

Max FPS % increase - 8%
_____________________________________

So to summarize, for roughly a 10% oc on the card with no voltage control using a stock bios, *avg FPS in TW3 saw an increase of 9.11% and surprisingly the min framerate saw an increase of 20%!* If anyone would really like to say Fiji is a dud of an overclocker, let them, I'll be sitting here laughing while I bench and game on my Fury with 10-15% lead on average vs an *overclocked* 980, that people likely paid upwards of $600 for when they originally bought their AIB cards. A 20% OC on an r9 Fury is likely to show an increase of ~18.5% before really messing with the vram, which I also don't find to be worth it.

See you guys when TrixX lets me pump volts through this thing









P.S. forgive my rage, really sick of people blasting the forums who have no clue what they're talking about today


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> Just because I'm sick and tired of all the people saying Fiji OC scaling sucks, and voltage scaling is going to be terrible, well guess what it's not, and here's why:
> 
> Test system - i5 4670k @ 4.4ghz/ 16GB DDR3 @ 2400 Cas 10/ Cat 15.7 w/default CCC settings
> 
> Witcher 3 @ 1440/ Quality - Ultra *Foliage visibility set to high/ Gameworks - off/ In Game AA - on
> 
> *The following scores were averaged from 3 consecutive runs from the first town in the game, on a short run to the devil at the well, including combat against 3 ghouls for each clock setting and data was captured using FRAPS. *2 minute and 23 second run time.
> _____________________________________
> 
> *R9 Fury @ 1000/500*
> 
> Min - 45 FPS Avg - 59.196 FPS Max - 75 FPS
> 
> *R9 Fury @ 1100/530 +50% PL no overvoltage*
> 
> Min - 54 FPS Avg - 64.585 FPS Max - 81 FPS
> _____________________________________
> 
> Min FPS % increase - 20%
> 
> Avg FPS % increase - 9.11%
> 
> Max FPS % increase - 8%
> _____________________________________
> 
> So to summarize, for roughly a 10% oc on the card with no voltage control using a stock bios, *avg FPS in TW3 saw an increase of 9.11% and surprisingly the min framerate saw an increase of 20%!* If anyone would really like to say Fiji is a dud of an overclocker, let them, I'll be sitting here laughing while I bench and game on my Fury with 10-15% on average vs an *overclocked* 980, that people likely paid upwards of $600 for when they originally bought their AIB cards. A 20% OC on an r9 Fury is likely to show an increase of ~18.5% before really messing with the vram, which I also don't find to be worth it.
> 
> See you guys when TrixX lets me pump volts through this thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. forgive my rage, really sick of people blasting the forums who have no clue what they're talking about today


That are awesome result...









But take care with voltage review show those vrm to be hot as lava stock


----------



## fjordiales

Quote:


> Originally Posted by *rv8000*
> 
> Just because I'm sick and tired of all the people saying Fiji OC scaling sucks, and voltage scaling is going to be terrible, well guess what it's not, and here's why:
> 
> Test system - i5 4670k @ 4.4ghz/ 16GB DDR3 @ 2400 Cas 10/ Cat 15.7 w/default CCC settings
> 
> Witcher 3 @ 1440/ Quality - Ultra *Foliage visibility set to high/ Gameworks - off/ In Game AA - on
> 
> *The following scores were averaged from 3 consecutive runs from the first town in the game, on a short run to the devil at the well, including combat against 3 ghouls for each clock setting and data was captured using FRAPS. *2 minute and 23 second run time.
> _____________________________________
> 
> *R9 Fury @ 1000/500*
> 
> Min - 45 FPS Avg - 59.196 FPS Max - 75 FPS
> 
> *R9 Fury @ 1100/530 +50% PL no overvoltage*
> 
> Min - 54 FPS Avg - 64.585 FPS Max - 81 FPS
> _____________________________________
> 
> Min FPS % increase - 20%
> 
> Avg FPS % increase - 9.11%
> 
> Max FPS % increase - 8%
> _____________________________________
> 
> So to summarize, for roughly a 10% oc on the card with no voltage control using a stock bios, *avg FPS in TW3 saw an increase of 9.11% and surprisingly the min framerate saw an increase of 20%!* If anyone would really like to say Fiji is a dud of an overclocker, let them, I'll be sitting here laughing while I bench and game on my Fury with 10-15% on average vs an *overclocked* 980, that people likely paid upwards of $600 for when they originally bought their AIB cards. A 20% OC on an r9 Fury is likely to show an increase of ~18.5% before really messing with the vram, which I also don't find to be worth it.
> 
> See you guys when TrixX lets me pump volts through this thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. forgive my rage, really sick of people blasting the forums who have no clue what they're talking about today


I'm jealous of your OC results since we have the same cards. Might try your settings and give it a shot again.


----------



## xer0h0ur

Quote:


> Originally Posted by *bonami2*
> 
> That are awesome result...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But take care with voltage review show this gpu to be hot as lava stock


The GPU does not run hot. You keep speaking nonsense. Is nonsense your first language?


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> The GPU does not run hot. You keep speaking nonsense. Is nonsense your first language?


Edit error writed gpu and mean vrm


----------



## xer0h0ur

Learn the terminology you're using at the very least. The VRMs /= GPU. You flat out said GPU....which is not running "hot as lava"....


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> Learn the terminology you're using at the very least. The VRMs /= GPU. You flat out said GPU....which is not running "hot as lava"....


Oh well yea sorry i mean the voltage Regulating module









Sorry for that did not really realise that one i was thinking graphic card ...... Not Graphical processing unit....


----------



## Ceadderman

Quote:


> Originally Posted by *rv8000*
> 
> 
> 
> Spoiler: Tell us how you really feel!
> 
> 
> 
> Just because I'm sick and tired of all the people saying Fiji OC scaling sucks, and voltage scaling is going to be terrible, well guess what it's not, and here's why:
> 
> Test system - i5 4670k @ 4.4ghz/ 16GB DDR3 @ 2400 Cas 10/ Cat 15.7 w/default CCC settings
> 
> Witcher 3 @ 1440/ Quality - Ultra *Foliage visibility set to high/ Gameworks - off/ In Game AA - on
> 
> *The following scores were averaged from 3 consecutive runs from the first town in the game, on a short run to the devil at the well, including combat against 3 ghouls for each clock setting and data was captured using FRAPS. *2 minute and 23 second run time.
> _____________________________________
> 
> *R9 Fury @ 1000/500*
> 
> Min - 45 FPS Avg - 59.196 FPS Max - 75 FPS
> 
> *R9 Fury @ 1100/530 +50% PL no overvoltage*
> 
> Min - 54 FPS Avg - 64.585 FPS Max - 81 FPS
> _____________________________________
> 
> Min FPS % increase - 20%
> 
> Avg FPS % increase - 9.11%
> 
> Max FPS % increase - 8%
> _____________________________________
> 
> So to summarize, for roughly a 10% oc on the card with no voltage control using a stock bios, *avg FPS in TW3 saw an increase of 9.11% and surprisingly the min framerate saw an increase of 20%!* If anyone would really like to say Fiji is a dud of an overclocker, let them, I'll be sitting here laughing while I bench and game on my Fury with 10-15% lead on average vs an *overclocked* 980, that people likely paid upwards of $600 for when they originally bought their AIB cards. A 20% OC on an r9 Fury is likely to show an increase of ~18.5% before really messing with the vram, which I also don't find to be worth it.
> 
> See you guys when TrixX lets me pump volts through this thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. forgive my rage, really sick of people blasting the forums who have no clue what they're talking about today


If this is Rage bro, then I don't wanna be anywhere near you when you really have a meltdown.









+Rep for this.









~Ceadder


----------



## rx7racer

Has anybody reported temps associated with putting either Fury X or Fury under the EKFC block?

Been in a huge debate on maybe dropping for Fury since one can at least find them and dunking it under some water. But if it isn't much better than Fury X temps then meh, although it should be much better I haven't seen anyone who said they got an EKFC report back or show, but I might have missed it as well if they did.


----------



## bonami2

Quote:


> Originally Posted by *rx7racer*
> 
> Has anybody reported temps associated with putting either Fury X or Fury under the EKFC block?
> 
> Been in a huge debate on maybe dropping for Fury since one can at least find them and dunking it under some water. But if it isn't much better than Fury X temps then meh, although it should be much better I haven't seen anyone who said they got an EKFC report back or show, but I might have missed it as well if they did.


It sure would have better temp with a big rad. But the vrm is another story except if you watercool them with pad and pipe and etc. or something with fan...

The strix on water should do pretty well. but it a custom pcb so idk


----------



## rv8000

Quote:


> Originally Posted by *bonami2*
> 
> It sure would have better temp with a big rad. But the vrm is another story except if you watercool them with pad and pipe and etc. or something with fan...
> 
> The strix on water should do pretty well. but it a custom pcb so idk


Please stop this nonsense about the VRM's, under water there would be no issue with a full block. Stop posting about something you are proving to know very little about.


----------



## fjordiales

Quote:


> Originally Posted by *rv8000*
> 
> Just because I'm sick and tired of all the people saying Fiji OC scaling sucks, and voltage scaling is going to be terrible, well guess what it's not, and here's why:
> 
> Test system - i5 4670k @ 4.4ghz/ 16GB DDR3 @ 2400 Cas 10/ Cat 15.7 w/default CCC settings
> 
> Witcher 3 @ 1440/ Quality - Ultra *Foliage visibility set to high/ Gameworks - off/ In Game AA - on
> 
> *The following scores were averaged from 3 consecutive runs from the first town in the game, on a short run to the devil at the well, including combat against 3 ghouls for each clock setting and data was captured using FRAPS. *2 minute and 23 second run time.
> _____________________________________
> 
> *R9 Fury @ 1000/500*
> 
> Min - 45 FPS Avg - 59.196 FPS Max - 75 FPS
> 
> *R9 Fury @ 1100/530 +50% PL no overvoltage*
> 
> Min - 54 FPS Avg - 64.585 FPS Max - 81 FPS
> _____________________________________
> 
> Min FPS % increase - 20%
> 
> Avg FPS % increase - 9.11%
> 
> Max FPS % increase - 8%
> _____________________________________
> 
> So to summarize, for roughly a 10% oc on the card with no voltage control using a stock bios, *avg FPS in TW3 saw an increase of 9.11% and surprisingly the min framerate saw an increase of 20%!* If anyone would really like to say Fiji is a dud of an overclocker, let them, I'll be sitting here laughing while I bench and game on my Fury with 10-15% lead on average vs an *overclocked* 980, that people likely paid upwards of $600 for when they originally bought their AIB cards. A 20% OC on an r9 Fury is likely to show an increase of ~18.5% before really messing with the vram, which I also don't find to be worth it.
> 
> See you guys when TrixX lets me pump volts through this thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. forgive my rage, really sick of people blasting the forums who have no clue what they're talking about today


You handled your rage better than me. I PM others regarding what I did and I have no regrets. So I might as well tell everyone what I did. Basically, last week I got these.





I also still have my 780 Ti Classy in SLI in my wife's build.



But unfortunately I got disappointed on this.





And this...

https://forums.geforce.com/default/topic/833016/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/

So I went on a "rage refund" since I got disappointed. I'm not into the GPU brand/tech wars. I just love new tech. If fury X had a custom aircool, I would've got it. But with r9 fury being $100 less per card and performs close, I went with it. I know I'm gonna get a lot of flak on this but the zotac card wasn't all that compared to Strix & Tri-X. Granted it is the "fastest" Maxwell out of the box but still, there are a lot of complaints regarding coil whine, erratic fan, noise, wrong backplate, and my complaint being WARRANTY. I'm happy with my Fury. I need to do more research on AMD stuff though because I only had Nvidia since 2011.


----------



## bonami2

Well did you says full block i did not see that..... I seen ekbc i have no idea if you where talking about a full or not ( i see it now ) Like people slapping h60 on gpu i was just sharing.

Not my fault if i know how block work but did not think of the Vrm part...... I was thinking of a single block uh

But anyways the nonsense is you asking if a waterloop with 360 480 rad will cool better than a 120mm.

Thank you for the friendlyness...............

Edit : IM NOT EVEN RESPONDING TO THE GOOD PERSON


----------



## bonami2

Quote:


> Originally Posted by *fjordiales*
> 
> You handled your rage better than me. I PM others regarding what I did and I have no regrets. So I might as well tell everyone what I did. Basically, last week I got these.
> 
> 
> 
> 
> 
> I also still have my 780 Ti Classy in SLI in my wife's build.
> 
> 
> 
> But unfortunately I got disappointed on this.
> 
> 
> 
> 
> 
> And this...
> 
> https://forums.geforce.com/default/topic/833016/gtx-780-possible-fail-as-performance-in-the-witcher-3-wild-hunt-/
> 
> So I went on a "rage refund" since I got disappointed. I'm not into the GPU brand/tech wars. I just love new tech. If fury X had a custom aircool, I would've got it. But with r9 fury being $100 less per card and performs close, I went with it. I know I'm gonna get a lot of flak on this but the zotac card wasn't all that compared to Strix & Tri-X. Granted it is the "fastest" Maxwell out of the box but still, there are a lot of complaints regarding coil whine, erratic fan, noise, wrong backplate, and my complaint being WARRANTY. I'm happy with my Fury. I need to do more research on AMD stuff though because I only had Nvidia since 2011.


Happy to see you got those Asus. 3 years warantly







Even sapphire is on 2 years well it was 1 years not so long ago


----------



## rx7racer

Quote:


> Originally Posted by *bonami2*
> 
> Well did you says full block i did not see that..... I seen ekbc i have no idea if you where talking about a full or not ( i see it now ) Like people slapping h60 on gpu i was just sharing.
> 
> Not my fault if i know how block work but did not think of the Vrm part...... I was thinking of a single block uh
> 
> But anyways the nonsense is you asking if a waterloop with 360 480 rad will cool better than a 120mm.
> 
> Thank you for the friendlyness...............


Well your input is fine in my book, and I ask because it doesn't look like they have an active channel to flow water looking at the back of the EKFC for the Fury X and Fury.

I'm actually not worried about the gpu as much as the vrm's due to this. Someone might have asked EK already and I missed it but like I said it appears the full block doesn't actively cool the 6 phase vrm section.

Details, always in the details.

And I'd be running it in my current loop with a 360 rad and a 240 rad. I'm just curious about specifics if anyone might have an idea how the full cover block handles the vrm's.


----------



## bonami2

Quote:


> Originally Posted by *rx7racer*
> 
> Well your input is fine in my book, and I ask because it doesn't look like they have an active channel to flow water looking at the back of the EKFC for the Fury X and Fury.
> 
> I'm actually not worried about the gpu as much as the vrm's due to this. Someone might have asked EK already and I missed it but like I said it appears the full block doesn't actively cool the 6 phase vrm section.
> 
> Details, always in the details.
> 
> And I'd be running it in my current loop with a 360 rad and a 240 rad. I'm just curious about specifics if anyone might have an idea how the full cover clock handles the vrm's.


Oh crap their is 2 guy sorry i was sure you replied to me badly but it the other guy...... Damn i love those guy who are always trying to control everyone when they just should shut up and hide...

Well i was telling me the same question considering myself water in my future possible Extreme build. Vrm seem to be well cooled on most block.... But those crappy 6 phase may have harder time to cool them looking at the fury where the water pipe is going over it..

That may the problem just there :::: if amd put the pipe on it it mean it hot and hard to cool i think.

Personnaly id take the asus if the core temp is good enough Sure it going to be noisier thought than water

Thank you to be friendly


----------



## rv8000

Quote:


> Originally Posted by *rx7racer*
> 
> Well your input is fine in my book, and I ask because it doesn't look like they have an active channel to flow water looking at the back of the EKFC for the Fury X and Fury.
> 
> I'm actually not worried about the gpu as much as the vrm's due to this. Someone might have asked EK already and I missed it but like I said it appears the full block doesn't actively cool the 6 phase vrm section.
> 
> Details, always in the details.
> 
> And I'd be running it in my current loop with a 360 rad and a 240 rad. I'm just curious about specifics if anyone might have an idea how the full cover block handles the vrm's.


Directly from the product page on EK's site for the Fury/Fury X Block (ref pcb)...
Quote:


> EK-FC R9 Fury X *directly cools the GPU, HBM as well as VRM* (voltage regulation module) as water flows directly over these critical areas, thus allowing the graphics card and it's VRM to remain stable under high overclocks.


No one would touch their waterblocks if they overlooked something as simple as cooling the VRMs properly.


----------



## Ehsteve

Quote:


> Originally Posted by *rx7racer*
> 
> Has anybody reported temps associated with putting either Fury X or Fury under the EKFC block?
> 
> Been in a huge debate on maybe dropping for Fury since one can at least find them and dunking it under some water. But if it isn't much better than Fury X temps then meh, although it should be much better I haven't seen anyone who said they got an EKFC report back or show, but I might have missed it as well if they did.


.
I've got crossfire Fury X's on EK full cover blocks plus backplate running cold at around 50-58 under max load and 22-25 without load.

Components:

Radiators: 2x240 Koolance 30FPI radiator
CPU Block: Koolance 380I
GPU Blocks: 2xEK-FC Fury X NickelxAcetal
Fans: 4xNoctua Industrial 2000RPM PWM (running 10%-15% under full load)
Tubing: PrimoChill Clear 3/8ID 5/8OD
Fluid: Distilled water + silver coil.

Water loops runs: Pump>Rad>GPU>GPU>CPU>Rad>Reservoir>Pump

No overclock in place.


----------



## rx7racer

Quote:


> Originally Posted by *rv8000*
> 
> Directly from the product page on EK's site for the Fury/Fury X Block (ref pcb)...
> No one would touch their waterblocks if they overlooked something as simple as cooling the VRMs properly.


Right right, which the base plate is technically actively cooled but I'm hesitant without seeing the block taken apart to fully trust that.

If you look at the back of the block I see no room for much of a channel. And that connector plate which is more than likely the channel guide for the VRM if that's it doesn't let you see what is actually going on.

Not going to apologize because I question things.

Have read a few saying they snagged a block, hopefully one of them will pop in and chime in a bit.

It's a basic question just because if I go with a full cover block I'm going to snag Fury but if vrms are still going to be hitting 85-90c then I might as well get Fury X instead and just break my loop to cpu only and run a simple AIO.

Just weighing my options out on the table.
Quote:


> Originally Posted by *Ehsteve*
> 
> .
> I've got crossfire Fury X's on EK full cover blocks plus backplate running cold at around 50-58 under max load and 22-25 without load.
> 
> Components:
> 
> Radiators: 2x240 Koolance 30FPI radiator
> CPU Block: Koolance 380I
> GPU Blocks: 2xEK-FC Fury X NickelxAcetal
> Fans: 4xNoctua Industrial 2000RPM PWM (running 10%-15% under full load)
> Tubing: PrimoChill Clear 3/8ID 5/8OD
> Fluid: Distilled water + silver coil.
> 
> Water loops runs: Pump>Rad>GPU>GPU>CPU>Rad>Reservoir>Pump
> 
> No overclock in place.


Thanks, appreciate that tidbit, I will be grabbing the backplate as well for use. Did you notice depending on which version you got, not sure if you got clear acrylic but could you see how it was channeled for the vrm's?

I should mention I also have a EK Supremacy VGA block so I have a few options because technically I can drop whatever card I want on water and aircool the vrms. Want to make sure it'd even be worth dropping for a FC block. I had stopped using FC blocks after my GTX 480 SLI setup as we all know FC blocks take a heck of a hit for resale usually.

Anyway I do appreciate any info all has given me to be able to educate my buy even more.


----------



## Ceadderman

Ny1 running Thuban with Fury X or Fury X crossfire?

Doubtful but I figure it's worth the question.

If there are, are there any issues to report? Considering throwing in the towel and scooping up a Fury X and EKWB.









~Ceadder


----------



## Ehsteve

Quote:


> Originally Posted by *rx7racer*
> 
> Right right, which the base plate is technically actively cooled but I'm hesitant without seeing the block taken apart to fully trust that.
> 
> If you look at the back of the block I see no room for much of a channel. And that connector plate which is more than likely the channel guide for the VRM if that's it doesn't let you see what is actually going on.
> 
> Not going to apologize because I question things.
> 
> Have read a few saying they snagged a block, hopefully one of them will pop in and chime in a bit.
> 
> It's a basic question just because if I go with a full cover block I'm going to snag Fury but if vrms are still going to be hitting 85-90c then I might as well get Fury X instead and just break my loop to cpu only and run a simple AIO.
> 
> Just weighing my options out on the table.
> Thanks, appreciate that tidbit, I will be grabbing the backplate as well for use. Did you notice depending on which version you got, not sure if you got clear acrylic but could you see how it was channeled for the vrm's?
> 
> I should mention I also have a EK Supremacy VGA block so I have a few options because technically I can drop whatever card I want on water and aircool the vrms. Want to make sure it'd even be worth dropping for a FC block. I had stopped using FC blocks after my GTX 480 SLI setup as we all know FC blocks take a heck of a hit for resale usually.
> 
> Anyway I do appreciate any info all has given me to be able to educate my buy even more.


Just have a squiz at these instructions for the EK blocks to show where the thermal pads are placed:

Water Block: https://shop.ekwb.com/EK-IM/EK-IM-3831109830826.pdf
Backplate: https://shop.ekwb.com/EK-IM/EK-IM-3831109830840.pdf

The NickelxAcetal version is completely opaque, so no chance to see the internal channels, but I do know that from comparing the stock plate on the AIO to the EKWB solution, they're pretty much identical for contact points on the card (just from eyeballing at the time, didn't pull out a ruler). But the number speak for themselves (though again it might be the loop setup or the extra radiator space).

The main advantage I personally saw with the EK-FC blocks was the single slot solution (as much as I love the retro-style RADEON logo in neon red and etched into the I/O plate) because I had to fit in a sound card and PCIe SSD, where as it would be otherwise impossible with the stock options. Funnily enough for this build the Fury X wasn't the first choice (got a pair of 980tis but both mysteriously bricked with the EK solution after testing fine on the dry run). Took all the same precautions with the Fury X and they worked, luck of the draw I guess. However, much happier with the Fury X's since they're no longer blocking vital airflow over the SSHDs or access to the SATA ports on the Mobo and are MUCH lighter.

That said, I'm currently working through some crossfire idling issues where the second card sometimes won't wake from idle (verified with green load LED and 0MHz clock in MSI Afterburner), forcing a restart to get the lazy bastard to work again. Won't even get going if you start another benchmark or crossfire-compatible game or even force a crossfire profile. Performance is fine otherwise.


----------



## semitope

Quote:


> Originally Posted by *fjordiales*
> 
> You handled your rage better than me. I PM others regarding what I did and I have no regrets. So I might as well tell everyone what I did. Basically, last week I got these.


that looks so wrong. No space at all between cards. usually water > air in multi GPU cases but the fury cards look like they leave more space. Should be fine


----------



## rx7racer

Quote:


> Originally Posted by *Ehsteve*
> 
> Just have a squiz at these instructions for the EK blocks to show where the thermal pads are placed:
> 
> Water Block: https://shop.ekwb.com/EK-IM/EK-IM-3831109830826.pdf
> Backplate: https://shop.ekwb.com/EK-IM/EK-IM-3831109830840.pdf
> 
> The NickelxAcetal version is completely opaque, so no chance to see the internal channels, but I do know that from comparing the stock plate on the AIO to the EKWB solution, they're pretty much identical for contact points on the card (just from eyeballing at the time, didn't pull out a ruler). But the number speak for themselves (though again it might be the loop setup or the extra radiator space).
> 
> snip.


Maybe this will show why i am asking. I know there is the copper plate touching vrm but I don't see how it's actually in direct contact with water without the heat having to travel farther down the block.

My quick snippet skills at work.











Just a simple observation and curiosity, not a bad design or anything not questioning that, just interested in the details.


----------



## Forceman

Quote:


> Originally Posted by *rx7racer*
> 
> Maybe this will show why i am asking. I know there is the copper plate touching vrm but I don't see how it's actually in direct contact with water without the heat having to travel farther down the block.
> 
> My quick snippet skills at work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just a simple observation and curiosity, not a bad design or anything not questioning that, just interested in the details.


I think the water channel is in the plexiglass in that area. The 290X blocks look the same, and they have the water channel on that side. That's why there's a nickel cover in that area.


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> Directly from the product page on EK's site for the Fury/Fury X Block (ref pcb)...
> No one would touch their waterblocks if they overlooked something as simple as cooling the VRMs properly.


Well the rule is to trust no manufacturer You clearly did not learn that one with overclock.net.

You own the sapphire with stock high vrm temp while the asus is the same price and has a better warantly









So that mean even if the product would cool the vrm badly people would buy it thinking it would because well....................................................................









99% of people dont even know what vrm mean... How do i know that well i worked in 3 shop and never seen anyone care at all they read the spec on the box

Anyways in 99% of case your right that block should cool the vrm extremely well.


----------



## xer0h0ur




----------



## xer0h0ur

This is the best I could manage to find to give you a view at the channel since its the acrylic block.


----------



## Ehsteve

Quote:


> Originally Posted by *xer0h0ur*
> 
> This is the best I could manage to find to give you a view at the channel since its the acrylic block.


Good spot, you can see the gasket from that angle showing the channel in and out of the VRM.


----------



## xer0h0ur

Yeah I mean I had no doubts that there was a water channel for the VRMs. No one at EK is that full potato. If cooling the GPU is waterblocking 101 then cooling the VRMs is waterblocking 102.


----------



## en9dmp

Quote:


> Originally Posted by *rx7racer*
> 
> Has anybody reported temps associated with putting either Fury X or Fury under the EKFC block?
> 
> Been in a huge debate on maybe dropping for Fury since one can at least find them and dunking it under some water. But if it isn't much better than Fury X temps then meh, although it should be much better I haven't seen anyone who said they got an EKFC report back or show, but I might have missed it as well if they did.


I have 2 fury x cards with EK blocks and an overclocked i5-4670K in my loop cooled by an alphacool nexxxos UT280 rad. Under full GPU load on both cards for 30 mins, I still can't get either core temps to hit 40°C...

The difference between the stock cooling and a full custom loop has been night and day for me. Allowed me to get much better overclocks as well, even without voltage control, and its now completely silent (apart from coil whine).

Sorry I haven't checked the vrm temps as I didn't think these were being reported on any software. Are these available through GPU-Z?


----------



## Newbie2009

Quote:


> Originally Posted by *en9dmp*
> 
> I have 2 fury x cards with EK blocks and an overclocked i5-4670K in my loop cooled by an alphacool nexxxos UT280 rad. Under full GPU load on both cards for 30 mins, I still can't get either core temps to hit 40°C...
> 
> The difference between the stock cooling and a full custom loop has been night and day for me. Allowed me to get much better overclocks as well, even without voltage control, and its now completely silent (apart from coil whine).
> 
> Sorry I haven't checked the vrm temps as I didn't think these were being reported on any software. Are these available through GPU-Z?


Once you go custom loop on GPUs, there is no going back.


----------



## en9dmp

Quote:


> Originally Posted by *Newbie2009*
> 
> Once you go custom loop on GPUs, there is no going back.


Yeah, I've been custom ever since the first Zalman reserator came out back in the early 2000s... This was the first time I've bought a card before the blocks were available. I'm totally amazed by the temps, I've never seen a card run so cool. Dying to pump some more volts through and see what can be achieved!

My i5-4670K on the other hand is a different story... Poor overclocker and hits 95°C on prime95. Its idle temps are only a few degrees lower than than both Fury Xs at load!


----------



## Newbie2009

Quote:


> Originally Posted by *en9dmp*
> 
> Yeah, I've been custom ever since the first Zalman reserator came out back in the early 2000s... This was the first time I've bought a card before the blocks were available. I'm totally amazed by the temps, I've never seen a card run so cool. Dying to pump some more volts through and see what can be achieved!
> 
> My i5-4670K on the other hand is a different story... Poor overclocker and hits 95°C on prime95. Its idle temps are only a few degrees lower than than both Fury Xs at load!


Very impressive indeed. Have you tried Witcher 3? That is the game that makes my cards run hottest, low 50s.


----------



## en9dmp

Quote:


> Originally Posted by *Newbie2009*
> 
> Very impressive indeed. Have you tried Witcher 3? That is the game that makes my cards run hottest, low 50s.


Not given W3 a try, but the temps I'm getting are the same across GTA V, project cars, far cry 4, next car game, and 3d mark.


----------



## spyshagg

Quote:


> Originally Posted by *Forceman*
> 
> I think the water channel is in the plexiglass in that area. The 290X blocks look the same, and they have the water channel on that side. That's why there's a nickel cover in that area.


The EK on the 290x does not do a good job cooling the VRM's compared to, say, the Aquacomputer full cover block.

I have both the EK and the Aquacomputer. The VRM temps on the Aquacomputer are bellow 40c and the EK temps are almost 60

Even though both blocks cool the GPU very well (1ºc difference!), the VRMs on the EK are much worse.

The xtremerigs 290x WC full blocks round up agrees with my findings.


----------



## Newbie2009

Quote:


> Originally Posted by *en9dmp*
> 
> Not given W3 a try, but the temps I'm getting are the same across GTA V, project cars, far cry 4, next car game, and 3d mark.


Still sub 40c is v impressive. I'd say most games push my cards to about 45c


----------



## rx7racer

Went ahead and pulled the trigger for the EK block and the Fury, Thanks for helping to feed some curiosity all.









Just never know and there is always differences in full cover blocks so just had to ask.









Also since the block is reference only that's main reason I'm asking, just want the vrms as cool as I can hope them to be for OC play.


----------



## Alastair

Does aqua Computer sell a backplate for their Fury block? Which block performs better overall? EK or aqua computer? For core and vrm temps.


----------



## 00riddler

aquacomputer backplates are in development (stated in their forum).
There will be an active and a passive version.


----------



## Maximization

Quote:


> Originally Posted by *00riddler*
> 
> aquacomputer backplates are in development (stated in their forum).
> There will be an active and a passive version.


interesting


----------



## xer0h0ur

Quote:


> Originally Posted by *spyshagg*
> 
> The EK on the 290x does not do a good job cooling the VRM's compared to, say, the Aquacomputer full cover block.
> 
> I have both the EK and the Aquacomputer. The VRM temps on the Aquacomputer are bellow 40c and the EK temps are almost 60
> 
> Even though both blocks cool the GPU very well (1ºc difference!), the VRMs on the EK are much worse.
> 
> The xtremerigs 290x WC full blocks round up agrees with my findings.


LOL were talking about VRMs rated at 125C. I personally couldn't care less if they were at 80C. As long as the block keeps them well away from the original air cooled VRM temps its fine by me. Interestingly enough I never have noted my VRM temps at load throughout a gaming session. I suppose you have peaked my curiosity to log it next time I play.


----------



## spyshagg

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL were talking about VRMs rated at 125C. I personally couldn't care less if they were at 80C. As long as the block keeps them well away from the original air cooled VRM temps its fine by me. Interestingly enough I never have noted my VRM temps at load throughout a gaming session. I suppose you have peaked my curiosity to log it next time I play.


True but we always want colder stuff god damn it lol

Without backplates the EK is much better than the aqua computer on the VRM's though.

As you said, I'm happy with mine. 57º is OK for 1.325v


----------



## Scorpion49

Quote:


> Originally Posted by *rx7racer*
> 
> Went ahead and pulled the trigger for the EK block and the Fury, Thanks for helping to feed some curiosity all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just never know and there is always differences in full cover blocks so just had to ask.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also since the block is reference only that's main reason I'm asking, just want the vrms as cool as I can hope them to be for OC play.


Hope yours isn't a whiner like mine is bud. Good luck with it.


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL were talking about VRMs rated at 125C. I personally couldn't care less if they were at 80C. As long as the block keeps them well away from the original air cooled VRM temps its fine by me. Interestingly enough I never have noted my VRM temps at load throughout a gaming session. I suppose you have peaked my curiosity to log it next time I play.


While they may be rated for 125c, they won't be very stable if you get that hot at all.

Why bother going water if you're happy with 80c VRM temps? My Vapor X 290 keeps my VRMs happily under 60c even when overvolted to 75mv @1175/1650mhz.


----------



## xer0h0ur

My hats off to the people that go so overboard that they manage to keep Hawaii cores under 40C. That is a lot of radiator real estate overkill and I have seen a number of people go that route.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gumbi*
> 
> While they may be rated for 125c, they won't be very stable if you get that hot at all.
> 
> Why bother going water if you're happy with 80c VRM temps? My Vapor X 290 keeps my VRMs happily under 60c even when overvolted to 75mv @1175/1650mhz.


Isn't it obvious? Because I value keeping the GPUs cool over keeping the VRMs cool although the water block accomplishes both. If I didn't care about keeping things as cool as possible I wouldn't have replaced all the thermal pads EK gives with Fujipoly Ultra Extreme either. I merely believe its nitpicking to want 40C temps over 60C temps when they are rated up to 125C. For what its worth you're also comparing different PCB designs which isn't an apples to apples comparison. A reference card like my 290X will never keep its VRMs as cool as say your Vapor X's.


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Isn't it obvious? Because I value keeping the GPUs cool over keeping the VRMs cool although the water block accomplishes both. If I didn't care about keeping things as cool as possible I wouldn't have replaced all the thermal pads EK gives with Fujipoly Ultra Extreme either. I merely believe its nitpicking to want 40C temps over 60C temps when they are rated up to 125C. For what its worth you're also comparing different PCB designs which isn't an apples to apples comparison. A reference card like my 290X will never keep its VRMs as cool as say your Vapor X's.


But why do you consider keeping the core cool and not consider keeping the VRMs as cool important? Why not both? The core is perfectly safe at higher temps too and yet you seem to value keeping that cool.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gumbi*
> 
> But why do you consider keeping the core cool and not consider keeping the VRMs as cool important? Why not both? The core is perfectly safe at higher temps too and yet you seem to value keeping that cool.


Because you're far more likely to run into performance issues related to GPU temps than VRM temps. Although obviously things change when you're adding voltage and making those VRMs nice and toasty. Again, you would be nitpicking to act as if 60C isn't perfectly as tolerable as 40C when they are rated up to 125C. Plain and simple. The GPUs on the other hand aren't rated to handle nearly as high of a temperature so clearly the priority will always in my mind be first cooling the GPUs as well as possible and then the VRMs after that.


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Because you're far more likely to run into performance issues related to GPU temps than VRM temps. Although obviously things change when you're adding voltage and making those VRMs nice and toasty. Again, you would be nitpicking to act as if 60C isn't perfectly as tolerable as 40C when they are rated up to 125C. Plain and simple. The GPUs on the other hand aren't rated to handle nearly as high of a temperature so clearly the priority will always in my mind be first cooling the GPUs as well as possible and then the VRMs after that.


Well they're rated to at least 95 degrees as per AMD reference specs.

In any case, I see your point to an extent (stability has more to do with core temps than VRM temps).


----------



## sugarhell

125C is the close down rate. VRMs in general are rated for 70C-80C optimal efficiency at stock settings. Not 125C..


----------



## xer0h0ur

The main reason I even water cooled my current rig was all because of the half-measure AIO solution Asetek provided AMD for the 295X2 which didn't water cool the VRMs coupled with the fact they lowered the max GPU temp to 75C due to the water pumps. Unless you have a decently large case with optimal airflow then you would end up reaching 74C, throttling AND your VRMs would be nice and toasty without even overclocking to begin with. I later ended up adding a 290X I got for a song and I am sure you know how noisy and crappy the reference design blower style cooler is with 290X's so I said screw it, revamped the loop, went full hog and dropped a matching block on the 290X. Haven't looked back since.


----------



## Agent Smith1984

I've found that keeping VRM's under 80C can be a key factor in maintaining an overclock, where the core itself doesn't offer up much headroom whether it's hitting 65C, or it's hitting 85C....

Of course that's speaking in terms of air cooling. I know the core's tendencies will change by a great deal when you are talking about shaving 20c or more off from water cooling, but not sure if the VRM's benefit from a temp drop that significant or not.


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> The main reason I even water cooled my current rig was all because of the half-measure AIO solution Asetek provided AMD for the 295X2 which didn't water cool the VRMs coupled with the fact they lowered the max GPU temp to 75C due to the water pumps. Unless you have a decently large case with optimal airflow then you would end up reaching 74C, throttling AND your VRMs would be nice and toasty without even overclocking to begin with. I later ended up adding a 290X I got for a song and I am sure you know how noisy and crappy the reference design blower style cooler is with 290X's so I said screw it, revamped the loop, went full hog and dropped a matching block on the 290X. Haven't looked back since.


Thumbs up and +1 to you ^^^^^^^^^^^^^^^^^^^









I had 3x R9 290X full coverage EK and back plate Stayed under 55C at full tilt and for hours.


----------



## xer0h0ur

Thank you. I love my setup but tri-fire often times is more of headache for me than if I had just remained crossfired with the 295X2 alone. This is why I am impressed with people's benches of moderately overclocked dual Fury X's matching my moderately overclocked 295X2 and 290X. For people that can afford dual Fury / Fury X, you have a hell of a nice setup.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> LOL were talking about VRMs rated at 125C. I personally couldn't care less if they were at 80C. As long as the block keeps them well away from the original air cooled VRM temps its fine by me. Interestingly enough I never have noted my VRM temps at load throughout a gaming session. I suppose you have peaked my curiosity to log it next time I play.


Well i would says up to 100c is safe so those block are effective as hell

If you want better reading for the vrm temp an infrared thermometer do a great job and they are cheap on amazon









Can even see water temp and stuff


----------



## xer0h0ur

Well as for the 295X2 at least I don't have much else of an option at reading VRM temps since there is no diode readout on that card. The 290X though does give me VRM1 and VRM2 readouts if I remember correctly. I have never gotten around to buying a halfway decent IR thermometer.


----------



## fjordiales

I'm at work and will be slowly upgrading to win10. I have to install 1tb ssd, install win 8.1 tun upgrade to 10 then clean install. Same with my wife's pc with win 7.

With that said, has anyone tried this yet?

http://www.guru3d.com/news-story/download-catalyst-15-7-1-drivers.html


----------



## xer0h0ur

People that were reporting issues with the 15.7 in Windows 10 seem to be okay with the latest Windows 10 driver available.

Edit: Although apparently there are some complaining VSR isn't working for them.


----------



## Jflisk

Quote:


> Originally Posted by *fjordiales*
> 
> I'm at work and will be slowly upgrading to win10. I have to install 1tb ssd, install win 8.1 tun upgrade to 10 then clean install. Same with my wife's pc with win 7.
> 
> With that said, has anyone tried this yet?
> 
> http://www.guru3d.com/news-story/download-catalyst-15-7-1-drivers.html


Windows 10 drivers have been updating the past week on windows 10 and getting better with every build.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well as for the 295X2 at least I don't have much else of an option at reading VRM temps since there is no diode readout on that card. The 290X though does give me VRM1 and VRM2 readouts if I remember correctly. I have never gotten around to buying a halfway decent IR thermometer.


Well i got one cheap from Canadian tire and one handgun from Amazon Both are as accurate as the socket and or gpu sensor.

But the problem is when the surface is reflective and stuff it cant read very well

This is the one i just got http://www.amazon.com/Etekcity-Lasergrip-Non-contact-Thermometer-Temperature/dp/B00K5QVBCU/ref=sr_1_3?ie=UTF8&qid=1438196677&sr=8-3&keywords=etekcity+infrared


----------



## Cyants

Got in the club just in time for Windows X!! I mean Windows 10


----------



## xer0h0ur

Windows X, Fury X. Half-Life 3 confirmed.


----------



## tx12

Quote:


> Originally Posted by *xer0h0ur*
> 
> Has anyone confirmed yet if Fury is hardware gimped or software gimped? In other words if flashing Fury with Fury X's BIOS would unlock those shaders?


Please Fury / Fury Air owners try this tool and post results in that thread:
http://www.overclock.net/t/1567179

The most interesting is Fury Air, of course. But with Fury X its possible to prove the tool to be working.


----------



## Ehsteve

Finally figured out I just had to disable the ULPS to stop the second card idling (thought I already had, apparently the settings didn't take last time).

Ran another non-benchtest but demanding gaming session (real world test), couldn't hit 40 degrees (sat around 34-38) on either GPU under full load. Did have a shutdown (first card clocked down to 0Hz and forced a power down, second card fine) after some subsequent stability testing though, seems to just have been a one-off.

Managed to clock a 1120/500 (6.6*% increase in core clock) for a 5.4% increase in performance on benchmarks (7109>7493 [http://www.3dmark.com/fs/5519138] vs. [http://www.3dmark.com/fs/5536271] FireStrike Ultra). Restored defaults while I'm still working on memory and CPU clocks.


----------



## Maximization

Quote:


> Originally Posted by *xer0h0ur*
> 
> Windows X, Fury X. Half-Life 3 confirmed.


huh????


----------



## xer0h0ur

LOL, how have you never seen the random correlations between things attached to the phrase "Half-Life 3 confirmed"


----------



## bonami2

Half life 3 yeaaaaaaaaaaaaaaaaaaaaaaaaaa


----------



## Jflisk

So I have my second Fury installed my basic Firestrike scores are the same as when I had 3 X R9 290 X installed. So these cards are nice to say the least. I need to get the memory over clocking going on the ram and I should be over me normal R9 290X x 3 Score. This Is at 1100 on the clocks. I decided to put this Fury X on the bottom of my case and put the fan on intake.

Had to do one more edit - Whats the release date on Half Life 3


----------



## By-Tor

Somebody started something with HL 3...lol


----------



## ozyo

I never play Half-Life


----------



## BaddParrot

Fury X + Win X + DX 12 + Gordon Freeman = Happy Pants!


----------



## xer0h0ur

Well I guess I am showing my age then. I realize younger generations didn't grow up playing those games but mine did. I'm 30 years old though.


----------



## Ceadderman

Shoot I am 46 and "grew up"(Yeah right) playing Half-Life. I've got them all.









Cannot wait for HL3. Was kind of hoping they would do more with Lost Coast, but so long as HL:3 is good, I'm good.









~Ceadder


----------



## BaddParrot

Black Mesa was just released 2-3 months ago.


----------



## Ceadderman

Drat!









Chalk that one on the to do list.


















~Ceadder


----------



## diggiddi

Guess what this is?


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Guess what this is?


A custom backplate?

Guys I have a question
Some rumours suggest that Nano is going to be a full FIJI die just clocked insanely low. Now I doubt that will be the case.

But hypothetically speaking. What is stopping AIB partners from using the FIJI XT dies they receive for Nano and just using them to make custom high end cards, ala FURY X Lightning or Matrix or Toxic?


----------



## ozyo

tbh I'm 25 but back then i was little poor to have pc


----------



## WinterQuinn

Ended up getting this yesterday at Micro Center, slightly more expensive than Amazon's price, but at least I didn't have to wait. (inpatient even with 2-day Amazon Prime shipping)

I'll def be getting another one in the future, but first I need to upgrade my PSU and monitors.


----------



## huzzug

Quote:


> Originally Posted by *WinterQuinn*
> 
> 
> 
> Ended up getting this yesterday at Micro Center, slightly more expensive than Amazon's price, but at least I didn't have to wait. (inpatient even with 2-day Amazon Prime shipping)
> 
> I'll def be getting another one in the future, but first I need to upgrade my PSU and monitors.


You should be fine with your current PSU even with a CFX unless its dead. Besides, its a good model.


----------



## WinterQuinn

Quote:


> Originally Posted by *huzzug*
> 
> You should be fine with your current PSU even with a CFX unless its dead. Besides, its a good model.


For 2x Fury's? I don't know, I still think it might be best to get a 1000W version, just to be safe. But I'll do that once I'm ready to get my second one.


----------



## huzzug

Quote:


> Originally Posted by *WinterQuinn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *huzzug*
> 
> You should be fine with your current PSU even with a CFX unless its dead. Besides, its a good model.
> 
> 
> 
> For 2x Fury's? I don't know, I still think it might be best to get a 1000W version, just to be safe. But I'll do that once I'm ready to get my second one.
Click to expand...

Yup. You do not need any bigger PSU. In fact, even 750W outta be enough for that CFX config along with an OC


----------



## Agent Smith1984

Oh man, all this talk about Half Life 3..... I gotta admit, I don't usually go bananas for new games, but that will be purchased at launch for sure!!!

Half Life 2, in my opinion of course, was the dawn of really high end PC graphics.

I remember it like yesterday.... playing those crazy custom multi player matches, where you are the size of a mouse, trying to throw giant coffee mugs around.....

9800 Pro FTW









8 pipelines, 128MB of 256bit GDDR, at like 250MHz, hahaha


----------



## boi801

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man, all this talk about Half Life 3..... I gotta admit, I don't usually go bananas for new games, but that will be purchased at launch for sure!!!
> 
> Half Life 2, in my opinion of course, was the dawn of really high end PC graphics.
> 
> I remember it like yesterday.... playing those crazy custom multi player matches, where you are the size of a mouse, trying to throw giant coffee mugs around.....
> 
> 9800 Pro FTW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 8 pipelines, 128MB of 256bit GDDR, at like 250MHz, hahaha


those were the days...

never forget 

https://www.techpowerup.com/gpudb/481/radeon-9500.html

moded to 9700 almost doubled the performance...

sorry for the off post...


----------



## Agent Smith1984

Quote:


> Originally Posted by *boi801*
> 
> those were the days...
> 
> never forget
> 
> https://www.techpowerup.com/gpudb/481/radeon-9500.html
> 
> moded to 9700 almost doubled the performance...
> 
> sorry for the off post...


Went GeForce Ti4200, to FX5500, to 9600 pro, to 9700 pro, to 9800 pro within a 6 month time frame..... that was my early days of hardcore PC.

Then 6600GT for Doom 3....

Then back to AMD for X800XL

Then X850XT PE

Then NVIDIA 7800GT

Then 7900GT (with the circuit pen volt mod and GTX BIOS)

Then an additional 7900GT in SLI with the volt mod.

Then I went on hiatus for about 7 years.....

Came back with 6670, then 7770, then 280x, then 290, then 290 Crossfire, now a single 390 and going crossfire 390's in two weeks.....

AAAHHHHHH









Oh, and don't even get me start on all the cards my little brother had, including a lot of BIOS modded stuff, like 6800->6800GT, X850 Pro ->X850XT,

Friggin love this stuff man!!!!


----------



## diggiddi

Quote:


> Originally Posted by *Alastair*
> 
> A custom backplate?
> 
> quote]
> 
> Does it not look like the side of the fury X?


----------



## Ceadderman

Fury X, I presume.









~Ceadder


----------



## Newbie2009

For people in the EU, overclockers have a few in stock https://www.overclockers.co.uk/showproduct.php?prodid=GX-375-AS&groupid=701&catid=56&subcat=3068


----------



## hyp36rmax

*+Update OP*

Added Signature: *Link*



Spoiler: Signature Code:






Spoiler: Official AMD R9 Radeon FURY / NANO / X/ X2 Owners Club



Code:



Code:


[center][url="http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club"][B]Official AMD R9 Radeon FURY / NANO / X/ X2 Owners Club[/B][/url][/center]


----------



## xer0h0ur

Quote:


> Originally Posted by *WinterQuinn*
> 
> For 2x Fury's? I don't know, I still think it might be best to get a 1000W version, just to be safe. But I'll do that once I'm ready to get my second one.


FWIW I am cranking out just a shade over 1000W total system power draw. I have never even once spiked over 1060W. Tri-fired Hawaii cores, overclocked. I think he is right in saying you're okay with that PSU you have.


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh man, all this talk about Half Life 3..... I gotta admit, I don't usually go bananas for new games, but that will be purchased at launch for sure!!!
> 
> Half Life 2, in my opinion of course, was the dawn of really high end PC graphics.
> 
> I remember it like yesterday.... playing those crazy custom multi player matches, where you are the size of a mouse, trying to throw giant coffee mugs around.....
> 
> 9800 Pro FTW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 8 pipelines, 128MB of 256bit GDDR, at like 250MHz, hahaha


It felt like my 9800XT lasted for ages back then. It was a golden age of gaming for me back then.


----------



## Ceadderman

Haha 9800 was my first ATi. Man the drivers, the stories I can tell.









~Ceadder


----------



## BaddParrot

I have a cheap 750w power supply my self. I'm running an 8350 & 1-Fury X both slightly OC. My Battery Back up software says i'm pulling over 600-630w when I'm running most bench marks on the card. ( & thats with out any voltage Oc's on the fury).

Personally, I would not grab a 2nd Fury X with out Upgrading my PSU. But then, i'm probably the only person on here powering a 650$ vid card with a 59$ power supply


----------



## Ceadderman

Quote:


> Originally Posted by *BaddParrot*
> 
> I have a cheap 750w power supply my self. I'm running an 8350 & 1-Fury X both slightly OC. My Battery Back up software says i'm pulling over 600-630w when I'm running most bench marks on the card. ( & thats with out any voltage Oc's on the fury).
> 
> Personaly, I would not grab a 2nd Fury X with out Upgrading my PSU. But then, i'm probably the only person on here powering a 650$ vid card with a 59$ power supply


I am running an 850w with 2 6870 an an 1100T. Still running a single 6870 though atm the whole system is down for mod.









OVERKILL FTW!!!









~Ceadder


----------



## By-Tor

I remember watching The Screen Savers with Leo Leporte and Patrick Norton on Tech TV a lot and they primiered the 9700 prior to release and I jumped on one as soon as it hit the stores..

ATI 128 Rage Pro was my first ATI card and loved it.


----------



## bonami2

Quote:


> Originally Posted by *BaddParrot*
> 
> I have a cheap 750w power supply my self. I'm running an 8350 & 1-Fury X both slightly OC. My Battery Back up software says i'm pulling over 600-630w when I'm running most bench marks on the card. ( & thats with out any voltage Oc's on the fury).
> 
> Personaly, I would not grab a 2nd Fury X with out Upgrading my PSU. But then, i'm probably the only person on here powering a 650$ vid card with a 59$ power supply


Cheap = low efficiency ? 75% = 600w - 25% 450w









Hey im going to power 2 7950 for fun in some days with my 650w ahah

Just got my gpu at 307gbs 1600mhz elpida in vram dat 7950 overclock now idk last time i tried it crashed everything







rma are fun finally


----------



## Skinnered

Anybody running W10 with Fury CF? Any benefit, or issue's? (btw, do man have to update all the drivers for W10 ie, chipset, creative X-Fi etc?)

I'm hesitating upgrading from W8.1.


----------



## HellBoundgr

Quote:


> ]Anybody running W10 with Fury CF? Any benefit, or issue's? (btw, do man have to update all the drivers for W10 ie, chipset, creative X-Fi etc?)


Im running W10 with fury x in cf and its working very good. Only problem I had with older build in w10 was that it auto installed back to older drivers the first time you install new drivers. But retail build it is fixed. No need for install of chipset drivers on my mb, and if you need any drivers just download for win8. So far, its great


----------



## Skinnered

Quote:


> Originally Posted by *HellBoundgr*
> 
> Im running W10 with fury x in cf and its working very good. Only problem I had with older build in w10 was that it auto installed back to older drivers the first time you install new drivers. But retail build it is fixed. No need for install of chipset drivers on my mb, and if you need any drivers just download for win8. So far, its great


Thanks, good to hear. Gonna give it a try this weekend.


----------



## xer0h0ur

Yeah I only have two issues with upgrading to W10 on my main installation's SSD. First being that I don't know how well my Windows 7 drivers will work after upgrading and second I heard people are having issues with Steam or Steam games on W10 which would be a bad time for me.


----------



## p4inkill3r

I upgraded two of my comps via TeamViewer this morning but I've yet to test anything out.


----------



## Jflisk

Quote:


> Originally Posted by *Skinnered*
> 
> Anybody running W10 with Fury CF? Any benefit, or issue's? (btw, do man have to update all the drivers for W10 ie, chipset, creative X-Fi etc?)
> 
> I'm hesitating upgrading from W8.1.


I have Fury X in Xfire. Windows 10 . Just make sure you grab the latest drivers. You should be okay.


----------



## boi801

Hello,

Is there any software that can read the vrm temps on fury x?


----------



## bastian

Quote:


> __
> https://www.reddit.com/r/3er4a7/amd_fury_x_fiji_voltage_scaling_techpowerup/cti3dx3%5B/URL


So much for Fury being a better overclocker with voltage changes.


----------



## xer0h0ur

As far as I know they are using an unpolished method of overvolting which is causing conflicts with the voltage regulation. I wonder if they were actually monitoring their voltages while testing to even verify if its holding onto the voltage or if it keeps dropping the voltage back down thinking there was an error.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> As far as I know they are using an unpolished method of overvolting which is causing conflicts with the voltage regulation. I wonder if they were actually monitoring their voltages while testing to even verify if its holding onto the voltage or if it keeps dropping the voltage back down thinking there was an error.


Well exactly or a deadly vdrop was just like 1187 on my 7950 and run occt and the voltage dropped to 1.108


----------



## diggiddi

Quote:


> Originally Posted by *Ceadderman*
> 
> Fury X, I presume.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Actually no,







its the back of my phone case that I was trying to pass it off as a Fury X Alastair was not biting I guess


----------



## Orivaa

Quote:


> Originally Posted by *bastian*
> 
> So much for Fury being a better overclocker with voltage changes.


Are those the guys who overvolted (Is that right?) the card then tested it with stock fan profile? Yeah, bogus test.


----------



## xer0h0ur

Well, for what its worth I am not a fan of techpowerup because they are constantly shilling for Nvidia. Hell not long ago they threw out a piece saying that AMD has not released a driver since the Omega. They were literally not counting any of the beta drivers.


----------



## Neon Lights

Quote:


> Originally Posted by *xer0h0ur*
> 
> As far as I know they are using an unpolished method of overvolting which is causing conflicts with the voltage regulation. I wonder if they were actually monitoring their voltages while testing to even verify if its holding onto the voltage or if it keeps dropping the voltage back down thinking there was an error.


I contacted w1zzard and he said that he does not know about anything that could be improved on with the method that will be used in Sapphire Trixx/the method that he used in the test.


----------



## xer0h0ur

Then what is the wait with releasing it if that is the case? Then people can do legitimate testing showing more than a single arbitrary number. Everyone knows there are three framerates given on legit tests, min, max and avg. This guy puts one then doesn't make any mention about the cooling. If he ran the fan speed higher, if there was any clock throttling and he only tested @ 4K? How about some relevant testing like 1080p and 1440p? You know, what the vast majority of gamers use.


----------



## Ehsteve

New architecture getting a rudimentry voltage crack showing marginal performance in an unoptimized game without the ability to replicate results and as of now remains unvalidated?

I suggest waiting for something more substantial before coming to any conclusions. Then again it is a post from a 980ti fanboi for a bit of a laugh.


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Fury X, I presume.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually no,
> 
> 
> 
> 
> 
> 
> 
> its the back of my phone case that I was trying to pass it off as a Fury X Alastair was not biting I guess
Click to expand...

You sly fox you! I presumed that because you had not given us a full shot of the card it was not a card. So I assumed a part of the card. Like a custom 3D printed back plate.

A question. Can anyone tell me if the 15.7 windows 10 drivers support the HD6000 series? Or is HD6000 now legacy? Maybe I need to get my rear into gear and get a Fury already.


----------



## Ceadderman

They should but you won't get DX12 capability. Only high end HD7*** and above will get DX12 capability.









~Ceadder


----------



## royfrosty

I just love the performance of 2x fury X. In most games at higher resolution, with the add up of 2nd card, i have almost gotten twice the frame rate. Only certain console port over games such as far cry 4 could not do that.


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> They should but you won't get DX12 capability. Only high end HD7*** and above will get DX12 capability.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


not worried about DX12. I just wanna know if there are windows 10 drivers for the older cards


----------



## Dupl3xxx

Quote:


> Originally Posted by *Alastair*
> 
> not worried about DX12. I just wanna know if there are windows 10 drivers for the older cards



This should answer your question.


----------



## tx12

Good news everyone!
*Fury Air is unlockable!*

At least, sort of. More details soon.


----------



## rx7racer

Well it finally arrived, guess this will make for an interesting weekend if I can get some down time.



I'm prepared to be let down though, not expecting much more from this Fury than my 290X but after about 2 years can't help but want a new toy.


----------



## richie_2010

my small batch of fury x backplates and led kit have arrived

need fix a slight boo boo ive made and there available


----------



## Jflisk

Quote:


> Originally Posted by *rx7racer*
> 
> Well it finally arrived, guess this will make for an interesting weekend if I can get some down time.
> 
> 
> 
> I'm prepared to be let down though, not expecting much more from this Fury than my 290X but after about 2 years can't help but want a new toy.


Oh not much around 2X the performance of the R9 290X.


----------



## Scorpion49

Quote:


> Originally Posted by *rx7racer*
> 
> Well it finally arrived, guess this will make for an interesting weekend if I can get some down time.
> 
> 
> 
> I'm prepared to be let down though, not expecting much more from this Fury than my 290X but after about 2 years can't help but want a new toy.


All that power to play world of tanks...

Quote:


> Originally Posted by *Jflisk*
> 
> Oh not much around 2X the performance of the R9 290X.


Not really, no.


----------



## rx7racer

Quote:


> Originally Posted by *Scorpion49*
> 
> All that power to play world of tanks...
> Not really, no.


Hahaha, ah so true, but I'll make use of it with other titles eventually.....









No, really just hoping to play Witcher a bit better and a few other games here and there. Plus the wife needed an upgrade for Dragon Age so meh she can have the 290x now, if I can figure out a way to cool it in her setup.


----------



## Scorpion49

Quote:


> Originally Posted by *rx7racer*
> 
> Hahaha, ah so true, but I'll make use of it with other titles eventually.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No, really just hoping to play Witcher a bit better and a few other games here and there. Plus the wife needed an upgrade for Dragon Age so meh she can have the 290x now, if I can figure out a way to cool it in her setup.


Fury is a good card, I like mine even if it is a little expensive and a little slower compared to Nvidia. I like to try different things









Check yours for coil whine before you block it, thats all I can say! Mine is TERRIBLE. I'm considering trying to RMA it for that but I;m not sure if another one will have the same problem.


----------



## Jflisk

Quote:


> Originally Posted by *Scorpion49*
> 
> All that power to play world of tanks...
> Not really, no.


No really yes. About 1.5 R9 290X

Fury X X2 Slight overclock
http://www.3dmark.com/fs/5547294

R9 290X X 3 Slight overclock or no over clock. I have so many runs
http://www.3dmark.com/fs/5255598


----------



## Scorpion49

Quote:


> Originally Posted by *Jflisk*
> 
> No really yes. About 1.5 R9 290X
> 
> Fury X X2 Slight overclock
> http://www.3dmark.com/fs/5547294
> 
> R9 290X X 3 Slight overclock or no over clock. I have so many runs
> http://www.3dmark.com/fs/5255598


Look, a Fury isn't twice as fast as a 290X. If it was, it would be dominating overclocked Titan X and 980ti's. A full Fury X can be around 40% faster depending on the game, in my own rig a single Fury is 15-25% faster than an 1100mhz 290X depending on what I'm doing, exactly what every review out there shows.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> All that power to play world of tanks...
> Not really, no.
> 
> 
> 
> No really yes. About 1.5 R9 290X
> 
> Fury X X2 Slight overclock
> http://www.3dmark.com/fs/5547294
> 
> R9 290X X 3 Slight overclock or no over clock. I have so many runs
> http://www.3dmark.com/fs/5255598
Click to expand...

Don't use Firestrike as a comparison







.....

Run both rigs at Extreme or Ultra if you want to compare Graphics score, your CPU cannot feed the GPU's fast enough at 1080p


----------



## Cyants

Quote:


> Originally Posted by *Scorpion49*
> 
> All that power to play world of tanks...


I have a Fury X and get around 60 FPS at Ultra High setting with HD textures everything with high AA in 2560x1080, it's a demanding game with the eye candy turned on...


----------



## Jflisk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Don't use Firestrike as a comparison
> 
> 
> 
> 
> 
> 
> 
> .....
> 
> Run both rigs at Extreme or Ultra if you want to compare Graphics score, your CPU cannot feed the GPU's fast enough at 1080p


Ill have to wait till I get home to do the run at Ultra or extreme. I know I have a couple of runs from the 3 X 290X somewhere. Thanks


----------



## Jflisk

Anyone that was in the windows 10 insider have a problem installing the latest 15.7.1 drivers on the Fury X. Just seeing if anyone else has had the problem of the driver hanging install with a black screen. But when I check the Device manager the Video cards show the latest driver is installed. Just thinking they sent this driver on the last win 10 driver update before the release and that is why its not installing.


----------



## Maximization

Stupid question is fury x and fury same pcb?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Maximization*
> 
> Stupid question is fury x and fury same pcb?


If it's a Sapphire Fury then yes, if it's an Asus then no.

Most likely anything with a half size PCB will be referance


----------



## rx7racer

Scaling with CF is nice but it doesn't fully relate to a single fury vs single 290X. I've read the numbers, I don't expect a 2x performance increase at all.

And yeah Scorp I am going to run it probably a day or two without block to hopefully be free of coil whine or at least hope for it not to be very bad. I've always been lucky in that regard so here's to continued luck.









And only 60fps with a Fury X at that res. is kinda scary, lord god this is going to perform worse than my 290X at those numbers in Tanks. But I kind of expect that until I get a couple new monitors to be able to step up my needs, Honestly @ 1080P even my HD6950 still hangs just fine in my HTPC(yeah it's overkill haha).


----------



## Scorpion49

Quote:


> Originally Posted by *Cyants*
> 
> I have a Fury X and get around 60 FPS at Ultra High setting with HD textures everything with high AA in 2560x1080, it's a demanding game with the eye candy turned on...


Its very CPU demanding, a fast Intel will net you more gains than a better GPU unfortunately. Check how your FPS goes up if you turn shadows, blur and lighting down (those are all rendered by CPU because russian).


----------



## Cyants

Quote:


> Originally Posted by *Scorpion49*
> 
> Its very CPU demanding, a fast Intel will net you more gains than a better GPU unfortunately. Check how your FPS goes up if you turn shadows, blur and lighting down (those are all rendered by CPU because russian).


I'll try that but really with Freesync does not mater it,s smooth as butter









BTW I have a i7 930 OC to 4Ghz


----------



## Scorpion49

Quote:


> Originally Posted by *Cyants*
> 
> I'll try that but really with Freesync does not mater it,s smooth as butter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW I have a i7 930 OC to 4Ghz


It helps, I run it at 4K locked at 60fps and don't even need Freesync, max settings besides shadows and blur (I don't like the blur they put in).


----------



## Cyants

Got over 110FPS with the shadows at medium, guess ill turn on Vsync+Freesync for 75Hz action.


----------



## rv8000

Quote:


> Originally Posted by *tx12*
> 
> Good news everyone!
> *Fury Air is unlockable!*
> 
> At least, sort of. More details soon.


Whats the sort of part mean D:


----------



## Cyants

Can be unlocked but keep crashing?


----------



## tx12

Some Fury Air quick & dirty unlocking benchmarks:

Fury Air (Sapphire Tri-X) was used for this test. All settings are defaulted, 1000Mhz/500Mhz, no overclocking. Driver 15.7.1 under Windows 7 x64.
Based on cuinfo data, card has at least one failed CU, so it was cut down to Fiji PRO for a solid reason.
However, each of Fiji PRO's shader engines contain TWO disabled CUs for 8 inactive CUs in total. With one failed core its still possible to enable 4 CUs (1 per each SE) and maintain full symmetry. If all cores are enabled, failed one turns on for a work it can't properly handle and system becomes unstable.

Original 3584 shaders:
FAH benchmark 1.2.0 implicit, single: 216.092 ns/day
Unigine Heaven 4.0 @1920x1200 (Ultra/Normal/x8): 1717


Added 4 CU, 3840 shaders:
FAH benchmark 1.2.0 implicit, single: 241.014 ns/day
Unigine Heaven 4.0 @1920x1200 (Ultra/Normal/x8): 1756
That's +11.5% for FAH and tiny 2,3% for Heaven.


Enabled all CUs, including at least one dead core:

Card barely works, image artifacts all over the desktop coming from failed core.
Any load hangs the card.


----------



## Crisium

Interesting. I just put together my Sapphire Tri X Fury build yesterday, running it at the stock OC of 1040/500 and am pleased with the performance, temps, and noise. I'll wait for some other people to take the plunge before trying this.


----------



## tx12

Anyone with Fury Air, please run cuinfo and post its report in the thread:
http://www.overclock.net/t/1567179

I'm still confused why Fiji PRO unlocks. It shouldn't. That could be by mistake on some first batches or just ES chips.
More reports are needed to gather better statistics.


----------



## BaddParrot

I was running 2x Sapphire 100364-4GL Radeon R9 270X 4GB GDDR5 (Crossfire) (Both OC some).

Arguably the R9 270x is the best "Value" card on the market atm. I could run pretty much any game at 1080p on max settings including Witcher 3.
Cross-fired 270x's Should be real close to a single 290x especially at the lower resolutions.

I chose to upgrade to the fury X & a single card solution (For now) & grab a 2k monitor.

I had to wait a week for New egg to ship the Fury from CA to Florida. During that week, I had plenty of time to run bench marks on my 270x's!

The day I installed the Fury, I took the time to also Flash my BioS to the newest one. (Asus Crosshair V formula Z MB). I forgot to save all my OC settings









I reset my 8350 to 4700 (Its close to were it was).

My results for now are basically showing only a 45% increase in 3d mark scores. I have no doubt that this will do nothing but go up as I install win 10 , dx12 & the drivers get better.

http://www.3dmark.com/compare/fs/5460553/fs/5570171

2763/4008

My pc always takes a dump on the physics scores! It hates the 8350.

4040 was my best ultra score so far.

.


----------



## Dupl3xxx

2k monitor.
Quote:


> Originally Posted by *BaddParrot*
> 
> I chose to upgrade to the fury X & a single card solution (For now) & grab a 2k monitor.


So 2048 x 1080? or did you mean 1920 ("2K") x 1080? Or did you mean 2560 x 1440, 2560 x 1600 or 2560 x 1080?

Please just use the actual resolution.


----------



## BaddParrot

Quote:


> Originally Posted by *Dupl3xxx*
> 
> 2k monitor.
> So 2048 x 1080? or did you mean 1920 ("2K") x 1080? Or did you mean 2560 x 1440, 2560 x 1600 or 2560 x 1080?
> 
> Please just use the actual resolution.


It doesn't matter at all. I have not ordered a new monitor yet. The firestrike bench marks are both set to ultra.


----------



## forg3600

anyone has dual r9 fury?
i want to know about vram usage
make screenshot


----------



## diggiddi

Quote:


> Originally Posted by *Scorpion49*
> 
> All that power to play world of tanks...
> Not really, no.


LOL


----------



## Alastair

Seems I'm gonna play lotto with some Sapphire Tri-x cards and try and partially unlock the card. If I manage to RE - enable 4 of the 8 disabled CU's then I can close the gap a bit to a full Fury. Down to about a theoretical 7% gap between the two then. And if I loose the lottery and get hardlocked cards. Well I was always planning on buying normal Fury anyways.


----------



## Alastair

Quote:


> Originally Posted by *tx12*
> 
> Good news everyone!
> *Fury Air is unlockable!*
> 
> At least, sort of. More details soon.


How do you only partially unlock it? Is it a BIOS? How does one go about doing this?


----------



## tx12

Quote:


> Originally Posted by *Alastair*
> 
> How do you only partially unlock it? Is it a BIOS? How does one go about doing this?


These results are from "lab environment". I hope it would be possible to unlock cores with BIOS.
Since all information involved into unlocking is public, I'm not going to get hideous about it. As soon as I'll find how to implement it (in BIOS?) it will be published.

The main possible obstacle for BIOS way is signature. Only IF Fury's BIOS protected by signature this way would be blocked. Otherwise, I think it could be done.

Flashing Fury Air with Fury X BIOS won't help, at least not for Fiji PRO's with failed cores. So far, only 3 or 4 cuinfo dumps of Fury Air are available and none of them features full Fiji chip. Need more statistics, the same story as with 290's.


----------



## Scorpion49

Quote:


> Originally Posted by *tx12*
> 
> Anyone with Fury Air, please run cuinfo and post its report in the thread:
> http://www.overclock.net/t/1567179
> 
> I'm still confused why Fiji PRO unlocks. It shouldn't. That could be by mistake on some first batches or just ES chips.
> More reports are needed to gather better statistics.


I'll check mine when I get home.


----------



## Thoth420

Quote:


> Originally Posted by *Scorpion49*
> 
> Fury is a good card, I like mine even if it is a little expensive and a little slower compared to Nvidia. I like to try different things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Check yours for coil whine before you block it, thats all I can say! Mine is TERRIBLE. I'm considering trying to RMA it for that but I;m not sure if another one will have the same problem.


Did you block it right off and notice the whine or holding off on putting a block on because your card has whine?

I ask because I still haven't gotten to test mine and was considering doing a custom loop...err having one done for me. I don't have that level of expertise or time to invest...


----------



## Cool Mike

*Not sure anyone has noticed. I just loaded Win 10 and the latest AMD driver and CCC now has memory adjustment. This may be old news, just putting it out there. Not sure this is Win10 related or not.*


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> Did you block it right off and notice the whine or holding off on putting a block on because your card has whine?
> 
> I ask because I still haven't gotten to test mine and was considering doing a custom loop...err having one done for me. I don't have that level of expertise or time to invest...


I didn't put a block on mine, I was commenting on racers picture with the water block ready to go. Mine is two weeks old, the whine died down a bit but not completely.

Quote:


> Originally Posted by *tx12*
> 
> Anyone with Fury Air, please run cuinfo and post its report in the thread:
> http://www.overclock.net/t/1567179
> 
> I'm still confused why Fiji PRO unlocks. It shouldn't. That could be by mistake on some first batches or just ES chips.
> More reports are needed to gather better statistics.


Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 04080000 / 00000000 [.....x......x...]
SE2 hw/sw: 00050000 / 00000000 [.............x.x]
SE3 hw/sw: 02010000 / 00000000 [......x........x]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.


----------



## Thoth420

Ah thanks Scorp


----------



## rx7racer

Also thought I'd pop this out. Had to do a fresh Win 10 install as CCC wouldn't open, bit annoying but oh well. Also good to report no coil whine.









Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00030000 / 00000000 [..............xx]
SE2 hw/sw: 40010000 / 00000000 [.x.............x]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.


----------



## bonami2

Im the only one with windows 8
Quote:


> Originally Posted by *rx7racer*
> 
> Also thought I'd pop this out. Had to do a fresh Win 10 install as CCC wouldn't open, bit annoying but oh well. Also good to report no coil whine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Adapters detected: 1
> Card #1 PCI ID: 1002:7300 - 174B:E329
> DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
> Fiji-class chip with 16 compute units per Shader Engine
> SE1 hw/sw: 00030000 / 00000000 [..............xx]
> SE2 hw/sw: 40010000 / 00000000 [.x.............x]
> SE3 hw/sw: 00030000 / 00000000 [..............xx]
> SE4 hw/sw: 00030000 / 00000000 [..............xx]
> 56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
> 8 CU's are disabled by HW lock, override is possible at your own risk.


As a pure noob

Are those x the disabled value?


----------



## Alastair

Quote:


> Originally Posted by *tx12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> How do you only partially unlock it? Is it a BIOS? How does one go about doing this?
> 
> 
> 
> These results are from "lab environment". I hope it would be possible to unlock cores with BIOS.
> Since all information involved into unlocking is public, I'm not going to get hideous about it. As soon as I'll find how to implement it (in BIOS?) it will be published.
> 
> The main possible obstacle for BIOS way is signature. Only IF Fury's BIOS protected by signature this way would be blocked. Otherwise, I think it could be done.
> 
> Flashing Fury Air with Fury X BIOS won't help, at least not for Fiji PRO's with failed cores. So far, only 3 or 4 cuinfo dumps of Fury Air are available and none of them features full Fiji chip. Need more statistics, the same story as with 290's.
Click to expand...

Those screenshots you posted. Is that your card that you managed to unlock? Or is it just something you happened to find online?


----------



## xer0h0ur

Quote:


> Originally Posted by *Alastair*
> 
> Those screenshots you posted. Is that your card that you managed to unlock? Or is it just something you happened to find online?


In Soviet Russia, card unlock you.


----------



## fjordiales

R9 Fury Strix.

Adapters detected: 2

Card #1 PCI ID: 1002:7300 - 1043:049E
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 80010000 / 00000000 [x..............x]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.

Card #2 PCI ID: 1002:7300 - 1043:049E
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00210000 / 00000000 [..........x....x]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Those screenshots you posted. Is that your card that you managed to unlock? Or is it just something you happened to find online?
> 
> 
> 
> In Soviet Russia, card unlock you.
Click to expand...


----------



## swiftypoison

FYI, there is stock over at newegg for Fury X cards. They seemed to have a large batch as I saw them in stock last night and this morning they are still available to purchase.


----------



## richie_2010

hope you dont mind the crappy pics but here is 2 pics of the fury x plates i had cut
let me know what you think


----------



## tx12

OK, BIOS unlock works. In Windows 7, at least with 15.7.1 drivers were no troubles with signature.
Fancy pic, everything but one failed core is enabled. Still, zero performance rise if compared with 3840 shaders because SE symmetry was lost:


----------



## Mr.N00bLaR

Quote:


> Originally Posted by *tx12*
> 
> OK, BIOS unlock works. In Windows 7, at least with 15.7.1 drivers were no troubles with signature.
> Fancy pic, everything but one failed core is enabled. Still, zero performance rise if compared with 3840 shaders because SE symmetry was lost:


Thats quite neat. How did you measure performance?


----------



## huzzug

Quote:


> Originally Posted by *richie_2010*
> 
> hope you dont mind the crappy pics but here is 2 pics of the fury x plates i had cut
> let me know what you think


Eh....showoff


----------



## richie_2010

Quote:


> Originally Posted by *huzzug*
> 
> Eh....showoff


they are available in my artisan store if your interested, these are pure acrylic black tops clear frames for the leds
the tops can be covered with diff colored vinyl if so wished. black comes as standard


----------



## tx12

Guide is ready. If you like to risk and love to tinker with hardware, you can give it a try:
http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool#post_24235287


----------



## rv8000

Huzzah, successful "partial" unlock. Going for the full 64 CU's would likely be an issue I may try in the future but the info tool did show I had a possible failure in one of the units so I don't want to risk it.
Quote:


> Originally Posted by *rv8000*
> 
> Apologies in advance for the double post....
> 
> It seems my activation was successful.
> 
> 
> 
> 3dmark GPU score went from 15425 to 15799, which is outside the normal margin of error for 3dmark (normally about +/- 100 points)
> 
> Stock bios
> 
> Unlocked bios


----------



## looncraz

Quote:


> Originally Posted by *en9dmp*
> 
> It's not the clock speed that's the issue, I think anything around 1200 is decent, but if that doesn't translate into any appreciable performance increase then there's not much point...


Since when has overclocking a GPU lead to appreciable performance increases? I mean, a 20% increase in performance is rarely enough to even turn on any extra eye candy.


----------



## xer0h0ur

Quote:


> Originally Posted by *looncraz*
> 
> Since when has overclocking a GPU lead to appreciable performance increases? I mean, a 20% increase in performance is rarely enough to even turn on any extra eye candy.


Speak for yourself. Every frame counts for me. When you get around to gaming at 4K you will welcome every extra frame you can pick up on avg FPS but the largest difference overclocking tends to make is raising your FPS min.


----------



## en9dmp

Quote:


> Originally Posted by *looncraz*
> 
> Since when has overclocking a GPU lead to appreciable performance increases? I mean, a 20% increase in performance is rarely enough to even turn on any extra eye candy.


Always has done for me in the past. An extra 5-10 fps can make the difference. From what I've seen on 980Ti benches, overclocking those chips yields much better performance.

We won't know for sure until we finally get a tool to unlock the voltage... Then we can all have a go and see what we can reach.


----------



## fjordiales

Quick gameplay vid on Win 10.


----------



## ozyo

finally
now i need to find another one for cf


----------



## Evil Penguin

Quote:


> Originally Posted by *ozyo*
> 
> 
> 
> 
> finally
> now i need to find another one for cf


Got one with a slightly whiny pump, I see.








So did I. :\


----------



## xer0h0ur

Actually some people who got the sticker pump were blessed with a non-noisy unit. I was actually going to ask him if his pump is noisy and/or has coil whine.


----------



## ozyo

Quote:


> Originally Posted by *Evil Penguin*
> 
> Got one with a slightly whiny pump, I see.
> 
> 
> 
> 
> 
> 
> 
> 
> So did I. :\


its ok
simple solution


----------



## ozyo

Quote:


> Originally Posted by *xer0h0ur*
> 
> Actually some people who got the sticker pump were blessed with a non-noisy unit. I was actually going to ask him if his pump is noisy and/or has coil whine.


low pump noisy
I'm not sure about coil whine
cant run my monitor and i don't know why normal dvi [email protected]#t


----------



## xer0h0ur

If you had coil whine you would certainly know you had it. More than likely its minimal or non-existent for you then. Unless your pump is noisy enough to cover up the coil whine.


----------



## localh85

Question about overclocking Fury X as I just got one:

"Power Limit Settings" - Is it ok to leave it at max 50% or should I find the stable power limit settings with GPU OC %? It seems I am only getting a 7% OC with 50% on.

Also, I am not crashing in FireStrike 4k tests, but playing CSGO I am getting slight freezes with OCs.


----------



## ozyo

as far i know coil whine start @ high fps


----------



## xer0h0ur

Quote:


> Originally Posted by *localh85*
> 
> Question about overclocking Fury X as I just got one:
> 
> "Power Limit Settings" - Is it ok to leave it at max 50% or should I find the stable power limit settings with GPU OC %? It seems I am only getting a 7% OC with 50% on.
> 
> Also, I am not crashing in FireStrike 4k tests, but playing CSGO I am getting slight freezes with OCs.


If you're attempting to run Crossfire in CS:GO, don't. It becomes a stutter/freezefest.


----------



## localh85

Quote:


> Originally Posted by *xer0h0ur*
> 
> If you're attempting to run Crossfire in CS:GO, don't. It becomes a stutter/freezefest.


Not crossfiring. Just a single card at this moment. Do you have any recommendation on power limit settings?


----------



## ozyo

Quote:


> Originally Posted by *localh85*
> 
> Not crossfiring. Just a single card at this moment. Do you have any recommendation on power limit settings?


u need to find sweet spot for power limit
+
don't use firestrike for stability test


----------



## localh85

Quote:


> Originally Posted by *ozyo*
> 
> u need to find sweet spot for power limit


Great, thanks!

I know this is probably dumb, but leaving it at 50% is probably bad, or at least not optimal, right?


----------



## ozyo

Quote:


> Originally Posted by *localh85*
> 
> Great, thanks!
> 
> I know this is probably dumb, but leaving it at 50% is probably bad, or at least not optimal, right?


depends on u oc but increasing power limit help
+
don't put it at max find sweet spot


----------



## Ceadderman

Quote:


> Originally Posted by *looncraz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *en9dmp*
> 
> It's not the clock speed that's the issue, I think anything around 1200 is decent, but if that doesn't translate into any appreciable performance increase then there's not much point...
> 
> 
> 
> Since when has overclocking a GPU lead to appreciable performance increases? I mean, a 20% increase in performance is rarely enough to even turn on any extra eye candy.
Click to expand...

Since nVidia slapped there stupid "Gaming as it's meant to be..." moniker on all their ads. Even if they don't see 20% returns from their OCs.









~Ceadder


----------



## xer0h0ur

I just cranked it to +50 power limit and +100mV without giving any care and stepped up the clocks until I experienced artifacting in FireStrike Ultra then backed the clocks off 10MHz. However my cards are waterblocked and I use Fujipoly Ultra Extreme thermal pads on the vRAM/VRMs so I am doing care free overclocking.


----------



## ozyo

1215686 rpm


----------



## xer0h0ur

ITS OVER 9000!


----------



## ozyo

Quote:


> Originally Posted by *xer0h0ur*
> 
> ITS OVER 9000!


----------



## tconroy135

What are the highest clock speeds people are getting on the core?


----------



## rv8000

Quote:


> Originally Posted by *tconroy135*
> 
> What are the highest clock speeds people are getting on the core?


Fastest I've seen for Fury X is 1175 without voltage control, average seems to be around 1130mhz without voltage control (benchstable, not necessarily game stable)

Fastest I've seen for Fury is 1140 without voltage control, average seems to be around 1080 (as long as you don't have a Strix which seem to fair even worse with their crippled bios)

Remember this is a very small sample pool to judge from, I've only seen about 15-20 people report ocs for fury X and maybe 10 people oc a Fury on forums.


----------



## xer0h0ur

We need the overvolting to go public so that people can try it and see where these Fiji cores average out to. I am not going to take a single person's result as final say.


----------



## Elmy

Hi 

Anyone looking for one of these?


----------



## dir_d

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Hi
> 
> Anyone looking for one of these?


jelly


----------



## Ceadderman

You giving it away? Or selling?









~Ceadder


----------



## rv8000

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Hi
> 
> Anyone looking for one of these?


How in the devil did you happen to come across a nano


----------



## Elmy

Quote:


> Originally Posted by *rv8000*
> 
> How in the devil did you happen to come across a nano


Doing a secret squirrel project for AMD. It will be up on the worklog section here next week.


----------



## rv8000

Quote:


> Originally Posted by *Elmy*
> 
> Doing a secret squirrel project for AMD. It will be up on the worklog section here next week.


I'll keep my eyes peeled, enjoy the nano


----------



## ozyo

my card can do 1150


----------



## xer0h0ur

God that thing looks even smaller than expected. Its like a toy.


----------



## Elmy

Just hanging out with my brother from another mother....


----------



## Jflisk

In case anyone needs the info the screws are 6-32 (Lowes) X whatever length you need for the radiator. I needed a couple of longer screws to mount the rad to my case. Get the nuts/washers also. Put the nut down on the screw with washers to the length you need cut leftover screw at the nut hold with channel locks (dremal with cutting wheel-Always wear your safety glasses). Screw and unscrew nut a couple of times to clear the threads. This is the important part - Check that the screw does not go thru the radiator when threaded thru the hole. Look between the end of thread and the radiator.


----------



## xer0h0ur

I love my dremel. Can't imagine PC modding without one.


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> I love my dremel. Can't imagine PC modding without one.


Dremel and Bondo cant go wrong.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> God that thing looks even smaller than expected. Its like a toy.


And i was thinking my gts 250 was small


----------



## Jflisk

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Just hanging out with my brother from another mother....


How about benching the little guy out and letting us see some scores.


----------



## fewness

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Hi
> 
> Anyone looking for one of these?


Your lucky....









Can you post a GPU-Z or 3DMark? if that's not gonna cause you any trouble of course....


----------



## Ehsteve

One game,
Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Just hanging out with my brother from another mother....


Wow, makes me want to put together a media centre.

Does the Nano have a backplate?


----------



## Elmy

Quote:


> Originally Posted by *fewness*
> 
> Your lucky....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you post a GPU-Z or 3DMark? if that's not gonna cause you any trouble of course....


NDA....


----------



## Elmy

Quote:


> Originally Posted by *Ehsteve*
> 
> One game,
> Wow, makes me want to put together a media centre.
> 
> Does the Nano have a backplate?


No backplate


----------



## Elmy

Quote:


> Originally Posted by *Jflisk*
> 
> How about benching the little guy out and letting us see some scores.


NDA ... Sorry


----------



## EpicOtis13

Quote:


> Originally Posted by *Elmy*
> 
> NDA ... Sorry


Quote:


> Originally Posted by *Elmy*
> 
> No backplate


Quote:


> Originally Posted by *Elmy*
> 
> NDA....


Try to keep triple/double posting to a minimum, just multi quote instead.


----------



## Alastair

Quote:


> Originally Posted by *Elmy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> How about benching the little guy out and letting us see some scores.
> 
> 
> 
> NDA ... Sorry
Click to expand...

is there any information you can give us without breaking NDA? Is it full fat FijiXT just clocked very low? Or is a cut down chip. Should some of us wait for Nano? Or should I just go with normal Sapphire Fury.


----------



## ZohanOP

When does an NDA expire?


----------



## tx12

Quote:


> Originally Posted by *ZohanOP*
> 
> When does an NDA expire?


I bet NDA expire date is under NDA


----------



## Elmy

Quote:


> Originally Posted by *tx12*
> 
> I bet NDA expire date is under NDA


All I can do is release pictures of it since AMD already has done that themselves.

Can't release any info about architecture , performance or release date.


----------



## ZohanOP

Quote:


> Originally Posted by *Elmy*
> 
> All I can do is release pictures of it since AMD already has done that themselves.
> 
> Can't release any info about architecture , performance or release date.


You are right, but i'm asking about NDA expire date, not about R9 Nano release date, specs, pink unicorns...


----------



## flopper

Quote:


> Originally Posted by *ZohanOP*
> 
> You are right, but i'm asking about NDA expire date, not about R9 Nano release date, specs, pink unicorns...


you never signed an NDA apperantly.
they cant say period.


----------



## Jflisk

I just started messing with the VSR - This stuff is interesting. 4k on a 1080 monitor. I have my games at 2560X1440 . The 3840 X 2160 seems laggy even with 2 Fury's.

Tried Crysis 3 and Battlefield hardline so far. Works great on these two. A little laggy on Titanfall at any of the resolutions above.


----------



## ZohanOP

Quote:


> Originally Posted by *flopper*
> 
> you never signed an NDA apperantly.
> they cant say period.


But Fury X NDA expire date wasn't confidential. I'm confused


----------



## tx12

Quote:


> Originally Posted by *Elmy*
> 
> All I can do is release pictures of it since AMD already has done that themselves.


OK then, that's about backside picture, please?


----------



## tx12

Quote:


> Originally Posted by *flopper*
> 
> you never signed an NDA apperantly.
> they cant say period.


At least most AMD's published slides feature a "under embargo until 19.01.2038" string.


----------



## mustrum

Quote:


> Originally Posted by *ozyo*
> 
> its ok
> simple solution


Hehe that's what i use.
There might be Pump rev 1 or 2 and then there are EK Blocks. Gotta love their quality too.


----------



## Elmy

Quote:


> Originally Posted by *tx12*
> 
> OK then, that's about backside picture, please?


Sorry can't do that either. Only thing I can release is the same picture's AMD has released themselves. They haven't released a picture of the PCB side.


----------



## flopper

Quote:


> Originally Posted by *tx12*
> 
> At least most AMD's published slides feature a "under embargo until 19.01.2038" string.


this one dont, so again, NDA
cant say


----------



## ZohanOP

Quote:


> Originally Posted by *tx12*
> 
> OK then, that's about backside picture, please?



Oh, wait, it's under NDA


----------



## SpeedyVT

Quote:


> Originally Posted by *Elmy*
> 
> NDA....


How does one work for you or get to work with you?







I'm loving the perks.


----------



## tx12

Is someone will accidentally find nano bios, please share


----------



## Elmy

Quote:


> Originally Posted by *ZohanOP*
> 
> 
> Oh, wait, it's under NDA


That one isn't mine  I got Different stickers on my PCB....

But there you go for anyone asking to see the other side.


----------



## SpeedyVT

Quote:


> Originally Posted by *Elmy*
> 
> That one isn't mine  I got Different stickers on my PCB....
> 
> But there you go for anyone asking to see the other side.


Since you seem to get all the nice information, could you get me the release date for NMS?







I'll keep it to myself.


----------



## Elmy

Quote:


> Originally Posted by *SpeedyVT*
> 
> Since you seem to get all the nice information, could you get me the release date for NMS?
> 
> 
> 
> 
> 
> 
> 
> I'll keep it to myself.


What is NMS?


----------



## SpeedyVT

Quote:


> Originally Posted by *Elmy*
> 
> What is NMS?


No Man's Sky. Probably the largest Indie game of all time!


----------



## Elmy

Here is another picture for you guys with the Nano on a full size X99 Motherboard.

And no this is not the motherboard I am using for the Nano build.


----------



## SpeedyVT

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Here is another picture for you guys with the Nano on a full size X99 Motherboard.
> 
> And no this is not the motherboard I am using for the Nano build.


I told people the previous pictures of nano were of someone with baby sized man hands. The early July ones, not yours.


----------



## rv8000

Quote:


> Originally Posted by *ZohanOP*
> 
> 
> Oh, wait, it's under NDA


4 Phase power and the already known single 8-pin, great for itx/small form factor builds but if it is just a downclocked Fiji XT or Fiji Pro theres likely no way to get it back up to higher clocks unless AIB partners get to go balls to the walls with custom PCB's. Then again we still have no idea if it's cutdown, downclocked, or both.


----------



## SpeedyVT

Quote:


> Originally Posted by *rv8000*
> 
> 4 Phase power and the already known single 8-pin, great for itx/small form factor builds but if it is just a downclocked Fiji XT or Fiji Pro theres likely no way to get it back up to higher clocks unless AIB partners get to go balls to the walls with custom PCB's. Then again we still have no idea if it's cutdown, downclocked, or both.


Or another revision of GCN.


----------



## Ceadderman

Double slot card they size of a Matchbox car. Interesting.









~Ceadder


----------



## ozyo

Cutdown of course


----------



## rv8000

Quote:


> Originally Posted by *ozyo*
> 
> Cutdown of course


I have a feeling it will have the same number of CU's that Fiji Pro has while being underclocked to ~800mhz on the core, even with this revision of GCN the combination of those two are the only way it could fit in the 175w TDP and still be faster than a 290x/390x. Then of course there is the chance it is slower than Hawaii/Grenada XT.


----------



## tx12

Quote:


> Originally Posted by *ZohanOP*


Even shorter than standard PCI Half-Length card.
But VRM is weak, only 4 phases with dual drivers. Wouldn't be possible to attach waterblock and run a full boost to Fury X.


----------



## rv8000

Quote:


> Originally Posted by *tx12*
> 
> Even shorter than standard PCI Half-Length card.
> But VRM is weak, only 4 phases with dual drivers. Wouldn't be possible to attach waterblock and run a full boost to Fury X.


Better than some reference 970's, which as far as I remember were 3+1 phase setups. It's not meant to be a power muncher, it's meant for small form factor builds.

I even wonder if AIB partners will bother with custom pcb's, I'm not expecting much more than a custom cooler/shroud with the stock pcb.


----------



## huzzug

Quote:


> Originally Posted by *Elmy*
> 
> Just hanging out with my brother from another mother....


Actually, they may be from the same mother, just that one had growth deficit


----------



## richie_2010

from kit guru's website they say its going to be a fully fledged fury x cpu just clocked down

here is a extract


Spoiler: Warning: Spoiler!



AMD's small form-factor graphics card for gamers who use mini ITX chassis - the AMD Radeon R9 Nano - will be based on the fully-fledged "Fiji" graphics processor with 4096 stream processors, according to AnandTech, which cites sources from AMD. The graphics card will be only 6" long and will feature one 8-pin PCIe power connector. The Radeon R9 Nano will carry 4GB of HBM memory operating at 1GHz.
Keeping in mind that the card needs to offer leading-edge performance while consuming just 175W, it might be easier for AMD to hit necessary performance-per-watt targets using the fully-unlocked GPU operating at relatively low clock-rates rather than to play with both configuration and frequency. In fact, to fulfil the promise of offering two times higher performance-per-watt compared to the Radeon R9 290X, AMD's Radeon R9 Nano should hit around 6.8TFLOPS FP32 rate (or 38.8GFLOPS/W). To do so with a fully-fledged "Fiji", AMD needs to clock it at around 830MHz.


----------



## tx12

These photos from lowyat are so low res, but really awesome.
Surprised they weren't on videocardz.

Based of these photos dual Fury seem to share the same VRM design as Nano, so it could be 350W card.
Also, both nano and dual may use additionally selected low leaking chips.


----------



## Ha-Nocri

Flurry X has 7 power phases? They said it can do 400-500W?! If that's true and they are using the same PP's, it will OC to 1050MHz core no problem. But I have a feeling we won't be able to unlock voltage.


----------



## rv8000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Flurry X has 7 power phases? They said it can do 400-500W?! If that's true and they are using the same PP's, it will OC to 1050MHz core no problem. But I have a feeling we won't be able to unlock voltage.


For one stop saying "Flurry".

Secondly, Fury X has 6 phases (the reference based Fury cards share the design).

And lastly, reference Nano will likely have less OC headroom than both Fury and Fury X, and likely have more bios restrictions than the rest of the Fury series.


----------



## Ha-Nocri

Flurry? Fluuuurry ;D

6 PP's, even better for Nano. And yeah, I agree, I feel they will do what they can to restrict OCing


----------



## royfrosty

Hi guys,

Sorry I remembered someone mentioned here that they have issues with HBM overclocking in Xfire mode?

I couldn't find the post in here.

Would like to check if there is any way to get around with it?

The 1st card HBM can be oc-ed.

But the 2nd card just disallowed HBM to be oc-ed. Any work around?


----------



## rv8000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Flurry? Fluuuurry ;D
> 
> 6 PP's, even better for Nano. And yeah, I agree, I feel they will do what they can to restrict OCing


Nano, at least from the rear pcb shots, only has 4 phases









You're hopeless


----------



## littlestereo

A Fury X + 8320 at 4.5 vs Fury X + 2500k at 4.8 (with Fury X mem OC'd) and they throw down almost the same score. So much for bottlenecking! http://www.3dmark.com/compare/fs/5598293/fs/5344228


----------



## Ha-Nocri

Quote:


> Originally Posted by *rv8000*
> 
> Nano, at least from the rear pcb shots, only has 4 phases
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're hopeless


Yes, who said it has more, lel. Enough to reach Flurry's clocks (which won't happen most likely due to locked voltage).


----------



## SpeedyVT

Quote:


> Originally Posted by *littlestereo*
> 
> A Fury X + 8320 at 4.5 vs Fury X + 2500k at 4.8 (with Fury X mem OC'd) and they throw down almost the same score. So much for bottlenecking! http://www.3dmark.com/compare/fs/5598293/fs/5344228


WOW! Nice benches!


----------



## rv8000

Just ran a few benchs after a fresh w10 isntall and unlocked 4 extra CU's on my Fury Tri-X OC (3840 shaders)

Stock 1040/500
OC'd 1110/550
Ultra OC'd 1110/550

Not too shabby, still can't wait for the new version of TrixX


----------



## Clockster

So guys my sponsored gear is being sent back and I'm having to get a new board, chip and cpu.

Thinking along the lines of a Z170, I5 6600C, G.Skill Ripjaws 4 16GB (4GBx4) 2666MHz DDR4.


----------



## bonami2

Quote:


> Originally Posted by *Clockster*
> 
> So guys my sponsored gear is being sent back and I'm having to get a new board, chip and cpu.
> 
> Thinking along the lines of a Z170, I5 6600C, G.Skill Ripjaws 4 16GB (4GBx4) 2666MHz DDR4.


direct x12 = i7


----------



## bonami2

Quote:


> Originally Posted by *littlestereo*
> 
> A Fury X + 8320 at 4.5 vs Fury X + 2500k at 4.8 (with Fury X mem OC'd) and they throw down almost the same score. So much for bottlenecking! http://www.3dmark.com/compare/fs/5598293/fs/5344228


Well that pretty good

My 4790k at 4.7 do 13000 in firestrike normal Dat fx is pretty close considering it cost almost 2.5x less









I just wish i could get one of those fury


----------



## Loeschzwerg

Quote:


> Originally Posted by *rv8000*
> 
> Nano, at least from the rear pcb shots, only has 4 phases


Have you looked closely enough?

On the back you can see four pairs of capacitors in a row, but there are two additional capacitors next to this row. I think it's a 5 phase design.


----------



## bonami2

You know you can have the worst phase in the world if they are under water it just cancel everything because Vrm work with temp the lower they are the more power they can output


----------



## Clockster

Quote:


> Originally Posted by *bonami2*
> 
> direct x12 = i7


Why would you say that mate? I'm pretty sure the i5 will benefit more from DX12 than the I7.
Would you mind to elaborate.


----------



## Shatun-Bear

Quote:


> Originally Posted by *littlestereo*
> 
> A Fury X + 8320 at 4.5 vs Fury X + 2500k at 4.8 (with Fury X mem OC'd) and they throw down almost the same score. So much for bottlenecking! http://www.3dmark.com/compare/fs/5598293/fs/5344228


Nice. But it says that your 2500K is @ 3300Mhz though, not 4.8Ghz?


----------



## bonami2

Quote:


> Originally Posted by *Clockster*
> 
> Why would you say that mate? I'm pretty sure the i5 will benefit more from DX12 than the I7.
> Would you mind to elaborate.


3dmark with dx12 and other test if i remember well showed that a i3 destroyed a pentium g3258 at 4.8

Same thing for a i7 except it x2 worse. that why i do 930 in cinebench and 13k in passmark no i5 can come close to that. A amd fx can come close sometime 5.0ghz + one

i dont know if ht will perform like expected on yet but im sure with dx12 it will help and if game start to be multithreaded better like gta v

it gonna make the i5 and i7 be completly different level of cpu


----------



## huzzug

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *littlestereo*
> 
> A Fury X + 8320 at 4.5 vs Fury X + 2500k at 4.8 (with Fury X mem OC'd) and they throw down almost the same score. So much for bottlenecking! http://www.3dmark.com/compare/fs/5598293/fs/5344228
> 
> 
> 
> Nice. But it says that your 2500K is @ 3300Mhz though, not 4.8Ghz?
Click to expand...

3D mark does not report actual frequency for the processor but its factory frequency. I know


----------



## Clockster

Quote:


> Originally Posted by *bonami2*
> 
> 3dmark with dx12 and other test if i remember well showed that a i3 destroyed a pentium g3258 at 4.8
> 
> Same thing for a i7 except it x2 worse. that why i do 930 in cinebench and 13k in passmark no i5 can come close to that. A amd fx can come close sometime 5.0ghz + one
> 
> i dont know if ht will perform like expected on yet but im sure with dx12 it will help and if game start to be multithreaded better like gta v
> 
> it gonna make the i5 and i7 be completly different level of cpu


Mmmm the only thing is the pricing on the skylake i7 is madness. Better off getting a 5820K and clocking it to 4.5Ghz for the same money.


----------



## bonami2

Quote:


> Originally Posted by *Clockster*
> 
> Mmmm the only thing is the pricing on the skylake i7 is madness. Better off getting a 5820K and clocking it to 4.5Ghz for the same money.


Yea sure even at 4.7 im just almost beating a 5820k in multithreaded benchmark well in some i beat it... And the 5820k is soldered so easier to o/c Skylake will probably not overclock so.................


----------



## mustrum

Quote:


> Originally Posted by *rv8000*
> 
> Just ran a few benchs after a fresh w10 isntall and unlocked 4 extra CU's on my Fury Tri-X OC (3840 shaders)
> 
> Stock 1040/500
> OC'd 1110/550
> Ultra OC'd 1110/550
> 
> Not too shabby, still can't wait for the new version of TrixX


What do you mean with unlocked 4 extra CU's?
Is an unlock via biosbod possible like it was on some R9 290's?
I haven't read about this anywhere yet. I am sure many Fury owners would be happe to know.


----------



## rv8000

Quote:


> Originally Posted by *Loeschzwerg*
> 
> Have you looked closely enough?
> 
> On the back you can see four pairs of capacitors in a row, but there are two additional capacitors next to this row. I think it's a 5 phase design.


There doesnt look to be room for another mosfet/driver on the frontside of the pcb, its far more likely to be 4
phase. You also cannot see the back end for a 5th mosfet driver on the rear of the pcb like you can see all 6 on fury/fury x.


----------



## Forceman

Quote:


> Originally Posted by *mustrum*
> 
> What do you mean with unlocked 4 extra CU's?
> Is an unlock via biosbod possible like it was on some R9 290's?
> I haven't read about this anywhere yet. I am sure many Fury owners would be happe to know.


Yes, some can be unlocked just like Hawaii.

http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


----------



## xer0h0ur

Quote:


> Originally Posted by *Clockster*
> 
> Why would you say that mate? I'm pretty sure the i5 will benefit more from DX12 than the I7.
> Would you mind to elaborate.


Actually the few benches that were shown pre-W10 release were showing that it scaled linearly up to 6 cores and past that it was giving diminishing returns. So you could have an 8-core processor out-performing a 6 but not by much while there is just as much of a gap between 2 and 4 cores as there is between 4 and 6 cores.


----------



## xer0h0ur

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Nice. But it says that your 2500K is @ 3300Mhz though, not 4.8Ghz?


I have been told this is a result of powerplay. Disabling it altogether should stop this from happening in 3DMark.


----------



## Shatun-Bear

Quote:


> Originally Posted by *huzzug*
> 
> 3D mark does not report actual frequency for the processor but its factory frequency. I know


Quote:


> Originally Posted by *xer0h0ur*
> 
> I have been told this is a result of powerplay. Disabling it altogether should stop this from happening in 3DMark.


Oh thought as much, it's just that the FX-8320 is displayed with the true frequency of 4515Mhz and the 2500k wasn't.


----------



## SpeedyVT

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Oh thought as much, it's just that the FX-8320 is displayed with the true frequency of 4515Mhz and the 2500k wasn't.


Ironically, it's not an IPC thing. It's why some weaker GPUs will bench on really slow processors equally as well as the fastest processor with the weaker graphics. However Fury isn't a weak card, it did hit it's peak DX11 performance under the current driver platform.


----------



## bonami2

If you want bottleneck go in unigne heaven in preset extreme (1600x resolution )

It put one of my core to 100% feeding 1 gpu

And i destroy the score of my fx 6300 at 4.4 be like 250 point


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> If you want bottleneck go in unigne heaven in preset extreme (1600x resolution )
> 
> It put one of my core to 100% feeding 1 gpu
> 
> And i destroy the score of my fx 6300 at 4.4 be like 250 point


makes 0 sense at all


----------



## bonami2

Quote:


> Originally Posted by *Alastair*
> 
> makes 0 sense at all


Can you stop saying this to every of my post... If i remember well it at least the third time....

Just click on the red flag and use IGNORE and stop being annoying..

It make sense it crap the stuff about resolution if i cant push 60 fps at 1600x i cant either at 4k...


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> makes 0 sense at all
> 
> 
> 
> Can you stop saying this to every of my post... If i remember well it at least the third time....
> 
> Just click on the red flag and use IGNORE and stop being annoying..
> 
> It make sense it crap the stuff about resolution if i cant push 60 fps at 1600x i cant either at 4k...
Click to expand...

The reason I keep saying it is because it is true. Most of the stuff you post makes zero sense. If you can't handle the truth stop posting rubbish.

I mean lets look at your latest example. You claim that somehow a purely GPU based test brings your CPU to its knees. Red Flag 1. It isn't unheard of, but you have to be running insanely low resolutions to get results like that. You claim the results are at 1600 resolution. What is 1600? 1600x1? 1600x900? 1600x2560? You gave incomplete information, that is Red Flag 2.

Thirdly if it is indeed 1600x900 it can still be a relatively intense enough resolution to not be bottle necked by CPU draw calls. An FX 6300 WILL NEVER bottleneck a 7950. Unless you do everything at 800x600. Or unless your CPU was throttling, which I find likely since you didn't know how to keep a 6 core FX cool.But since you are claiming a bottleneck at 1600xSOMETHING I call bull, unless its throttled.

Honestly I have the opportunity to prove you wrong at almost every foolish claim you have made. Shall I post a refresher to your memory?

http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/600_40#post_23412842

Remember?

Honestly you seem to post around on OCN like some big shot computer enthusiast yet most of your posts seem to prove quiet the opposite. Your knowledge is clearly lacking. Clearly you do not know how to operate your computer if you are still bottle necking a 7950 with a 6300. Learn how to overclock and run your system properly bro.

Why don't I block your posts? Cause people like me who know better have to run around cleaning up your mess. (And heck I do not even get paid for it.







)


----------



## bonami2

Quote:


> Originally Posted by *Alastair*
> 
> The reason I keep saying it is because it is true. Most of the stuff you post makes zero sense. If you can't handle the truth stop posting rubbish.
> 
> I mean lets look at your latest example. You claim that somehow a purely GPU based test brings your CPU to its knees. Red Flag 1. It isn't unheard of, but you have to be running insanely low resolutions to get results like that. You claim the results are at 1600 resolution. What is 1600? 1600x1? 1600x900? 1600x2560? You gave incomplete information, that is Red Flag 2.
> 
> Thirdly if it is indeed 1600x900 it can still be a relatively intense enough resolution to not be bottle necked by CPU draw calls. An FX 6300 WILL NEVER bottleneck a 7950. Unless you do everything at 800x600. Or unless your CPU was throttling, which I find likely since you didn't know how to keep a 6 core FX cool.But since you are claiming a bottleneck at 1600xSOMETHING I call bull, unless its throttled.
> 
> Honestly I have the opportunity to prove you wrong at almost every foolish claim you have made. Shall I post a refresher to your memory?
> 
> http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/600_40#post_23412842
> 
> Remember?
> 
> Honestly you seem to post around on OCN like some big shot computer enthusiast yet most of your posts seem to prove quiet the opposite. Your knowledge is clearly lacking. Clearly you do not know how to operate your computer if you are still bottle necking a 7950 with a 6300. Learn how to overclock and run your system properly bro.
> 
> Why don't I block your posts? Cause people like me who know better have to run around cleaning up your mess.


Hey dude i have over 2 hundred game i played in the last 4 years

I do know what a cpu bottleneck is

................................................

Beam ng

Arma 3

WOW

RIFT

Next Car Game

Far Cry3

And many other game in my list Gained at least 20 fps from gettign a 4790k and even worse when i overclocked and added ram.... Benchmarked arma 3 with every cpu too

Could not push any decent fps on my 4.4 fx 6300

THAT WHY I SOLD IT

So go play minecraft and stop annoying me for no reason

Btw i says 1600x because we all know that the extreme preset is 1600x900 or something and i never seen in my entire life 1600x2560 writed somewhere because it never happened

I did 1200 in extreme preset with the same 7950 that i did 1000 max at 1150core with the fx at 4.4 But the gpu was at just 1070core.

Anyways i had core 2 duo e8400 before the fx 6300 and the singlethread is the same almost....


----------



## xer0h0ur

Here we find ourselves, nearly a week into August and no news yet on the Nano's release.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Here we find ourselves, nearly a week into August and no news yet on the Nano's release.


Yet we've seen the little guy lurking around, and know he exist.....


----------



## xer0h0ur

I am guessing its going to be a late August launch since were not even getting any news yet or even a paper launch.


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am guessing its going to be a late August launch since were not even getting any news yet or even a paper launch.


sep1?

The Nano is so small they keep loosing the card under some sofa all the time to even have a bench.
3 weeks for elmy to finish his mod.


----------



## xer0h0ur

Weeks ago they said August but didn't give a date. Who knows. Its possible they push it back but supposedly they fixed the issue in production that was holding them back from mass producing in higher quantities.


----------



## Agent Smith1984

I want to see the performance, and price..... A single card wouldn't do me a bit of good, but if they aren't too expensive, 2 of those things in crossfire would be great (depending on how hot they get).


----------



## xer0h0ur

Still think you're better off with crossfired 390's @ 8GB a pop than Nanos. I am not even considering HBM cards until we reach HBM2 capacities.


----------



## SpeedyVT

Quote:


> Originally Posted by *xer0h0ur*
> 
> Still think you're better off with crossfired 390's @ 8GB a pop than Nanos. I am not even considering HBM cards until we reach HBM2 capacities.


Effectively with Windows 10 and DX12 the memory will be shared (4 + 4). Would be nice to see the experience.


----------



## xer0h0ur

For the billionth time. DX12 does not automatically stack vRAM. The application must specifically be developed using alternate frame rendering and stacking the vRAM to be able to use this feature of DX12. Don't expect very many developers take this route either as its complicated to implement simply due to the fact there are so many varying amounts of vRAM on different graphics solutions out there.


----------



## Jflisk

don't know about the 4k thing. I have 2 of the Fury Xs and the higher you go in VSR the more of a performance hit. Maybe the drivers just need some work.


----------



## bonami2

Just made a post about my 7950 and gta v vram and ram usage

I pop 2.5gb ram usage from pushing msaa 2x to 4x with my crossfire 7950 in gta v at 5760x1080

So im thinking that the fury x would be having a hard time in those case ahah with only 4gb

R9 390x 8gb are the perfect gpu for crossfire currently


----------



## mustrum

Quote:


> Originally Posted by *bonami2*
> 
> Just made a post about my 7950 and gta v vram and ram usage
> 
> I pop 2.5gb ram usage from pushing msaa 2x to 4x with my crossfire 7950 in gta v at 5760x1080
> 
> So im thinking that the fury x would be having a hard time in those case ahah with only 4gb
> 
> R9 390x 8gb are the perfect gpu for crossfire currently


Nope. Been tested by computerbase extensively. Apparently AMD manages vmem so well with Fiji that it does not slow down or even generate bad frametimes even in situations where it should slow down (when other GPU's use way more than 4 GB vmem).

In short: Nope 2x 390x 8GB definatly aren't the best crossfire solution. Fury X will be faster.


----------



## xer0h0ur

Quote:


> Originally Posted by *mustrum*
> 
> Nope. Been tested by computerbase extensively. Apparently AMD manages vmem so well with Fiji that it does not slow down or even generate bad frametimes even in situations where it should slow down (when other GPU's use way more than 4 GB vmem).
> 
> In short: Nope 2x 390x 8GB definatly aren't the best crossfire solution. Fury X will be faster.


True and False actually. AMD has to specifically optimize vRAM usage on a game by game basis and if they don't then you get hitching/stuttering (in games which would normally go past 4GB usage). You can find videos on youtube of people playing GTAV and SoM while experiencing this hitching on Fury X. Me personally, I don't trust AMD to optimize every title out there so I default to avoiding this generation like the plague so I am not dependent on them. They can't even deliver crossfire profiles in a timely manner. Much less vRAM optimizations.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> True and False actually. AMD has to specifically optimize vRAM usage on a game by game basis and if they don't then you get hitching/stuttering (in games which would normally go past 4GB usage). You can find videos on youtube of people playing GTAV and SoM while experiencing this hitching on Fury X. Me personally, I don't trust AMD to optimize every title out there so I default to avoiding this generation like the plague so I am not dependent on them. They can't even deliver crossfire profiles in a timely manner. Much less vRAM optimizations.


If anything, the value on running 2x 390's is superb.

I really want to see some 390x CF vs Fury Pro CF results (actual gaming benchmarks) though, still trying to find some.


----------



## Alastair

Why is the Sapphire Tri-X more expensive than the Strix on Amazon now. I wanted to order a normal Sapphire Tri-X and now they are like $584 what is going on?


----------



## xer0h0ur

Amazon inflating the price. Probably using the scarcity to take advantage of people despite AMD claiming they fixed the issue causing the hold up in manufacturing.


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> Amazon inflating the price. Probably using the scarcity to take advantage of people despite AMD claiming they fixed the issue causing the hold up in manufacturing.


Dammit. I guess I will wait until the price drops back down. I wanted to buy today because the ZAR keeps weakening to the dollar. But now I need to wait until the price goes down again cause I am in no way paying an extra 400 bucks for it.


----------



## bonami2

Well i agree i would not wait on amd driver

Well i love amd and have never had problem with driver for 2 years now but you need to be patient









And for amazon they are independent seller on amazon it has nothing to do with Neweeg or Ncix...... I can sell stuff on amazon probably if i tried. Like ebay

Well Alastair your stuck with the same problem as me... The currency is so bad in canada.... we are worth like 74 us cent


----------



## Alastair

I assume the price will go down again when Amazon gets new stock? Cause they say only one is available at this point?


----------



## bonami2

Quote:


> Originally Posted by *Alastair*
> 
> I assume the price will go down again when Amazon gets new stock? Cause they say only one is available at this point?


I do expect them to drop

Amazon and neweeg have a tendency to add and drop price for no reason.

My 4790k was 400$ dropped 350 a week later 400 and after 320 and etc They just like to play with them to make profit


----------



## antonis21

Do we have any info about ASUS fury strix availability?


----------



## denman

Is there any hint at when the X2 will be released. Looking to upgrade my GTX 690 and would love to go with a Fury X2.


----------



## bonami2

Quote:


> Originally Posted by *denman*
> 
> Is there any hint at when the X2 will be released. Looking to upgrade my GTX 690 and would love to go with a Fury X2.


Well in at least 6 month i would think even more... the fury is just out i dont think they expect to ship a new variant right now


----------



## xer0h0ur

Quote:


> Originally Posted by *denman*
> 
> Is there any hint at when the X2 will be released. Looking to upgrade my GTX 690 and would love to go with a Fury X2.


AMD has not given a month for the Fury X2's release. The last thing they mentioned was an August release for the R9 Nano and I would imagine that once that time comes around we may get some more news on the Fury X2.


----------



## Kaapstad




----------



## SpeedyVT

Quote:


> Originally Posted by *Kaapstad*


OMG you are my hero.


----------



## weinstein888

Any updates on unlocking the voltage/overclocking for these things? Very close to picking up a couple and tossing EK blocks on them.


----------



## ssateneth

Quote:


> Originally Posted by *weinstein888*
> 
> Any updates on unlocking the voltage/overclocking for these things? Very close to picking up a couple and tossing EK blocks on them.


Probably not. It's probably also not worth buying these. Word on the street is Fury non-X can be unlocked on a per-card basis. Some can't be unlocked at all. Some can be partially unlocked. Some can be fully unlocked. Find a Fury that can be fully unlocked to a Fury X on the best available custom PCB and you got a killer card since you will be able to OC it much better.

I can see a possible business in binning unlockable fury's, selling fully unlocked fury's on non-ref pcb for a large premium. Of course, a big name-brand won't do this, due to agreements with AMD that full-fledged fury X can only be on AMD's ref boards. you will just have to find an individual person doing binning. Silicon Lottery comes to mind, though they only do intel chips.


----------



## xer0h0ur

Quote:


> Originally Posted by *Kaapstad*


What manner of video card porn is this? I was under the impression there was no 4-way crossfire EK block terminal. Where did that sucker come from?


----------



## weinstein888

I don't know why AMD didn't let board partners develop custom Fury X models. Really killed it for me honestly.


----------



## xer0h0ur

Quote:


> Originally Posted by *ssateneth*
> 
> Probably not. It's probably also not worth buying these. Word on the street is Fury non-X can be unlocked on a per-card basis. Some can't be unlocked at all. Some can be partially unlocked. Some can be fully unlocked. Find a Fury that can be fully unlocked to a Fury X on the best available custom PCB and you got a killer card since you will be able to OC it much better.
> 
> I can see a possible business in binning unlockable fury's, selling fully unlocked fury's on non-ref pcb for a large premium. Of course, a big name-brand won't do this, due to agreements with AMD that full-fledged fury X can only be on AMD's ref boards. you will just have to find an individual person doing binning. Silicon Lottery comes to mind, though they only do intel chips.


This doesn't make any sense. The only existing custom PCB design so far for Fury is the Asus Strix which is BIOS gimped to lower voltage than the reference PCB Tri-X. At this point that custom PCB gives no benefit whatsoever apart from little to no coil whine and premium components. We are still waiting for Afterburner and Trixx releases that allow Fury / Fury X voltage control. Its believed Asus is going to end up using that custom PCB design on more extreme variants but there hasn't been anything shown or announced yet so that may be little more than speculation.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> This doesn't make any sense. The only existing custom PCB design so far for Fury is the Asus Strix which is BIOS gimped to lower voltage than the reference PCB Tri-X. At this point that custom PCB gives no benefit whatsoever apart from little to no coil whine and premium components. We are still waiting for Afterburner and Trixx releases that allow Fury / Fury X voltage control. Its believed Asus is going to end up using that custom PCB design on more extreme variants but there hasn't been anything shown or announced yet so that may be little more than speculation.


Dont mind him, zero clue what hes talking about...


----------



## p4inkill3r

Quote:


> Originally Posted by *Kaapstad*


----------



## bonami2

The strixx is build to be pushed the vrm are able to handle the load.

But asus was drunk like the r9 290x...............









And stop saying tri-x those are the equivalent to 1.25v 7950 that catched on fire Well BURNED IN SMOKE MOSTLY from sapphire because they had higher voltage ( too much )









In mining and heavy load.... Gaming they would probably survive some years Thinking 100c those 7950 where hitting 120c of what i remember


----------



## Elmy

__
https://flic.kr/p/w5nGFB


__
https://flic.kr/p/w5nGFB
 by Anthony Lackey, on Flickr

Great things come in small packages......


----------



## Mr.N00bLaR

do want


----------



## bonami2

They did says we could do Crossfire sli with more gpu with direct x 12 no? or maybe was just sharing the load stuff like using the igp that could be offloaded.

Could be awesome 6-7 ways nano


----------



## Ehsteve

Quote:


> Originally Posted by *Elmy*
> 
> 
> __
> https://flic.kr/p/w5nGFB
> 
> 
> __
> https://flic.kr/p/w5nGFB
> by Anthony Lackey, on Flickr
> 
> Great things come in small packages......


Get a tiny case and a pci extender then jury rig the nano fan over an intake/exhaust, fit a water cooler on the CPU and suddenly the form factor becomes insanely compact so long as the PSU orientation doesn't make for any issues.


----------



## mustrum

Quote:


> Originally Posted by *bonami2*
> 
> They did says we could do Crossfire sli with more gpu with direct x 12 no? or maybe was just sharing the load stuff like using the igp that could be offloaded.
> 
> Could be awesome 6-7 ways nano


I hope the multi GPU Dx 12 rumor is a thing. I still got a R9 290 lying around that could accompany my fury x.


----------



## Loeschzwerg

@Elmy:
Nano on an ASRock X99E-ITX









What CPU to be used? i7 Extreme or Xeon?


----------



## fjordiales

R9 fury Strix now available at newegg for those interested. Just got a 3rd one 3mins ago.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fjordiales*
> 
> R9 fury Strix now available at newegg for those interested. Just got a 3rd one 3mins ago.


Those are really nice looking cards, and the components are high quality from my reading, but why did they set the core to 1.Gimp volts???


----------



## Alastair

Saphhire Radeon Fury Tri-x is now 599 on Amazon *** is going on!







I do not want to by strix because EKWB has no blocks and the BIOS is gimped on a low voltage what am I to do?


----------



## bonami2

Well 2 choice

Low volt with overpowered vrm

High volt with underpowered vrm

I vote low volt

Because my Windforce with quality vrm stay under load at 1070 core with 1.090v Under load to death 1.090v Vrm dont move at all

Where my crappy sapphire need 1.187v for 1070 WITH A DEAD VDROPP To ABOUT 1.110 ( i had 3 dual x that did about the same voltage

That with vrm under 80c on the sapphire

But if you go on water it will probably make the sapphire worth it if the vrm can be cooled right

But for now im not impressed with sapphire soldering quality


----------



## fjordiales

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Those are really nice looking cards, and the components are high quality from my reading, but why did they set the core to 1.Gimp volts???


Quote:


> Originally Posted by *Alastair*
> 
> Saphhire Radeon Fury Tri-x is now 599 on Amazon *** is going on!
> 
> 
> 
> 
> 
> 
> 
> I do not want to by strix because EKWB has no blocks and the BIOS is gimped on a low voltage what am I to do?


I emailed Asus rep about a bios regarding their 1.7v compared to 1.212v by sapphire, no response yet.

http://www.nowinstock.net/computers/videocards/amd/r9fury/


----------



## THUMPer1

Quote:


> Originally Posted by *fjordiales*
> 
> R9 fury Strix now available at newegg for those interested. Just got a 3rd one 3mins ago.


Is 1 fury worth it for 1440p?


----------



## p4inkill3r

Yes.


----------



## bonami2

Quote:


> Originally Posted by *THUMPer1*
> 
> Is 1 fury worth it for 1440p?


Perfect gpu

Dont need that much vram for 1440p

While 4k and surround is in need of more

In simple it about 2 7950 in performance


----------



## THUMPer1

Quote:


> Originally Posted by *bonami2*
> 
> Perfect gpu
> 
> Dont need that much vram for 1440p
> 
> While 4k and surround is in need of more
> 
> In simple it about 2 7950 in performance


well i was thinking 390, 390x or fury. Havent made up my mind


----------



## bonami2

Quote:


> Originally Posted by *THUMPer1*
> 
> well i was thinking 390, 390x or fury. Havent made up my mind


I would says go with fury

The 390x is good in crossfire for the vram but it still behind a fury be a large amount


----------



## Agent Smith1984

Quote:


> Originally Posted by *THUMPer1*
> 
> well i was thinking 390, 390x or fury. Havent made up my mind


At 1440, the difference is actually slim between all of those....



$330 for 390 vs $550 for Fury is a steep money hill to climb for a few more frames, but some people just got it like that... more power to 'em


----------



## Agent Smith1984

Quote:


> Originally Posted by *bonami2*
> 
> I would says go with fury
> 
> The 390x is good in crossfire for the vram but it still behind a fury be a large amount


NOT a large amount.... see the chart above... almost every major title shows the same results...


----------



## THUMPer1

Ya I've been pouring over reviews and benches. I also Overclock everything, and I know Fury Overclocking may not be all there.


----------



## Agent Smith1984

Quote:


> Originally Posted by *THUMPer1*
> 
> Ya I've been pouring over reviews and benches. I also Overclock everything, and I know Fury Overclocking may not be all there.


Yeah, OC potential kind of applies to everything though.

I'd bet with the 1170/1700 daily clocks I run on my 390, that I'm right there with that fury in actual gameplay performance. Especially at 1440....

Of course you can OC the Fury and then span the gap all over again..... Guess I'm just saying that I sooner see myself spending $660 on two cards, than $550 on one that would be a good bit slower, and I would never spend over $1k on two.....

I just personally have financial limitations, and also "common sense" boundaries I refuse to break.

I certainly would LOVE to have a Fury, and two would be tits, but in the real world, with a wife and 4 kids, I just can't make sense of spending $200 more for an 8% performance increase.









I'm completely supportive of anyone in here who chooses to do so though


----------



## bonami2

well show 3dmark score in firestrike if you reach 17-18k then your r9 390 is close but im sure it aint because a 7970 at 1300 core was show to get a 290x level performance.

Cant be that big of a gap between r9 290x and 390.

i did 16800 and could probably do 18k with my 7950 crossfire if i had voltage unlocked

1080p firestrike


----------



## fjordiales

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, OC potential kind of applies to everything though.
> 
> I'd bet with the 1170/1700 daily clocks I run on my 390, that I'm right there with that fury in actual gameplay performance. Especially at 1440....
> 
> Of course you can OC the Fury and then span the gap all over again..... Guess I'm just saying that I sooner see myself spending $660 on two cards, than $550 on one that would be a good bit slower, and I would never spend over $1k on two.....
> 
> I just personally have financial limitations, and also "common sense" boundaries I refuse to break.
> 
> I certainly would LOVE to have a Fury, and two would be tits, but in the real world, with a wife and 4 kids, I just can't make sense of spending $200 more for an 8% performance increase.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm completely supportive of anyone in here who chooses to do so though


Well, what you said makes sense. With a family, budget is very calculated and common sense boundaries need to be set.

No kids for me yet and had some saved up for gpu so my common sense went out the door. Lol. After I get the 3rd fury, I will see if the 3rd can be unlocked. Prepped for the 3rd with a t2 1600(I know, overkill).


----------



## Forceman

Quote:


> Originally Posted by *THUMPer1*
> 
> Ya I've been pouring over reviews and benches. I also Overclock everything, and I know Fury Overclocking may not be all there.


I agree with Agent Smith1984, the Fury isn't worth it for 1440p. Crossfire 4K is prime Fury territory, but anything less than that the 290X/390/390X is just too much competition at a much better price.


----------



## bonami2

4gb for 4k is crap

Im hitting vram hard with crossfire 7950 at only 5760x1080

Gta v showed 7gb vram used with sli titan x.... So that mean 4gb + 3gb in system ram slow as hell. I was myself pushing 3gb + 2.5gb system ram and stuttering like mad

But yea those r9 390x arent that bad


----------



## gatygun

Quote:


> Originally Posted by *bonami2*
> 
> 4gb for 4k is crap
> 
> Im hitting vram hard with crossfire 7950 at only 5760x1080
> 
> Gta v showed 7gb vram used with sli titan x.... So that mean 4gb + 3gb in system ram slow as hell. I was myself pushing 3gb + 2.5gb system ram and stuttering like mad
> 
> But yea those r9 390x arent that bad


What it reports as used isn't really what is needed. Some games report on a titan x towards 9gb of usaged v-ram while a 4gb 980 has no issue's playing the game. This just means it uses the v-ram to cache data instead of the system memory. There really is no need for this as that data is perfectly fine to be pushed on your system memory.

The only way to know what a game really needs v-ram wise is to push gpu's with limited v-ram inside a setup and test how far you can push the game.

Any 4gb v-ram card will play 4k gta 5 without any issue's if you got enough gpu performance, there is absolute no need for 7gb of v-ram at all.

Also gta 5 adds v-ram from crossfire/sli setups, so a sli titan x will have 24gb of v-ram reported. while in reality it's just 12gb. If it showcases 7 gb on a sli titan x, it basically means it uses a maximum of 3,5gb at times.

That your 7950 crossfire setup struggles on that resolution isn't shocking as you simple are v-ram starved.


----------



## bonami2

Quote:


> Originally Posted by *gatygun*
> 
> What it reports as used isn't really what is needed. Some games report on a titan x towards 9gb of usaged v-ram while a 4gb 980 has no issue's playing the game. This just means it uses the v-ram to cache data instead of the system memory. There really is no need for this as that data is perfectly fine to be pushed on your system memory.
> 
> The only way to know what a game really needs v-ram wise is to push gpu's with limited v-ram inside a setup and test how far you can push the game.
> 
> Any 4gb v-ram card will play 4k gta 5 without any issue's if you got enough gpu performance, there is absolute no need for 7gb of v-ram at all.
> 
> Also gta 5 adds v-ram from crossfire/sli setups, so a sli titan x will have 24gb of v-ram reported. while in reality it's just 12gb. If it showcases 7 gb on a sli titan x, it basically means it uses a maximum of 3,5gb at times.
> 
> That your 7950 crossfire setup struggles on that resolution isn't shocking as you simple are v-ram starved.


I know all that i see 3gb used in gpu z + 2.5gb of system ram from just using 4x smaa vs 2x im only using 3gb

Those benchmark are made without anti aliasing and msaa because 4k dont need it and etc

Surround setup are still stuck to run it even worse because side screen are streched.

My gpu are doing 288gbs of ram and 3gb each so your saying me that 1 gb will save my life ?

Not many people play in surround either well eyefinity for me

I may add my gpu usage drop from going to 4x while at 2x i can hold 60fps with the setting i use.. That mean the system ram is too slow ( 2400 mhz cl10 )


----------



## Agent Smith1984

I use 2.9gb vram in bf4 @ 4k ultra worth no aa, and 2.3gb vram in crysis 3 at 4k with very high textures and high system spec settings, no aa.....

I haven't tried gta yet, but the fact that the two i mentioned don't break 3gb is a pretty good indication that veam is fine for now. I did try 4x aa on bf4 and though unplayable from a fps standpoint, the vram usage was 3.5gb....


----------



## bonami2

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I use 2.9gb vram in bf4 @ 4k ultra worth no aa, and 2.3gb vram in crysis 3 at 4k with very high textures and high system spec settings, no aa.....
> 
> I haven't tried gta yet, but the fact that the two i mentioned don't break 3gb is a pretty good indication that veam is fine for now. I did try 4x aa on bf4 and though unplayable from a fps standpoint, the vram usage was 3.5gb....


bf4 look like crap

Crysis 3 is a closed map

Gta v is a full map that i can see car moving at 3 km.... Lot more texture to load


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I use 2.9gb vram in bf4 @ 4k ultra worth no aa, and 2.3gb vram in crysis 3 at 4k with very high textures and high system spec settings, no aa.....
> 
> I haven't tried gta yet, but the fact that the two i mentioned don't break 3gb is a pretty good indication that veam is fine for now. I did try 4x aa on bf4 and though unplayable from a fps standpoint, the vram usage was 3.5gb....
> 
> 
> 
> bf4 look like crap
> 
> Crysis 3 is a closed map
> 
> Gta v is a full map that i can see car moving at 3 km.... Lot more texture to load
Click to expand...

Crysis 3 still uses one of the most advanced graphics engines to date.


----------



## SpeedyVT

Quote:


> Originally Posted by *Alastair*
> 
> Crysis 3 still uses one of the most advanced graphics engines to date.


That runs like a turd solidified by concrete. It is indeed outstanding, but it is by far the least optimized. Advanced doesn't always mean advancement.


----------



## the9quad

Quote:


> Originally Posted by *SpeedyVT*
> 
> That runs like a turd solidified by concrete. It is indeed outstanding, but it is by far the least optimized. Advanced doesn't always mean advancement.


One of the few games that will take what you throw at it and utilize it. Not sure what you are talking about. Runs at 1440p and a solid 100+ fps with everything cranked utilizing all 3 cards here. Definitely better running game than a whole ton of other games that look worse: witcher 3, dying light, farcry4, ac unity, GTAV, etc.. they all run worse and are terrible at utilizing your resources.


----------



## Agent Smith1984

Crysis 3 is great in my opinion....
With the most recent AMD drivers it still saw yet another jump in performance.

It's a great game, and still a great benchmark, an still looks outstanding.

Not to mention Crysis 3 gets along with AMD GPU's very well!

Not sure what the gripes are with that game?

Also, I think FireStrike is a cool benchmark and all, but I put very little stock in the results now considering the numbers we see in real gaming benchmarks, versus the numbers that some of these cards are putting up in FireStrike.

Currently, 4GB VRAM is not an issue, even at 4k... not to say it won't be when using unplayable settings, but then again, unplayable settings are usable to begin are they?


----------



## bonami2

The only game that peg my 4790k at 80c loading the map is gta v i call that optimisation. And all the thread are working while in game....

Unplayable setting? im playing at 60fps on mostly high setting with 2xmsaa i could do 40fps with 4x smaa but i lack vram to do it my gpu usage drop


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Crysis 3 is great in my opinion....
> With the most recent AMD drivers it still saw yet another jump in performance.
> 
> It's a great game, and still a great benchmark, an still looks outstanding.
> 
> Not to mention Crysis 3 gets along with AMD GPU's very well!
> 
> Not sure what the gripes are with that game?
> 
> *Also, I think FireStrike is a cool benchmark and all, but I put very little stock in the results now considering the numbers we see in real gaming benchmarks, versus the numbers that some of these cards are putting up in FireStrike.*
> 
> Currently, 4GB VRAM is not an issue, even at 4k... not to say it won't be when using unplayable settings, but then again, unplayable settings are usable to begin are they?


You do realize that synthetics are 99% the best case scenario when it comes testing a GPU's actual potential. The issue lies not on the GPU side of things, but the software/game engine side as engines, at least for pc games as pcs can have numerous types of configurations, are not infinitely scale-able software to hardware; we also have to take into consideration that some engines may simply run better on a different graphics architecture. Simple examples can be found by looking at games like Metro, Far Cry 4, Crysis 3, and SoM where Fiji has much larger performance differentials compared to Hawaii, and is competitive or out performs GM200. On the flipside if we look at W3, GTA V, and other games Fiji will sometimes show single digit performance increases over its predecessor and fall way behind GM200.

Now that's not necessarily the whole story for performance differences between Hawaii and Fiji, as there do seem to be some possible architectural bottlenecks that are not simple to explain (and I certainly can't say for sure). When you look at the broader picture, it is very messy, and performance differences are all across the spectrum of bad to good. Whatever the issue is, Fiji remains a fantastic advancement in my book. If there are people that don't think it's a worthy investment they must step back and look at the fact that in any circumstance there tends to be diminishing returns as input ($$$ in this case) gets higher and what output will result.


----------



## TK421

Performance of Fury X is still below 980ti/tx? Have anyone managed to beat the TX/980ti by overclocking, new drivers etc with their fury x? :|


----------



## cowie

asus strike voltage mods and -0c results








http://kingpincooling.com/forum/showthread.php?p=30820#post30820


----------



## bonami2

That thing rule


----------



## p4inkill3r

Quote:


> Originally Posted by *cowie*
> 
> asus strike voltage mods and -0c results
> 
> 
> 
> 
> 
> 
> 
> 
> http://kingpincooling.com/forum/showthread.php?p=30820#post30820


1450/1000


----------



## TK421

Quote:


> Originally Posted by *cowie*
> 
> asus strike voltage mods and -0c results
> 
> 
> 
> 
> 
> 
> 
> 
> http://kingpincooling.com/forum/showthread.php?p=30820#post30820


Anything that doesn't require ln2 and soldering? :V


----------



## SpeedyVT

Quote:


> Originally Posted by *cowie*
> 
> asus strike voltage mods and -0c results
> 
> 
> 
> 
> 
> 
> 
> 
> http://kingpincooling.com/forum/showthread.php?p=30820#post30820


MY GOD! That's insane!


----------



## flopper

Quote:


> Originally Posted by *THUMPer1*
> 
> Is 1 fury worth it for 1440p?


Depends.
the cost is really steep from 390 to fury.
Here its twice the money from 330 euro to 600 euro.
Unless you have at least 30% more fps with the Fury I would go with 390 in every other case.
They both OC around the same so its one of those cases as is with the 980ti and such that you really dont get much value for twice the money today.
Once the die shrink next year happens maybe then but atm its a really difficult choice what to buy.

My 290 gave up so running a 6850 atm and since I bought a new screen the Fury is now out of it for me and a 390 will replace it.
Its still an upgrade for me from the previous 290 albait not huge but enough to justifiy that vs a fury.
I buy 390 today and wait out the next years die shrink as hopefully it happens fast.
if that die shrink really change things I wouldnt feel ok with a fury/980ti buy today.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rv8000*
> 
> You do realize that synthetics are 99% the best case scenario when it comes testing a GPU's actual potential. The issue lies not on the GPU side of things, but the software/game engine side as engines, at least for pc games as pcs can have numerous types of configurations, are not infinitely scale-able software to hardware; we also have to take into consideration that some engines may simply run better on a different graphics architecture. Simple examples can be found by looking at games like Metro, Far Cry 4, Crysis 3, and SoM where Fiji has much larger performance differentials compared to Hawaii, and is competitive or out performs GM200. On the flipside if we look at W3, GTA V, and other games Fiji will sometimes show single digit performance increases over its predecessor and fall way behind GM200.
> 
> Now that's not necessarily the whole story for performance differences between Hawaii and Fiji, as there do seem to be some possible architectural bottlenecks that are not simple to explain (and I certainly can't say for sure). When you look at the broader picture, it is very messy, and performance differences are all across the spectrum of bad to good. Whatever the issue is, Fiji remains a fantastic advancement in my book. If there are people that don't think it's a worthy investment they must step back and look at the fact that in any circumstance there tends to be diminishing returns as input ($$$ in this case) gets higher and what output will result.


I agree with all of that, i guess i should have elaborated...

My point is, we never get to see that true potential until 2 years later when it's already time for another release, just like with the Hawaii in the 390 series.

It's primarely a software issue from both a game developer standpoint, and from a driver standpoint.
I agree that benchmarks are the best case scenario, and a lot of that has to do with the fact that it's a predetermined sequence of scenes rendered by the gpu only.

In that scenario, the fury is 15-20% faster, but in real world tests, it tends to only be 5-10% faster, especially in resolutions of 1440 or less.

I know new tech always has a premium involved, i was just stating that the diminishing returns for that premium aren't part of my spending logic.

Happy for anyone who chooses to go that route though!

I'm a big follower of this thread, despite my lack of intentions of purchasing one of these cards any time soon.


----------



## fewness

Quote:


> Originally Posted by *cowie*
> 
> asus strike voltage mods and -0c results
> 
> 
> 
> 
> 
> 
> 
> 
> http://kingpincooling.com/forum/showthread.php?p=30820#post30820


So far we still can't do anything on ref card?


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I agree with all of that, i guess i should have elaborated...
> 
> My point is, we never get to see that true potential until 2 years later when it's already time for another release, just like with the Hawaii in the 390 series.
> 
> It's primarely a software issue from both a game developer standpoint, and from a driver standpoint.
> I agree that benchmarks are the best case scenario, and a lot of that has to do with the fact that it's a predetermined sequence of scenes rendered by the gpu only.
> 
> In that scenario, the fury is 15-20% faster, but in real world tests, it tends to only be 5-10% faster, especially in resolutions of 1440 or less.
> 
> I know new tech always has a premium involved, i was just stating that the diminishing returns for that premium aren't part of my spending logic.
> 
> Happy for anyone who chooses to go that route though!
> 
> I'm a big follower of this thread, despite my lack of intentions of purchasing one of these cards any time soon.


In synthetics the gap is more like 20-30%, I've also found in practice the gap between my 290x Lightning vs my Sapphire Fury Tri-X to be much larger, roughly 15% in TW3 at a slightly lower core clock on the Fury (with some more concrete testing i think the difference would actually prove to be slightly higher ~20%. Apples to Apples settings in my personal test showed my 290X (1140/1500) to average around 52 fps @ 1440p while my fury (1100/530) was able to average ~62 fps @ 1440. Almost all retail reviews have a very high variation between results and actually misrepresent Fiji performance depending on where you look. I for one lack faith in almost every review site now aside from temps and noise (and even then the consistency is quite poor).


----------



## xer0h0ur

Games rarely if ever live up to a synthetic benchmark's performance.


----------



## bonami2

I always trusted benchmark and always had equivalent increase.

My 3dmark score doubled i had double fps in all my game with 90-100% scaling...... Im currently able to push close to fury and 980ti level of performance ( not the fluidity )

But yea the fury is not that got at 1080p.


----------



## th3illusiveman

So you can unlock Fury cards now?


----------



## Forceman

Quote:


> Originally Posted by *th3illusiveman*
> 
> So you can unlock Fury cards now?


You can partially unlock some of them (4 of the 8 disabled CUs). Seems like very, very few can be fully unlocked.


----------



## xer0h0ur

Yeah its going to be a while before fully unlockable Fiji dies are more common. Its not like AMD is swimming in availability right now of parts. At some point they will run out of dies with issues and will have no choice but to put fully functioning Fiji XT (Fury X) dies into the Fury.


----------



## provost

Quote:


> Originally Posted by *cowie*
> 
> asus strike voltage mods and -0c results
> 
> 
> 
> 
> 
> 
> 
> 
> http://kingpincooling.com/forum/showthread.php?p=30820#post30820


Great Graphics Score







but, needs some soldering to get there...








Physics score is holding the total score back, may be due to the his 4790k vs 5960x..... or ram ?

may be we can see a Kingpin version of this eh







(if NV let's EVGA out of its exclusivity .... lol)


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> You can partially unlock some of them (4 of the 8 disabled CUs). Seems like very, very few can be fully unlocked.


Very few? Just glancing at the unlock thread shows there are fairly high percentages of at the very least 4 extra CU's being unlocked. Just from I guess I'd like to say its greater than 60% chance to unlock at least 4, all 8 is another story entirely.


----------



## xer0h0ur

I see nothing wrong with his statement. He only said very very few about full unlock.


----------



## Forceman

Quote:


> Originally Posted by *rv8000*
> 
> Very few? Just glancing at the unlock thread shows there are fairly high percentages of at the very least 4 extra CU's being unlocked. Just from I guess I'd like to say its greater than 60% chance to unlock at least 4, all 8 is another story entirely.


Yeah, some can unlock 4, and very very few can unlock 8. Was it so confusing the way I wrote it originally?


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, some can unlock 4, and very very few can unlock 8. Was it so confusing the way I wrote it originally?


I probably stopped reading half way through


----------



## Neon Lights

Hey, does anyone know what would happen if that LN2 BIOS was flashed on a Fury X (reference of course)?

Could the voltage then be only regulated using ASUS GPU Tweak?


----------



## xer0h0ur

There is no hardware or software lock prohibiting voltage changes to begin with. There simply hasn't been any software released by anyone that controls it yet. At this point I wouldn't be surprised if Nvidia payed off Unwinder and the tool from techpowerup that does Trixx to not release it.


----------



## Neon Lights

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is no hardware or software lock prohibiting voltage changes to begin with. There simply hasn't been any software released by anyone that controls it yet. At this point I wouldn't be surprised if Nvidia payed off Unwinder and the tool from techpowerup that does Trixx to not release it.


So there would be no problems if I did so?

I hope the person who created the thread soon uploads the BIOS!


----------



## xer0h0ur

Nooooo. I didn't say that. I wouldn't be surprised if the LN2 BIOS makes voltage changes which aren't kosher under standard use. I can't really speak on the matter as I am not familiar with that BIOS at all.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is no hardware or software lock prohibiting voltage changes to begin with. There simply hasn't been any software released by anyone that controls it yet. *At this point I wouldn't be surprised if Nvidia payed off Unwinder and the tool from techpowerup that does Trixx to not release it.*


Wizzard has already submitted a beta version for testing to Sapphire (I personally asked him this within a pm), and it's hardly been a week since voltage control was first announced working. Unwinder has unfortunately not had a Fiji card in hand to test with, and therefor can't properly develop the software currently. Quite the ridiculous comment to make.


----------



## xer0h0ur

Yeah, July 27th was the date on his piece on techpowerup. Obviously I have no proof. It just seemed suspect as hell to me considering Wizzard already said its as good as it can be done and released a piece on techpowerup basically sour grapesing the results on his card while in effect providing half-measured benchmarking. People other than Unwinder had also made progress on Afterburner getting it to work but that hasn't materialized publicly either. Clearly I can't put any blame on Unwinder because its not like he should have to pony up cash every time to have a video card on hand for Afterburner support.


----------



## Elmy

Quote:


> Originally Posted by *Loeschzwerg*
> 
> @Elmy:
> Nano on an ASRock X99E-ITX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What CPU to be used? i7 Extreme or Xeon?


5960X

For those that want to see what I am building. I have posted in the case mod section here.

http://www.overclock.net/t/1568676/amd-nano-project


----------



## littlestereo

Got my Fury-X's in the EKWB nickel acrylic blocks finally and they haven't gotten over 30-35 running unigine for almost an hour. The UV blue coolant looks amazing through the acrylic! (2 x 240mm Black Ice Nemesis GTS's with swiftech MCP50X pump in its own loop, Thermaltake Riing 12 fans x 4).



Spoiler: More Images!


----------



## eucalyptus

How is it going with the Amd radeon Fury X2?

I a building a ITX rig, and I got a EVGA 980 Ti Hydro copper ordered.

But since I got ITX with only one PCI express I need as much power as possible from one card.

Guess I will have to go for the 980 Ti since there is nothing new about the Fury X2? No release or anything?


----------



## xer0h0ur

They haven't even mentioned a month for Fury X2, much less given a release date. They still have to release the R9 Nano this month. We may get more information around the Nano's release.


----------



## ozyo

any news about voltage control ?


----------



## Medusa666

These cards do look amazing, have they sorted out the pump noise problem as of the today or?


----------



## xer0h0ur

Fury X? That was sorted out immediately. Not that it matters though since if any store has old stock you would still get the first revision. I doubt any old stock still exists anywhere other than perhaps mom and pop stores who don't move merchandise quickly.


----------



## rv8000

Quote:


> Originally Posted by *ozyo*
> 
> any news about voltage control ?


If the beta version for the new TrixX checks out with Sapphire, I'm guessing we should see it out sometime this week if all goes well as Wizzard sent them a build for testing late last week.

As far as AB is concerned, Unwinder has no Fiji card to work with so no ETA on AB support for voltage control.

GPUTweak is a mystery at this point.


----------



## dir_d

How does he still not have a Fiji?


----------



## xer0h0ur

Because obviously MSI are a bunch of morons. They should be responsible for providing Unwinder with cards to update afterburner but they didn't.


----------



## rv8000

Quote:


> Originally Posted by *dir_d*
> 
> How does he still not have a Fiji?


I don't think he ever goes out and buys the newest GPU's, AFAIK MSI often provides him with a sample and for one reason or another this is not the case for Fiji.


----------



## Agent Smith1984

Does MSI even offer a Fiji card? I haven't seen one at all?


----------



## p4inkill3r

They have a Fury X, I haven't seen a non-X model though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> They have a Fury X, I haven't seen a non-X model though.


That's pretty surprising considering how well the gaming models sell for every other GPU series they offer.

I run the 390 owners club, and the amount of MSI Gaming cards I have on the roster compared to all other brands, is staggering!!

Can't believe they don't have a fiji offering???


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's pretty surprising considering how well the gaming models sell for every other GPU series they offer.
> 
> I run the 390 owners club, and the amount of MSI Gaming cards I have on the roster compared to all other brands, is staggering!!
> 
> Can't believe they don't have a fiji offering???


Apparently they opted out. I don't think Furys are selling particularly well :/


----------



## Ceadderman

Pretty sure they haven't. Stock is low due to HBM die stocks being low. MSi isn't the only producer affected by this.









~Ceadder


----------



## Agent Smith1984

Maybe, MSI told AMD they didn't want to move forward with a Fury card until AMD could guarantee x amount of parts?
I mean, I'm sure MSI expects to sell the crap out of it.... why bother doing these tiny production runs on limited stock, when you can let AMD get their availability issues straightened, and make a bunch at once....


----------



## dir_d

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Maybe, MSI told AMD they didn't want to move forward with a Fury card until AMD could guarantee x amount of parts?
> I mean, I'm sure MSI expects to sell the crap out of it.... why bother doing these tiny production runs on limited stock, when you can let AMD get their availability issues straightened, and make a bunch at once....


This could be true, i want that Lightning Fury


----------



## Alastair

Any one have any idea when price on Sapphire Fury Tri-x is going to go down on Amazon? They are now 599! It's God damned ridiculous.


----------



## Kuivamaa

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Maybe, MSI told AMD they didn't want to move forward with a Fury card until AMD could guarantee x amount of parts?
> I mean, I'm sure MSI expects to sell the crap out of it.... why bother doing these tiny production runs on limited stock, when you can let AMD get their availability issues straightened, and make a bunch at once....


Limited production , sapphire and Asus claimed almost all chips.


----------



## bonami2

Quote:


> Originally Posted by *Alastair*
> 
> Any one have any idea when price on Sapphire Fury Tri-x is going to go down on Amazon? They are now 599! It's God damned ridiculous.


It 750$ cad currently here


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Any one have any idea when price on Sapphire Fury Tri-x is going to go down on Amazon? They are now 599! It's God damned ridiculous.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It 750$ cad currently here
Click to expand...

It is ridiculous. I am going to buy two. That's a big sale. I won't touch the Asus Fury with a 10 foot long barge pole. I want the Sapphire card. I do not understand why.


----------



## bonami2

Well only 2 manufacturer = hey dude we play with price? oh yea good idea........ And us customer are screwed again


----------



## rv8000

Quote:


> Originally Posted by *bonami2*
> 
> Well only 2 manufacturer = hey dude we play with price? oh yea good idea........ And us customer are screwed again


Has nothing to do with Sapphire and ASUS, they set an MSRP and prices will only be reduced by them. You only have 3rd parties like Newegg, Amazon, and others to blame as they're the ones who jack the prices up.


----------



## bonami2

Quote:


> Originally Posted by *rv8000*
> 
> Has nothing to do with Sapphire and ASUS, they set an MSRP and prices will only be reduced by them. You only have 3rd parties like Newegg, Amazon, and others to blame as they're the ones who jack the prices up.


Well that what i mean to says sorry...

Someone somewhere is playing with price

I have a bad habit of writing faster than i read ahahah


----------



## Ceadderman

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bonami2*
> 
> Well only 2 manufacturer = hey dude we play with price? oh yea good idea........ And us customer are screwed again
> 
> 
> 
> Has nothing to do with Sapphire and ASUS, they set an MSRP and prices will only be reduced by them. You only have 3rd parties like Newegg, Amazon, and others to blame as they're the ones who jack the prices up.
Click to expand...

*QFT*









~Ceadder


----------



## Yorkston

Looks like my replacement Fury X is also getting RMA'ed to Newegg this week. Nasty coil whine/buzzing that gets worse the higher I set the framerate cap. The fan also makes an awful howling noise at anything over ~1100rpm. Pity they were smart and don't allow refunds for the Fury X, only replacement.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Yorkston*
> 
> Looks like my replacement Fury X is also getting RMA'ed to Newegg this week. Nasty coil whine/buzzing that gets worse the higher I set the framerate cap. The fan also makes an awful howling noise at anything over ~1100rpm. Pity they were smart and don't allow refunds for the Fury X, only replacement.


Replacement only is a crappy policy considering that they may not even have one in stock for you!!


----------



## Alastair

Is there any way to complain about the inflating of prices. Cause Amazon is the only place that I can find that ships to SA. And the fact that the prices have been inflated is making me so damn angry. I just got the money in a few days ago and now saw us gone up. Really screws the pooch. There has to be something consumers can do.


----------



## Jflisk

I am thinking about doing a power color RMA on my Fury X . Every so often looks like my refresh rate is changing my screen halfs with green at the bottom the top has the windows screen with lines thru it. If i do preferences and change either 3d on/off or resolution the screen goes back to normal no logs no driver crashes. Like what ever its doing its normal other then the screen looks like heck . Think ill throw my bottom card to the top and see what happens. Could be the windows 10 and the drivers.Plays games like a champ thought can go hours on BF all and Crysis all no problems this is a weird one.


----------



## Medusa666

Ordered a Fury X card to play around with today, will be fun to see how it performs.


----------



## Alastair

When do you guys think powercolor will bring their Fury to market?


----------



## Nizzen

Quote:


> Originally Posted by *Medusa666*
> 
> Ordered a Fury X card to play around with today, will be fun to see how it performs.


It is not fun at all. Have had mine in 1.5 months. It sux in overclocking and performe worse than 980ti, when overclocked.

Good I have some Titan X`s in the other pc...


----------



## p4inkill3r

Quote:


> Originally Posted by *Nizzen*
> 
> It is not fun at all. Have had mine in 1.5 months. It sux in overclocking and performe worse than 980ti, when overclocked.
> 
> Good I have some Titan X`s in the other pc...


I've had mine for just as long and love it.
It is quiet, cool, and performs great in every game I play.


----------



## Medusa666

Quote:


> Originally Posted by *Nizzen*
> 
> It is not fun at all. Have had mine in 1.5 months. It sux in overclocking and performe worse than 980ti, when overclocked.
> 
> Good I have some Titan X`s in the other pc...


Quote:


> Originally Posted by *p4inkill3r*
> 
> I've had mine for just as long and love it.
> It is quiet, cool, and performs great in every game I play.


Sounds like a mixed bag of reactions here.

I'm going to test the card out, and have some fun with it, then see if I want to stay with it or not.

The things that really appeal to me is the quitness of the card (given that the pump issue is resolved with the one I'm getting), the low temperatures, the lower power draw, and the non-throttling.

What I do not like is that it has an AIO, eventually it will fail. Neither do I like that the card is so small. The fan seems hard to replace on the radiator, was better on the 295X2 where you could simply disconnect it.

My hopes are that the card will outperform my 295X2 in general, mainly removing the throttling which disturbs the gaming somewhat at times and delivering a smoother, more refined experience.


----------



## Medusa666

Does anyone here have information about the quality grade of the components used on the Fury X PCB?

Same goes for the pump and loop, how good are these compared to Aseteks solution that was applied on the 295X2, or the Swifttech technology, or the Do-It-Yourself AIOs for Nvidia and AMD cards such as the Kraken variants?


----------



## nickcnse

Just picked up a used XFX R9 Fury X for $550, did I do good? Also, does anyone know if throwing an EK waterblock on it would void the warranty? Thanks guys.

In answer to my own question and for anyone else who was wondering, from the XFX website:

"Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Products returned to XFX must be fully assembled with the original thermal solution (heatsink, fansink, etc) that was installed at the time of purchase.

XFX graphics card were designed to perform optimally with our manufacturers thermal solutions, however you may feel the need to push your performance higher than what we have designed it for, so we can only gurantee the performance and quality of the product as it was originally intended so you should keep in mind that over clocking your graphics card and the use of water cooling solutions is at your own risk and that damage to the card via improper use such as unregulated power overages will not be covered. Any physical damage such as burn marks or damaged PCB will void ALL warranties.

This modder friendly policy only applies to the United States and Canada.

- See more at: http://xfxforce.com/en-us/support/faq#sthash.HXYFx6B5.dpuf"


----------



## bonami2

Quote:


> Originally Posted by *nickcnse*
> 
> Just picked up a used XFX R9 Fury X for $550, did I do good? Also, does anyone know if throwing an EK waterblock on it would void the warranty? Thanks guys.
> 
> In answer to my own question and for anyone else who was wondering, from the XFX website:
> 
> "Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Products returned to XFX must be fully assembled with the original thermal solution (heatsink, fansink, etc) that was installed at the time of purchase.
> 
> XFX graphics card were designed to perform optimally with our manufacturers thermal solutions, however you may feel the need to push your performance higher than what we have designed it for, so we can only gurantee the performance and quality of the product as it was originally intended so you should keep in mind that over clocking your graphics card and the use of water cooling solutions is at your own risk and that damage to the card via improper use such as unregulated power overages will not be covered. Any physical damage such as burn marks or damaged PCB will void ALL warranties.
> 
> This modder friendly policy only applies to the United States and Canada.
> 
> - See more at: http://xfxforce.com/en-us/support/faq#sthash.HXYFx6B5.dpuf"


It will paper wise it against the warantly but if you put back the cooler on it they will in 98% of time rma it without problem


----------



## Alastair

Guys. Any news on Powercolour Fury airs?


----------



## Jflisk

Quote:


> Originally Posted by *nickcnse*
> 
> Just picked up a used XFX R9 Fury X for $550, did I do good? Also, does anyone know if throwing an EK waterblock on it would void the warranty? Thanks guys.
> 
> In answer to my own question and for anyone else who was wondering, from the XFX website:
> 
> "Installing third party cooling solutions does not void warranty on our products. Just be sure to keep the original cooling solution as it will have to be on the card if it is ever sent in for RMA. Products returned to XFX must be fully assembled with the original thermal solution (heatsink, fansink, etc) that was installed at the time of purchase.
> 
> XFX graphics card were designed to perform optimally with our manufacturers thermal solutions, however you may feel the need to push your performance higher than what we have designed it for, so we can only gurantee the performance and quality of the product as it was originally intended so you should keep in mind that over clocking your graphics card and the use of water cooling solutions is at your own risk and that damage to the card via improper use such as unregulated power overages will not be covered. Any physical damage such as burn marks or damaged PCB will void ALL warranties.
> 
> This modder friendly policy only applies to the United States and Canada.
> 
> - See more at: http://xfxforce.com/en-us/support/faq#sthash.HXYFx6B5.dpuf"


I have had XFX for the longest time they will honor the warranty as long as you put there cooler back on before shipping it back in the US. Also make sure the previous user did not register the card with XFX. If they did then they have the warranty.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> Guys. Any news on Powercolour Fury airs?


I was reading an article about AMD stocks, and the author delved into a lengthy bit about Fury.....

He said that "some of AMD's partners are not even able to bring third party variants of the Fury air to the market yet, due to a lack of supply"

I'm guessing one of two things has happened....

Either the partners said "screw making a Fury air until you can get me [x] amount of parts for a mass production"

Or AMD said, "sorry MSI, PowerColor, and XFX, we can't give you guys anything yet, we need to take care of Asus and Sapphire first"


----------



## en9dmp

I just uninstalled and reinstalled sapphire trixx and afterburner and now I don't get the core or memory clocks on the sliders... They just show 0-100 and can't be adjusted!

It was working fine before... Anyone had this problem with their Furies?


----------



## rv8000

Quote:


> Originally Posted by *en9dmp*
> 
> I just uninstalled and reinstalled sapphire trixx and afterburner and now I don't get the core or memory clocks on the sliders... They just show 0-100 and can't be adjusted!
> 
> It was working fine before... Anyone had this problem with their Furies?


Did you re-enable unofficial overclocking and set "extended" overclocking limits in the settings tab?


----------



## en9dmp

Quote:


> Originally Posted by *rv8000*
> 
> Did you re-enable unofficial overclocking and set "extended" overclocking limits in the settings tab?


Yeah, I'm pretty certain I set all the extended options, disabled ULPS etc, it's pretty strange. I'll send a screenshot when I get back home. I can still overclock OK in CCC but hoping voltage control is around the corner so will need the new trixx(xxx?) app to work properly when it comes out...


----------



## rv8000

Quote:


> Originally Posted by *en9dmp*
> 
> Yeah, I'm pretty certain I set all the extended options, disabled ULPS etc, it's pretty strange. I'll send a screenshot when I get back home. I can still overclock OK in CCC but hoping voltage control is around the corner so will need the new trixx(xxx?) app to work properly when it comes out...


I'm sure it's just a fluke with AB settings, I ended up reinstalling after I couldn't get the slider to move and after the reinstall and setting the proper settings I had no issues with the memory slider again.


----------



## Ceadderman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys. Any news on Powercolour Fury airs?
> 
> 
> 
> I was reading an article about AMD stocks, and the author delved into a lengthy bit about Fury.....
> 
> He said that "some of AMD's partners are not even able to bring third party variants of the Fury air to the market yet, due to a lack of supply"
> 
> I'm guessing one of two things has happened....
> 
> Either the partners said "screw making a Fury air until you can get me [x] amount of parts for a mass production"
> 
> *Or AMD said, "sorry MSI, PowerColor, and XFX, we can't give you guys anything yet, we need to take care of Asus and Sapphire first"*
Click to expand...

I seriously doubt that's what happened.

It's likely lack of HBM modules from supplier that's hamstringing the production level of complete cards. It's new tech and there are always issues with new tech in the run up process imho.

Fury X is two months old today. And people want the world available to them without hassle. I get it. So do I. But ya gotta have patience where new tech is concerned. Sadly the demand created by the glut inspires vendors to stick it to the consumer.









~Ceadder


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ceadderman*
> 
> I seriously doubt that's what happened.
> 
> It's likely lack of HBM modules from supplier that's hamstringing the production level of complete cards. It's new tech and there are always issues with new tech in the run up process imho.
> 
> Fury X is two months old today. And people want the world available to them without hassle. I get it. So do I. But ya gotta have patience where new tech is concerned. Sadly the demand created by the glut inspires vendors to stick it to the consumer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I am more inclined to believe that XFX, MSI, and PowerColor has postponed making a Fury card on their own accord, due to supply limitations. But you also have to suspect that part of the limitation is due to AMD wanting to scratch Sapphire and Asus' backs first......


----------



## p4inkill3r

One could assume that if MSI wasn't such a big player.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> One could assume that if MSI wasn't such a big player.


I thought that too, but if you figure that NVIDIA has 75% of the market share on GPU's, and MSI manufactures cards for both, maybe they didn't want any part of fury until AMD could guarantee a substantial enough supply to mass produce 30-90 days worth of inventory.

Think of all the cards they are producing there.... it would very inefficient to make a limited number of Furies per batch.... I'm sure the cooler is already designed, and I wouldn't be shocked if MSI uses something really close to the reference board design. I'm guessing we will see the Twin Frozr V on their Fury, that will be almost identical to the one used on the 970, 980, 980ti, and the 390/390x's....


----------



## p4inkill3r

That is a more likely scenario for sure.


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> I seriously doubt that's what happened.
> 
> It's likely lack of HBM modules from supplier that's hamstringing the production level of complete cards. It's new tech and there are always issues with new tech in the run up process imho.
> 
> Fury X is two months old today. And people want the world available to them without hassle. I get it. So do I. But ya gotta have patience where new tech is concerned. Sadly the demand created by the glut inspires vendors to stick it to the consumer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am more inclined to believe that XFX, MSI, and PowerColor has postponed making a Fury card on their own accord, due to supply limitations. But you also have to suspect that part of the limitation is due to AMD wanting to scratch Sapphire and Asus' backs first......
Click to expand...

But I mean powercolor has pictures of their aircooled PCS+ Fury on their website. Which must mean something is happening. But it's been ages since those pics came out and now we are still waiting. I'm just keen to see those get released cause I am not to impressed with amazon's latest price raping on the Sapphire cards. And I don't want the Asus because of its crippled bios and lack of water blocks.


----------



## bonami2

Well i would says be patient but at the same time it a trap. Just wait next years for 8gb hbm ahah

And next for hbm25555 with 42gb.


----------



## Alastair

Quote:


> Originally Posted by *bonami2*
> 
> Well i would says be patient but at the same time it a trap. Just wait next years for 8gb hbm ahah
> 
> And next for hbm25555 with 42gb.


nah. No time to wait. My 6850's are aging. Struggling more and more with newer games by the day. I want to buy my Fury's yesterday. And the longer I wait, the more the ZAR to dollar deteriorates due to the damage the ANC is doing to this country. If I don't see a price drop on Amazon within the next day or so I'm going to buy from the Egg and just ask my friend in the Virgin Islands to ship em to me.


----------



## Henderjc




----------



## rx7racer

That looks pretty darn smexxy, kinda sad I went EK now.


----------



## Orthello

Sorry fellas havn't kept up to date, has voltage control been released yet for fury ? i know wizard was making progress ?


----------



## rv8000

Quote:


> Originally Posted by *Orthello*
> 
> Sorry fellas havn't kept up to date, has voltage control been released yet for fury ? i know wizard was making progress ?


Not yet, he has submitted a new beta to Sapphire for testing recently, still hoping to see it sometime this week unless there are major issues.


----------



## looncraz

Quote:


> Originally Posted by *Ceadderman*
> 
> I seriously doubt that's what happened.
> 
> It's likely lack of HBM modules from supplier that's hamstringing the production level of complete cards. It's new tech and there are always issues with new tech in the run up process imho.
> 
> Fury X is two months old today. And people want the world available to them without hassle. I get it. So do I. But ya gotta have patience where new tech is concerned. Sadly the demand created by the glut inspires vendors to stick it to the consumer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


You know, this got me to thinking about the bitter reality here:

nVidia may end up directly benefiting from the fact that AMD is pushing for mainstream production of HBM first. This way, by the time they have a solution ready for market, many of the kinks will have been worked out - and only AMD will have had to deal with the negative market effects involved.

Sometimes I think AMD should just stop innovating, it almost always gets them burned.


----------



## Ceadderman

Meh, somebody had to do it and there is nothing wrong with the tech other than lack of HBM supply from the only chip builder of HBM.

Not even nVidia coulda capped these out faster. Is okay the longer I have to wait, the sooner Fury X2 will be here.







lulz

I gotta focus on my mod first so...









~Ceadder


----------



## fewness

Quote:


> Originally Posted by *rv8000*
> 
> Not yet, he has submitted a new beta to Sapphire for testing recently, still hoping to see it sometime this week unless there are major issues.


Can I volunteer to beta test it? Yes I'm that eager to unlock the voltage of my Fury X


----------



## fewness

Quote:


> Originally Posted by *looncraz*
> 
> You know, this got me to thinking about the bitter reality here:
> 
> nVidia may end up directly benefiting from the fact that AMD is pushing for mainstream production of HBM first. This way, by the time they have a solution ready for market, many of the kinks will have been worked out - and only AMD will have had to deal with the negative market effects involved.
> 
> Sometimes I think AMD should just stop innovating, it almost always gets them burned.


There is nothing wrong for AMD to try lead in HBM technology. But I feel it's a wrong decision to rush it out with Fury. They could have just kept HBM on an even small scale of experimenting products, while mass produce the Fury line with 8G GDDR5.


----------



## Yorkston

^My card. Cellphone camera doesn't really capture the whine but you can hear the buzzing under load. Off to Sapphire with you, since Newegg counts the 30 days from the original purchase instead of when they ship replacements. Hopefully 3rd time is the charm.


----------



## Alastair

Quote:


> Originally Posted by *Yorkston*
> 
> 
> 
> 
> 
> 
> ^My card. Cellphone camera doesn't really capture the whine but you can hear the buzzing under load. Off to Sapphire with you, since Newegg counts the 30 days from the original purchase instead of when they ship replacements. Hopefully 3rd time is the charm.


Instead of just sending it back why don't you just take steps to reduce the whine. Cause ultimately coil whine isn't really a fault. It's just an audible vibration of the power components. It's not like the product is defective. Run programs that cause heavy load over 24-48 hours. That can reduce coil whine.


----------



## Alastair

Any one know of any promo codes for Newegg so that I can shave a bit off the cost of my Fury's?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yorkston*
> 
> 
> 
> 
> 
> 
> ^My card. Cellphone camera doesn't really capture the whine but you can hear the buzzing under load. Off to Sapphire with you, since Newegg counts the 30 days from the original purchase instead of when they ship replacements. Hopefully 3rd time is the charm.
> 
> 
> 
> Instead of just sending it back why don't you just take steps to reduce the whine. Cause ultimately coil whine isn't really a fault. It's just an audible vibration of the power components. It's not like the product is defective. Run programs that cause heavy load over 24-48 hours. That can reduce coil whine.
Click to expand...

Launch CS:GO, leave up the main menu overnight with vsync off....usually helps alot with coil whine


----------



## p4inkill3r

The post-run credits screen in Heaven is my favorite cardblaster.


----------



## Sgt Bilko

Quote:


> Originally Posted by *p4inkill3r*
> 
> The post-run credits screen in Heaven is my favorite cardblaster.


Never thought of that one before.....thank you









It's just basically anything that gives you stupid high fps really but the higher it is the less time it takes to wear down


----------



## jase78

New egg promo code :0u812


----------



## Alastair

It does not want to work. Oh well.


----------



## royfrosty

I'm still awaiting my Water blocks for my 2x fury x from Bitspower. Cant wait!


----------



## xer0h0ur

Quote:


> Originally Posted by *looncraz*
> 
> You know, this got me to thinking about the bitter reality here:
> 
> nVidia may end up directly benefiting from the fact that AMD is pushing for mainstream production of HBM first. This way, by the time they have a solution ready for market, many of the kinks will have been worked out - and only AMD will have had to deal with the negative market effects involved.
> 
> Sometimes I think AMD should just stop innovating, it almost always gets them burned.


Except for the fact that AMD retains priority access to HBM2 so Nvidia gets to stand in line with their hands out hoping to get some HBM2 put on their interposers.

Edit: The only chance in hell Nvidia has of circumventing AMD's priority access with Hynix is if Samsung begins production of HBM themselves.


----------



## xer0h0ur

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Launch CS:GO, leave up the main menu overnight with vsync off....usually helps alot with coil whine


I did this but I opened the developer console and put in the command fps_max 0 to uncap the framerate.


----------



## Yorkston

I'll give the heaven credits thing a shot this weekend.

On a semi-related note, I've also been having some issues with the MG279Q I got to go along with the Fury X. I'm occasionally getting displayport link failures while gaming, even after a Windows re-install and trying 4 different cables. Since I got an actual VESA-certified cable I have been fine as long as I keep the card at near-stock clocks, but overclocking it gives the chance of the link failure. I would chalk that up to the card instability, but switching back to my old 1080p monitor lets me run 1140/550 all day with no issues whatsoever. Likewise, clocks that were stable on my old r9 290 also cause link errors with the new monitor.

Any ideas what is going on here? Bad monitor, bad cable, or does displayport just hate AMD cards?


----------



## Ceadderman

Which version of Windows are you running?









~Ceadder


----------



## Yorkston

Win7 x64, just did a clean install over the weekend.


----------



## Mr.N00bLaR

No nano info yet?


----------



## xer0h0ur

Still waiting on its launch this month. When, who knows. However considering were near mid month with nothing yet, I am guessing end of the month.


----------



## Jflisk

Its probably so nano you cant even see it


----------



## fjordiales

Got my 3rd Strix...


----------



## CM Felinni

Quote:


> Originally Posted by *fjordiales*
> 
> Got my 3rd Strix...


iSpy a HAF XB! Those Fury cards are looking good in that chassis!


----------



## looncraz

Quote:


> Originally Posted by *xer0h0ur*
> 
> Except for the fact that AMD retains priority access to HBM2 so Nvidia gets to stand in line with their hands out hoping to get some HBM2 put on their interposers.
> 
> Edit: The only chance in hell Nvidia has of circumventing AMD's priority access with Hynix is if Samsung begins production of HBM themselves.


Assuming AMD really has priority access, of course, rather than just some profit share (both would be awesome







).


----------



## xer0h0ur

I feel as if people are sometimes intentionally ignorant. I informed you on something. You could have just as easily googled what I told you. Instead you decided to live in la la land. *shrug*

http://www.tweaktown.com/news/46420/amd-priority-access-hbm2-advantage-over-nvidia/index.html

http://hexus.net/tech/news/graphics/84662-amd-said-secured-priority-access-sk-hynixs-hbm2-chips/

http://www.eteknix.com/amd-priority-access-nvidia-hbm2/

http://wccftech.com/amd-working-entire-range-hbm-gpus-follow-fiji-fury-lineup/


----------



## bonami2

Edit nothing


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> I feel as if people are sometimes intentionally ignorant. I informed you on something. You could have just as easily googled what I told you. Instead you decided to live in la la land. *shrug*
> 
> http://www.tweaktown.com/news/46420/amd-priority-access-hbm2-advantage-over-nvidia/index.html
> 
> http://hexus.net/tech/news/graphics/84662-amd-said-secured-priority-access-sk-hynixs-hbm2-chips/
> 
> http://www.eteknix.com/amd-priority-access-nvidia-hbm2/
> 
> http://wccftech.com/amd-working-entire-range-hbm-gpus-follow-fiji-fury-lineup/


All those articles use the WCCF article as a source. So really it's one unattributed WCCF "source" that is making that claim.


----------



## Elmy

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> No nano info yet?


I got one.... whats your problem? LoL


----------



## Mr.N00bLaR

Quote:


> Originally Posted by *Elmy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr.N00bLaR*
> 
> No nano info yet?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got one.... whats your problem? LoL
Click to expand...

What's my my problem? WHATS MY PROBLEM!!!!?????

*This is my problem







*



It just BARELY fits lol. I'm moving to a 4U but I'd still like more hoarse power in a smaller package while suckin down less juice.


----------



## Alastair

And still the Amazon price rape of Sapphire Fury continues. Can anyone tell me. Should I get strixx instead? There are no blocks for it, will there be blocks? Also still nothing on Powercolors Fury. So get the strixx now. Or see and wait for powercolor?


----------



## Ceadderman

Quote:


> Originally Posted by *Alastair*
> 
> And still the Amazon price rape of Sapphire Fury continues. Can anyone tell me. Should I get strixx instead? There are no blocks for it, will there be blocks? Also still nothing on Powercolors Fury. So get the strixx now. Or see and wait for powercolor?


I don't know what the issue is, I just found an MSi for $699 a Sapphire for less a VisionTek and PowerColor cards split the difference seems they have good stock too but am on my phone atm.

Yes there are some serious gougers' on Amazon but I found the cards mentioned simply by searching "Fury X" in their search box.









Now I am torn between my mod *OR* a Fury X...









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> And still the Amazon price rape of Sapphire Fury continues. Can anyone tell me. Should I get strixx instead? There are no blocks for it, will there be blocks? Also still nothing on Powercolors Fury. So get the strixx now. Or see and wait for powercolor?
> 
> 
> 
> I don't know what the issue is, I just found an MSi for $699 a Sapphire for less a VisionTek and PowerColor cards split the difference seems they have good stock too but am on my phone atm.
> 
> Yes there are some serious gougers' on Amazon but I found the cards mentioned simply by searching "Fury X" in their search box.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I am torn between my mod *OR* a Fury X...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
Click to expand...

The prices for Fury X are still reasonable. But I am talking about the Sapphire Fury Tri-x. Price is supposed to be 549. But they are selling it at 600. At that point it seems almost pointless to get and rather just get a full size Fury X. But I don't want Fury X. Can't afford it. I wanted Sapphire Tri-x Fury. And then I wanted to see and experiment with unlocking and other bios hacks just for the Lols.


----------



## Ceadderman

Shop on Google Mate. Found all kinds of Tri-Xs'.









~Ceadder


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> The prices for Fury X are still reasonable. But I am talking about the Sapphire Fury Tri-x. Price is supposed to be 549. But they are selling it at 600. At that point it seems almost pointless to get and rather just get a full size Fury X. But I don't want Fury X. Can't afford it. I wanted Sapphire Tri-x Fury. And then I wanted to see and experiment with unlocking and other bios hacks just for the Lols.


You're importing to SA aren't you? Why are you looking to go with a Fury/X over a 980 Ti?


----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> The prices for Fury X are still reasonable. But I am talking about the Sapphire Fury Tri-x. Price is supposed to be 549. But they are selling it at 600. At that point it seems almost pointless to get and rather just get a full size Fury X. But I don't want Fury X. Can't afford it. I wanted Sapphire Tri-x Fury. And then I wanted to see and experiment with unlocking and other bios hacks just for the Lols.
> 
> 
> 
> You're importing to SA aren't you? Why are you looking to go with a Fury/X over a 980 Ti?
Click to expand...

simply because I don't like what Nvidia has been doing with game works. Didn't like what they did with the GTX 970 issue. Don't like how they are pushing locking our overclocking on mobile parts. Besides. I have been using AMD since HD5770. I haven't wanted to switch ever because I have never had a problem. I also want the chance to own the latest and greatest in technology. Aka HBM and get the chance to tinker with it a bit.


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> Shop on Google Mate. Found all kinds of Tri-Xs'.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


How does one shop on Google?


----------



## Alastair

Also. Dunno if it's an error. But an Amazon seller with good reputation has Sapphire Fury Tri-x for 252 a piece. I have ordered two. Will see what happens.


----------



## Origondoo

Is there any chance to get more info about the Fury X2 on the Nano release day?

Really curious about Nano and Double Nano


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> Also. Dunno if it's an error. But an Amazon seller with good reputation has Sapphire Fury Tri-x for 252 a piece. I have ordered two. Will see what happens.


252 a piece?

Don't be shocked when some 290 tri-x show up!!

Then again, Amazon is well known to have pricing errors sometimes, and if you really end up with two Fury cards for the price of one, then I'm jelly!!


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Also. Dunno if it's an error. But an Amazon seller with good reputation has Sapphire Fury Tri-x for 252 a piece. I have ordered two. Will see what happens.
> 
> 
> 
> 252 a piece?
> 
> Don't be shocked when some 290 tri-x show up!!
> 
> Then again, Amazon is well known to have pricing errors sometimes, and if you really end up with two Fury cards for the price of one, then I'm jelly!!
Click to expand...

yeah I am also prepared to pull the plug in a moments notice do not worry. Also if need be. Someone mentioned shop on Google. How do you shop on Google?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Also. Dunno if it's an error. But an Amazon seller with good reputation has Sapphire Fury Tri-x for 252 a piece. I have ordered two. Will see what happens.
> 
> 
> 
> 252 a piece?
> 
> Don't be shocked when some 290 tri-x show up!!
> 
> Then again, Amazon is well known to have pricing errors sometimes, and if you really end up with two Fury cards for the price of one, then I'm jelly!!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah I am also prepared to pull the plug in a moments notice do not worry. Also if need be. Someone mentioned shop on Google. How do you shop on Google?
Click to expand...

Hell of a deal if it's true but here the Google shopping thingy: LINK


----------



## Alastair

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Also. Dunno if it's an error. But an Amazon seller with good reputation has Sapphire Fury Tri-x for 252 a piece. I have ordered two. Will see what happens.
> 
> 
> 
> 252 a piece?
> 
> Don't be shocked when some 290 tri-x show up!!
> 
> Then again, Amazon is well known to have pricing errors sometimes, and if you really end up with two Fury cards for the price of one, then I'm jelly!!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah I am also prepared to pull the plug in a moments notice do not worry. Also if need be. Someone mentioned shop on Google. How do you shop on Google?
> 
> Click to expand...
> 
> Hell of a deal if it's true but here the Google shopping thingy: LINK
Click to expand...

If only that were true. Amazon seller Contacted me. They had made an error. So yeah. Oh well. It was worth the shot any way.

I found the shopping thingy link. But it doesn't seem to be much use to me here in South Africa. Or is there a way I can select preferences.

It the price of the Sapphire Fury is going to continue to be way up at 600 dollars. I can't believe people even buy it at that price. As for the Strixx. Still sitting merry at 569. I'm stick between wait for powercolour or buy strixx.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> If only that were true. Amazon seller Contacted me. They had made an error. So yeah. Oh well. It was worth the shot any way.
> 
> I found the shopping thingy link. But it doesn't seem to be much use to me here in South Africa. Or is there a way I can select preferences.
> 
> It the price of the Sapphire Fury is going to continue to be way up at 600 dollars. I can't believe people even buy it at that price. As for the Strixx. Still sitting merry at 569. I'm stick between wait for powercolour or buy strixx.


I wish for you, that the voltage control would roll out, so that buying the strix (unless you plan to go water) would be a viable option.

The tri-x is in no way, shape, or form worth $600. I'm sorry, it's just not that much card.


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> If only that were true. Amazon seller Contacted me. They had made an error. So yeah. Oh well. It was worth the shot any way.
> 
> I found the shopping thingy link. But it doesn't seem to be much use to me here in South Africa. Or is there a way I can select preferences.
> 
> It the price of the Sapphire Fury is going to continue to be way up at 600 dollars. I can't believe people even buy it at that price. As for the Strixx. Still sitting merry at 569. I'm stick between wait for powercolour or buy strixx.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wish for you, that the voltage control would roll out, so that buying the strix (unless you plan to go water) would be a viable option.
> 
> The tri-x is in no way, shape, or form worth $600. I'm sorry, it's just not that much card.
Click to expand...

I know that's why I am SOOOOO angry. I was happy to buy two sapphires at 549 even 559 but at 600 on Amazon it's a no no. And Amazon is like the only place that will ship to South Africa I dunno about the strixx. Can anyone confirm if there will be blocks for it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> I know that's why I am SOOOOO angry. I was happy to buy two sapphires at 549 even 559 but at 600 on Amazon it's a no no. And Amazon is like the only place that will ship to South Africa I dunno about the strixx. Can anyone confirm if there will be blocks for it?


I would think there would be blocks, but again, you'll the luxury of the short PCB (not sure that matters much since we are all used to large boards at this point anyways).

Have you considered holding until the end of the month for Nano (we hope)?


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I know that's why I am SOOOOO angry. I was happy to buy two sapphires at 549 even 559 but at 600 on Amazon it's a no no. And Amazon is like the only place that will ship to South Africa I dunno about the strixx. Can anyone confirm if there will be blocks for it?
> 
> 
> 
> I would think there would be blocks, but again, you'll the luxury of the short PCB (not sure that matters much since we are all used to large boards at this point anyways).
> 
> Have you considered holding until the end of the month for Nano (we hope)?
Click to expand...

The long pub of the strixx does not bother me. I have plenty of room in my Phantom 820. I am just worried about blocks and overclock ability. I know the Sapphire wins that at this stage simply because of a slightly higher stick voltage.

I don't know how long I can wait. I want to rebuild my computer before the NAG Rage gaming expo here in SA. That is in October. And also now that it's public knowledge, AMD might try and find a way to prevent people from unlocking the disabled cores on Fiji. Which might happen on later batch cards. Cause I want to try my hand at that. And the strixx don't have a back up bios like Sapphire. So again another point to the Sapphire especially if I'm gonna try unlock. So yeah. Sigh.


----------



## Ceadderman

I suggest going to EK and using their cooling configurator to see if blocks are available. Pretty sure there are but not so much that I would guarantee availability.









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> I suggest going to EK and using their cooling configurator to see if blocks are available. Pretty sure there are but not so much that I would guarantee availability.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I did. All it says is. "coming soon"


----------



## Ceadderman

There you go. It's a positive answer at least.









~Ceadder


----------



## looncraz

Quote:


> Originally Posted by *xer0h0ur*
> 
> I feel as if people are sometimes intentionally ignorant. I informed you on something. You could have just as easily googled what I told you. Instead you decided to live in la la land. *shrug*
> 
> http://www.tweaktown.com/news/46420/amd-priority-access-hbm2-advantage-over-nvidia/index.html
> 
> http://hexus.net/tech/news/graphics/84662-amd-said-secured-priority-access-sk-hynixs-hbm2-chips/
> 
> http://www.eteknix.com/amd-priority-access-nvidia-hbm2/
> 
> http://wccftech.com/amd-working-entire-range-hbm-gpus-follow-fiji-fury-lineup/


I was actually the first one to suggest this was the case, however there is no official confirmation. It is just everyone parroting rumor, quite possibly starting from ME. Until there is an official confirmation of it, it may as well not be the case. It certainly is not yet fact.


----------



## Ceadderman

Actually, there was a writeup in CPU magazine where they mention that not only did AMD create the tech, they contracted with the HBM manufacturer in Korea for Exclusive access to it. If you follow the bouncing ball, logic dictates that nVidia is stuck out in the Cold.







lulz

~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> All those articles use the WCCF article as a source. So really it's one unattributed WCCF "source" that is making that claim.


Have you been keeping score with WCCF's Fiji reports for the past year? They have been wrong once on their performance estimate for Fiji XT but every piece with insider information and photos has been bang on. Whomever their source happens to be is clearly working for AMD or quite closely with them within an AIB. They were the ones that debunked the report of there being less than 30,000 Fury X's for sale this year, they were the first to report HBM1 was exclusive to AMD (obviously first to report HBM2 priority access for AMD), first far as I know to show accurate renders of Fury X and the Strix Fury. I am sure I am forgetting a lot of things in between there.

The fact remains though that if Samsung decides to crash the HBM party then that entire advantage goes up in smoke.


----------



## Jflisk

I was looking around not much of a price difference between Fury(Strix) and Fury X .


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you been keeping score with WCCF's Fiji reports for the past year? They have been wrong once on their performance estimate for Fiji XT but every piece with insider information and photos has been bang on. Whomever their source happens to be is clearly working for AMD or quite closely with them within an AIB. They were the ones that debunked the report of there being less than 30,000 Fury X's for sale this year, they were the first to report HBM1 was exclusive to AMD (obviously first to report HBM2 priority access for AMD), first far as I know to show accurate renders of Fury X and the Strix Fury. I am sure I am forgetting a lot of things in between there.
> 
> The fact remains though that if Samsung decides to crash the HBM party then that entire advantage goes up in smoke.


AMD sources a very small amount of VRAM from Samsung these days, and with NVIDIA still holding 75% market share, you can believe they'd love the opportunity to go straight to NVIDIA with HBM2 or perhaps a proprietary equivalent. (Though I've not see anything like that happen before outside of the Blu-Ray/HD-DVD battle).


----------



## xer0h0ur

There would be no point to re-invent the wheel. HBM is a Jedec standard so any member of Jedec has the right to produce HBM if they so felt inclined to.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> There would be no point to re-invent the wheel. HBM is a Jedec standard so any member of Jedec has the right to produce HBM if they so felt inclined to.


Yep, and I assume AMD has first dibs at HBM2 with regards to Hynix.

But with AMD showing Samsung very little love right now (and maybe Elpida too now, who came to bat for them in a major way during the bitcoin fiasco), NVIDIA may knock on their door(s) with a suitcase full of money and their palms wide open, lol.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> There would be no point to re-invent the wheel. HBM is a Jedec standard so any member of Jedec has the right to produce HBM if they so felt inclined to.


True, but they open themselves up for litigation if they do since AMD created the tech. Unless they alter the tech enough to make it worth pushing that envelope. Very savvy of AMD to protect themselves on two fronts imho.









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> True, but they open themselves up for litigation if they do since AMD created the tech. Unless they alter the tech enough to make it worth pushing that envelope. Very savvy of AMD to protect themselves on two fronts imho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


What? I just finished saying HBM is a JEDEC standard. Every single member of JEDEC has the right to manufacture any technology standardized by JEDEC. While SK Hynix and AMD co-developed HBM, its not exclusive to them altogether. Hynix is however the only company currently manufacturing HBM and they aren't stupid enough to give anyone any pointers on how to do it either. There are only two other companies with the fabs and enough money to throw around to manufacture HBM if they really wanted to and its Samsung and Intel. Both of which are members of JEDEC.


----------



## Kana-Maru

AMD Radeon R9 Fury X user here. Upgraded from dual Nvidia GTX 670s SLI. I'm fed up with Nvidia at the moment and voted with my dollars instead of speech. You know how some people give lengthy speeches about Nvidia serious issues and problems, but excuse all Nvidia negativity by buying another Nvidia product. Well I'm not that person and went AMD this time around. I was tired of seeing my dual 670s overclock and performance decrease from mid 2014 to late 2014. The drivers in 2015 was simply pathetic compared to 2014 drivers. Crashing and causing color bleeds etc. I had many reasons for changing brands this time around.

Overall I'm happy with the Fury X. Over 15,000 points in 3DStrike Firestrike, more than 8,000 points in FireStrike Extreme, and more than 4300 Firestrike Ultra @ stock clock. All while maintaining great temperatures. I haven't attempted to overclock my Fury X just yet since I haven't really needed to. I might mess around with OC'ing this weekend.

Running Shadows of Mordor 100% maxed [Ultra + 6GB Texture Pack] @ *4K* matched a i7-4960X+X79+Fury X, giving me [X5660+X58+Fury X] a very smooth 48fps average during "in-game" benchmark. The "built in" Benchmarking Tool gave me 48.73fps Average. Temps stayed below 60c.









The card idles around 26c-29c depending on the ambient and case temperature.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> AMD Radeon R9 Fury X user here. Upgraded from dual Nvidia GTX 670s SLI. I'm fed up with Nvidia at the moment and voted with my dollars instead of speech. You know how some people give lengthy speeches about Nvidia serious issues and problems, but excuse all Nvidia negativity by buying another Nvidia product. Well I'm not that person and went AMD this time around. I was tired of seeing my dual 670s overclock and performance decrease from mid 2014 to late 2014. The drivers in 2015 was simply pathetic compared to 2014 drivers. Crashing and causing color bleeds etc. I had many reasons for changing brands this time around.
> 
> Overall I'm happy with the Fury X. Over 15,000 points in 3DStrike Firestrike, more than 8,000 points in FireStrike Extreme, and more than 4300 Firestrike Ultra @ stock clock. All while maintaining great temperatures. I haven't attempted to overclock my Fury X just yet since I haven't really needed to. I might mess around with OC'ing this weekend.
> 
> Running Shadows of Mordor 100% maxed [Ultra + 6GB Texture Pack] @ *4K* matched a i7-4960X+X79+Fury X, giving me [X5660+X58+Fury X] a very smooth 48fps average during "in-game" benchmark. The "built in" Benchmarking Tool gave me 48.73fps Average. Temps stayed below 60c.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The card idles around 26c-29c depending on the ambient and case temperature.


Nice to see you take your stand, and also glad you are happy with the performance.

I really want to get SoM and see how it runs on my 390..... I've literally not played that game at all yet.

I'me always one of the "late to the party" guys with games and hardware though. It's mainly a money thing for me... but I also like to wait until plenty of refined patches and drivers are out also. The only thing I've been "first to the table" on around here was this R9 390, but not really much "new" going on with these. I sold off my pair of Tri-X 290's in order to ultimately run 2x 390's for the larger frame buffer needed in some cases (and possibly more cases soon) @ 4K.

My theory is that the frame buffer of the Fury processes information through the memory so quickly, that the VRAM size is not an issue, as it inherently doesn't need to store as much information at one time. That could be completely inaccurate though?


----------



## xer0h0ur

Its not the speed, its literally how the vRAM is getting used that is different between the GDDR5 cards and the HBM cards. They aren't clogging up the vRAM with as many things as the GDDR5 cards are. This is why usage is significantly lower on the Fury/X compared to the Hawaii based cards even when using the same Ultra / uncompressed textures in SoM. At some point there was hitching / stuttering in SoM while using Fury X. I presume it was fixed since then. I just wish AMD was more specific on how they are managing to pull this off.


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice to see you take your stand, and also glad you are happy with the performance.
> 
> I really want to get SoM and see how it runs on my 390..... I've literally not played that game at all yet.
> 
> I'me always one of the "late to the party" guys with games and hardware though. It's mainly a money thing for me... but I also like to wait until plenty of refined patches and drivers are out also. The only thing I've been "first to the table" on around here was this R9 390, but not really much "new" going on with these. I sold off my pair of Tri-X 290's in order to ultimately run 2x 390's for the larger frame buffer needed in some cases (and possibly more cases soon) @ 4K.


Thanks. So far it's dominating my older 670s as I hoped. The 670s were amazing until Nvidia stopped optimizing the drivers and flat out degrading performance on my kepler cards. I lost more thna 1,000 points in 3Dmark FireStrike. The Fury X is much better than my highest dual 670 score.

SoM is a pretty great game. It's a pretty deep Action\RPG\Fantasy game. The graphics are great and the gameplay is very smooth. There was a patch for the game and it has been out since last year. It's worth the purchase. I didn't know it was linked to The Lord of the Rings. I'm not a LotR fan or anything like that. I prefer SoM anyways. I restarted the game and put 2 hours into it with my gf. We are only 6% done. So yeah you will have to make time to complete this title for sure.

I thought about getting the R9 390X. I actually considered getting dual 390X for CFX. Instead I wanted the water cooler. So it has paid off and kept my room much cooler. Those air blowers can heat up the room quickly, especially when running dual GPUs.

Quote:


> My theory is that the frame buffer of the Fury processes information through the memory so quickly, that the VRAM size is not an issue, as it inherently doesn't need to store as much information at one time. That could be completely inaccurate though?


Well I need to investigate this. I believe you are correct though. I had no micro stutter. With only 4GB HBM vs the 6GB Texture Pack requirement, I had no issues @ 4K which requires a decent amount of VRAM on it's own. Even when the screen was filled with many enemies and monsters I had no issues with frame rates or frame times. HBM does appear to be very fast, My computer is also fast enough to handle the data I imagine. I will try to get around to playing SoM @ 4K again while monitoring the VRAM info to see how high it goes. I'm also going to check out other things as well.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not the speed, its literally how the vRAM is getting used that is different between the GDDR5 cards and the HBM cards. They aren't clogging up the vRAM with as many things as the GDDR5 cards are. This is why usage is significantly lower on the Fury/X compared to the Hawaii based cards even when using the same Ultra / uncompressed textures in SoM. At some point there was hitching / stuttering in SoM while using Fury X. I presume it was fixed since then. I just wish AMD was more specific on how they are managing to pull this off.


Interesting....

Can anyone with a Fury or Fury X please verify their VRAM usage at these settings (in BF4):

4k (I use 4096x2160, but also have VRAM usage numbers for 3840)
DEFAULT Ultra settings but with AA turned off
100% Res scale...

Very curious to see the VRAM usage for Fury VS my Hawaii!


----------



## Kana-Maru

As I said in my previous post......i will try to find some time to monitor the vRAM usage @ 4K and report back. I had no micro stutter so the 4GB HBM is definitely doing it's job.


----------



## criminal

Quote:


> Originally Posted by *jase78*
> 
> New egg promo code :0u812


I can't believe no one commented on this... lol. And someone actually tried it as a promo code. I actually laughed out loud!

On topic: We need some Nano information. Really digging that card!


----------



## blue1512

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Interesting....
> 
> Can anyone with a Fury or Fury X please verify their VRAM usage at these settings (in BF4):
> 
> 4k (I use 4096x2160, but also have VRAM usage numbers for 3840)
> DEFAULT Ultra settings but with AA turned off
> 100% Res scale...
> 
> Very curious to see the VRAM usage for Fury VS my Hawaii!


From what Amd said, they optimized Vram usage on HBM with their driver, game by game basis. Not much games need this optimization by the way.


----------



## josephimports

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Interesting....
> 
> Can anyone with a Fury or Fury X please verify their VRAM usage at these settings (in BF4):
> 
> 4k (I use 4096x2160, but also have VRAM usage numbers for 3840)
> DEFAULT Ultra settings but with AA turned off
> 100% Res scale...
> 
> Very curious to see the VRAM usage for Fury VS my Hawaii!


Operation Locker 64P


Spoiler: Warning: Spoiler!


----------



## battleaxe

Quote:


> Originally Posted by *josephimports*
> 
> Operation Locker 64P
> 
> 
> Spoiler: Warning: Spoiler!


Wow... not even 3Gb in Xfire... do I see that right?


----------



## PontiacGTX

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its not the speed, its literally how the vRAM is getting used that is different between the GDDR5 cards and the HBM cards. They aren't clogging up the vRAM with as many things as the GDDR5 cards are. This is why usage is significantly lower on the Fury/X compared to the Hawaii based cards even when using the same Ultra / uncompressed textures in SoM. At some point there was hitching / stuttering in SoM while using Fury X. I presume it was fixed since then. I just wish AMD was more specific on how they are managing to pull this off.


the vram usage is the same on GDDR5 and HBM only that the vram frame that remains forces to uses the least.either way ona game that requirest extra vram will have poor performance
Quote:


> Originally Posted by *battleaxe*
> 
> Wow... not even 3Gb in Xfire... do I see that right?


Battlefield 4 doesnt requires that much vram unless you use the downsampling option


----------



## bonami2

i need a minimum of 5gb vram to be sure at 5760x1080 in gta v and that is not maxxed out.

I would says i minimum of 8gb vram is needed in my setup.....

So im gonna wait to get some 12gb vram gpu with ultra fast bandwith to upgrade









Most of my game use less than 3-4gb thought

even one less than 1.5gb with pretty nice graphic


----------



## Ceadderman

~Ceadder


----------



## Agent Smith1984

Quote:


> Originally Posted by *josephimports*
> 
> Operation Locker 64P
> 
> 
> Spoiler: Warning: Spoiler!


Thanks for posting, i get the same results.


----------



## bonami2

Quote:


> Originally Posted by *josephimports*
> 
> Operation Locker 64P
> 
> 
> Spoiler: Warning: Spoiler!


4.8 1.21v Damn is that a god sample i want one


----------



## Gumbi

Quote:


> Originally Posted by *bonami2*
> 
> 4.8 1.21v Damn is that a god sample i want one


And I thought *I* had a good sample (4.9 at 1.31v). Don't think even my card can go that low, maybe 1.24/1.25 for 4.8?


----------



## battleaxe

Are there any plans for non reference models on the FuryX?


----------



## ozyo

Quote:


> Originally Posted by *bonami2*
> 
> i need a minimum of 5gb vram to be sure at 5760x1080 in gta v and that is not maxxed out.
> 
> I would says i minimum of 8gb vram is needed in my setup.....
> 
> So im gonna wait to get some 12gb vram gpu with ultra fast bandwith to upgrade
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Most of my game use less than 3-4gb thought
> 
> even one less than 1.5gb with pretty nice graphic




Quote:


> Originally Posted by *battleaxe*
> 
> Are there any plans for non reference models on the FuryX?


no


----------



## battleaxe

Quote:


> Originally Posted by *ozyo*
> 
> 
> no


Darnit... well then the regular Fury look enticing. I think I may just wait until they come out with an 8GB version then. I've got the GPU itch to buy though, so hard not to pull the trigger on these.


----------



## Maticb

Quote:


> Originally Posted by *battleaxe*
> 
> Darnit... well then the regular Fury look enticing. I think I may just wait until they come out with an 8GB version then. I've got the GPU itch to buy though, so hard not to pull the trigger on these.


There will be no 8GB version it's a limitation of the current HBM. Higher VRAM will only be possible on HBM2 next year (or so).


----------



## ozyo

Quote:


> Originally Posted by *battleaxe*
> 
> Darnit... well then the regular Fury look enticing. I think I may just wait until they come out with an 8GB version then. I've got the GPU itch to buy though, so hard not to pull the trigger on these.


you going to wait for long time
4gb is the limit for hbm 1


----------



## battleaxe

Quote:


> Originally Posted by *Maticb*
> 
> There will be no 8GB version it's a limitation of the current HBM. Higher VRAM will only be possible on HBM2 next year (or so).


That's fine by me. I can wait a year.
Quote:


> Originally Posted by *ozyo*
> 
> you going to wait for long time
> 4gb is the limit for hbm 1


Ditto.

I'll just have to find something else to spend my money on. Maybe its time to start paying down the mortgage after all. LOL


----------



## localh85

What OCs is everyone getting on their Fury X?

I am getting like only a 4% on the GPU with 50% power via OverDrive.

Feels bad man.


----------



## Kana-Maru

I haven't really overclocked my Fury X at all. It has been running everything marvelously. I'm getting together a ton of benchmarks. I guess I'll mess around with it and check back later.


----------



## p4inkill3r

Quote:


> Originally Posted by *localh85*
> 
> What OCs is everyone getting on their Fury X?
> 
> I am getting like only a 4% on the GPU with 50% power via OverDrive.
> 
> Feels bad man.


I'm rock solid at 1115/520.


----------



## Kana-Maru

Alright I ran some OC benchmarks. I'm stable with:

*Power Limit:* 0% - Untouched -
*Core Clock:* 1100Mhz
*Memory Clock:* 570Mhz

In Firestrike temps never went about 43c


----------



## Agent Smith1984

What are people getting in Heaven 4.0 @ 1080P max settings, with these cards?


----------



## p4inkill3r




----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*


That was fast! Thanks man

I can get real close to the stock score with my 390 @ 1200/1700. I get 1608 (63.8 fps)...

That fury scales well with such small clock increases!

Any pro owners have a heaven run?


----------



## Medusa666

Anyone who bought either the Asus Strix or the Sapphire Tri-X cooler versions?

Can you elaborate on your satisfication with the product, good things, bad, how is the power draw?


----------



## Alastair

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone who bought either the Asus Strix or the Sapphire Tri-X cooler versions?
> 
> Can you elaborate on your satisfication with the product, good things, bad, how is the power draw?


I have been saving for two Sapphires. Imma get them soon.


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone who bought either the Asus Strix or the Sapphire Tri-X cooler versions?
> 
> Can you elaborate on your satisfication with the product, good things, bad, how is the power draw?


Strix here in 3x CrossfireX


Satisfaction? Very Satisfied. I had the 980 Ti AMP Omega in SLI. Single card, 980 Ti wins. Multi GPU, Fury & especially Fury X wins in 1440p & above(Most games). $100 cheaper per card at the time I got mine.

http://www.pcworld.com/article/2947547/components-graphics/amd-radeon-fury-crossfire-review-2-fast-2-furious.html

http://www.tweaktown.com/articles/7226/amd-radeon-r9-fury-video-cards-crossfire/index8.html

http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/16/

If anyone can add on to these, or even correct me, that would be appreciated.

Pros on Strix:
Looks. I have red & black theme.
2 slots instead of 2.5 on sapphire.
Custom PCB*
Quiet Fans**
DVI port (that I don't even use)
Performance close to fury X for $100 less. (CU can be unlocked)
Not heavy (980Ti AMP omega felt like I was holding a newborn)

Cons:
Performance/Noise/OC potential*** is 2nd place to Sapphire.

AVAILABILITY. My 1st 2 cards are right next to each other when it comes to serial number. 2-3weeks later of waiting for availability, my 3rd card is +28 on the serial number. So between the time I got 2 cards and the 3rd they only had 28 cards?

Mark ups on price (Granted I didn't experience it)

Single bios

-=-=-=-=-=-=-

Explanation of *, **, ***

*Even though it has a custom PCB, bios is locked to run at 1.7v compared to 1.8v(normal bios) and 1.212v(unlocked bios)

**This card gets up to 84 deg in Xfire. If you look at the pic, the middle card hovers in between 75-84 in heavy gaming. Fan profile is very mellow.

***OC potential is "better" on the Sapphire out of the box since there is an unlocked bios. BUT, if you want to LN2, get Strix.

http://forum.hwbot.org/showthread.php?t=142320

Bottom line:

Performance out of the box: Sapphire

Size out of the box: Asus since it's 2 slots

Color & looks: Depends on your taste

Noise: Sapphire. Strix fans that run @ 70-80% are annoying. My middle card ends up running at 70-80% to cool it. I had to set a custom fan profile.

Basically, out of the box, go sapphire. LN2, go Strix.


----------



## rx7racer

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone who bought either the Asus Strix or the Sapphire Tri-X cooler versions?
> 
> Can you elaborate on your satisfication with the product, good things, bad, how is the power draw?


Can't say I've measured power draw as that's a moot point in my book. But as far as being satisfied I can't argue against it that's for sure. Sapphire Tri-X is what I got and the cooler is astounding, absolutely no reason to even contemplate water cooling to be honest after my experience.

The good is it's nice and stable and most are unlocking 4 CU's so it only lags about 7% behind Full Fiji in the Fury X. And honestly can't list anything bad about it so far as it's been just fine. Only thing I will say is I don't see a point going to it if you have your 295X2 though as it's not even beating most of my own benches I did with my 290X oc'ed, figure you will be disappointed in that dept. greatly.

Fun new toy to play with but nothing special on performance side which makes it's price kinda janky for performance/$$.


----------



## Medusa666

Quote:


> Originally Posted by *fjordiales*
> 
> Strix here in 3x CrossfireX
> 
> 
> Satisfaction? Very Satisfied. I had the 980 Ti AMP Omega in SLI. Single card, 980 Ti wins. Multi GPU, Fury & especially Fury X wins in 1440p & above(Most games). $100 cheaper per card at the time I got mine.
> 
> http://www.pcworld.com/article/2947547/components-graphics/amd-radeon-fury-crossfire-review-2-fast-2-furious.html
> 
> http://www.tweaktown.com/articles/7226/amd-radeon-r9-fury-video-cards-crossfire/index8.html
> 
> http://www.eteknix.com/amd-r9-fury-x-4gb-graphics-card-crossfire-review/16/
> 
> If anyone can add on to these, or even correct me, that would be appreciated.
> 
> Pros on Strix:
> Looks. I have red & black theme.
> 2 slots instead of 2.5 on sapphire.
> Custom PCB*
> Quiet Fans**
> DVI port (that I don't even use)
> Performance close to fury X for $100 less. (CU can be unlocked)
> Not heavy (980Ti AMP omega felt like I was holding a newborn)
> 
> Cons:
> Performance/Noise/OC potential*** is 2nd place to Sapphire.
> 
> AVAILABILITY. My 1st 2 cards are right next to each other when it comes to serial number. 2-3weeks later of waiting for availability, my 3rd card is +28 on the serial number. So between the time I got 2 cards and the 3rd they only had 28 cards?
> 
> Mark ups on price (Granted I didn't experience it)
> 
> Single bios
> 
> -=-=-=-=-=-=-
> 
> Explanation of *, **, ***
> 
> *Even though it has a custom PCB, bios is locked to run at 1.7v compared to 1.8v(normal bios) and 1.212v(unlocked bios)
> 
> **This card gets up to 84 deg in Xfire. If you look at the pic, the middle card hovers in between 75-84 in heavy gaming. Fan profile is very mellow.
> 
> ***OC potential is "better" on the Sapphire out of the box since there is an unlocked bios. BUT, if you want to LN2, get Strix.
> 
> http://forum.hwbot.org/showthread.php?t=142320
> 
> Bottom line:
> 
> Performance out of the box: Sapphire
> 
> Size out of the box: Asus since it's 2 slots
> 
> Color & looks: Depends on your taste
> 
> Noise: Sapphire. Strix fans that run @ 70-80% are annoying. My middle card ends up running at 70-80% to cool it. I had to set a custom fan profile.
> 
> Basically, out of the box, go sapphire. LN2, go Strix.


Thank you for your short review of the Asus R9 Fury Strix.

I like the idea of having a custom full length PCB, ASUS always use high quality components on their cards, and they really hit home with their Strix cooler. The backplate and the looks in general are just superb.

I do not really believe that the differences between these two cards can be called pros or cons, more like aspects of two different kind of beasts.

Would you say that the fan during 1-2 hour load is
a) Silent
b) Whisper
c) Audible
d) Loud

?

Impressive system btw : )
Quote:


> Originally Posted by *rx7racer*
> 
> Can't say I've measured power draw as that's a moot point in my book. But as far as being satisfied I can't argue against it that's for sure. Sapphire Tri-X is what I got and the cooler is astounding, absolutely no reason to even contemplate water cooling to be honest after my experience.
> 
> The good is it's nice and stable and most are unlocking 4 CU's so it only lags about 7% behind Full Fiji in the Fury X. And honestly can't list anything bad about it so far as it's been just fine. Only thing I will say is I don't see a point going to it if you have your 295X2 though as it's not even beating most of my own benches I did with my 290X oc'ed, figure you will be disappointed in that dept. greatly.
> 
> Fun new toy to play with but nothing special on performance side which makes it's price kinda janky for performance/$$.


When you say that the cooler is astounding, what exactly do you mean by that?

I'm love the 295X2 cards, but what I do not like is that in some titles, textures flicker, crossfire is not supported, etc, and I do not have the time or the patience to fix stuff like that, just going to go with a powerful single card in the future, hence the Fury X or the Fury, leaning towards the Fury.


----------



## Alastair

I might also add. That if you want to water cool. Go Sapphire. As I contacted EK about Strix blocks and they let me know that they have no plans at this point to develop a block for it.


----------



## rv8000

AFAIK The Strix is louder and hotter at the same fan settings on the Tri-X. Whenever ever I play games I set the fan to 45% currently (waiting on new TrixX and AB to mess with curves) and my Tri-X sits around 61c in a 27-30c idle room, peak temps hit 66c at that fan speed. The card cannot be heard above my PSU fan and CPU fan, the Tri-X Fury is actually the quietest card I've ever owned, and although it does put out a lot of heat the stock cooler is amazing.

*Small update to people waiting for TrixX and voltage control. There were apparently a few issues/bugs found with the newest beta from W1zzard, so he's working on some more fixes, I'll try to ask him when the newest beta gets sent to Sapphire for testing.


----------



## xer0h0ur

LOL, didn't he say there was nothing else he could change. Apparently there was.


----------



## Alastair

Quote:


> Originally Posted by *rv8000*
> 
> AFAIK The Strix is louder and hotter at the same fan settings on the Tri-X. Whenever ever I play games I set the fan to 45% currently (waiting on new TrixX and AB to mess with curves) and my Tri-X sits around 61c in a 27-30c idle room, peak temps hit 66c at that fan speed. The card cannot be heard above my PSU fan and CPU fan, the Tri-X Fury is actually the quietest card I've ever owned, and although it does put out a lot of heat the stock cooler is amazing.
> 
> *Small update to people waiting for TrixX and voltage control. There were apparently a few issues/bugs found with the newest beta from W1zzard, so he's working on some more fixes, I'll try to ask him when the newest beta gets sent to Sapphire for testing.


I think the Tri-x cools better because the air can pass through that 1/3 of the cooler where there is no PCB.The air has a clear path so there is less turbulence and interference from air that is getting deflected back into the fans by the PCB.


----------



## rx7racer

Quote:


> Originally Posted by *Medusa666*
> 
> Thank you for your short review of the Asus R9 Fury Strix.
> 
> I like the idea of having a custom full length PCB, ASUS always use high quality components on their cards, and they really hit home with their Strix cooler. The backplate and the looks in general are just superb.
> 
> I do not really believe that the differences between these two cards can be called pros or cons, more like aspects of two different kind of beasts.
> 
> Would you say that the fan during 1-2 hour load is
> a) Silent
> b) Whisper
> c) Audible
> d) Loud
> 
> ?
> 
> Impressive system btw : )
> When you say that the cooler is astounding, what exactly do you mean by that?
> 
> I'm love the 295X2 cards, but what I do not like is that in some titles, textures flicker, crossfire is not supported, etc, and I do not have the time or the patience to fix stuff like that, just going to go with a powerful single card in the future, hence the Fury X or the Fury, leaning towards the Fury.


What I mean by that is the air cooler on teh Sapphire cools better than my watercooling setup with a full cover Ek block hahaha

So yes, time to get rid of this 800D and my rads cause they obviously suck.


----------



## xer0h0ur

Or might be time to bake and flush your radiators. I wouldn't be surprised if they are gunked up.


----------



## fjordiales

There's a lot of you to multi quote but yes, #1 choice would be sapphire, #2 is Strix. If Tri-X was Red & Black, I would get it. TEMPs I get are usually 75/80/65 with what I have. I have all my case/cpu fans @ 60% on 30deg then straight to 100% @ 60deg so my cooling/noise is balanced. Thankfully I have 5 be quiet silent wings 2 PWM as case fans & TY-147A on the CPU heatsink.

Also, the fans on GPU2 rub on GPU3 backplate. Had to use these to stop the rubbing.
http://www.amazon.com/gp/product/B001QCXXWS?psc=1&redirect=true&ref_=oh_aui_detailpage_o05_s00



@Medusa

"Would you say that the fan during 1-2 hour load is
a) Silent
b) Whisper
c) Audible
d) Loud"

Top = Whisper
Bottom = Silent
Middle = AUDIBLE

These are in default settings. For my custom fans, I have them these way.


All fans connected to this so when CPU gets hot, all fans ramp up.

http://www.amazon.com/Swiftech-8-Way-Splitter-Power-Connector/dp/B00IF6R4C8/ref=sr_1_4?ie=UTF8&qid=1439663848&sr=8-4&keywords=fan+hub


----------



## Scorpion49

Has anyone RMA'd a Tri-X with Sapphire for coil whine? Its getting to the point where I can't take it any more, its been getting louder for days now.


----------



## Ceadderman

Stress the hades out of it for a 12-24hr cycle and see if that clears it up.

It should if you run a capable bench on it.









~Ceadder


----------



## Alastair

I hope that my Sapphires do not have major coil whine.


----------



## Scorpion49

Quote:


> Originally Posted by *Ceadderman*
> 
> Stress the hades out of it for a 12-24hr cycle and see if that clears it up.
> 
> It should if you run a capable bench on it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Its been almost a month and the whine is getting worse, way worse. This is not like normal coil whine I've heard before, it is extremely loud. I gamed for dozens of hours and left benches that cause the whine running over night several times, and it keeps getting worse instead of better.


----------



## Medusa666

Quote:


> Originally Posted by *Scorpion49*
> 
> Its been almost a month and the whine is getting worse, way worse. This is not like normal coil whine I've heard before, it is extremely loud. I gamed for dozens of hours and left benches that cause the whine running over night several times, and it keeps getting worse instead of better.


Do you mind recording it and put it on youtube?

I'm looking at this card as an option but now you scared me : )


----------



## Ceadderman

Do you have another PSU around? Seems like that possibly could be the problem. Some people have reported that their coil whine went away with a PSU change. I don't know, since I've never experienced the issue personally. I did experience coil whine with my HX850 but that went away after I folded 24/7 for a week or so.









~Ceadder


----------



## Thoth420

Also are you using an AVR or something to rule out dirty power from your wall? That was my issue at my folks house and an older apartment building I was staying for a while as well. Using my USPS suppressed it to a frequency range that my human ear cannot detect. It never really goes away if we are being technical.

I also found SuperFlower OEM Power Supplies to be the best at eliminating coil whine on a GPU caused by the PSU. However the SuperFlower units exhibit the whine on their end instead so the sound would then be coming from the PSU. It is so low you have to have your ear up to the unit as well. Again a case of how it never goes away completely it just changes frequencies.

Also you could try to locate the source and try the old epoxy trick.

In my experience if it gets worse over time it is either the PSU or the power from the wall not necessarily the card....but ya never know...


----------



## Scorpion49

Its been in 6 different machines in two different towns, my own rig is on a nice UPS but plugged into that or not it makes no difference. The card is just whiny, thats all. I'll try to record it again as it has increased in volume significantly since the last time I recorded it.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Its been in 6 different machines in two different towns, my own rig is on a nice UPS but plugged into that or not it makes no difference. The card is just whiny, thats all. I'll try to record it again as it has increased in volume significantly since the last time I recorded it.


Just FYI, back when I had my 7950 from Sapphire I had some serious issues with whine, I was able to get it RMA'd specifically for that. While the newer card was better there were still some issues and I was able to RMA a second time with Althon Micro, and I even got upgraded to a 7970







. As long as you're clear and patient Althon Micro is a great support group to work with.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Just FYI, back when I had my 7950 from Sapphire I had some serious issues with whine, I was able to get it RMA'd specifically for that. While the newer card was better there were still some issues and I was able to RMA a second time with Althon Micro, and I even got upgraded to a 7970
> 
> 
> 
> 
> 
> 
> 
> . As long as you're clear and patient Althon Micro is a great support group to work with.


Sweet, I might give it a shot. I was reading a lot of horror stories about Sapphire RMA so I was hesitant to give it a try. I'll just use my R9 380 Nitro in the mean time, I love the Sapphire coolers they are so good.


----------



## Alastair

What's the epoxy method for coil whine though? What is it? How does one do it?


----------



## Thoth420

Quote:


> Originally Posted by *Scorpion49*
> 
> Its been in 6 different machines in two different towns, my own rig is on a nice UPS but plugged into that or not it makes no difference. The card is just whiny, thats all. I'll try to record it again as it has increased in volume significantly since the last time I recorded it.


Yep from my experience it sounds like that is the case.
Quote:


> Originally Posted by *Alastair*
> 
> What's the epoxy method for coil whine though? What is it? How does one do it?


http://lifehacker.com/this-video-explains-what-coil-whine-is-and-how-to-avoid-1669522880

Take a look at this. I have never tried it but have seen numerous threads in the past on various forums were users have fixed the issue with some modding. The stuff used varies so it's def worth doing a bit of research and maybe speaking to someone with some actual experience. I won't vouch for it because I am super paranoid about modding my stuff due to me being prone to breaking stuff just by standing near it. Just tossing it out there as a potentially "ghetto bandaid".


----------



## rv8000

Quote:


> Originally Posted by *Alastair*
> 
> What's the epoxy method for coil whine though? What is it? How does one do it?


It can only be done with open chokes, almost all GPUs have a closed choke housing now, you would have to break open the housing and void your warranty; would likely risk damaging the chokes in the process.


----------



## looncraz

Quote:


> Originally Posted by *Alastair*
> 
> What's the epoxy method for coil whine though? What is it? How does one do it?


I've had really good luck with just "burn-in."

My 7870XT had the worst coil whine I ever heard, but I ignored it and just played my games with headphones on. I didn't think it would go away, but it went away completely. My current R9 290, luckily, didn't have any. My x850 XT did, and it also went away after some hours of intense gaming.

Before doing anything crazy, just give that a go.

From there, using ferrite half-moons near the coils may make a difference as coil whine can easily be magnetic fields from neighboring components causing a resonance.


----------



## Clockster

Some sad news...

I'll be leaving the Fury X side of life tomorrow.
Moving over to a Gigabyte GT980Ti G1 Gaming. I adore the silence of the fury card and I love the temps but the 980Ti is simply a much faster card.


----------



## Medusa666

Quote:


> Originally Posted by *Clockster*
> 
> Some sad news...
> 
> I'll be leaving the Fury X side of life tomorrow.
> Moving over to a Gigabyte GT980Ti G1 Gaming. I adore the silence of the fury card and I love the temps but the 980Ti is simply a much faster card.


Define "much faster", and I'm curious because I'm just about to buy either a Fury X or a Sapphire Fury with the Tri-X cooler.


----------



## ozyo

just wondering if I'm the only one who get 39215686 rpm fan ?

any one know how i can fixit ?
its fury x btw


----------



## rv8000

Quote:


> Originally Posted by *ozyo*
> 
> just wondering if I'm the only one who get 39215686 rpm fan ?
> 
> any one know how i can fixit ?
> its fury x btw


Fiji isn't fully supported, you'll have to wait for a newer version of GPU-Z and Afterburner, both have the same weird quirks reading fan speeds.


----------



## ozyo

Quote:


> Originally Posted by *rv8000*
> 
> Fiji isn't full supported, you'll have to wait for a newer version of GPU-Z and Afterburner, both have the same weird quirks reading fan speeds.


I thought I got the golden typhoon


----------



## Kana-Maru

I've finally finished my Fury X review. If anyone is on the fence or if anyone is interested you can read it here:

http://www.overclock-and-game.com/hardware/computer-tech-reviews/40-amd-fury-x-review

Quote:


> Originally Posted by *Clockster*
> 
> Some sad news...
> 
> I'll be leaving the Fury X side of life tomorrow.
> Moving over to a Gigabyte GT980Ti G1 Gaming. I adore the silence of the fury card and I love the temps but the 980Ti is simply a much faster card.


Well of course it is. It's obvious that AMD was going after the GTX 980 none Ti. Nvidia just can't stand to let AMD have "anything" at all. They released the 980 Ti about a month before Fury X E3 reveal and launch date. With that said both cards are good. I feel bad for the 980 purchasers. I see more people trying to upgrade from 980 to 980 Ti across the web. The 980 Ti does overclock better, but you put so much strain on the GPU and components. Let's not forget the heat and the power consumption. To much heat will downclock anyways. Apparently OC heat and power consumption is no concern as long as it's Nvidia doing it. I do love Gigabyte and their G1 Gaming cards as well. I think I do without the excessive heat and fan noise for while.


----------



## Jflisk

Quote:


> Originally Posted by *Kana-Maru*
> 
> I've finally finished my Fury X review. If anyone is on the fence or if anyone is interested you can read it here:
> 
> http://www.overclock-and-game.com/hardware/computer-tech-reviews/40-amd-fury-x-review
> Well of course it is. It's obvious that AMD was going after the GTX 980 none Ti. Nvidia just can't stand to let AMD have "anything" at all. They released the 980 Ti about a month before Fury X E3 reveal and launch date. With that said both cards are good. I feel bad for the 980 purchasers. I see more people trying to upgrade from 980 to 980 Ti across the web. The 980 Ti does overclock better, but you put so much strain on the GPU and components. Let's not forget the heat and the power consumption. To much heat will downclock anyways. Apparently OC heat and power consumption is no concern as long as it's Nvidia doing it. I do love Gigabyte and their G1 Gaming cards as well. I think I do without the excessive heat and fan noise for while.


Nice thank you


----------



## Kana-Maru

Quote:


> Originally Posted by *Jflisk*
> 
> Nice thank you


No problem at all.


----------



## Ceadderman

Yeah Ti is a good card but it's not all out faster than Fury X from the reviews I have seen. And R9 390x closes the gap somewhat.

imho, nVidia needed Ti in order to remain on top and even then that's due to OCing not stock. Cannot wait for 390s to get consistent clockability. And Fury x2 to launch.









~Ceadder


----------



## Jflisk

Fury x 2 I might have to sell one of my furies for that one.


----------



## fjordiales

Just an update, 3x crossfireX on fury has negative scaling(witcher 3, some benchmarks). My observation are from OSD from afterburner. Witcher 3 1440p ultra, no gimpworks when I mean hairworks, 2x gpu is 65-75fps. 3x gpu is 45-55fps. Feels like a waste for now since I deactivated the middle card because its the hottest.


----------



## PontiacGTX

Quote:


> Originally Posted by *fjordiales*
> 
> Just an update, 3x crossfireX on fury has negative scaling(witcher 3, some benchmarks). My observation are from OSD from afterburner. Witcher 3 1440p ultra, no gimpworks when I mean hairworks, 2x gpu is 65-75fps. 3x gpu is 45-55fps. Feels like a waste for now since I deactivated the middle card because its the hottest.


that mobo doesnt have PLX...and running xDMA on 4x...


----------



## jase78

New egg codes: if 0u812 is no longer working try be bndovr or psngas


----------



## Ceadderman

I hope you're not serious.









~Ceadder


----------



## littlestereo

After a bit of tweaking I finally managed to hit the top Firestrike and Firestrike Ultra scores for the 4790k + Fury X (2 in CF) combo category (7th overall for any cpu +2 Fury X's).

Firestrike: *21734* overall *33960* graphics.
Firestrike Ultra Score: *7750* overall and *8273* graphics.

http://www.3dmark.com/fs/5739163

http://www.3dmark.com/fs/5738978

It is a bit a of a bummer that the top 10 results for Fury X's in CF only reaches the top 15% of 980ti's in SLI for Firestrike Ultra. If voltage mod without the soldering scaled like the LN build we could probably crack 9000 (Hall of Fame/ top 100 overall) with +200 core and +200 mem and tie the 980ti's in SLI. Heat really isn't an issue with these cards especially in full waterblocks (still not hitting 40c under load even at +95MHz)


----------



## Alastair

So my Sapphire Fury Tri-x cards are on their way. Should be here by the 1st of September!


----------



## Thoth420

Hey all couple queries:

1. Is their an EK Block for Fury X that has a backplate? Or if not is there a way to add one? If I am losing my sexy shroud I at least don't want to staring at PCB.

2. Has anyone with a waterblocked Fury X recieving coil whine? Was it worse before the block or better? Possibly no change? Also if you having whine: PSU model and age as well as are you using an AVR or not?

3. I am new to custom loops and wonder how much radiator I need to cool a single fury x and an i7 skylake. Will be OCing both for performance not max ceiling.

I should note the chassis is a Corsair Air 540 or Fractal R5 Define...leaning toward the air but I do prefer the system to be as silent as possible. Also any ATX case with a window that is quiet and has good dust filters I would consider.


----------



## littlestereo

1. All the Fury X EKWB blocks have optional backplates you can buy for them in a variety of colors
2. No Coil whine with a corsair GS800 or EVGA P2 1200
3. The Corsair 540 would be a much better choice for a custom loop and the two 280mm radiator mounting points on the front and top are perfect for keeping the whole setup around 40-50c under load


----------



## Medusa666

Quote:


> Originally Posted by *Alastair*
> 
> So my Sapphire Fury Tri-x cards are on their way. Should be here by the 1st of September!


Congratulations, I ordered my Fury X this morning and will see how I like it, my system is 100% silent so if it is too loud I will get the Sapphire.

Let us know what you think of it.


----------



## p4inkill3r

You guys that may be on the fence about the Fury/Fury X might be interested in seeing some performance numbers under DX12: http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/0_100


----------



## rv8000

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all couple queries:
> 
> 1. Is their an EK Block for Fury X that has a backplate? Or if not is there a way to add one? If I am losing my sexy shroud I at least don't want to staring at PCB.
> 
> 2. Has anyone with a waterblocked Fury X recieving coil whine? Was it worse before the block or better? Possibly no change? Also if you having whine: PSU model and age as well as are you using an AVR or not?
> 
> 3. I am new to custom loops and wonder how much radiator I need to cool a single fury x and an i7 skylake. Will be OCing both for performance not max ceiling.
> 
> I should note the chassis is a Corsair Air 540 or Fractal R5 Define...leaning toward the air but I do prefer the system to be as silent as possible. Also any ATX case with a window that is quiet and has good dust filters I would consider.


Do not buy the Air 540, it has no top or bottom air filters. It's also a flimsy piece of garbage, it is totally overpriced, and while the concept was great the execution and quality are extremely poor. The R5 or Define S are much much better cases for less money. Just some things to point out about the Air 540...

- Cheap plastic
- Case would creak randomly (likely due to heat)
- Flimsy top; poor frame support for rads and fans will cause tons of vibrations with or without rubber grommets
- Lack of Dust filters for PSU intake, top exhaust/intake, and bottom
- Bottom of the case has huge openings where the 3.5 slide bays are, leading to TONS of dust getting in your case and no simple way to prevent this.
- Side window is made of cheap plastic and will scratch easily; mine came with the window bowing outwards pretty badly
- I cannot say this enough, it is OVERPRICED; I don't care how good this case looks, it is very low quality overall and has tons of glaring issues for anyone wanting a silent case.


----------



## richie_2010

does anyone with a fury x card fancy trying one of my led back-plate kits. i have sold a couple but the persons are not able to send any pictures of the finished look to update my thread









I only need one person and that person needs be able send pics back to me of it installed and advise of any issues they come across so i can fix them.
the product will be for you to keep free of charge









shoot me a pm if your interested or a quick post here for me to see


----------



## Jflisk

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all couple queries:
> 
> 1. Is their an EK Block for Fury X that has a backplate? Or if not is there a way to add one? If I am losing my sexy shroud I at least don't want to staring at PCB.
> 
> 2. Has anyone with a waterblocked Fury X recieving coil whine? Was it worse before the block or better? Possibly no change? Also if you having whine: PSU model and age as well as are you using an AVR or not?
> 
> 3. I am new to custom loops and wonder how much radiator I need to cool a single fury x and an i7 skylake. Will be OCing both for performance not max ceiling.
> 
> I should note the chassis is a Corsair Air 540 or Fractal R5 Define...leaning toward the air but I do prefer the system to be as silent as possible. Also any ATX case with a window that is quiet and has good dust filters I would consider.


You could do the Arc Xl like my build in my icon picture. Instead of the R5 . The thing is built like a tank. Also Fractical design is excellent for parts you prove you own the case break anything they will ship you a replacement part. I somehow managed to break the power button on my case they replaced it within the week.

280mm/240MM in the top and in the front 240mm Rads


----------



## criminal

Quote:


> Originally Posted by *rv8000*
> 
> Do not buy the Air 540, it has no top or bottom air filters. It's also a flimsy piece of garbage, it is totally overpriced, and while the concept was great the execution and quality are extremely poor. The R5 or Define S are much much better cases for less money. Just some things to point out about the Air 540...
> 
> - Cheap plastic
> - Case would creak randomly (likely due to heat)
> - Flimsy top; poor frame support for rads and fans will cause tons of vibrations with or without rubber grommets
> - Lack of Dust filters for PSU intake, top exhaust/intake, and bottom
> - Bottom of the case has huge openings where the 3.5 slide bays are, leading to TONS of dust getting in your case and no simple way to prevent this.
> - Side window is made of cheap plastic and will scratch easily; mine came with the window bowing outwards pretty badly
> - I cannot say this enough, it is OVERPRICED; I don't care how good this case looks, it is very low quality overall and has tons of glaring issues for anyone wanting a silent case.


You can't be anymore right about the AIR 540. JUNK!


----------



## p4inkill3r

Quote:


> Originally Posted by *rv8000*
> 
> Do not buy the Air 540, it has no top or bottom air filters. It's also a flimsy piece of garbage, it is totally overpriced, and while the concept was great the execution and quality are extremely poor. The R5 or Define S are much much better cases for less money. Just some things to point out about the Air 540...
> 
> - Cheap plastic
> - Case would creak randomly (likely due to heat)
> - Flimsy top; poor frame support for rads and fans will cause tons of vibrations with or without rubber grommets
> - Lack of Dust filters for PSU intake, top exhaust/intake, and bottom
> - Bottom of the case has huge openings where the 3.5 slide bays are, leading to TONS of dust getting in your case and no simple way to prevent this.
> - Side window is made of cheap plastic and will scratch easily; mine came with the window bowing outwards pretty badly
> - I cannot say this enough, it is OVERPRICED; I don't care how good this case looks, it is very low quality overall and has tons of glaring issues for anyone wanting a silent case.


I'll agree with the dust filter and side window complaint, but otherwise I've had little issue with mine. I bought it for right around $100 when it released and I think it has held up well.

Seriously though, the dust issues are pretty bad.


----------



## Shatun-Bear

Quote:


> Originally Posted by *rv8000*
> 
> Do not buy the Air 540, it has no top or bottom air filters. It's also a flimsy piece of garbage, it is totally overpriced, and while the concept was great the execution and quality are extremely poor. The R5 or Define S are much much better cases for less money. Just some things to point out about the Air 540...
> 
> - Cheap plastic
> - Case would creak randomly (likely due to heat)
> - Flimsy top; poor frame support for rads and fans will cause tons of vibrations with or without rubber grommets
> - Lack of Dust filters for PSU intake, top exhaust/intake, and bottom
> - Bottom of the case has huge openings where the 3.5 slide bays are, leading to TONS of dust getting in your case and no simple way to prevent this.
> - Side window is made of cheap plastic and will scratch easily; mine came with the window bowing outwards pretty badly
> - I cannot say this enough, it is OVERPRICED; I don't care how good this case looks, it is very low quality overall and has tons of glaring issues for anyone wanting a silent case.


Opinions and all that but I think you are wide of the mark. The build quality is good, and the case is far from flimsy, I think it's really solid. Yes, the lack of top filters is an irritation but I just bought 2x 140mm ultra-fine fan filters for £7 a pop and that fixed that issue.

The holes in the bottom is not an issue, unless you plonk your PC on a dirty carpet. There is hardly any dust in my case with the 2x 140mm fan filters on the top. I'm looking in there right now in fact - no dust









As for the side window, mine isn't bowed or cheap looking, and I dunno how you scratch it, unless you're cleaning it with a scourer or something...

But I do agree with you it is overpriced. But then so is the R5, which is a bit of a hot box in terms of its airflow. I think that is one of the perks of the Air 540.


----------



## rv8000

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Opinions and all that but I think you are wide of the mark. *The build quality is good*, and the case is far from flimsy, I think it's really solid. Yes, the lack of top filters is an irritation but I just bought 2x 140mm ultra-fine fan filters for £7 a pop and that fixed that issue.
> 
> The holes in the bottom is not an issue, unless you plonk your PC on a dirty carpet. There is hardly any dust in my case with the 2x 140mm fan filters on the top. I'm looking in there right now in fact - no dust
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the side window, mine isn't bowed or cheap looking, and I dunno how you scratch it, unless you're cleaning it with a scourer or something...
> 
> But I do agree with you it is overpriced. But then so is the R5, which is a bit of a hot box in terms of its airflow. I think that is one of the perks of the Air 540.


It isn't, there's nothing to argue. If you've never owned some of the older high end Antec cases, Lian-Li, and any of the good Fractal cases from a material and quality standpoint they put the Air 540 to absolute shame. Corsair build quality across nearly all their products is extremely lacking and cheapy. I really don't want to sound like a snob but having owned several competing products across their line (fans, cases, psus, coolers) there is always a higher quality and better performing option; It's a shame too, normally when you buy cheap you get cheap quality, but with corsair they charge you an arm and a leg and give you bottom of the barrel materials and build quality for the majority of their products.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rv8000*
> 
> It isn't, there's nothing to argue. If you've never owned some of the older high end Antec cases, Lian-Li, and any of the good Fractal cases from a material and quality standpoint they put the Air 540 to absolute shame. Corsair build quality across nearly all their products is extremely lacking and cheapy, I really don't want to sound like a snob but having owned several competing products across their line (fans, cases, psus, coolers) there is always a higher quality and better performing option; It's a shame too, normally when you buy cheap you get cheap quality, but with corsair they charge you an arm and a leg and give you bottom of the barrel materials and build quality for the majority of their products.


Wow, and to think, I was this close







to ordering the Air 540, before I decided to go even cheaper and get the S340.

I ended up seeing an Air 540 in person some weeks later, and was so glad I didn't get it..... it was a bit wider than I had expected, and it did indeed feel a bit cheap (even for me







. )

The the little NZXT S340 I bought's build quality/per dollar is bar none. Besides the snap on plastic front, it is of no lower quality whatsoever than my full tower Lian Li 201B.

Best $70 I ever spent on anything PC related for sure....


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Wow, and to think, I was this close
> 
> 
> 
> 
> 
> 
> 
> to ordering the Air 540, before I decided to go even cheaper and get the S340.
> 
> I ended up seeing an Air 540 in person some weeks later, and was so glad I didn't get it..... it was a bit wider than I had expected, and it did indeed feel a bit cheap (even for me
> 
> 
> 
> 
> 
> 
> 
> . )
> 
> The the little NZXT S340 I bought's build quality/per dollar is bar none. Besides the snap on plastic front, it is of no lower quality whatsoever than my full tower Lian Li 201B.
> 
> Best $70 I ever spent on anything PC related for sure....


S340 and Define S are easily two of the most versatile, well built, and affordable cases on the market. Fractal and NZXT seem to have nailed most everything with these cases.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rv8000*
> 
> S340 and Define S are easily two of the most versatile, well built, and affordable cases on the market. Fractal and NZXT seem to have nailed most everything with these cases.


Couldn't agree more!


----------



## fjordiales

Quote:


> Originally Posted by *rv8000*
> 
> It isn't, there's nothing to argue. If you've never owned some of the older high end Antec cases, Lian-Li, and any of the good Fractal cases from a material and quality standpoint they put the Air 540 to absolute shame. Corsair build quality across nearly all their products is extremely lacking and cheapy. I really don't want to sound like a snob but having owned several competing products across their line (fans, cases, psus, coolers) there is always a higher quality and better performing option; It's a shame too, normally when you buy cheap you get cheap quality, but with corsair they charge you an arm and a leg and give you bottom of the barrel materials and build quality for the majority of their products.




The case is OVERPRICED for what you get. Mine is full of scratches(my fault for being a brute) but it's being masked by the blinding light. Lol. I'm happy that it's able to hold 3 r9 fury. But as far as price/function/looks, I don't think it's worth more than $100. But for $100, there are other options. Also, the white looks better, my wife has it. It's like a storm trooper white.


----------



## devilhead

Tested couple fury X, looks like memory all of them can reach 600mhz








It would be nice to have some extra voltage on core...








here firestrike ultra score: http://www.3dmark.com/fs/5744659


----------



## Thoth420

Quote:


> Originally Posted by *littlestereo*
> 
> 1. All the Fury X EKWB blocks have optional backplates you can buy for them in a variety of colors
> 2. No Coil whine with a corsair GS800 or EVGA P2 1200
> 3. The Corsair 540 would be a much better choice for a custom loop and the two 280mm radiator mounting points on the front and top are perfect for keeping the whole setup around 40-50c under load


Thank you so much sir!

Also thank you everyone with the feedback on the 540. I guess I will take a look at some other options.

I also should note I already have an R5 sitting at home. I am now eyeing the NZXT 440 but wonder if a cylinder res will be visible and fit...guessing it will be fine because the fury x is so short.

I am doing the rig but the watercooling is being done by maingear as I have no experience doing a custom loop and am prone to break things. Still deciding on acrylic or pvc...either way I want a cylinder res not a bay style one.


----------



## Ceadderman

Depending on your system go with EVGA. Corsair PSU combined with ASUS x99 have crushed some hopes and dreams. Mainly due to the former changing their rail system w/o telling their customers. Pretty sure it's been addressed by both companies but I personally wouldn't take the chance.

I'm on my phone atm so I can't see the rig in question. Just something thought provoking.









~Ceadder


----------



## Alastair

So just two days after my Sapphire Fury was ordered. The price on Amazon went down to 562!









I feel ripped off.


----------



## Clockster

Decided to stick with my Fury X, seems like these cards will be great when devs finally make use of DX12


----------



## Gumbi

Quote:


> Originally Posted by *Alastair*
> 
> So just two days after my Sapphire Fury was ordered. The price on Amazon went down to 562!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel ripped off.


Email them asking for the difference to be reimbursed.

Not sure what your rights are in Murica (generally a lot weaker than here in the EU), but I ordered a monitor for 210 pounds recently, and the day I received it had dropped to 165. I sent an email and was reimbursed the difference without any hassle.... 60 euro back in my account than you very much.

Amazon customer service are awesome here.


----------



## battleaxe

Quote:


> Originally Posted by *Clockster*
> 
> Decided to stick with my Fury X, seems like these cards will be great when devs finally make use of DX12


^^ This










Just waiting for my bank account to fill up a bit more...


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> So just two days after my Sapphire Fury was ordered. The price on Amazon went down to 562!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel ripped off.


Contact them see if they will price match for you.


----------



## Alastair

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So just two days after my Sapphire Fury was ordered. The price on Amazon went down to 562!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I feel ripped off.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Contact them see if they will price match for you.
Click to expand...

I did so. And I worded it nicely so as to not sound angry or anything. I know your often bound to get better service when you don't sound like your angry. Let's see what happens.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Do not buy the Air 540, it has no top or bottom air filters. It's also a flimsy piece of garbage, it is totally overpriced, and while the concept was great the execution and quality are extremely poor. The R5 or Define S are much much better cases for less money. Just some things to point out about the Air 540...
> 
> - Cheap plastic
> - Case would creak randomly (likely due to heat)
> - Flimsy top; poor frame support for rads and fans will cause tons of vibrations with or without rubber grommets
> - Lack of Dust filters for PSU intake, top exhaust/intake, and bottom
> - Bottom of the case has huge openings where the 3.5 slide bays are, leading to TONS of dust getting in your case and no simple way to prevent this.
> - Side window is made of cheap plastic and will scratch easily; mine came with the window bowing outwards pretty badly
> - I cannot say this enough, it is OVERPRICED; I don't care how good this case looks, it is very low quality overall and has tons of glaring issues for anyone wanting a silent case.


lol this is so true. I've had 3 different Air 540's by now and they were all trash. All of them had some damage to the side panels from shipping, all of them had rattles and creaks and two of them I had to remove the front filter because no matter how much I bent it it would hit the front fans causing ungodly racket. Put a 7200rpm HDD in the bottom of that thing and you'll feel the vibrations in your desk so badly you'd think your desk was in a train.

Plus the window scratches if you look at it wrong.


----------



## acidr4in

I received my Sapphire Tri-X Fury (non OC) yesterday, love the performance boost and hate the coil whine . I can't stand it. I'm building myself a silent watercooling rig and this is just frustrating to deal with.

Now I'm hesitating to put my waterblock on it, I just might send it back.

I tried recording it, I put my ModMic on the card and recorded it during Valley benchmark, I do ALT+TAB to Desktop a few times so you can hear the difference. In the end you here It the most during the credits.

__
https://soundcloud.com/acidr4in%2Ffury-coil-whine

Last night I ran the CS:GO menu for about 7-8 hours, I didn't notice any difference.

Is there anything else I can do?

Is every Sapphire Card affected? Should I send it back and hope for a better sample?


----------



## Scorpion49

Quote:


> Originally Posted by *acidr4in*
> 
> I received my Sapphire Tri-X Fury (non OC) yesterday, love the performance boost and hate the coil whine . I can't stand it. I'm building myself a silent watercooling rig and this is just frustrating to deal with.
> 
> Now I'm hesitating to put my waterblock on it, I just might send it back.
> 
> I tried recording it, I put my ModMic on the card and recorded it during Valley benchmark, I do ALT+TAB to Desktop a few times so you can hear the difference. In the end you here It the most during the credits.
> 
> __
> https://soundcloud.com/acidr4in%2Ffury-coil-whine
> 
> Last night I ran the CS:GO menu for about 7-8 hours, I didn't notice any difference.
> 
> Is there anything else I can do?
> 
> Is every Sapphire Card affected? Should I send it back and hope for a better sample?


Looks like you're in the same boat as me. Nothing you can do about it, they are all reference PCB except for the Asus Strix so you'll have a decent shot at getting whine no matter how many times you send it back. AMD's reference board this time around is really bad for that.


----------



## acidr4in

Quote:


> Originally Posted by *Scorpion49*
> 
> Looks like you're in the same boat as me. Nothing you can do about it, they are all reference PCB except for the Asus Strix so you'll have a decent shot at getting whine no matter how many times you send it back. AMD's reference board this time around is really bad for that.


Well that sucks, I was thinking about the Asus Strix but there is no waterblock from EK for that one.
Afaik the board is the same regardless of Fury and Fury X right, so not even that would change the fact that I could get one with whine, right?

With the Fury X people were talking about the pump whining but it might not be the pump after all if it's based on the same board and AMD's reference design is just ****.


----------



## Alastair

Well Amazon said that they would refund me the 66 dollars from the price drop. I'll have to check out with my bank to see if it does indeed come in. But yeah things look good.

I hear the coil while of that car. Actually sounds like a car revving!







anyway you would probably want to stick with the Sapphire card. Because I emailed EK about blocks for the Strix Fury and they told me that they had no plans of making one at this point. I hope it just wears away for you over time. I also hope that my cards come back relatively whine free.


----------



## Medusa666

Getting my Fury X by mail tomorrow, got two Noctua NF-P12 lying around, what (if any) would the differences be using them in push/pull compared to going with the stock fan, lower noise? Temps?


----------



## Alastair

Quote:


> Originally Posted by *Medusa666*
> 
> Getting my Fury X by mail tomorrow, got two Noctua NF-P12 lying around, what (if any) would the differences be using them in push/pull compared to going with the stock fan, lower noise? Temps?


GT fans are probably better than Noctua fans anyways.


----------



## rv8000

Quote:


> Originally Posted by *acidr4in*
> 
> Well that sucks, I was thinking about the Asus Strix but there is no waterblock from EK for that one.
> Afaik the board is the same regardless of Fury and Fury X right, so not even that would change the fact that I could get one with whine, right?
> 
> With the Fury X people were talking about the pump whining but it might not be the pump after all if it's based on the same board and AMD's reference design is just ****.


One the reference design isn't awful, and as always its above nvidia reference cards in terms of quality. I'm pretty sure the 6 phase design was for both space savings and cost savings on AMD's part unfortunately, but reference 290x/290 was the same.

Secondly there's a very distinct difference between pump whine and coil whine, 99% of the time if you do have pump whine you will notice that and the other downside is that it's a constant noise and not dependent on load/high fps. Coil whine can diminish, pump whine will not.

To people buying Fury/Fury X make sure that your power supply is providing clean power, also that outlets in your house are grounded properly (found out my ground was disconnected back when I was having issues with my 7950), and avoid placing your pc near any large electronic components with strong magnets/electrical interference. These won't necessarily stop it, but it can help at the very least.


----------



## Medusa666

Quote:


> Originally Posted by *Alastair*
> 
> GT fans are probably better than Noctua fans anyways.


I'm really sound sensitive, so I'm actually contemplating getting the card you got, just for the zero RPM mode.


----------



## acidr4in

Quote:


> Originally Posted by *rv8000*
> 
> One the reference design isn't awful, and as always its above nvidia reference cards in terms of quality. I'm pretty sure the 6 phase design was for both space savings and cost savings on AMD's part unfortunately, but reference 290x/290 was the same.
> 
> Secondly there's a very distinct difference between pump whine and coil whine, 99% of the time if you do have pump whine you will notice that and the other downside is that it's a constant noise and not dependent on load/high fps. Coil whine can diminish, pump whine will not.
> 
> To people buying Fury/Fury X make sure that your power supply is providing clean power, also that outlets in your house are grounded properly (found out my ground was disconnected back when I was having issues with my 7950), and avoid placing your pc near any large electronic components with strong magnets/electrical interference. These won't necessarily stop it, but it can help at the very least.


Sorry, but how can you call a PCB design not awful, if there is a very noticeable coil whine on a lot of cards. AMD should've invested more time and/or money to produce a satisfactory product for their loyal customers. It's not always some childish war about AMD and Nvidia, I don't care for that. I just want for a premium product for a premium price.

I did not know that about pump whine, but pump whine would not concern me because I would've put a waterblock on it anyways. Hopefully my coil whine will diminish over time.

Lastly I bought a new RMi1000 as a upgrade for my rig.


----------



## Clockster

Quote:


> Originally Posted by *acidr4in*
> 
> Sorry, but how can you call a PCB design not awful, if there is a very noticeable coil whine on a lot of cards. AMD should've invested more time and/or money to produce a satisfactory product for their loyal customers. It's not always some childish war about AMD and Nvidia, I don't care for that. I just want for a premium product for a premium price.
> 
> I did not know that about pump whine, but pump whine would not concern me because I would've put a waterblock on it anyways. Hopefully my coil whine will diminish over time.
> 
> Lastly I bought a new RMi1000 as a upgrade for my rig.


Mate have you owned a GTX970/980/980Ti/Titan X? They have some of the worst coil whine around...lol
Both Nvidia and AMD have fairly crappy coil whine. My Fury X is dead silent though...unlike my Titan X.

The Fury X is a premium product.


----------



## p4inkill3r

Coil noise whining is really getting out of control.


----------



## rioja

Quote:


> Originally Posted by *Clockster*
> 
> Mate have you owned a GTX970/980/980Ti/Titan X? They have some of the worst coil whine around...lol
> Both Nvidia and AMD have fairly crappy coil whine. My Fury X is dead silent though...unlike my Titan X.
> 
> The Fury X is a premium product.


Is it true? I never thought about this aspect


----------



## acidr4in

A friend of mine owns two 980s, there is hardly any coil whine to hear


----------



## p4inkill3r

Quote:


> Originally Posted by *acidr4in*
> 
> A friend of mine owns two 980s, there is hardly any coil whine to hear


So?

Go to any of our threads here for the 970/980/980Ti/Titan X and read the hundreds of posts complaining of coil noise.

People's ears these days sure are sensitive.


----------



## xer0h0ur

Coil whine in reference PCBs is a luck of the draw sort of thing. There are plenty of 970s and 980's with coil whine just the same as there are plenty of Fury / Fury X's that have coil whine. Things become further complicated by external factors like your power supply.


----------



## rioja

Is this whining due to overload of power? How those versions with 8pin + 8pin run then, such as Asus Strix 980 Ti or Gigabyte G1 Gaming 980 Ti?


----------



## p4inkill3r

Quote:


> Originally Posted by *rioja*
> 
> Is this whining due to overload of power? How those versions with 8pin + 8pin run then, such as Asus Strix 980 Ti or Gigabyte G1 Gaming 980 Ti?


Whine/whine.


----------



## xer0h0ur

I am no expert here but I know that cards with higher amount of power phases tend to have less to no coil whine. I presume its simply because there are more chokes to spread it across. Perhaps its due to them using higher quality chokes. I can't pinpoint it. However someone here claims that coil whine never goes away, that the frequency of the sound simply changes to being inaudible. Again, I can't confirm that either.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am no expert here but I know that cards with *higher amount of power phases tend to have less to no coil whine*. I presume its simply because there are more chokes to spread it across. Perhaps its due to them using higher quality chokes. I can't pinpoint it. However someone here claims that coil whine never goes away, that the frequency of the sound simply changes to being inaudible. Again, I can't confirm that either.


That's incredibly misleading, and not the realistic truth.

The VRM portion of a card's job is to step the 12v line down to what the core and memory require to function. The number of phases a card has is essentially the number of times the voltage is reduced and filtered. Sure more phases will definitely allow for cleaner and more stable power, but that has nothing to do with the physical quality and qualities of the chokes. Coil whine is caused by electrical current moving through the copper wires that are part of the choke (the wires wrapped around the donut looking thing). The size of the copper wire, the chokes entire mass, and how the choke is insulated within the housing is what ultimately determines the presence of coil whine. Regardless of everything I've mentioned the chokes will resonate. When the frequency of the electrical signal approaches the resonate frequency of the choke you will begin to hear the buzz which is commonly known as coil whine.

Take a Lightning, Vapor-X, Matrix, Classified, Kingpin, whatever you like as it really doesn't matter. There is no possible way manufacturers are going to take the time to ensure each choke is 100% physically the same in terms of total mass and being completely insulated with some form of epoxy (or whatever they use inside the choke housing). The amount or presence of coil whine could come down to a single choke on the card, or all of them regardless of the number or quality. My guess would be the only way to ensure that higher end cards or cards using significantly more power than they use to not have coil whine would be to use incredibly large, expensive, and or out of spec chokes. Cost and size of the components are massive factors in the overall design, and thus what we get is likely a compromise, and the unfortunate side effect in many cases is coil whine.

Due to those limitations, and unpreventable manufacturing imperfections, coil whine is completely unpredictable and saying that a card with simply more phases is guaranteed to have less or no coil whine unfortunately isn't the case. Fortunately there are ways we may be able to reduce it: breaking in a GPU with high FPS scenarios for long durations, ensuring clean power is provided from a good psu, proper grounding and electrical wiring within the house/apartment/office, as few electrical and magnetic interference's near the GPU as possible, and in the case of open chokes covering them with epoxy or any material to change mass/physical aspects of the choke to change the resonant frequency of the choke itself.


----------



## brazilianloser

Still waiting on the non x Fury... Sure taking them some time to bring the card to the public.


----------



## xer0h0ur

Quote:


> Originally Posted by *brazilianloser*
> 
> Still waiting on the non x Fury... Sure taking them some time to bring the card to the public.


You mean the R9 Nano? Fury X and Fury are being sold already.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Things become further complicated by external factors like your power supply.


I think that's a largely overlooked factor. I know that, for whatever reason, any Corsair PSU I've ever owned has caused coil whine on whatever card is in the system.


----------



## xer0h0ur

Quote:


> Originally Posted by *rv8000*
> 
> That's incredibly misleading, and not the realistic truth.
> 
> The VRM portion of a card's job is to step the 12v line down to what the core and memory require to function. The number of phases a card has is essentially the number of times the voltage is reduced and filtered. Sure more phases will definitely allow for cleaner and more stable power, but that has nothing to do with the physical quality and qualities of the chokes. Coil whine is caused by electrical current moving through the copper wires that are part of the choke (the wires wrapped around the donut looking thing). The size of the copper wire, the chokes entire mass, and how the choke is insulated within the housing is what ultimately determines the presence of coil whine. Regardless of everything I've mentioned the chokes will resonate. When the frequency of the electrical signal approaches the resonate frequency of the choke you will begin to hear the buzz which is commonly known as coil whine.
> 
> Take a Lightning, Vapor-X, Matrix, Classified, Kingpin, whatever you like as it really doesn't matter. There is no possible way manufacturers are going to take the time to ensure each choke is 100% physically the same in terms of total mass and being completely insulated with some form of epoxy (or whatever they use inside the choke housing). The amount or presence of coil whine could come down to a single choke on the card, or all of them regardless of the number or quality. My guess would be the only way to ensure that higher end cards or cards using significantly more power than they use to not have coil whine would be to use incredibly large, expensive, and or out of spec chokes. Cost and size of the components are massive factors in the overall design, and thus what we get is likely a compromise, and the unfortunate side effect in many cases is coil whine.
> 
> Due to those limitations, and unpreventable manufacturing imperfections, coil whine is completely unpredictable and saying that a card with simply more phases is guaranteed to have less or no coil whine unfortunately isn't the case. Fortunately there are ways we may be able to reduce it: breaking in a GPU with high FPS scenarios for long durations, ensuring clean power is provided from a good psu, proper grounding and electrical wiring within the house/apartment/office, as few electrical and magnetic interference's near the GPU as possible, and in the case of open chokes covering them with epoxy or any material to change mass/physical aspects of the choke to change the resonant frequency of the choke itself.


So its good to assume then that the only reason cards with higher amounts of power phases have less or no coil whine is likely due to higher quality chokes on a non-reference design.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> So its good to assume then that the only reason cards with higher amounts of power phases have less or no coil whine is likely due to higher quality chokes on a non-reference design.


Yes, not to say there can't be manufacturing imperfections that will still allow for coil whine even on better quality choke.

Well it isn't as simple as that, while it can be true, again there are too many variables. When designing a card I feel major concerns about resonant frequencies of the chokes are very low on the importance list. I guess the moral to take from all of this is that regardless of quality, if the electrical signal through the chokes is close to or the same as the chokes resonant frequency, you will get some form of coil whine no matter what.


----------



## brazilianloser

Quote:


> Originally Posted by *xer0h0ur*
> 
> You mean the R9 Nano? Fury X and Fury are being sold already.


I have seen a few reviews pop up but from what I can find here in the US they are not on sale yet other than the Saphire model...


----------



## xer0h0ur

Are you talking about online retailers or retail stores?


----------



## brazilianloser

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you talking about online retailers or retail stores?


Online. Only seeing the saphire card for the fury non x.


----------



## duox

Quote:


> Originally Posted by *brazilianloser*
> 
> Online. Only seeing the saphire card for the fury non x.


Is there a problem with the Sapphire card ?


----------



## brazilianloser

Quote:


> Originally Posted by *duox*
> 
> Is there a problem with the Sapphire card ?


Previous experience with the company. Will just wait for others to come out with their options.


----------



## duox

Quote:


> Originally Posted by *brazilianloser*
> 
> Previous experience with the company. Will just wait for others to come out with their options.


oh say no more, been there.


----------



## Alastair

Quote:


> Originally Posted by *brazilianloser*
> 
> Quote:
> 
> 
> 
> Originally Posted by *duox*
> 
> Is there a problem with the Sapphire card ?
> 
> 
> 
> Previous experience with the company. Will just wait for others to come out with their options.
Click to expand...

Asus has released the strixx version. Keep your eyes open for that one.


----------



## sygnus21

Quote:


> Originally Posted by *Alastair*
> 
> So my Sapphire Fury Tri-x cards are on their way. Should be here by the 1st of September!


I ordered my Sapphire R9 Fury from Newegg Saturday, was shipped Monday and should be here tomorrow. To be honest I can't believe I spent that much on a GPU since the most I ever spent before this purchase was around $430 bucks for the ATI RADEON ALL-IN-WONDER 9800 PRO, which was purchased in 2003...



My how times have changed









Anyway the R9 Fury will be replacing my Sapphire Vapor X R9-280X OC. I did want the OC version but Newegg was out, and those that do have it want 980TI prices









BTW the Asus Strixx is out now but I chose to go with Sapphire as I think it looks really "cool"









Until then...


----------



## BlackyMeow

Hello everyone,

my second fury X arrived yesterday. The first one (MSI) had the high pitched pump noise issue, the second one (Sapphire) doesn't.

However, the pump still makes noise. It's kind of the same noise as a 7200rpm hard drive (spinning noise), but maybe it's normal for an AIO liquid-cooler (can someone confirm?).

Also, the card has coil whine, but that's not really bothering me.

What's bothering me is the noise coming from the fan. The fan is supposed to be dead silent according to reviews, but mine has noise coming from the motor. It's a rattling noise, I don't really know how to explain it. It's like a mix of "brr" and "hum". At low speeds it more like a "brr", and since the frequency changes with speed, it's more like a "hum" at high speed.

Is this normal? Maybe it comes from the PWM circuitry? The sound isn't super loud, but I can definitely hear it through open-back headphones while gaming...

Should I return the card? I'm tired of waiting, I was supposed to get this card at the beginning of July...

Maybe the sound will go away with time? I'm leaving it at 100% fan speed today while folding at home is running, I'll see if it went away when I get home.

Or maybe I should just put a noctua NF-P12 on it and plug it in a fan header on my motherboard? I'd lose fan speed control, but the included fan never spins above 1200rpm anyway, and the Noctua is really silent.

Any piece of advice would be appreciated.

Thanks in advance.


----------



## rioja

Quote:


> Originally Posted by *BlackyMeow*
> 
> Also, the card has coil whine, but that's not really bothering me.


Oh no,

is it both cards have this whining?


----------



## BlackyMeow

Quote:


> Originally Posted by *rioja*
> 
> Oh no,
> 
> is it both cards have this whining?


Yes, both have it. (I have returned the first Fury X, FYI).

But it's really not a big deal, I only really hear it when my side panel is off. The coil whine is not present at idle, and I can't hear it while gaming.


----------



## ozyo

how about am1 athlon 5350/fury x benchmark









Spoiler: Heaven Benchmark 4.0









Spoiler: Valley Benchmark 1.0









Spoiler: gta-5



Code:



Code:


Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 6.452031, 36.712173, 22.406473
Pass 1, 18.520264, 27.094330, 22.031418
Pass 2, 14.807896, 34.682663, 21.890619
Pass 3, 16.300716, 32.307831, 25.703424
Pass 4, 2.770581, 48.184399, 27.067884

Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 27.238922, 154.989960, 44.629959
Pass 1, 36.908092, 53.994911, 45.389725
Pass 2, 28.832850, 67.531540, 45.681667
Pass 3, 30.952248, 61.346996, 38.905323
Pass 4, 20.753605, 360.935089, 36.944149

Frames under 16ms (for 60fps): 
Pass 0: 0/211 frames (0.00%)
Pass 1: 0/208 frames (0.00%)
Pass 2: 0/200 frames (0.00%)
Pass 3: 0/240 frames (0.00%)
Pass 4: 0/3477 frames (0.00%)

Frames under 33ms (for 30fps): 
Pass 0: 2/211 frames (0.95%)
Pass 1: 0/208 frames (0.00%)
Pass 2: 7/200 frames (3.50%)
Pass 3: 5/240 frames (2.08%)
Pass 4: 891/3477 frames (25.63%)

Percentiles in ms for pass 0
50%,    44.00
75%,    46.00
80%,    47.00
85%,    48.00
90%,    50.00
91%,    50.00
92%,    50.00
93%,    51.00
94%,    52.00
95%,    53.00
96%,    54.00
97%,    54.00
98%,    56.00
99%,    62.00

Percentiles in ms for pass 1
50%,    45.00
75%,    46.00
80%,    47.00
85%,    48.00
90%,    49.00
91%,    49.00
92%,    50.00
93%,    50.00
94%,    50.00
95%,    50.00
96%,    51.00
97%,    51.00
98%,    52.00
99%,    53.00

Percentiles in ms for pass 2
50%,    45.00
75%,    56.00
80%,    56.00
85%,    58.00
90%,    58.00
91%,    58.00
92%,    59.00
93%,    59.00
94%,    60.00
95%,    60.00
96%,    61.00
97%,    62.00
98%,    64.00
99%,    65.00

Percentiles in ms for pass 3
50%,    37.00
75%,    42.00
80%,    44.00
85%,    46.00
90%,    48.00
91%,    49.00
92%,    50.00
93%,    50.00
94%,    50.00
95%,    50.00
96%,    51.00
97%,    51.00
98%,    52.00
99%,    52.00

Percentiles in ms for pass 4
50%,    36.00
75%,    43.00
80%,    44.00
85%,    46.00
90%,    48.00
91%,    49.00
92%,    49.00
93%,    50.00
94%,    51.00
95%,    52.00
96%,    53.00
97%,    55.00
98%,    59.00
99%,    65.00

=== SYSTEM ===
Windows 8.1 Pro 64-bit (6.2, Build 9200)
DX Feature Level: 11.0
AMD Athlon(tm) 5350 APU with Radeon(tm) R3      (4 CPUs), ~2.0GHz
8192MB RAM
AMD Radeon (TM) R9 Fury Series, 4268MB, Driver Version 15.200.1062.1004
Graphics Card Vendor Id 0x1002 with Device ID 0x7300

=== SETTINGS ===
Display: 2560x1440 (FullScreen) @ 60Hz VSync OFF
Tessellation: 2
LodScale: 1.000000
PedLodBias: 0.200000
VehicleLodBias: 0.000000
ShadowQuality: 3
ReflectionQuality: 2
ReflectionMSAA: 8
SSAO: 2
AnisotropicFiltering: 16
MSAA: 8
MSAAFragments: 0
MSAAQuality: 0
SamplingMode: 0
TextureQuality: 2
ParticleQuality: 2
WaterQuality: 2
GrassQuality: 2
ShaderQuality: 2
Shadow_SoftShadows: 4
UltraShadows_Enabled: false
Shadow_ParticleShadows: true
Shadow_Distance: 1.000000
Shadow_LongShadows: false
Shadow_SplitZStart: 0.930000
Shadow_SplitZEnd: 0.890000
Shadow_aircraftExpWeight: 0.990000
Shadow_DisableScreenSizeCheck: false
Reflection_MipBlur: true
FXAA_Enabled: true
TXAA_Enabled: false
Lighting_FogVolumes: true
Shader_SSA: true
DX_Version: 2
CityDensity: 1.000000
PedVarietyMultiplier: 1.000000
VehicleVarietyMultiplier: 1.000000
PostFX: 2
DoF: true
HdStreamingInFlight: false
MaxLodScale: 0.000000
MotionBlurStrength: 0.000000







Spoiler: BioShockInfinite-Fraps



Code:



Code:


2015-08-19 13:12:19 - BioShockInfinite
Frames: 26393 - Time: 454766ms - Avg: 58.036 - Min: 28 - Max: 61


----------



## mRYellow

Hi guys

Finally got my Fury - Sapphire. Took me some time to get stock.
Will post more info once and screenies once I have this bad boy in.


----------



## BlackyMeow

Quote:


> Originally Posted by *mRYellow*
> 
> Hi guys
> 
> Finally got my Fury - Sapphire. Took me some time to get stock.
> Will post more info once and screenies once I have this bad boy in.


Please tell me if the fan makes a buzzing noise


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> Please tell me if the fan makes a buzzing noise


Will do.


----------



## ceVoIX

@ozyo
Thx for the bench








Can u bench the DX12 test from 3DMark vs DX11?


----------



## escksu

Quote:


> Originally Posted by *rioja*
> 
> Oh no,
> 
> is it both cards have this whining?


Its normal for Fury X to have coil whine. both my cards have coil whine. The fans do make a slight noise at low RPM. This is normal. Fans are not supposed to be dead quiet.

Btw, the pair can run witcher 3 even with hairworks turned on.


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> Please tell me if the fan makes a buzzing noise


Card is silent.

The GPU meter (flashing night rider effect on PCB) on my card doesn't seem to light up? Is this normal on Fury?
Any ideas?


----------



## Jflisk

Quote:


> Originally Posted by *mRYellow*
> 
> Card is silent.
> 
> The GPU meter (flashing night rider effect on PCB) on my card doesn't seem to light up? Is this normal on Fury?
> Any ideas?


The night rider does not light up under load ?


----------



## Jflisk

Quote:


> Originally Posted by *mRYellow*
> 
> Card is silent.
> 
> The GPU meter (flashing night rider effect on PCB) on my card doesn't seem to light up? Is this normal on Fury?
> Any ideas?


Check this
One other new feature on the AMD Radeon R9 Fury X that has never been used on a GPU before is the addition of the GPU Tach. AMD soldered down nine LEDs on the back of the card that allow you to see the GPU load level. There are two DIP switches on the backplate that allow you to enable or disable the GPU Tach and allows you to change the color of the LEDs between red and blue to go with your case theme better. Eight of the LEDs are for the GPU load level and can be red or blue. The ninth LED light is green and when it is lit up it visually lets you know that the GPU is in AMD's ZeroCore power mode.

Read more at http://www.legitreviews.com/amd-fiji-arrives-radeon-r9-fury-x-details_166515#CeelGGJ2yujKMe2h.99

Half way down the page tells you the settings
http://www.hardwarezone.com.sg/review-amd-radeon-r9-fury-x-reaching-out-maxwell


----------



## mRYellow

Quote:


> Originally Posted by *Jflisk*
> 
> Check this
> One other new feature on the AMD Radeon R9 Fury X that has never been used on a GPU before is the addition of the GPU Tach. AMD soldered down nine LEDs on the back of the card that allow you to see the GPU load level. There are two DIP switches on the backplate that allow you to enable or disable the GPU Tach and allows you to change the color of the LEDs between red and blue to go with your case theme better. Eight of the LEDs are for the GPU load level and can be red or blue. The ninth LED light is green and when it is lit up it visually lets you know that the GPU is in AMD's ZeroCore power mode.
> Read more at http://www.legitreviews.com/amd-fiji-arrives-radeon-r9-fury-x-details_166515#CeelGGJ2yujKMe2h.99


Thank you.

Mean while I unlocked this baby

Here are my results for locked CUs

*Before*
Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00030000 / 00000000 [..............xx]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 02010000 / 00000000 [......x........x]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.

Firestrike extreme GPU score before: 7015

*After*
Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00020000 / 00000000 [..............x.]
SE2 hw/sw: 00020000 / 00000000 [..............x.]
SE3 hw/sw: 00020000 / 00000000 [..............x.]
SE4 hw/sw: 02000000 / 00000000 [......x.........]
60 of 64 CUs are active. HW locks: 4 (R/W) / SW locks: 0 (R/W).
4 CU's are disabled by HW lock, override is possible at your own risk.

*Firestrike extreme GPU score after: 7203*


----------



## sygnus21

How do you unlock these bad boys? - Sapphire R9 Fury (Non OC).

Thanks


----------



## battleaxe

Quote:


> Originally Posted by *sygnus21*
> 
> How do you unlock these bad boys? - Sapphire R9 Fury (Non OC).
> 
> Thanks


Read the opening post.


----------



## sygnus21

Ah, ok.

Thanks.


----------



## BlackyMeow

I think I'm gonna return my Fury X... I gave it two tries, the first one was horrible, the second one is a noisy mess... It's a shame because it looks really good and performs really well.

The pump on the second card makes the same noise as the Cooler Master Seidon 120V I installed in a friend's computer, and that's their crappiest AIO cooler... Why did they use the same mechanism on AMD's most expensive graphics card ?

I'm thinking about getting a Sapphire Fury (non-X) or a 980Ti (but I don't like Nvidia at all, so...). Any advice ?


----------



## rioja

Quote:


> Originally Posted by *BlackyMeow*
> 
> Any advice ?


This











In add you'll get lower temps


----------



## Orthello

Quote:


> Originally Posted by *rioja*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> In add you'll get lower temps


Dang that looks nice , how does she overclock , help much vs stock cooler ?


----------



## BlackyMeow

Quote:


> Originally Posted by *rioja*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In add you'll get lower temps


Yeah, that would be nice... But I don't have any watercooling stuff. I would need : radiator, pump, res, tubing, waterblock, fittings, and probably a new case (I only have a NZXT source 220).

Even if I went full custom watercooling, it would still make sense to get a Fury non-X instead, since the waterblock can fit both cards, wouldn't it ?


----------



## rioja

Quote:


> Originally Posted by *Orthello*
> 
> Dang that looks nice , how does she overclock , help much vs stock cooler ?


no, 1110-1120 / 600 on stock, 1130/620 on EK

but they wait for a way to increase VGPU higher than 1.2150 and hope to reach 1250 then


----------



## Orthello

20mhz on each , not bad at all for just enhanced water cooling and yeah once voltage is fully unlocked that vrm cooling you have there should let you go quite a bit further. Those ram clocks are pretty nice alright already.

What is the update on that btw, i thought it was meant to be implemented in Trixx by now ??


----------



## mRYellow

I can recommend the Sapphire Fury. Card is silent and runs cool.
And you are almost 99% guaranteed to unlock at least 4 CUs.


----------



## Jflisk

Quote:


> Originally Posted by *rioja*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In add you'll get lower temps


That fury X temperature chart is wrong. I have never seen either of my cards above 52 MAX . They idel at 30 and 34C . The ambient's that they were using for that test must have been high.

Never mind I also have a custom fan profile. That keeps the cards at 52C.

One other thing though the EK water block route depends on the amount of radiators.


----------



## rioja

I think it just depends of fan profile they used, they have 59C at Auto and 47C at Full speed,

this is the link

https://translate.google.com/translate?hl=ru&sl=ro&tl=en&u=http%3A%2F%2Fwww.techview.ro%2Fr9-fury-waterblock-review%2F


----------



## MunneY

Looks like Amazon FINALLY has a Fury card in stock...

http://www.amazon.com/gp/product/B011D7A526/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B011D7A526&linkCode=as2&tag=them0971-20&linkId=T547NACOMBLTRT3

If only it was the strix....


----------



## Ceadderman

Somebody is smoking crack. Over $1k for that card before discount to put it where the MSRP has it?

But at that price I will take the 290 version if I can still get it.









~Ceadder


----------



## looncraz

Quote:


> Originally Posted by *BlackyMeow*
> 
> I think I'm gonna return my Fury X... I gave it two tries, the first one was horrible, the second one is a noisy mess... It's a shame because it looks really good and performs really well.
> 
> The pump on the second card makes the same noise as the Cooler Master Seidon 120V I installed in a friend's computer, and that's their crappiest AIO cooler... Why did they use the same mechanism on AMD's most expensive graphics card ?
> 
> I'm thinking about getting a Sapphire Fury (non-X) or a 980Ti (but I don't like Nvidia at all, so...). Any advice ?


If you can get a refund, then I'd get a non-X Fury... though I might try my hand at the pump lottery once more, first.

nVidia cards are too weak in the color department for me.


----------



## Medusa666

Ordered two Fury X, both had the pump noise and extremely high coil whine compared to both my previous 295X2.

Returned both for refunds, sticking to the 295X2 now until Fiji X2 is released, or 16nm with HBM2.


----------



## Alastair

Quote:


> Originally Posted by *MunneY*
> 
> Looks like Amazon FINALLY has a Fury card in stock...
> 
> http://www.amazon.com/gp/product/B011D7A526/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B011D7A526&linkCode=as2&tag=them0971-20&linkId=T547NACOMBLTRT3
> 
> If only it was the strix....


Quote:


> Originally Posted by *MunneY*
> 
> Looks like Amazon FINALLY has a Fury card in stock...
> 
> http://www.amazon.com/gp/product/B011D7A526/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B011D7A526&linkCode=as2&tag=them0971-20&linkId=T547NACOMBLTRT3
> 
> If only it was the strix....


The Tri-X has been in stock for AGES! You are a bit late to the party my friend.


----------



## xer0h0ur

Quote:


> Originally Posted by *Medusa666*
> 
> Ordered two Fury X, both had the pump noise and extremely high coil whine compared to both my previous 295X2.
> 
> Returned both for refunds, sticking to the 295X2 now until Fiji X2 is released, or 16nm with HBM2.


How are people still getting cards with pump noise? This doesn't make any sense to me.


----------



## Medusa666

Quote:


> Originally Posted by *xer0h0ur*
> 
> How are people still getting cards with pump noise? This doesn't make any sense to me.


No idea, was extremely dissappointed at this but I can't be bothered at this point, it was a downgrade for me anyway compared to 295X2, wanted to support AMD and get the newest flagship, but they lost me for this generation at least. Can't be bothered.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> How are people still getting cards with pump noise? This doesn't make any sense to me.


¯\_(ツ)_/¯


----------



## xer0h0ur

Quote:


> Originally Posted by *Medusa666*
> 
> No idea, was extremely dissappointed at this but I can't be bothered at this point, it was a downgrade for me anyway compared to 295X2, wanted to support AMD and get the newest flagship, but they lost me for this generation at least. Can't be bothered.


I am genuinely shocked people are still getting new stock with pump noise.

Well yeah, two Fury X = three 290X so there was no way you would approach/pass dual 295X2 territory unless you had three Fury X and that creates case problems trying to accommodate three radiators.


----------



## GorillaSceptre

Quote:


> Originally Posted by *xer0h0ur*
> 
> How are people still getting cards with pump noise? This doesn't make any sense to me.


What evidence has there been that shows the pump noise was fixed? Not being facetious, just asking.

I've seen a few people say that they still have noise even though they have the new logo on the pump.


----------



## xer0h0ur

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What evidence has there been that shows the pump noise was fixed? Not being facetious, just asking.
> 
> I've seen a few people say that they still have noise even though they have the new logo on the pump.


Google it?

Edit: Or if you can't be bothered: http://www.legitreviews.com/amd-adjusts-sound-baffling-adhesive-to-calm-radeon-r9-fury-x-noise-complaints_168284


----------



## Jflisk

I have 2 X Fury X Ill have to go listen to them. I remember hearing a pump noise initially but not after they run for a little while. Think they need time to settle the air then go quite. The only thing that annoys me pump wise is my fish tank air pump don't think replacing the furies will fix that.


----------



## sygnus21

Quote:


> Originally Posted by *MunneY*
> 
> Looks like Amazon FINALLY has a Fury card in stock...
> 
> http://www.amazon.com/gp/product/B011D7A526/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=B011D7A526&linkCode=as2&tag=them0971-20&linkId=T547NACOMBLTRT3
> 
> If only it was the strix....


They've been had that card. I was going to get it from Amazon, but as of last week it was $595. Newegg had it for $559 so I got it from them. Nice to see it dropped in price, but it's still cheaper at Newegg.

Had I acted sooner I could have got the OC edition for 569 from Newegg but I waited too long and no one had them for less than 600+ so I just got the regular Sapphire Tri-X R9 Fury.

Anyway, yeah, Amazon has always had the card (at least when I was looking at it last week) they just dropped the price.


----------



## rioja

Quote:


> Originally Posted by *Medusa666*
> 
> No idea, was extremely dissappointed at this but I can't be bothered at this point, it was a downgrade for me anyway compared to 295X2, wanted to support AMD and get the newest flagship, but they lost me for this generation at least. Can't be bothered.


Don't you want to get second 295x2 instead?


----------



## Malamute3511

Hi al just had a few questions. Does any 1 here own 2 FuryX in Crossfire. I am curious of benchmark scores and temperatures. Sorry if wrong forum just seem like the right place to ask.


----------



## Medusa666

Quote:


> Originally Posted by *rioja*
> 
> Don't you want to get second 295x2 instead?


Been there done that, used to own two of these beasts but the heat output was insane and the quad GPU crossfire was a royal pita, but the few times it worked it was wicked sick, but in the end it wasn't worth it. One of these cards is more than enough to completely crush everything else.


----------



## sygnus21

Question...

As stated earlier, I have the Sapphire Tri-X R9 Fury. Prior to buying this card I watched a few videos and in one of them it showed that the fans of the card wound not run until it was under heavy load. You can see this YouTube review to see what I'm talking about. Go to the 6 minute mark where the reviewer states the fans aren't running...




Anyway on my card the fans are running all the time. As of this writing they are at 19 percent. Is anyone's fans dead stopped until the card is under heavy load as the video shows?

Thanks.

BTW if this card has coil whine, I can't hear it. The card is pretty quiet, and extremely cool.


----------



## rv8000

Quote:


> Originally Posted by *sygnus21*
> 
> Question...
> 
> As stated earlier, I have the Sapphire Tri-X R9 Fury. Prior to buying this card I watched a few videos and in one of them it showed that the fans of the card wound not run until it was under heavy load. You can see this YouTube review to see what I'm talking about. Go to the 6 minute mark where the reviewer states the fans aren't running...
> 
> 
> 
> 
> Anyway on my card the fans are running all the time. As of this writing they are at 19 percent. Is anyone's fans dead stopped until the card is under heavy load as the video shows?
> 
> Thanks.
> 
> BTW if this card has coil whine, I can't hear it. The card is pretty quiet, and extremely cool.


The fans are stopped, do not let the 19% fool you in any monitoring program they're off, that's simply the fan speed where the startup/cut off voltage happens to be which is why it displays 19% even when they're stopped. Open up your case and look if you're really that curious.


----------



## xer0h0ur

If you're that concerned over the fans just set a custom fan curve in afterburner?


----------



## sygnus21

Quote:


> Originally Posted by *rv8000*
> 
> The fans are stopped, do not let the 19% fool you in any monitoring program they're off, that's simply the fan speed where the startup/cut off voltage happens to be which is why it displays 19% even when they're stopped. Open up your case and look if you're really that curious.


Yeah I was just about to post that a physical eye test of looking at the fans does indeed show they're off... despite what both CCC and GPU-Z shows. They even show RPMS, which is obviously wrong...



Thanks.


----------



## sygnus21

Quote:


> Originally Posted by *xer0h0ur*
> 
> If you're that concerned over the fans just set a custom fan curve in afterburner?


Can anyone ask a question without getting snide comments??? This is a forum, people ask questions. Good grief!


----------



## xer0h0ur

*Double take* Is that not a valid suggestion?


----------



## Jflisk

Quote:


> Originally Posted by *Malamute3511*
> 
> Hi al just had a few questions. Does any 1 here own 2 FuryX in Crossfire. I am curious of benchmark scores and temperatures. Sorry if wrong forum just seem like the right place to ask.


3dmark benchmark Fury x2 slight overclock
http://www.3dmark.com/fs/5547294

No overclock
http://www.3dmark.com/fs/5695098

Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.


----------



## jase78

It is a valid suggestion . It's the tone in which it reads. Maybe wasn't your intention but it seems arrogant and or snobbish


----------



## sygnus21

Dang! Seems I'll have to send my card back to Newegg as I'm now getting massive artifacting to the point of having to reboot the PC. Happened twice within the last hour. Damn!









Funny thing is yesterday when I ran both 3DMark's Firestrike and Unigine Valley I had no issues. Even played Dragon Age: Inquisition and no problems. Today I'm just on the internet and bam.. screen res decreases and artifacting. Reboot, about an hour later, same thing - on the internet and artifiacting.

Also, something strange happened when trying to run 3DMark today - Yesterday I was able to run Firestrike no problems. Today I run Firestrike and 3DMark now tells me my card is not recognized! Yeah, definitely sounds like the card is bad.

Anyway now I get to go back to my failing fan R9 280X card and wait a week for the replacement R9 Fury


----------



## Malamute3511

Quote:


> Originally Posted by *Jflisk*
> 
> 3dmark benchmark Fury x2 slight overclock
> http://www.3dmark.com/fs/5547294
> 
> No overclock
> http://www.3dmark.com/fs/5695098
> 
> Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.


Thank you. I Own Nvidia and I wondered how scored correlated. Got to love the AIO Cooler those temps are amazing. Thanks again for the info was very helpful. I don't want to start a flame war so sorry if talkin about talkin Green in a Red thread.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> 3dmark benchmark Fury x2 slight overclock
> http://www.3dmark.com/fs/5547294
> 
> No overclock
> http://www.3dmark.com/fs/5695098
> 
> Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.


You should tune that vishera up buddy.
Physics looks a little low.... Tried overclocking it?

I ask because it appears to be bottling those fiji's up a bit, so every ounce of CPU performance you can get will help a lot.


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You should tune that vishera up buddy.
> Physics looks a little low.... Tried overclocking it?
> 
> I ask because it appears to be bottling those fiji's up a bit, so every ounce of CPU performance you can get will help a lot.


I am trying to take it back to 5.0 seem to have a problem with that one lately.

My CPU seems constipated .


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> I am trying to take it back to 5.0 seem to have a problem with that one lately.


You've certainly got the board and chip needed to get to 5.

Based on the physics score, there's some tuning to do too, even if you don't oc more. My 8300 at 4.8 pulls 9300+ points on physics.

Too off topic for this thread, but would love to help in pm, or in the vishera owners thread.

Either way, congrats on scoring two furies!


----------



## mRYellow

Quote:


> Originally Posted by *sygnus21*
> 
> Dang! Seems I'll have to send my card back to Newegg as I'm now getting massive artifacting to the point of having to reboot the PC. Happened twice within the last hour. Damn!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Funny thing is yesterday when I ran both 3DMark's Firestrike and Unigine Valley I had no issues. Even played Dragon Age: Inquisition and no problems. Today I'm just on the internet and bam.. screen res decreases and artifacting. Reboot, about an hour later, same thing - on the internet and artifiacting.
> 
> Also, something strange happened when trying to run 3DMark today - Yesterday I was able to run Firestrike no problems. Today I run Firestrike and 3DMark now tells me my card is not recognized! Yeah, definitely sounds like the card is bad.
> 
> Anyway now I get to go back to my failing fan R9 280X card and wait a week for the replacement R9 Fury


You didn't by any chance unlock the CUs?


----------



## BlackyMeow

I just ordered a Sapphire Fury Tri-X OC, it should arrive tomorrow.

I also filed an RMA form for my Fury X... Man, this is heartbreaking...

I might be able to do some Fury/Fury X crossfire benchmarks before I send it back though. Would you guys be interested in that ? I can do 1080p, 4K VSR and Eyefinity 5760x1080. Let me know.


----------



## mRYellow

Has anyone managed to flash the the Sapphire Fury OC bios on the sapphire non Fury?


----------



## spyshagg

Quote:


> Originally Posted by *Jflisk*
> 
> 3dmark benchmark Fury x2 slight overclock
> http://www.3dmark.com/fs/5547294
> 
> No overclock
> http://www.3dmark.com/fs/5695098
> 
> Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.


somethings up mate. I have more GPU score with 2 290x's than your fury x's score. http://www.3dmark.com/fs/5662654


----------



## Agent Smith1984

Quote:


> Originally Posted by *spyshagg*
> 
> somethings up mate. I have more GPU score with 2 290x's than your fury x's score. http://www.3dmark.com/fs/5662654


I noticed the same thing.... that Fury X CF score is low....

Your 290x's are looking great though!! Nice job







That with modded BIOS and water cooling I assume?

A big part of his problem though, is that something is going on with his system.... his CPU, even for an FX-8, is scoring really low, which in tern, is obviously causing major GPU bottlenecks.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I noticed the same thing.... that Fury X CF score is low....
> 
> Your 290x's are looking great though!! Nice job
> 
> 
> 
> 
> 
> 
> 
> That with modded BIOS and water cooling I assume?
> 
> A big part of his problem though, is that something is going on with his system.... his CPU, even for an FX-8, is scoring really low, which in tern, is obviously causing major GPU bottlenecks.


1333mhz RAM and likely to be running 2200NB instead of 2600 like a lot of the 990FX board default to will have a significant impact on the Physics score when the clock speed is as high as it is on that 9590. Possible he is also getting some throttling, I know when I bought a 9590 to play around with it needed a small fan on the VRM 24/7 to maintain clocks (Sabertooth R2.0).


----------



## Jflisk

I Dont know what the heck it is. I just went to look at the scores on 3Dmark. The scores of 2x 295 and 3 x R9 290x are in line with mine. Like almost same scores at least in the 13000 to 14000 range. Then there are 2x R9 290x that are in line with mine or higher. These are all scores with 9590.

This is the list of the breakdown.
http://www.3dmark.com/search?_ga=1.181218976.664798282.1440168318#/?mode=basic&url=/proxycon/ajax/search/cpuname/fs/P/AMD%20FX-9590&cpuName=AMD%20FX-9590


----------



## Sgt Bilko

Quote:


> Originally Posted by *spyshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> 3dmark benchmark Fury x2 slight overclock
> http://www.3dmark.com/fs/5547294
> 
> No overclock
> http://www.3dmark.com/fs/5695098
> 
> Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.
> 
> 
> 
> somethings up mate. I have more GPU score with 2 290x's than your fury x's score. http://www.3dmark.com/fs/5662654
Click to expand...

It's because he is doing Firestrike Performance, CPU is holding them back a bit, load up Extreme or Ultra and and the gap will appear


----------



## spyshagg

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I noticed the same thing.... that Fury X CF score is low....
> 
> Your 290x's are looking great though!! Nice job
> 
> 
> 
> 
> 
> 
> 
> That with modded BIOS and water cooling I assume?


thanks mate







yeah 390x bios and watercooled.
cracked 15K on a single 290x also. that was fun. http://www.3dmark.com/fs/5662841
If the Dx12 gains pans out to be true, I might have cards for at least another year or two
Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's because he is doing Firestrike Performance, CPU is holding them back a bit, load up Extreme or Ultra and and the gap will appear


yes that makes sense. FS extreme is paid so I wouldn't know the difference. But something is up with his rig, fury x's should always be faster than my 290x's. Hopefully he finds the issue


----------



## Sgt Bilko

Quote:


> Originally Posted by *spyshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I noticed the same thing.... that Fury X CF score is low....
> 
> Your 290x's are looking great though!! Nice job
> 
> 
> 
> 
> 
> 
> 
> That with modded BIOS and water cooling I assume?
> 
> 
> 
> thanks mate
> 
> 
> 
> 
> 
> 
> 
> yeah 390x bios and watercooled.
> cracked 15K on a single 290x also. that was fun. http://www.3dmark.com/fs/5662841
> If the Dx12 gains pans out to be true, I might have cards for at least another year or two
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's because he is doing Firestrike Performance, CPU is holding them back a bit, load up Extreme or Ultra and and the gap will appear
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yes that makes sense. FS extreme is paid so I wouldn't know the difference. But something is up with his rig, fury x's should always be faster than my 290x's. Hopefully he finds the issue
Click to expand...

Firestrike is at 1080p, at 1080p you'd need that FX Chip clocked at 5.5Ghz or so to get those Fury X's to run at 100% load all the time, Extreme and Ultra lets them stretch their legs a bit more









Having owned Crossfire 290's and a 295x2 and benched FS quite a bit you can take my word on that


----------



## schubaltz

can anyone confirm the new version of Sapphire Trixx now support Fury/Fury X overclocking w/ unlocked voltage?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Firestrike is at 1080p, at 1080p you'd need that FX Chip clocked at 5.5Ghz or so to get those Fury X's to run at 100% load all the time, Extreme and Ultra lets them stretch their legs a bit more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Having owned Crossfire 290's and a 295x2 and benched FS quite a bit you can take my word on that


Sarge is 100% correct, and that's why that low physics score was alarming to me.... If his CPU was pulling the proper weight, it would still bottle the furies a bit, but not as bad as in this case (with physics score in the 8k's)...
Probably turn that 27k graphics score into 29 or 30k with a solid 9500+ Point physics score if everything is dialed in nicely.

Of course, 2 Furies in crossfire is geared more towards 4K gaming anyways, so I wouldn't sweat it!

Post up some Ultra scores


----------



## sygnus21

Quote:


> Originally Posted by *sygnus21*
> 
> Dang! Seems I'll have to send my card back to Newegg as I'm now getting massive artifacting to the point of having to reboot the PC. Happened twice within the last hour. Damn!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Funny thing is yesterday when I ran both 3DMark's Firestrike and Unigine Valley I had no issues. Even played Dragon Age: Inquisition and no problems. Today I'm just on the internet and bam.. screen res decreases and artifacting. Reboot, about an hour later, same thing - on the internet and artifiacting.
> 
> Also, something strange happened when trying to run 3DMark today - Yesterday I was able to run Firestrike no problems. Today I run Firestrike and 3DMark now tells me my card is not recognized! Yeah, definitely sounds like the card is bad.
> 
> Anyway now I get to go back to my failing fan R9 280X card and wait a week for the replacement R9 Fury


Quote:


> Originally Posted by *mRYellow*
> 
> You didn't by any chance unlock the CUs?


I haven't did anything to the card. I don't even overclock my system. And I certainly wouldn't tweak or modify a brand new card as I definitely need to make sure it's working as advertised out the box.

Anyway I'm not sure what's going on now because I've not had any issues since yesterday. I've since done a couple of gaming sessions with Dragon Age: Inquisition, running the game on ultra in both DX11 and Mantle. There wasn't even a hiccup so.... Still, the fact that I had artifacting issues concerns me.

That said, I'll give it the weekend and see what goes from there before deciding to return it to Newegg for replacement. I'm going to try a few more games such as Skyrim, Fallout NV, and Bioshock Infinite to see what happens. Will also run a few benches again.

Will keep you guys posted.

Thanks.


----------



## Thoth420

Looking for some advice on custom cooling config. I was curious if 2 x 240 EK Rads (one front and one top), D5 Pump and a cylinder res(unsure what size) would be adequate to cool an i7 6700k and one fury x (both ek supremacy blocks). Tubing is PETC but the bends will use fittings so it won't be heat gunned. I have 0 experience building with custom loops but plenty of air and aio coolers etc.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Looking for some advice on custom cooling config. I was curious if 2 x 240 EK Rads (one front and one top), D5 Pump and a cylinder res(unsure what size) would be adequate to cool an i7 6700k and one fury x (both ek supremacy blocks). Tubing is is PETC but the bends will use fittings so it won't be heat gunned. I have 0 experience building with custom loops but plenty of air and aio coolers etc.


I don't think you can use Supremacy on the Fury due to the HBM.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> I don't think you can use Supremacy on the Fury due to the HBM.


Ok thanks








Whatever full coverage block they make for it then.


----------



## BlackyMeow

Quote:


> Originally Posted by *schubaltz*
> 
> can anyone confirm the new version of Sapphire Trixx now support Fury/Fury X overclocking w/ unlocked voltage?


No, the voltage isn't unlocked in Trixx.


----------



## rx7racer

Little bit of an oddball question. Anyone running any type of Fury/X notice your turn off state not happening for your monitors?


----------



## Nilsom

Hello everyone,
I would like to know your opinions and advice ,
I am seriously considering buying 2x fury X to crossfire ,
which the feedback that give me about this crossfire ? or is it better to go green side ? atualmentente have 2 R9 290x 8GB vapox ,
thank you all.


----------



## Malamute3511

Asking about Green in a Red Thread u r a brave man lol. I have personally used red and green though the ages. As of right now it comes down to budget. THIS IS NOT FACT BUT PERSONAL OPINION.

as it stands right now 980ti beats the fury x. 980TI in SLI beats Fury X in Crossfire.
Fury X beats 980ti in temps hands down. OFC you can block your 980ti but Fury X is already done fore you.
980TI draws less power on average. SO it really comes down to what you prefer.
side note fury x card its self is much smaller than the 980ti. Mind you have to place a rad some where. Sorry if I over stepped in a Red thread. These just my opinions.

side note Love your Avatar


----------



## EpicOtis13

Quote:


> Originally Posted by *Malamute3511*
> 
> Asking about Green in a Red Thread u r a brave man lol. I have personally used red and green though the ages. As of right now it comes down to budget. THIS IS NOT FACT BUT PERSONAL OPINION.
> 
> as it stands right now 980ti beats the fury x. 980TI in SLI beats Fury X in Crossfire.
> Fury X beats 980ti in temps hands down. OFC you can block your 980ti but Fury X is already done fore you.
> 980TI draws less power on average. SO it really comes down to what you prefer.
> side note fury x card its self is much smaller than the 980ti. Mind you have to place a rad some where. Sorry if I over stepped in a Red thread. These just my opinions.
> 
> side note Love your Avatar


Wait what? Fury X in crossfire beats both 980ti Sli and TitanX Sli. (Pardon the French: http://www.hardware.fr/focus/111/crossfire-radeon-r9-fury-x-fiji-vs-gm200-round-2.html)


----------



## Malamute3511

Quote:


> Originally Posted by *Jflisk*
> 
> 3dmark benchmark Fury x2 slight overclock
> http://www.3dmark.com/fs/5547294
> 
> No overclock
> http://www.3dmark.com/fs/5695098
> 
> Max temps never above 52C top card 49C lower card . Idel 30C and 34C . Custom fan profiles to set 52C.


his score is 13390. My 1 980ti beats that score on its own. When I have both in SLI I'm way above that score. Keep in mind overclock/cpu can make a large impact.
SINGLE CARD


SLI


Even ig you go buy PURE graphics score alone. SLI wins. At least in my particular case. Still like the Fury x its the 1st generation of HBM can only get better. I'm in no way saying green over red at all. Just this 1 card vs 1 card I feel green has advantage


----------



## GorillaSceptre

Quote:


> Originally Posted by *EpicOtis13*
> 
> Wait what? Fury X in crossfire beats both 980ti Sli and TitanX Sli. (Pardon the French: http://www.hardware.fr/focus/111/crossfire-radeon-r9-fury-x-fiji-vs-gm200-round-2.html)


At stock, yes.


----------



## Nilsom

Quote:


> Originally Posted by *Malamute3511*
> 
> Asking about Green in a Red Thread u r a brave man lol. I have personally used red and green though the ages. As of right now it comes down to budget. THIS IS NOT FACT BUT PERSONAL OPINION.
> 
> as it stands right now 980ti beats the fury x. 980TI in SLI beats Fury X in Crossfire.
> Fury X beats 980ti in temps hands down. OFC you can block your 980ti but Fury X is already done fore you.
> 980TI draws less power on average. SO it really comes down to what you prefer.
> side note fury x card its self is much smaller than the 980ti. Mind you have to place a rad some where. Sorry if I over stepped in a Red thread. These just my opinions.
> 
> side note Love your Avatar


Thank you
after several years with the nvidia , I returned to AMD , I am very happy, the 970 regrettable to me with 3 monitors , so I returned to AMD , but now I have doubts the fury X, I'm afraid to 4GB be little for future next , despite being HBM , it will be AMD that will improve the fury x with the new drivers and getting better than the 980Ti ?
thank you all.


----------



## rx7racer

I hate to say it but imo you've gotta be an AMD Fan to buy a Fury X and especially do CF with it.

980 Ti for a slight nickel more makes way more sense. We're on OCN so overclocking makes it an obvious price/performance winner.

If anything I'd recommend at least waiting for the FuryX2 if you want CF and AMD, should be only a month or two out now from at least getting details on it.

Ever since Fermi(which I loved) and NV decided to short change us on selling mid tier core at high tier pricing I just won''t buy and support them. AMD does well enough but for pure performance NV does have it on lock right now.

Of course all this is just in my opinion. I'd love to say Fiji is awesome and all but realistically in practice after real world overclocks are taken in consideration it's no contest.


----------



## Malamute3511

as far as future proof no card is lol. I don't know enough about HBM to answer any Vram Questions sorry. I just know benches I have done and other I have seen. Will drivers improve the performance and over all quality in the long run of the Fury X I would say yes. Any card from any manufacturer will improve over time with driver to a point. Will it double your bench score, doubt full. I always look at it like the GPU is like a car with a set amount of Horsepower. GPU Drivers are like the Transmission. All the power in the world means nothing unless the GPU driver can use it appropriately. Best GPU drivers in the world mean nothing if there is not enough power behind it. Very give and take relationship between the 2.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Malamute3511*
> 
> his score is 13390. My 1 980ti beats that score on its own. When I have both in SLI I'm way above that score. Keep in mind overclock/cpu can make a large impact.
> SINGLE CARD
> 
> 
> SLI
> 
> 
> Even ig you go buy PURE graphics score alone. SLI wins. At least in my particular case. Still like the Fury x its the 1st generation of HBM can only get better. I'm in no way saying green over red at all. Just this 1 card vs 1 card I feel green has advantage


Nice scores, but the gpu's have nothing to do with the results he posted. He is CPU pound at 1080p, completely different issue...

The fury x in crossfire IS faster than 980ti SLI in almost every situation.

That is bench/review proven fact


----------



## mRYellow

BTW, Here's my score, no OC. For some reason 3DMark doesn't recognise my card. Wonder if its the unlocked CUs.

http://www.3dmark.com/fs/5783413

*Score*
13157 with Generic VGA(1x) and Intel Core i7-5820K Processor

*Graphics Score*
15062

*Physics Score*
16959

*Combined Score*
5759


----------



## Malamute3511

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice scores, but the gpu's have nothing to do with the results he posted. He is CPU pound at 1080p, completely different issue...
> 
> The fury x in crossfire IS faster than 980ti SLI in almost every situation.
> 
> That is bench/review proven fact


ok don't take total score. Take just graphics score I'm still much higher.


----------



## mRYellow

Quote:


> Originally Posted by *Malamute3511*
> 
> ok don't take total score. Take just graphics score I'm still much higher.


OK, bye bye now


----------



## Agent Smith1984

Quote:


> Originally Posted by *Malamute3511*
> 
> ok don't take total score. Take just graphics score I'm still much higher.


Right, because his CPU is limiting the overall graphics score too.

2 Fury X on a nice Intel CPU will score
32-34k depending....

Then keep in mind this is Firestrike which favors NVIDIA and Intel aanyways.

Just look at actual gaming benchmarks....

980Ti is a great card, but Fury is better in dual GPU

Single card, the ti wins in most cases.

And when overclocking both, the ti wins there too, at least until these guys get voltage control..... Then it's game on!


----------



## Malamute3511

I saw benchmarks with 980ti and Fury X for GTAV. Fury X had random stutters, Either way 32k-34k is still lower than my cards at there normal boost clock. Not Gona start a flame war as that was not my intention. Was just giving my 2 cents.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Malamute3511*
> 
> I saw benchmarks with 980ti and Fury X for GTAV. Fury X had random stutters, Either way 32k-34k is still lower than my cards at there normal boost clock. Not Gona start a flame war as that was not my intention. Was just giving my 2 cents.


I hear you man, no flame intended









Was just saying that firestrike favors NVIDIA while actual reviews show the fury in crossfire matches right up with it.


----------



## Malamute3511

curious than any 1 got Fury X in crossfire that can do a valley bench at stock settings. I don't care if you overclock your card just stock 3D settings. No changing CCC or Nvidia control panel for higher scores. Id love to see the score vs my 98ti HOF in sli


----------



## Kana-Maru

Quote:


> Originally Posted by *Malamute3511*
> 
> I saw benchmarks with 980ti and Fury X for GTAV. Fury X had random stutters, Either way 32k-34k is still lower than my cards at there normal boost clock. Not Gona start a flame war as that was not my intention. Was just giving my 2 cents.


You do realize that Fury X was original suppose to tackle the GTX 980 [non TI] right? Nvidia got nervous as heck and rushed the 980 Ti to the market to undercut AMDs Fury X sales and hype. Obviously Nvidia doesn't want AMD to have anything positive. At the time I was running GTX 670s and I was looking to upgrading to a GTX 980 Ti Hybrid. That would've cost me $749.99+. Ultimately I decided to go with AMD this time around since I'm NO fanboy to neither company. Anyways 980 Ti releases and there's no way AMD is about to delay their Fury line up and HBM memory. E3 was plan and they stuck with it.

There's no flame war to start. I really enjoyed my GTX 670 2 way SLI and now I'm enjoying my Fury X. My two cents is basically what I said above and the fact that you can't go wrong with any card. Nvidia has been doing a lot of dirty crap lately and I'm not going to support that type of marketing and lying. Therefore I checked out Fury X and liked what I saw. The 4K performance is great and with Direct X 12 games on the way I'm sure both companies will battle neck for neck.


----------



## sygnus21

Quote:


> Originally Posted by *mRYellow*
> 
> BTW, Here's my score, no OC. For some reason 3DMark doesn't recognise my card. Wonder if its the unlocked CUs.
> 
> http://www.3dmark.com/fs/5783413
> 
> *Score*
> 13157 with Generic VGA(1x) and Intel Core i7-5820K Processor
> 
> *Graphics Score*
> 15062
> 
> *Physics Score*
> 16959
> 
> *Combined Score*
> 5759


I have the same issue as posted *here* and it's not because the card is unlocked. BTW the day I got the card 3DMark recognized the card, the next day, it doesn't. Very odd. And I rand just ran 3DMark today and card still isn't recognized. I'm thinking there's as issue at 3DMark's end.


----------



## Forceman

Quote:


> Originally Posted by *Thoth420*
> 
> Looking for some advice on custom cooling config. I was curious if 2 x 240 EK Rads (one front and one top), D5 Pump and a cylinder res(unsure what size) would be adequate to cool an i7 6700k and one fury x (both ek supremacy blocks). Tubing is PETC but the bends will use fittings so it won't be heat gunned. I have 0 experience building with custom loops but plenty of air and aio coolers etc.


I've got a 280 and a 120 cooling my 4790K and 290X, so you should be fine with 2 x 240s.


----------



## Jflisk

Quote:


> Originally Posted by *rx7racer*
> 
> Little bit of an oddball question. Anyone running any type of Fury/X notice your turn off state not happening for your monitors?


Mines shutting off the monitor but ask me what happens every so often when it does shut off. Blue screen restart.


----------



## Jflisk

I just got this score from fire strike extreme

http://www.3dmark.com/fs/5784573


----------



## Malamute3511

nice well done, Whats the temps of your cards reach ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> I just got this score from fire strike extreme
> 
> http://www.3dmark.com/fs/5784573


That's looking much better









Here's my 295x2 (Stock): http://www.3dmark.com/fs/5465183

And 295x2 + 290x Trifire (Stock): http://www.3dmark.com/fs/5465137

Your Fury X crossfire is just as fast as my 290x Trifire, pretty much the way it's supposed to be


----------



## Thoth420

Quote:


> Originally Posted by *Forceman*
> 
> I've got a 280 and a 120 cooling my 4790K and 290X, so you should be fine with 2 x 240s.


Cheers otherwise I would have to lose all HDD and Optical Bays...in a case that hides them anyway...


----------



## Jflisk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's looking much better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's my 295x2 (Stock): http://www.3dmark.com/fs/5465183
> 
> And 295x2 + 290x Trifire (Stock): http://www.3dmark.com/fs/5465137
> 
> Your Fury X crossfire is just as fast as my 290x Trifire, pretty much the way it's supposed to be


By the way thanks. I keep forgetting to run the ultra.


----------



## Jflisk

Quote:


> Originally Posted by *Malamute3511*
> 
> nice well done, Whats the temps of your cards reach ?


51C is the max my cards ever hit.









I just got done playing BFH for an hour.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That's looking much better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's my 295x2 (Stock): http://www.3dmark.com/fs/5465183
> 
> And 295x2 + 290x Trifire (Stock): http://www.3dmark.com/fs/5465137
> 
> Your Fury X crossfire is just as fast as my 290x Trifire, pretty much the way it's supposed to be
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way thanks. I keep forgetting to run the ultra.
Click to expand...

All good man, the FX chips do well enough but high end Crossfire and 1080p (even in a synthetic bench) just doesn't really give out any decent results for anyone


----------



## Jflisk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> All good man, the FX chips do well enough but high end Crossfire and 1080p (even in a synthetic bench) just doesn't really give out any decent results for anyone


A little off topic did you switch out your 9 290X cooler for the 9 390X cooler. Keeping an eye on that one

And to keep it on topic FURY X


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> All good man, the FX chips do well enough but high end Crossfire and 1080p (even in a synthetic bench) just doesn't really give out any decent results for anyone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A little off topic did you switch out your 9 290X cooler for the 9 390X cooler. Keeping an eye on that one
> 
> And to keep it on topic FURY X
Click to expand...

I am planning on getting a Fury X at some point so i do like to keep up on the news in here but my 290x is in another rig, and my PSU is getting RMA'd so i'm using a spare atm so the 295x2 is boxed up and this 390x landed on my doorstep









And not yet, I am planning on it but i need some more thermal pads and some time to get it done


----------



## rx7racer

Quote:


> Originally Posted by *Jflisk*
> 
> Mines shutting off the monitor but ask me what happens every so often when it does shut off. Blue screen restart.


Well that's interesting, haven't ran into that yet. Oddly my wifes rig with the 290X did though, maybe a fresh driver install is in order, last thing to check really.

Just for giggles anyone else with anomalies with monitor power off or restart states?

And also please don't anyone take away that I'm not enjoying my Fury by what i said earlier, pleased with it and feel AMD is still very capable when it comes to their gpu's.


----------



## Malamute3511

Quote:


> Originally Posted by *Jflisk*
> 
> 51C is the max my cards ever hit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just got done playing BFH for an hour.


nice when I stress test I hit 65 on my bottom card and 80-85 on my top card lol. the AIO coolers are a nice addition


----------



## BlackyMeow

I got my Sapphire Fury Tri-X OC this morning. I managed to unlock all 4096 shaders. No stability issues whatsoever.

I tried to overclock it to 1050MHz and it crashed multiple times in Dirt Rally. But it's stable at 1040MHz (factory OC) and gives me performance on par with the Fury X (and even beats it in some benchmarks). AND IT'S SOOOO SILENT, OMG. Even at 75 degrees (default target temp), the fans are only spinning at 1000rpm. I can barely hear them. That card is amazing, seriously.


----------



## Malamute3511

Nice glad to hear shaders unlocked with no issues. Post some bench scores


----------



## BlackyMeow

Quote:


> Originally Posted by *Malamute3511*
> 
> Nice glad to hear shaders unlocked with no issues. Post some bench scores


I have run some 5760*1080 benchmarks, but it was before I unlocked the shaders (tbh I've been playing Dirt Rally for 5 hours since I unlocked the shaders). I'll run the benchmarks again tomorrow morning (and post them ofc).


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> I got my Sapphire Fury Tri-X OC this morning. I managed to unlock all 4096 shaders. No stability issues whatsoever.
> 
> I tried to overclock it to 1050MHz and it crashed multiple times in Dirt Rally. But it's stable at 1040MHz (factory OC) and gives me performance on par with the Fury X (and even beats it in some benchmarks). AND IT'S SOOOO SILENT, OMG. Even at 75 degrees (default target temp), the fans are only spinning at 1000rpm. I can barely hear them. That card is amazing, seriously.


Congrats. I''ve unlocked four but will give all a go later.
Can you maybe post your CU layout before and after?


----------



## BlackyMeow

Quote:


> Originally Posted by *mRYellow*
> 
> Congrats. I''ve unlocked four but will give all a go later.
> Can you maybe post your CU layout before and after?


Sure, it was:
Before (don't take into account the number of dots, only the x's placements are important):
[.......xx]
[.......xx]
[x.......x]
[.......xx]

After: all dots.


----------



## Crisium

I'm not sure what happened, but I can't overclock my Fury anymore. I used to have it running at 1080 / 550MHz. Now MSI Afterburner ignores me. On its own accord, the program decided to turn off "Extent official overclocking limits". Of course I turned it back on. I can move the sliders, but when I hit apply it doesn't work.

edit: Well, I'll just have to use MSI Afterburner to monitor temps only. I was able to use AMD's CCC to get my 1080/550 back.


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> Sure, it was:
> Before (don't take into account the number of dots, only the x's placements are important):
> [.......xx]
> [.......xx]
> [x.......x]
> [.......xx]
> 
> After: all dots.


I tried all but i wasn't so lucky. Got artifacting in the desktop.
Well, a partial unlock is better than nothing smile.gif

Quote:


> Originally Posted by *Crisium*
> 
> I'm not sure what happened, but I can't overclock my Fury anymore. I used to have it running at 1080 / 550MHz. Now MSI Afterburner ignores me. On its own accord, the program decided to turn off "Extent official overclocking limits". Of course I turned it back on. I can move the sliders, but when I hit apply it doesn't work.
> 
> edit: Well, I'll just have to use MSI Afterburner to monitor temps only. I was able to use AMD's CCC to get my 1080/550 back.


What power limit did you use to get 1080 stable on core?


----------



## Medusa666

You guys who got the Sapphire Fury with the Tri-X cooler, how is the coil whine on your cards?

I bought two fury X but returned them, as both had the pump whine and coil whine, was beyond irritating and couldn't be ignored, so I might get this card instead.


----------



## Ceadderman

Might think have something to do with your PSU?

This should also be asked of the members who own those cards. If not which PSU do you run or if so which...

Other members have reported losing their coil whine (not just AMD owners) after they swapped out their PSU.

So it's worth looking into at least.









~Ceadder


----------



## mRYellow

Quote:


> Originally Posted by *Medusa666*
> 
> You guys who got the Sapphire Fury with the Tri-X cooler, how is the coil whine on your cards?
> 
> I bought two fury X but returned them, as both had the pump whine and coil whine, was beyond irritating and couldn't be ignored, so I might get this card instead.


I can confirm that there is NO coil whine. Confirmed by multiple users as well.
Card is quiet and cool.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Ceadderman*
> 
> Might think have something to do with your PSU?
> 
> This should also be asked of the members who own those cards. If not which PSU do you run or if so which...
> 
> Other members have reported losing their coil whine (not just AMD owners) after they swapped out their PSU.
> 
> So it's worth looking into at least.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Changing the PSU won't do anything for pump whine.


----------



## BlackyMeow

Quote:


> Originally Posted by *Medusa666*
> 
> You guys who got the Sapphire Fury with the Tri-X cooler, how is the coil whine on your cards?
> 
> I bought two fury X but returned them, as both had the pump whine and coil whine, was beyond irritating and couldn't be ignored, so I might get this card instead.


My Sapphire Fury Tri-X has some coil whine. But it's really not as loud as the coil whine on the Fury X. Don't know why, because that's essentially the same card. It doesn't bother me, even when my side panel is off.
Quote:


> Originally Posted by *mRYellow*
> 
> I tried all but i wasn't so lucky. Got artifacting in the desktop.
> Well, a partial unlock is better than nothing smile.gif


I actually just ran into an issue with my full unlock... It works without issues in games, but when I use folding at home, the desktop freezes for 5 seconds every 10 seconds. I can't figure out why. I tried underclocking the card, nothing works.

The card works great with only 4CUs unlocked though.


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> My Sapphire Fury Tri-X has some coil whine. But it's really not as loud as the coil whine on the Fury X. Don't know why, because that's essentially the same card. It doesn't bother me, even when my side panel is off.
> I actually just ran into an issue with my full unlock... It works without issues in games, but when I use folding at home, the desktop freezes for 5 seconds every 10 seconds. I can't figure out why. I tried underclocking the card, nothing works.
> 
> The card works great with only 4CUs unlocked though.


I guess that's the reason why they locked that CU.
Guess it isn't as stable as you think. So it must be faulty.


----------



## BlackyMeow

Here are some 5760x1080 Eyefinity benchmarks with Fury, Fury with partial unlock, and Fury X.



EDIT : please note that "Fury X" means "actual Fury X". It's not a fully unlocked regular Fury.


----------



## Alastair

Quote:


> Originally Posted by *BlackyMeow*
> 
> Here are some 5760x1080 Eyefinity benchmarks with Fury, Fury with partial unlock, and Fury X.


To be honest it doesnt look like that one core is completely stable. And if it were me I would just run partially unlocked vs. fully unlocked. But thats just me.


----------



## GorillaSceptre

Dammit AMD, just let Sapphire make an air-cooled Fury X for $599 and i'll buy it now.

Not everyone wants the CLC, but want the GPU. What a stupid business decision..


----------



## BlackyMeow

Quote:


> Originally Posted by *Alastair*
> 
> To be honest it doesnt look like that one core is completely stable. And if it were me I would just run partially unlocked vs. fully unlocked. But thats just me.


I didn't include fully unlocked benchmarks since it's not completely stable. It did beat the Fury X though.


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> Here are some 5760x1080 Eyefinity benchmarks with Fury, Fury with partial unlock, and Fury X.


From those results it does seem that your Fury X mode has a faulty CU.

BTW, so far i'm stable with an OC of 1080 core and 550 on mem.
Net result is 7676 graphic score.

http://www.3dmark.com/fs/5797041


----------



## Medusa666

Quote:


> Originally Posted by *Ceadderman*
> 
> Might think have something to do with your PSU?
> 
> This should also be asked of the members who own those cards. If not which PSU do you run or if so which...
> 
> Other members have reported losing their coil whine (not just AMD owners) after they swapped out their PSU.
> 
> So it's worth looking into at least.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah ok, I see.

Thing is I just got a new Corsair RM1000i, (RMi series) and my 295X2 is dead silent, no coil whine, both fury X's I got had very high coil whine.


----------



## BlackyMeow

Quote:


> Originally Posted by *mRYellow*
> 
> From those results it does seem that your Fury X mode has a faulty CU.


It's not a Fury X "mode", it's an actual Fury X that I'm sending back on monday.


----------



## mRYellow

Quote:


> Originally Posted by *BlackyMeow*
> 
> It's not a Fury X "mode", it's an actual Fury X that I'm sending back on monday.


My bad, only noticed it after i made my post.


----------



## BlackyMeow

Quote:


> Originally Posted by *mRYellow*
> 
> My bad, only noticed it after i made my post.


But you know, my Fury X scores are weird... Maybe my CPU was doing stuff in the background.


----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Dammit AMD, just let Sapphire make an air-cooled Fury X for $599 and i'll buy it now.
> 
> Not everyone wants the CLC, but want the GPU. What a stupid business decision..


I wish manufacturers would just allow us to buy cards without coolers installed in the first place. It isn't nice having to buy a card, and then take off an air cooler or CLC that will probably be switched on once in its life time.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> I wish manufacturers would just allow us to buy cards without coolers installed in the first place. It isn't nice having to buy a card, and then take off an air cooler or CLC that will probably be switched on once in its life time.


Yup, but that would actually mean saving consumers money..

I'm hoping that Fury X sales are tanking, maybe then they'll wake up and give it to AIB's to manufacture.


----------



## Thoth420

Quote:


> Originally Posted by *Medusa666*
> 
> Yeah ok, I see.
> 
> Thing is I just got a new Corsair RM1000i, (RMi series) and my 295X2 is dead silent, no coil whine, both fury X's I got had very high coil whine.


Wait....a Corsair PSU that isn't junk? I tried the original RM it had whine and I hate those flat cables.


----------



## mRYellow

Ok, cards seems to be stable at 1100 on core and 550 mem.
Firestrike 16238
Firestrike Extreme 7787


----------



## ozyo

im the only one how cant oc mem ?


----------



## rioja

Quote:


> Originally Posted by *Medusa666*
> 
> Yeah ok, I see.
> 
> Thing is I just got a new Corsair RM1000i, (RMi series) and my 295X2 is dead silent, no coil whine, both fury X's I got had very high coil whine.


So changing PSU may solve whine issue?


----------



## p4inkill3r

Quote:


> Originally Posted by *ozyo*
> 
> im the only one how cant oc mem ?


OC it to what degree?
I haven't seen many people able to run huge numbers on the memory, but I'd be willing to bet you could get at least something out of it.


----------



## p4inkill3r

Quote:


> Originally Posted by *rioja*
> 
> So changing PSU may solve whine issue?


If you have a cheap PSU, maybe. Coil noise isn't anything new, however, and many times goes away of its own accord.


----------



## Medusa666

Quote:


> Originally Posted by *rioja*
> 
> So changing PSU may solve whine issue?


No idea, before this Corsair unit I had a LEPA G1600, and the 295X2 behaved exactly the same, so in my case no, it made no difference (i.e did not get worse).


----------



## ozyo

Quote:


> Originally Posted by *p4inkill3r*
> 
> OC it to what degree?
> I haven't seen many people able to run huge numbers on the memory, but I'd be willing to bet you could get at least something out of it.


501mhz


----------



## Crisium

Quote:


> Originally Posted by *Medusa666*
> 
> You guys who got the Sapphire Fury with the Tri-X cooler, how is the coil whine on your cards?
> 
> I bought two fury X but returned them, as both had the pump whine and coil whine, was beyond irritating and couldn't be ignored, so I might get this card instead.


I use a notoriously silent case the Define R4, but I've never even heard my Sapphire Fury Tri X in nearly a month of ownership.


----------



## sygnus21

Quote:


> Originally Posted by *rioja*
> 
> So changing PSU may solve whine issue?


If you know you have a cheap (quality wise) power supply, maybe. But if you know your power supply is a quality one, you could be chasing ghosts. As has been stated, there are many things that could contribute coil whine from board design/layout to the power coming out of your outlet. And to be honest, you're hearing a lot of complaints about coil whine with these cards. Does that mean everyone's power supply is the issue? I'm just saying.

I wouldn't waste my money on something I've no control over.

Anyway my Sapphire Tri-X Fury was quiet. Unfortunately is has to be returned because it was defective (artifacting) so I just hope the replacement is just as quiet, coil whine wise.

My two cents.


----------



## Ceadderman

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I wish manufacturers would just allow us to buy cards without coolers installed in the first place. It isn't nice having to buy a card, and then take off an air cooler or CLC that will probably be switched on once in its life time.
> 
> 
> 
> Yup, but that would actually mean saving consumers money..
> 
> I'm hoping that Fury X sales are tanking, maybe then they'll wake up and give it to AIB's to manufacture.
Click to expand...

Why does that myth persist?









Once again...

REFERENCE design means "FOR REFERENCE".

Limited amounts of Reference cards are compiled for Reference for the manufacturers who build the cards and distribute them. The tech by agreement can NEVER be less than what is on the Reference model but it doesn't limit the manufacturer improving upon the design unless by agreement the manufacturer must leave the design alone and build the card as is.

The only difference bein here is that AMD controls HBM tech. Their only role in this regard is making sure that HBM is true to design and is properly implemented from their HBM manufacturer who ships to the manufacturers with AMDs approval.

Seriously, what did you think Reference meant anyway?









~Ceadder


----------



## looncraz

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Dammit AMD, just let Sapphire make an air-cooled Fury X for $599 and i'll buy it now.
> 
> Not everyone wants the CLC, but want the GPU. What a stupid business decision..


They just copied nVidia.


----------



## xer0h0ur

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Yup, but that would actually mean saving consumers money..
> 
> I'm hoping that Fury X sales are tanking, maybe then they'll wake up and give it to AIB's to manufacture.


They are selling out non-stop since they can't keep up with demand. Either way your wish can't happen. The dies are sent to SK Hynix for them to assemble onto the interposer along with the HBM. AIBs aren't equipped to handle that part of the process. The rest of the assembly, sure.

Edit: Not to mention AMD would have to remove the reference design lock on Fury X.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GorillaSceptre*
> 
> Yup, but that would actually mean saving consumers money..
> 
> I'm hoping that Fury X sales are tanking, maybe then they'll wake up and give it to AIB's to manufacture.
> 
> 
> 
> They are selling out non-stop since they can't keep up with demand. Either way your wish can't happen. The dies are sent to SK Hynix for them to assemble onto the interposer along with the HBM. AIBs aren't equipped to handle that part of the process. The rest of the assembly, sure.
> 
> Edit: Not to mention AMD would have to remove the reference design lock on Fury X.
Click to expand...

AMD has *NOTHING* related to Fury being manufactured within their building. Nothing. Reference lock only applies to less than Reference specs. AIBs, unless held to a stricter agreement than previous Reference designs, can make cards better. But they cannot limit the design itself.

Years back(5*** series) XFX was allowed to drop one of the xFire contacts because they added performance to their Reference design.

Hynix is likely having issues keeping up with the demand. So the AIBs aren't able to produce them at a level to keep up with demand. AMD after designing and cobbling together the Reference cards for the AIBS to work from is out of the building business for all intents and purposes unrelated to the HBM\GPU die.









That's the only thing AMD would be working on.









~Ceadder


----------



## xer0h0ur

Try learning how to read. I didn't say AMD was manufacturing anything.


----------



## Ceadderman

Sorry my bad, my comment was for the gent who posted about AMD manufacturing process and hoping they sell less to push them to get the AIBs more involved.

My phone simply pushed your reply together with his post when I was meaning to just grab his comment.









~Ceadder


----------



## Scorpion49

Well, here is a run at stock settings on the Fury with the new 6600k, dat phsyics score doe.


----------



## Alastair

What sort of DX12 compliance dcomplianceoes Fury have? I heard something that GCN cards don't have full DX12 compliance.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.


This again?

Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.

That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.

Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual

Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too


----------



## Alastair

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.
> 
> 
> 
> This again?
> 
> Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.
> 
> That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.
> 
> Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual
> 
> Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too
Click to expand...

yeah sorry. I haven't really been following the DX game. I just know I made the jump to AMD back with HD5xxx when they were the first to the party with DX 11 support. So tiled resources isn't a loss on AMD's side? Cause they can implement it in a different fashion? Sorry for brining the subject up again sarge.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.
> 
> 
> 
> This again?
> 
> Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.
> 
> That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.
> 
> Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual
> 
> Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah sorry. I haven't really been following the DX game. I just know I made the jump to AMD back with HD5xxx when they were the first to the party with DX 11 support. So tiled resources isn't a loss on AMD's side? Cause they can implement it in a different fashion? Sorry for brining the subject up again sarge.
Click to expand...

It's ok man, I don't know a massive amount on the subject myself but from what AMD has said it's not a big loss because they can use an alternate code path to achieve the same effect (I'm assuming at slight performance hit)

and i read it wrong, AMD does have support for Tiled Resources with GCN 1.1 and 1.2.

There was something, i was reading a really good thread about it a while ago and now i can't find it


----------



## Alastair

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.
> 
> 
> 
> This again?
> 
> Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.
> 
> That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.
> 
> Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual
> 
> Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah sorry. I haven't really been following the DX game. I just know I made the jump to AMD back with HD5xxx when they were the first to the party with DX 11 support. So tiled resources isn't a loss on AMD's side? Cause they can implement it in a different fashion? Sorry for brining the subject up again sarge.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> It's ok man, I don't know a massive amount on the subject myself but from what AMD has said it's not a big loss because they can use an alternate code path to achieve the same effect (I'm assuming at slight performance hit)
> 
> and i read it wrong, AMD does have support for Tiled Resources with GCN 1.1 and 1.2.
> 
> There was something, i was reading a really good thread about it a while ago and now i can't find it
Click to expand...

well if you do happen to find it please link me to it. Would love to give it a read. Thanks man.

Doesn't really matter for me anyways as I am already committed to two Fury's anyways.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.
> 
> 
> 
> This again?
> 
> Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.
> 
> That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.
> 
> Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual
> 
> Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah sorry. I haven't really been following the DX game. I just know I made the jump to AMD back with HD5xxx when they were the first to the party with DX 11 support. So tiled resources isn't a loss on AMD's side? Cause they can implement it in a different fashion? Sorry for brining the subject up again sarge.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> It's ok man, I don't know a massive amount on the subject myself but from what AMD has said it's not a big loss because they can use an alternate code path to achieve the same effect (I'm assuming at slight performance hit)
> 
> and i read it wrong, AMD does have support for Tiled Resources with GCN 1.1 and 1.2.
> 
> There was something, i was reading a really good thread about it a while ago and now i can't find it
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> well if you do happen to find it please link me to it. Would love to give it a read. Thanks man.
> 
> Doesn't really matter for me anyways as I am already committed to two Fury's anyways.
Click to expand...

I found it!!









Linky

Was very interesting to read and see what's what


----------



## GorillaSceptre

Quote:


> Originally Posted by *xer0h0ur*
> 
> They are selling out non-stop since they can't keep up with demand. Either way your wish can't happen. The dies are sent to SK Hynix for them to assemble onto the interposer along with the HBM. AIBs aren't equipped to handle that part of the process. The rest of the assembly, sure.
> 
> Edit: Not to mention AMD would have to remove the reference design lock on Fury X.


Quote:


> Originally Posted by *Ceadderman*
> 
> Sorry my bad, my comment was for the gent who posted about AMD manufacturing process and hoping they sell less to push them to get the AIBs more involved.
> 
> My phone simply pushed your reply together with his post when I was meaning to just grab his comment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah, i wasn't talking about the interposer assembly, nor did i mention AMD's manufacturing process.

AIB partners could easily make custom Fury X's, the only ones stopping that from happening is AMD. They are locking the Fury X down like a Titan, the problem is it's not one..

But if they can't even keep up with current demand, then i guess it doesn't matter either way.


----------



## BlackyMeow

Quote:


> Originally Posted by *Alastair*
> 
> Doesn't really matter for me anyways as I am already committed to two Fury's anyways.


Damn, my room gets super toasty even with one Fury. My case is exhausting hot (like really hot) air from three fans... I can't even imagine having two Furys x)


----------



## flopper

still waiting for Nano.
its so small its difficult to manufacture it.


----------



## nickcnse

Hey guy, just loaded up my r9 fury x. Quick question though, what should I use to hook it up to my 3x Asus VG236HE monitor? Do I need some sort of adapter to run from the dvi-d to the displayport in order to run at 120hz? Thanks everyone!


----------



## en9dmp

Anyone have any idea what has happened to the voltage control? It was seemingly imminent a month ago but there's be no news since... The card came out 2 months ago now, can't believe we're all still on stock voltage!


----------



## flopper

Quote:


> Originally Posted by *en9dmp*
> 
> Anyone have any idea what has happened to the voltage control? It was seemingly imminent a month ago but there's be no news since... The card came out 2 months ago now, can't believe we're all still on stock voltage!


you have a card that is a OC dream.
be happy that you cant OC it.

/Irony sarcastic such off


----------



## BlackyMeow

Quote:


> Originally Posted by *flopper*
> 
> you have a card that is a OC dream.
> be happy that you cant OC it.
> 
> /Irony sarcastic such off


Actually I think I'm not even going to OC my Sapphire Fury that much. It's already factory OCed to 1040MHz, and powerful enough for anything I might want to do with it... I prefer having a completely silent card than having a louder card that performs 3% better.


----------



## huzzug

Quote:


> Originally Posted by *BlackyMeow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> you have a card that is a OC dream.
> be happy that you cant OC it.
> 
> /Irony sarcastic such off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually I think I'm not even going to OC my Sapphire Fury that much. It's already factory OCed to 1040MHz, and powerful enough for anything I might want to do with it... I prefer having a completely silent card than having a louder card that performs 3% better.
Click to expand...

I think you were supposed to throw a fit & protest eating your vegetables tonight.


----------



## Alastair

So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!


Congrats unleash the awesomeness .


----------



## BlackyMeow

Quote:


> Originally Posted by *huzzug*
> 
> I think you were supposed to throw a fit & protest eating your vegetables tonight.


What do you mean ? I don't understand (sorry, english isn't my first language).

Anyway, here's a family picture (6870, 7870, 7950, Fury) :


Spoiler: Image


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!


----------



## Alastair

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of DX12 compliance compliances Fury have? I heard something that GCN cards don't have full DX12 compliance.
> 
> 
> 
> This again?
> 
> Alrighty, well AMD supports DX12 feature level 12_0 with GCN 1.1 and 1.2 which are mainly the R9 290/x, 390/x, 285, 380 and Fury/X cards while Nvidia's GTX 900's series support feature level 12_1 iirc whiwhichans Nvidia can do Tiled resources natively on the chip.
> 
> That said AMD supports Tier 3 of DX12 on GCN 1.2 while Nvidia only supports Tier 1 for the most part.
> 
> Now the feature levels don't really matter to the end consumer that much seeing as afaik AMD can just emulate that partition and carry on business as usual
> 
> Either way, doesn't matter, all GCN GPU's support DX12 and the GTX 900 series support it too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah sorry. I haven't really been following the DX game. I just know I made the jump to AMD back with HD5xxx when they were the first to the party with DX 11 support. So tiled resources isn't a loss on AMD's side? Cause they can implement it in a different fashion? Sorry for brining the subject up again sarge.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> It's ok man, I don't know a massive amount on the subject myself but from what AMD has said it's not a big loss because they can use an alternate code path to achieve the same effect (I'm assuming at slight performance hit)
> 
> and i read it wrong, AMD does have support for Tiled Resources with GCN 1.1 and 1.2.
> 
> There was something, i was reading a really good thread about it a while ago and now i can't find it
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> well if you do happen to find it please link me to it. Would love to give it a read. Thanks man.
> 
> Doesn't really matter for me anyways as I am already committed to two Fury's anyways.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I found it!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Linky
> 
> Was very interesting to read and see what's what
Click to expand...

Thanks Sarge! So the way I see it. AMD pretty much supports all of what is required for 12.0 and pretty much most of 12.1 as well. I really don't see much of a difference there between 12.0 and 12.1. And judging from what some of the features are called, it won't make a difference in visual quality. Which was a different story in the 11 vs 10 days back when I made the jump to AMD. +REP


----------



## p4inkill3r

Quote:


> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!


Very cool.


----------



## mRYellow

Congrats Alistair








Benoni has become so much more awesome


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!






That looks so freaking awesome. I'm sorry, but Nvidia has nothing on this... IMO as drivers start catching up and voltage is unlocked this card is going to be fantastic. Its already good, but it will be great. Just like most other AMD cards, just have to wait a bit for the value to increase. Good times.


----------



## ozyo

i cant overclock my memory at all
when i click apply blue screen shows up
fury x / TriXX
any help







?
i can downclockit btw


----------



## looncraz

Quote:


> Originally Posted by *nickcnse*
> 
> Hey guy, just loaded up my r9 fury x. Quick question though, what should I use to hook it up to my 3x Asus VG236HE monitor? Do I need some sort of adapter to run from the dvi-d to the displayport in order to run at 120hz? Thanks everyone!


You need an active adapter.

http://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY

I think that is the most highly rated one. Pricey.

I think AMD really screwed the pooch by not having native DL-DVI. I'm just not interest in Fury products for my gaming PC due to that. The Nano may still find its way into my HTPC (which is also quite good for gaming), but now I'm leaning more towards just waiting and updating it to an AM4 platform next year, depending on what Carrizo APUs bring on the GPU front. If need about 7850-level of performance, then I'm sold for an APU to rid my HTPC of a dGPU. It won't matter to me if the APU is $200+ if it can do that.


----------



## p4inkill3r

Quote:


> Originally Posted by *ozyo*
> 
> i cant overclock my memory at all
> when i click apply blue screen shows up
> fury x / TriXX
> any help
> 
> 
> 
> 
> 
> 
> 
> ?
> i can downclockit btw


Try using Afterburner?


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Congrats Alistair
> 
> 
> 
> 
> 
> 
> 
> 
> Benoni has become so much more awesome


Benoni was always awesome. It's had me for a while. But now... It just got better!


----------



## ozyo

Quote:


> Originally Posted by *p4inkill3r*
> 
> Try using Afterburner?


can't move slider in ab


----------



## mRYellow

Quote:


> Originally Posted by *ozyo*
> 
> can't move slider in ab


Which drivers are you using? Also, try a different PCI-E slot.


----------



## Scorpion49

Quote:


> Originally Posted by *nickcnse*
> 
> Hey guy, just loaded up my r9 fury x. Quick question though, what should I use to hook it up to my 3x Asus VG236HE monitor? Do I need some sort of adapter to run from the dvi-d to the displayport in order to run at 120hz? Thanks everyone!


You're screwed, look forward to spending almost another GPU's worth on active adapters that support 120hz. Make sure you buy extras because at least half of them are dead out of the box. I played that game before, never again.


----------



## ozyo

Quote:


> Originally Posted by *mRYellow*
> 
> Which drivers are you using? Also, try a different PCI-E slot.


15.7
second slot no difference


----------



## mRYellow

Quote:


> Originally Posted by *ozyo*
> 
> 15.7
> second slot no difference


There's a newer version out 15.701.

How did you upgrade, from what card?


----------



## Medusa666

Quote:


> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!


Wow, that is so frikkin nice, can you please let me know what you think of the cards, first impressions, sound levels, performance, any coil whine?

Much jealous now, but happy for you, fine fine GPUs!


----------



## p4inkill3r

Quote:


> Originally Posted by *ozyo*
> 
> can't move slider in ab


You have to tick the 'extend official overclocking' box in settings and restart before you can manipulate the slider.


----------



## flopper

Quote:


> Originally Posted by *looncraz*
> 
> You need an active adapter.
> 
> http://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY
> 
> I think that is the most highly rated one. Pricey.
> 
> I think AMD really screwed the pooch by not having native DL-DVI. I'm just not interest in Fury products for my gaming PC due to that. The Nano may still find its way into my HTPC (which is also quite good for gaming), but now I'm leaning more towards just waiting and updating it to an AM4 platform next year, depending on what Carrizo APUs bring on the GPU front. If need about 7850-level of performance, then I'm sold for an APU to rid my HTPC of a dGPU. It won't matter to me if the APU is $200+ if it can do that.


the easier way, sell your old screens buy new screens with displayport. adapters for 120hz not worth it.
sell old buy new with DP.
using old dvi myself and I just buy new ones as the future needs displayport capable 120hz anyhow.
recently got acer xg270 and its amazing.
now I wait for the Nano to be out to find out what to buy


----------



## ozyo

Quote:


> Originally Posted by *mRYellow*
> 
> There's a newer version out 15.701.
> 
> How did you upgrade, from what card?


im in 15.701
+
new window install
Quote:


> Originally Posted by *p4inkill3r*
> 
> You have to tick the 'extend official overclocking' box in settings and restart before you can manipulate the slider.


thx it work
1150/520








http://www.3dmark.com/fs/5810463


----------



## Thoth420

Well seems I'm back to the drawing board on a chassis again. The H440 seems problematic for a vertically mounted cylinder reservoir that would be visible through the window. The only config I have aeen so far at all in an H440 was with it horizontal and it took one of the ssd slots to fit.

I was thinking Fractal Define S since it has a pre drilled reservoir mount.


----------



## ENTERPRISE

Hey guys, not that I have seen anything but has there been any update pertaining to the X2 or what may be known as the 395X2 ?

Thanks guys.


----------



## mRYellow

Nice Ozyo, finally









BTW My OC isn't stable @ 1100 on core. W3 will hang. Not sure how much benefit there is on OC'ing the memory.


----------



## mRYellow

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Hey guys, not that I have seen anything but has there been any update pertaining to the X2 or what may be known as the 395X2 ?
> 
> Thanks guys.


It will be called Fury X2. It's been confirmed.


----------



## looncraz

Quote:


> Originally Posted by *flopper*
> 
> the easier way, sell your old screens buy new screens with displayport. adapters for 120hz not worth it.
> sell old buy new with DP.
> using old dvi myself and I just buy new ones as the future needs displayport capable 120hz anyhow.
> recently got acer xg270 and its amazing.
> now I wait for the Nano to be out to find out what to buy


Well, if you have the money









It is something that should not be required. DL-DVI is prolific and capable, AMD was beyond stupid for not natively supporting it.


----------



## Ceadderman

Quote:


> Originally Posted by *looncraz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *flopper*
> 
> the easier way, sell your old screens buy new screens with displayport. adapters for 120hz not worth it.
> sell old buy new with DP.
> using old dvi myself and I just buy new ones as the future needs displayport capable 120hz anyhow.
> recently got acer xg270 and its amazing.
> now I wait for the Nano to be out to find out what to buy
> 
> 
> 
> Well, if you have the money
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is something that should not be required. DL-DVI is prolific and capable, AMD was beyond stupid for not natively supporting it.
Click to expand...

I believe they didn't natively support it because they can't. It has something to do with nVidipoopooinAMDsWheatiesitis.

Seriously though, more often than not, if AMD doesn't support something it's generally due to the competition throwing up roadblocks along the way. If there is some sort of license on a tech applied by nVidia you can be sure they will be watching.









~Ceadder


----------



## ENTERPRISE

Quote:


> Originally Posted by *mRYellow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hey guys, not that I have seen anything but has there been any update pertaining to the X2 or what may be known as the 395X2 ?
> 
> Thanks guys.
> 
> 
> 
> It will be called Fury X2. It's been confirmed.
Click to expand...

Cheers I just saw that as I was researching, I think they said something about this coming fall ?


----------



## looncraz

Quote:


> Originally Posted by *Ceadderman*
> 
> I believe they didn't natively support it because they can't. It has something to do with nVidipoopooinAMDsWheatiesitis.
> 
> Seriously though, more often than not, if AMD doesn't support something it's generally due to the competition throwing up roadblocks along the way. If there is some sort of license on a tech applied by nVidia you can be sure they will be watching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


DVI was developed by DDWG, a group created (and no longer apparently in existence) by Intel, SI, Compaq, Fujitsu, HP, IBM, and NEC.

DVI is so ubiquitous for to reasons:

1. It is a capable and flexible interface (allowing analog capabilities as well as different levels of digital capabilities).
2. It is royalty free.

AMD seriously had no excuses.

ASUS Strix R9 Fury, for example, has a DL-DVI port.


----------



## nickcnse

Thanks everyone for the quick and knowledgeable replies. Kind of sad, thought I was finally upgrading my video card from the gtx 690 to something a bit newer. I guess I should have done a bit of research in order to avoid this hassle. Its going to cost me almost $300 to use my old monitors!


----------



## Scorpion49

Quote:


> Originally Posted by *nickcnse*
> 
> Thanks everyone for the quick and knowledgeable replies. Kind of sad, thought I was finally upgrading my video card from the gtx 690 to something a bit newer. I guess I should have done a bit of research in order to avoid this hassle. Its going to cost me almost $300 to use my old monitors!


Buy them from amazon or somewhere you can return them easily and get an extra if you can afford it, you'll thank me later. Also double check reviews for people who have used them for similar applications, there are a lot out there and not all are created equal. I had to get some for a triple monitor setup before and out of 5 I bought 3 worked out of the box and one of those had intermittent display shutdowns.


----------



## Dupl3xxx

Quote:


> Originally Posted by *bonami2*
> 
> Just made a post about my 7950 and gta v vram and ram usage
> 
> I pop 2.5gb ram usage from pushing msaa 2x to 4x with my crossfire 7950 in gta v at 5760x1080
> 
> So im thinking that the fury x would be having a hard time in those case ahah with only 4gb
> 
> R9 390x 8gb are the perfect gpu for crossfire currently


I'm running GTA V at 5440x1600, or if you'd like, more than 4K (8,7MP vs 8.3MP) with 2x msaa. I get from the high 50 to just under 100 when playing online. I'm having a blast, and I run out of GPU power before I run out of VRAM, assuming the tell-tell sign is stuttering and generally uneven gameplay. My Fury X plays the game smoother than my 7970 did at 2560x1600 (4,1MP) with almost everything set to the lowest it could go except textures.


----------



## sygnus21

Quote:


> Originally Posted by *looncraz*
> 
> DVI was developed by DDWG, a group created (and no longer apparently in existence) by Intel, SI, Compaq, Fujitsu, HP, IBM, and NEC.
> 
> DVI is so ubiquitous for to reasons:
> 
> 1. It is a capable and flexible interface (allowing analog capabilities as well as different levels of digital capabilities).
> 2. It is royalty free.
> 
> AMD seriously had no excuses.
> 
> ASUS Strix R9 Fury, for example, has a DL-DVI port.


Really If you're using a high end card like this you probably aren't looking at DVI since these cards are also designed to run multiple monitors, which is better suited to DisplayPort as opposed to DVI or even HDMI. Notice there is only one HDMI connection on these cards.

And you can always get an HDMI to DVI cable if you really need one (pay no attention to that ridiculous price, you can get them a lot cheaper). Or you can get the Asus R9 Fury Strix which does have a DVI port.

My two cents.


----------



## Scorpion49

Quote:


> Originally Posted by *sygnus21*
> 
> Really If you're using a high end card like this you probably aren't looking at DVI since these cards are also designed to run multiple monitors, which is better suited to DisplayPort as opposed to DVI or even HDMI. Notice there is only one HDMI connection on these cards.
> 
> And you can always get an HDMI to DVI cable if you really need one (pay no attention to that ridiculous price, you can get them a lot cheaper).
> 
> My two cents.


Too bad many 120hz or 1440p/1600p monitors people still have that are perfectly good to use for many years to come don't have DP or HDMI that supports the refresh rate/resolution, right? Screw those guys. When I was in between monitors with my Fury I wanted to use my VG236HE, but I couldn't thanks to the lack of DVI ports. You can justify it any way you want, it doesn't mean a large segment of the market that still use DVI monitors will agree with you.


----------



## sygnus21

Quote:


> Originally Posted by *Scorpion49*
> 
> Too bad many 120hz or 1440p/1600p monitors people still have that are perfectly good to use for many years to come don't have DP or HDMI that supports the refresh rate/resolution, right? Screw those guys. When I was in between monitors with my Fury I wanted to use my VG236HE, but I couldn't thanks to the lack of DVI ports. You can justify it any way you want, it doesn't mean a large segment of the market that still use DVI monitors will agree with you.


I'm not sure why you're jumping on me. I'm not necessarily agreeing with the move one way or the other, I'm simply stating what I "think" AMD and some partners are/were thinking. That said, and as I pointed out, Asus DOES make a Fury with a DVI port. How effective it is I can't say.

Yes, it does suck, but technology sometimes tends to move forward while leaving legacy in the past. That said, hopefully others Fury vendors besides Asus will come along and fill the void, perhaps AMD may even allow it on the X series. The card is still relatively new.

At any rate I'm simply stating an opinion, I didn't build the cards


----------



## looncraz

Quote:


> Originally Posted by *sygnus21*
> 
> Really If you're using a high end card like this you probably aren't looking at DVI since these cards are also designed to run multiple monitors, which is better suited to DisplayPort as opposed to DVI or even HDMI. Notice there is only one HDMI connection on these cards.
> 
> And you can always get an HDMI to DVI cable if you really need one (pay no attention to that ridiculous price, you can get them a lot cheaper). Or you can get the Asus R9 Fury Strix which does have a DVI port.
> 
> My two cents.


Anytime you create a higher burden of ownership, you have reduced ownership. Perhaps, AMD wanted to ensure Fury X was a niche product due to yield issues?

I love AMD, but I wouldn't buy a Fury X, simply too expensive. The Nano looks interesting, even without DVI. And, probably, in a year or so, I'll end up buying a used Asus Strix Fury for $250


----------



## sygnus21

I'm not buying a Fury X either. I just went with the "regular" Fury - Sapphire's Tri-X R9 Fury. Unfortunately I've had to return it to Newegg due to artifacting issues. Hopefully I get my replacement this week.

Oh, and good luck finding the Asus Fury for $250. Probably by the time that happens we'll be on the second going on third generation of Fury


----------



## looncraz

Quote:


> Originally Posted by *sygnus21*
> 
> I'm not buying a Fury X either. I just went with the "regular" Fury - Sapphire's Tri-X R9 Fury. Unfortunately I've had to return it to Newegg due to artifacting issues. Hopefully I get my replacement this week.
> 
> Oh, and good luck finding the Asus Fury for $250. Probably by the time that happens we'll be on the second going on third generation of Fury


I bought the R9 290 a year after release for less ;-)

People usually underestimate the depreciation rate on things.


----------



## ozyo

@mRYellow
I'm very lucky








stable @ 1150 "artifacts start @ 1152mhz"
power limit @ +0 it does not make any difference if i increase it
+
memory OC increase min fps


----------



## mRYellow

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Cheers I just saw that as I was researching, I think they said something about this coming fall ?


It should be soon. My guess is around October.
Quote:


> Originally Posted by *ozyo*
> 
> @mRYellow
> I'm very lucky
> 
> 
> 
> 
> 
> 
> 
> 
> stable @ 1150 "artifacts start @ 1152mhz"
> power limit @ +0 it does not make any difference if i increase it
> +
> memory OC increase min fps


I don't get artifcacting either but would get random driver resets.


----------



## Greenland

Any news of unlocking voltage?


----------



## sygnus21

Quote:


> Originally Posted by *looncraz*
> 
> I bought the R9 290 a year after release for less ;-)
> 
> People usually underestimate the depreciation rate on things.


Yeah someone will sell you a "used" one for that price









Anyway good luck


----------



## Scorpion49

Well, I think the Fury just died. I was playing a game and it crashed, when I booted back up I just get artifacts after 10-15 seconds. I also notice it was extremely hot, so I think the fans were off while the game was running. It was running at stock speeds, no modded BIOS or even afterburner installed. Looked like this (someone elses pic):


----------



## sygnus21

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I think the Fury just died. I was playing a game and it crashed, when I booted back up I just get artifacts after 10-15 seconds. I also notice it was extremely hot, so I think the fans were off while the game was running. It was running at stock speeds, no modded BIOS or even afterburner installed. Looked like this (someone elses pic):


Oh god, don't let this be a trend







Here's mine...





Sapphire Tri-X R9 Fury. No mods, no tweaks, no overclocks. Not even my system is overclocked. Incidentally my card never acted up during gaming or benchmarking. Strictly desktop or internet browsing







.

BTW, the popup red alert at the end of the video is just my AV complaining about finding tracking cookies. Was in the middle of a routine scan when the artifacting issue occurred... again.

Anyway card sent back to Newegg for replacement.


----------



## Alastair

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So um. Something happened. There was a package on my bed. It arrived a week earlier than expected. Inside. Pure awesome!
> 
> 
> 
> 
> 
> 
> 
> Wow, that is so frikkin nice, can you please let me know what you think of the cards, first impressions, sound levels, performance, any coil whine?
> 
> Much jealous now, but happy for you, fine fine GPUs!
Click to expand...

I shall let you know exactly how they both perform! It's just gonna be a little while cause I have to drain my loop and redo my loop. So yeah.


----------



## Alastair

Hey aren't we supposed to get never settle bundles with our GPU's?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I think the Fury just died. I was playing a game and it crashed, when I booted back up I just get artifacts after 10-15 seconds. I also notice it was extremely hot, so I think the fans were off while the game was running. It was running at stock speeds, no modded BIOS or even afterburner installed. Looked like this (someone elses pic):


Wait....didn't you have artifacting issues with hawaii as well if memory serves?









Quote:


> Originally Posted by *Alastair*
> 
> Hey aren't we supposed to get never settle bundles with our GPU's?


Not that i know of, they haven't announced the new Never Settle bundle yet afaik.

I know XFX had Dirt Rally codes with their cards at Best Buy but thats all i can think of


----------



## Alastair

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Well, I think the Fury just died. I was playing a game and it crashed, when I booted back up I just get artifacts after 10-15 seconds. I also notice it was extremely hot, so I think the fans were off while the game was running. It was running at stock speeds, no modded BIOS or even afterburner installed. Looked like this (someone elses pic):
> 
> 
> 
> 
> 
> Wait....didn't you have artifacting issues with hawaii as well if memory serves?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Hey aren't we supposed to get never settle bundles with our GPU's?
> 
> Click to expand...
> 
> Not that i know of, they haven't announced the new Never Settle bundle yet afaik.
> 
> I know XFX had Dirt Rally codes with their cards at Best Buy but thats all i can think of
Click to expand...

oh. DAMN. Sad face.


----------



## Scorpion49

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wait....didn't you have artifacting issues with hawaii as well if memory serves?


Maybe at one point, I had probably 12-15 Hawaii cards since they were new.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Wait....didn't you have artifacting issues with hawaii as well if memory serves?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe at one point, I had probably 12-15 Hawaii cards since they were new.
Click to expand...

Yeah this was quite some time ago, first gen cards i believe......that's straining my memory a bit


----------



## Scorpion49

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah this was quite some time ago, first gen cards i believe......that's straining my memory a bit


I go through an inordinate amount of video cards and hardware, its basically my hobby so I end up reselling quickly and re-using the same bit of money to try other things. One thing I have noticed is AMD has gotten a lot better with DOA hardware, during the Tahiti days it wasn't uncommon for me to come across several dead cards in a row straight out of the box, reference board 7950's were especially terrible.


----------



## mRYellow

I've set a custom fan profile. Temp never foes above 57c and fan speed is about 38%. Card is still very quiet.


----------



## xer0h0ur

Are you guys experiencing artifacting by chance on Windows 10? I have mostly seen those issues popping up with Windows 10 users. Hell, Nvidia's Windows 10 driver was burning out Alienware laptop displays.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you guys experiencing artifacting by chance on Windows 10? I have mostly seen those issues popping up with Windows 10 users. Hell, Nvidia's Windows 10 driver was burning out Alienware laptop displays.


Any specific games? Ive noticed some weird issues with warframe but most of the games ive tried since swapping over appear to be fine.


----------



## xer0h0ur

I was referring to them showing artifacting outside of gameplay.


----------



## WheelZ0713

Hey All.

Got my new toy, Sapphire R9 Fury this week.

This may be a stupid question but:
Obviously i've been playing with some OCing and got some good results up to 1055Mhz. As soon i go to 1060Mhz i seem to start shedding points in bench-marking. It drops about 100 points off both Heaven and Firestrike benches, though it doesn't seem to get to a temp worthy of throttling.

Any idea on why this might be?


----------



## kayan

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you guys experiencing artifacting by chance on Windows 10? I have mostly seen those issues popping up with Windows 10 users. Hell, Nvidia's Windows 10 driver was burning out Alienware laptop displays.


I've actually had fewer issues with artifacts and such on desktop in W10 than I did on W8.1.

About the same issues in games.


----------



## xer0h0ur

I was talking about Fury / Fury X users. You had mentioned before that you were experiencing issues with your 295X2.


----------



## sygnus21

Quote:


> Originally Posted by *xer0h0ur*
> 
> Are you guys experiencing artifacting by chance on Windows 10? I have mostly seen those issues popping up with Windows 10 users. Hell, Nvidia's Windows 10 driver was burning out Alienware laptop displays.


Yes. My posted video was on Windows 10. Yes, Sapphire R9 Fury.


----------



## mRYellow

No artifacting here in Win10.


----------



## Alastair

You guys seen the AMD product slides for Fury Nano? Apparently it's a full on XT die. All 4096 and 256 TMU's. Yet 1, how do the get it to 175w and 2, think some of us normal Fury owners are gonna suddenly get an onset if buyers remorse?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alastair*
> 
> You guys seen the AMD product slides for Fury Nano? Apparently it's a full on XT die. All 4096 and 256 TMU's. Yet 1, how do the get it to 175w and 2, think some of us normal Fury owners are gonna suddenly get an onset if buyers remorse?


Even if it is the full on 4096 SP's then it will be lower clocked for one and second, AIB Fury cards will have better cooling allowing for OC headroom.....

There is always a trade off


----------



## BlackyMeow

Quote:


> Originally Posted by *Alastair*
> 
> You guys seen the AMD product slides for Fury Nano? Apparently it's a full on XT die. All 4096 and 256 TMU's. Yet 1, how do the get it to 175w and 2, think some of us normal Fury owners are gonna suddenly get an onset if buyers remorse?


I think it's going to be downclocked quite a bit. It won't have any OC headroom (only one 6 or 8 pin power connector). Also, it won't be as silent as the Sapphire Fury. No remorse here.

Also I have a big case, with very good airflow, the Nano makes no sense for me.


----------



## antonis21

New Fury owner


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> I was referring to them showing artifacting outside of gameplay.


Yes I have - every so often - I go in and change the display resolution and apply it back fixes it.


----------



## p4inkill3r

Quote:


> Originally Posted by *antonis21*
> 
> New Fury owner


Welcome aboard!


----------



## antonis21

Some more photos!


----------



## BlackyMeow

Quote:


> Originally Posted by *antonis21*
> 
> New Fury owner


Why did you buy the Asus one ? Sapphire's quieter, cheaper and runs cooler...


----------



## Jflisk

Quote:


> Originally Posted by *antonis21*
> 
> Some more photos!


Thats a nice looking card


----------



## p4inkill3r

Quote:


> Originally Posted by *BlackyMeow*
> 
> Why did you buy the Asus one ? Sapphire's quieter, cheaper and runs cooler...


Maybe he likes ASUS?


----------



## Sgt Bilko

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BlackyMeow*
> 
> Why did you buy the Asus one ? Sapphire's quieter, cheaper and runs cooler...
> 
> 
> 
> Maybe he likes ASUS?
Click to expand...

Or maybe he likes Red/Black?

lots of different reasons....i just wish we had an XFX version already


----------



## MiladEd

Quote:


> Originally Posted by *BlackyMeow*
> 
> Why did you buy the Asus one ? Sapphire's quieter, cheaper and runs cooler...


Maybe he needed a DVI?


----------



## BlackyMeow

Quote:


> Originally Posted by *MiladEd*
> 
> Maybe he needed a DVI?


FYI, Sapphire includes a DP to DVI adapter.


----------



## antonis21

Because i prefer ASUS quality/design.First 3dmark run with stock fury and 3770k at 4.6ghz=12.500


----------



## Sgt Bilko

Quote:


> Originally Posted by *BlackyMeow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MiladEd*
> 
> Maybe he needed a DVI?
> 
> 
> 
> FYI, Sapphire includes a DP to DVI adapter.
Click to expand...

Doesn't always matter that much, for example here in Aus there are a number of reasons to go for an Asus card over a Sapphire one

RMA'ing here is pretty painless with Asus, Sapphire not so much, 3 year warranty with Asus, 2 years with Sapphire.

That said it's always down to an individual's choices and looking at their sig rig i don't think they do but if they did have a Qnix monitor like myself then the DP-DVI adaptor won't allow you to overclock the monitor









as before there are many different reasons to choose one AIB over another and considering there are only 2 AIB Fury cards out on the market atm i don't see how going with the strix would be a "bad" choice


----------



## en9dmp

Can someone in the know drop w1zzard a pm and find out what the frig is happening with the voltage control?


----------



## dir_d

Quote:


> Originally Posted by *en9dmp*
> 
> Can someone in the know drop w1zzard a pm and find out what the frig is happening with the voltage control?


Go to his site and ask him.


----------



## BusterOddo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Doesn't always matter that much, for example here in Aus there are a number of reasons to go for an Asus card over a Sapphire one
> 
> RMA'ing here is pretty painless with Asus, Sapphire not so much, 3 year warranty with Asus, 2 years with Sapphire.
> 
> That said it's always down to an individual's choices and looking at their sig rig i don't think they do but if they did *have a Qnix monitor like myself then the DP-DVI adaptor won't allow you to overclock the monitor
> 
> 
> 
> 
> 
> 
> 
> *
> 
> as before there are many different reasons to choose one AIB over another and considering there are only 2 AIB Fury cards out on the market atm i don't see how going with the strix would be a "bad" choice


I just received my Overlord 2795QHD 2 days ago. I had not heard this before. I had heard of potential frame skipping with an adapter, but not " you cant overclock the monitor with an adapter"


----------



## rv8000

Quote:


> Originally Posted by *en9dmp*
> 
> Can someone in the know drop w1zzard a pm and find out what the frig is happening with the voltage control?


Last i asked, about 5 days ago, not much progress since the last beta.


----------



## Scorpion49

Does anyone know if its possible to make Freesync stop freesyncing when you alt-tab out of a game? Really annoying behavior that I didn't see with Gsync, if my game is at 47fps and I alt-tab out my desktop is now stuck at 47fps and all jerky/stuttery.


----------



## Thoth420

Quote:


> Originally Posted by *Scorpion49*
> 
> Does anyone know if its possible to make Freesync stop freesyncing when you alt-tab out of a game? Really annoying behavior that I didn't see with Gsync, if my game is at 47fps and I alt-tab out my desktop is now stuck at 47fps and all jerky/stuttery.


Stuff like this makes me want to avoid FreeSync at least for now. I have plenty of G Sync experience over various panels...overall a nice feature but pricey. Hoping AMD works out the issues and limitations over time....for now my single Fury X won't be pushing far past 60fps anyway. Leaving me wondering what panel to buy. Cheap Korean with a monitor arm or maybe the Dell 8m/s 60hz until something really interesting comes out. I know I can't go back to 1080 or TN and 4k I see no need in a monitor unless it's the size of a TV.


----------



## Ceadderman

Oh geez, just shut down the game for now. It'll get worked out I am sure. I rarely if ever Alt+tab out of a game on my system. I simply save and quit.

Thank you for letting the rest of us know your issue. But that shouldn't stop anyone from pulling the trigger.










~Ceadder


----------



## Scorpion49

Quote:


> Originally Posted by *Ceadderman*
> 
> Oh geez, just shut down the game for now. It'll get worked out I am sure. I rarely if ever Alt+tab out of a game on my system. I simply save and quit.
> 
> Thank you for letting the rest of us know your issue. But that shouldn't stop anyone from pulling the trigger.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


What if I don't want to shut down my game? Why should I have to? One in particular I play takes a long time to start up, and I'm frequently alt-tabbing to change my TS channel or other things related to the game itself, like looking at map strats in a quick window of time before a team battle starts so the caller can give instructions and such. You're saying I should accommodate AMD by shutting down my game, when they should be accommodating me so I want to use their products.


----------



## ENTERPRISE

Impressing Fury cards from the images I have seen, I itching for one but I really must hold out and wait for the Fury X2 lol.


----------



## hyp36rmax

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Impressing Fury cards from the images I have seen, I itching for one but I really must hold out and wait for the Fury X2 lol.


Same.


----------



## ENTERPRISE

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Impressing Fury cards from the images I have seen, I itching for one but I really must hold out and wait for the Fury X2 lol.
> 
> 
> 
> Same.
Click to expand...

Glad I am not alone, Fall is not long to wait really and getting any other card will give me remorse when it is released. I have an ITX Based motherboard build (Love it) however as it comes with one PCI-E slot then if one wants XFIRE like performance from one card then you have to wait for the dual GPU's. I am confident however that once I get my hands on the Fury X2 that upgrading is unlikely to happen in the near future, not too fussed by Finfet and HBM2 as it IMO is not going to give a quantum leap in my gaming performance.


----------



## xer0h0ur

Quote:


> Originally Posted by *Scorpion49*
> 
> What if I don't want to shut down my game? Why should I have to? One in particular I play takes a long time to start up, and I'm frequently alt-tabbing to change my TS channel or other things related to the game itself, like looking at map strats in a quick window of time before a team battle starts so the caller can give instructions and such. You're saying I should accommodate AMD by shutting down my game, when they should be accommodating me so I want to use their products.


I think he's leaning more towards trying to say don't expect things games/drivers aren't designed to do to just always work. I don't expect to be able to alt+tab out of every game without ever running into issues. I alt+tab frequently out of CS:GO but occasionally it causes weird resolution issues like changing font size scale on my steam menus and windows. Exiting and opening again makes the issues disappear. Either way if you're that bothered by it then submit a bug report about the issue you're experiencing with Freesync. Considering how Intel is backing Freesync and AMD is pushing Freesync I wouldn't be surprised if they took the issue seriously and/or were already working on such a fix.


----------



## Jflisk

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Glad I am not alone, Fall is not long to wait really and getting any other card will give me remorse when it is released. I have an ITX Based motherboard build (Love it) however as it comes with one PCI-E slot then if one wants XFIRE like performance from one card then you have to wait for the dual GPU's. I am confident however that once I get my hands on the Fury X2 that upgrading is unlikely to happen in the near future, not too fussed by Finfet and HBM2 as it IMO is not going to give a quantum leap in my gaming performance.


Makes you wonder if there going to go with 1x 240mm rad for 2 GPUS instead of the 1x120mm for 1 .

Anyone know if there is anything being done to afterburner to get the second FURY X to Ram over clock without powering down the first one. Thanks in advance


----------



## ENTERPRISE

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Glad I am not alone, Fall is not long to wait really and getting any other card will give me remorse when it is released. I have an ITX Based motherboard build (Love it) however as it comes with one PCI-E slot then if one wants XFIRE like performance from one card then you have to wait for the dual GPU's. I am confident however that once I get my hands on the Fury X2 that upgrading is unlikely to happen in the near future, not too fussed by Finfet and HBM2 as it IMO is not going to give a quantum leap in my gaming performance.
> 
> 
> 
> Makes you wonder if there going to go with 1x 240mm rad for 2 GPUS instead of the 1x120mm for 1 .
> 
> Anyone know if there is anything being done to afterburner to get the second FURY X to Ram over clock without powering down the first one. Thanks in advance
Click to expand...

I hope not otherwise I will need a different case but I find it unlikely. The 295X2 will run hotter than an Fury X2 and it had a slim 120 RAD which with a push pull configuration cooled it well even with a fairly decent OC.


----------



## xer0h0ur

Quote:


> Originally Posted by *ENTERPRISE*
> 
> I hope not otherwise I will need a different case but I find it unlikely. The 295X2 will run hotter than an Fury X2 and it had a slim 120 RAD which with a push pull configuration cooled it well even with a fairly decent OC.


Yeah but the 295X2 was only cooling the GPUs with the Asetek AIO cooler. This would be the GPUs + HBM generating heat and that is assuming they overlook actively cooling the VRMs. Otherwise you're looking at two GPUs, each GPU's HBM and the VRMs.


----------



## Jflisk

Quote:


> Originally Posted by *ENTERPRISE*
> 
> I hope not otherwise I will need a different case but I find it unlikely. The 295X2 will run hotter than an Fury X2 and it had a slim 120 RAD which with a push pull configuration cooled it well even with a fairly decent OC.


I would have to go mess with my fan profile in CCC to give a good guesstimate . I think before I changed it I was seeing 58C after an hour of BFH . Then I tweaked the fan profile 100% at 52C and have not seen above 51C since. I will say the fans are extremely quite.This is with the FURY X . The second one hits 49C same set up as the first card. Both cards set at 1100MHZ on the GPU. I hope you right about the single 120mm RAD I would like to go back to trifire.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> What if I don't want to shut down my game? Why should I have to? One in particular I play takes a long time to start up, and I'm frequently alt-tabbing to change my TS channel or other things related to the game itself, like looking at map strats in a quick window of time before a team battle starts so the caller can give instructions and such. You're saying I should accommodate AMD by shutting down my game, when they should be accommodating me so I want to use their products.
> 
> 
> 
> I think he's leaning more towards trying to say don't expect things games/drivers aren't designed to do to just always work. I don't expect to be able to alt+tab out of every game without ever running into issues. I alt+tab frequently out of CS:GO but occasionally it causes weird resolution issues like changing font size scale on my steam menus and windows. Exiting and opening again makes the issues disappear. Either way if you're that bothered by it then submit a bug report about the issue you're experiencing with Freesync. Considering how Intel is backing Freesync and AMD is pushing Freesync I wouldn't be surprised if they took the issue seriously and/or were already working on such a fix.
Click to expand...

This.

I wasn't being mean about it. Just saying that it's something that will likely get fixed.

Although I believe the issue is OS related and not Freesync related. I have friends who have nVidia cards who complain about their Alt+Tab issues. My bro's system will keep the Xray engine running even though he's closed Stalker.

My best advice I can give at is just to close down your game, go into Task Manager and find the offender which is usually near the bottom of the list and shut that down manually.

Now Skorps issue complicates things. He want to change his channel and do other game related things. For that I have no answer short of what I'd already suggested. Shutting it down in TM would affect visuals when he gets back to the game. So really AMD needs to get with M $oft to figure out a solution to that.









~Ceadder


----------



## hyp36rmax

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Glad I am not alone, Fall is not long to wait really and getting any other card will give me remorse when it is released. I have an ITX Based motherboard build (Love it) however as it comes with one PCI-E slot then if one wants XFIRE like performance from one card then you have to wait for the dual GPU's. I am confident however that once I get my hands on the Fury X2 that upgrading is unlikely to happen in the near future, not too fussed by Finfet and HBM2 as it IMO is not going to give a quantum leap in my gaming performance.


I've been wanting two of the Fury X2's once they are released for my MATX build. Ultimate 4k! haha. We'll see I'm very interested to see what the price point for these will be. I'm taking a wild guess of about $1499 for one.


----------



## Wage

Patiently awaiting the Fury X2 so I can buy two for quadfire -_-


----------



## Orthello

The nano does look nice .. couple that with amd doing great in dx12 so far (early days) looks like fiji will be a late bloomer , think they will sell all the nanos they can make.

See that witcher 3 performance review at hardocp too , showing the Fury out classing the gtx980 by 9% with the nvidia gameworks features hairworks etc on and also off by a good amount. Wasn't long ago nvidia dominated that game. OC vs OC may reverse that a bit but still .. speaking of that where is voltage control at any updates ?

Edit ... just saw the price (wccftech) on the Nano, $650 usd ... jeez they missed the mark there . I don't want to be negative but if that's true it tells me they have not got many to sell so if the price affects sales it doesn't matter .


----------



## WheelZ0713

Quote:


> Originally Posted by *Orthello*
> 
> The nano does look nice .. couple that with amd doing great in dx12 so far (early days) looks like fiji will be a late bloomer , think they will sell all the nanos they can make.
> 
> See that witcher 3 performance review at hardocp too , showing the Fury out classing the gtx980 by 9% with the nvidia gameworks features hairworks etc on and also off by a good amount. Wasn't long ago nvidia dominated that game. OC vs OC may reverse that a bit but still .. speaking of that where is voltage control at any updates ?


I too am curious about this. I'm wondering when i'm going to be able to squeeze some more out of my Fury. 1050Mhz is pretty much topping it out at this point.


----------



## ozyo




----------



## p4inkill3r

Quote:


> Originally Posted by *ozyo*


Beautiful, powerful, and completely overpriced.


----------



## xer0h0ur

Dear lord AMD is still full of potatoes. $650 for the Nano? *facepalm*


----------



## Sgt Bilko

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Glad I am not alone, Fall is not long to wait really and getting any other card will give me remorse when it is released. I have an ITX Based motherboard build (Love it) however as it comes with one PCI-E slot then if one wants XFIRE like performance from one card then you have to wait for the dual GPU's. I am confident however that once I get my hands on the Fury X2 that upgrading is unlikely to happen in the near future, not too fussed by Finfet and HBM2 as it IMO is not going to give a quantum leap in my gaming performance.
> 
> 
> 
> I've been wanting two of the Fury X2's once they are released for my MATX build. Ultimate 4k! haha. We'll see I'm very interested to see what the price point for these will be. I'm taking a wild guess of about $1499 for one.
Click to expand...

I don't think you will be far off the mark
Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ozyo*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Beautiful, powerful, and completely overpriced.
Click to expand...

^ perfect....short, simple, to the point and truthful


----------



## xer0h0ur

Missed the mark by $150 imo.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xer0h0ur*
> 
> Missed the mark by $150 imo.


If they made it the same price as the Fury then it would be sitting pretty, Fury has less SP's but beefier vrm's and coolers while Nano made more SP's + form factor on it's side while the downsides were weaker vrms and maybe cooling?

i mean i'm waiting till i see what the little thing is capable of before passing final judgement on it but i do agree that it should not have been priced that high, niche product or not


----------



## Jflisk

I am just reading up on the nano now. So they removed the water block and down clocked the Fury X at the same price as the Fury X. Lets see how well this goes.









And they called it a nano.


----------



## xer0h0ur

I really believe they would have sold a boatload of em at the $499.99 price point.


----------



## royfrosty

Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.

Probably gonna finish my project GeneXis 2.0 next week.

A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.



Comes with a nice matte back plate.


----------



## Sgt Bilko

Quote:


> Originally Posted by *royfrosty*
> 
> Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.
> 
> Probably gonna finish my project GeneXis 2.0 next week.
> 
> A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Comes with a nice matte back plate.
> 
> 
> Spoiler: Warning: Spoiler!


Now those are some sexy looking GPU's you got there


----------



## Alastair

Quote:


> Originally Posted by *royfrosty*
> 
> Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.
> 
> Probably gonna finish my project GeneXis 2.0 next week.
> 
> A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.
> 
> 
> 
> Comes with a nice matte back plate.


hot DAMN that's some smexy looking blocks you got there.


----------



## Jflisk

Quote:


> Originally Posted by *royfrosty*
> 
> Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.
> 
> Probably gonna finish my project GeneXis 2.0 next week.
> 
> A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.
> 
> 
> 
> Comes with a nice matte back plate.


There nice looking they come with the back plates.


----------



## Neon Lights

If anyone wants to use the BIOS(es) that were used to achieve the LN2 overclocking results on an ASUS Fury Strix, they can be downloaded here: http://forum.hwbot.org/showthread.php?t=142320. I believe for it to work ASUS GPU Tweak has to be used.

I would be happy if someone who has a Strix tried how much he can overclock it.

I personally tried to flash it to my reference Fury X, but the Flash always failed (I tried of both normal and LN2 the ones with no shaders unlocked and the ones with all shaders unlocked).


----------



## criminal

Quote:


> Originally Posted by *royfrosty*
> 
> Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.
> 
> Probably gonna finish my project GeneXis 2.0 next week.
> 
> A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.
> 
> 
> 
> Comes with a nice matte back plate.


Beautiful! Very jelly right now.


----------



## en9dmp

Quote:


> Originally Posted by *royfrosty*
> 
> Damn happy with Bitspower. Managed to get my water blocks for my 2 Fury X.
> 
> Probably gonna finish my project GeneXis 2.0 next week.
> 
> A sneak peak. Probably the first to get the Bitspower water blocks from Taiwan.
> 
> 
> 
> Comes with a nice matte back plate.


Nice - crying out for a single slot PCI bracket tho...


----------



## jase78

Supply will likely be a major problem once again for the next Fiji card "nano", even at 650 they will be sold out constantly. If you cant sell in high quantity to make more profit ,but can offset that by getting top dollar your gonna do juse that.

AMD is a bussiness after all and is tryin to stay afloat until the hbm2 cards can hopefully be produced much faster and actually rake in some dough.

If the 980 ti hadnt been created the fury x woukd have been 850 and 650 for the nano would probably be considered a bargain. Lol


----------



## xer0h0ur

They have been ramping up production since a few weeks ago but you're likely correct about a new Fiji XT card making the Fiji line scarce again.


----------



## WheelZ0713

Hope this isn't out of line, but my thread isn't getting a whole bunch of love out in the big bad world by itself.

Would love some input from you all as i'm pretty keen to have some answers before i get home and play some more...

http://www.overclock.net/t/1571383/fury-tri-x-oc-advice

Thanks!


----------



## littlestereo

The perfect Fury X monitor is finally here! Now for $455 you can get the first 4k IPS FreeSync monitor (LG 27MU67) over at NCIXUS on a flash sale (Normally $600). This makes it $200 cheaper than the closest G-sync TN panel (XB280HK) and $300+ over the unreleased IPS version (XB271HK).

http://www.ncixus.com/products/?sku=111161&vpn=27MU67-B&manufacture=LG%20Electronics&promoid=1292&ir_clickid=WTe0wf3WTU6z3TMR7xSo60y3UkX3EoxnuSUxzE0&ir_cid=3092&ir_affid=10451


----------



## ozyo

Quote:


> Originally Posted by *littlestereo*
> 
> The perfect Fury X monitor is finally here! Now for $455 you can get the first 4k IPS FreeSync monitor (LG 27MU67) over at NCIXUS on a flash sale (Normally $600). This makes it $200 cheaper than the closest G-sync TN panel (XB280HK) and $300+ over the unreleased IPS version (XB271HK).
> 
> http://www.ncixus.com/products/?sku=111161&vpn=27MU67-B&manufacture=LG%20Electronics&promoid=1292&ir_clickid=WTe0wf3WTU6z3TMR7xSo60y3UkX3EoxnuSUxzE0&ir_cid=3092&ir_affid=10451


freesync range ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ozyo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *littlestereo*
> 
> The perfect Fury X monitor is finally here! Now for $455 you can get the first 4k IPS FreeSync monitor (LG 27MU67) over at NCIXUS on a flash sale (Normally $600). This makes it $200 cheaper than the closest G-sync TN panel (XB280HK) and $300+ over the unreleased IPS version (XB271HK).
> 
> http://www.ncixus.com/products/?sku=111161&vpn=27MU67-B&manufacture=LG%20Electronics&promoid=1292&ir_clickid=WTe0wf3WTU6z3TMR7xSo60y3UkX3EoxnuSUxzE0&ir_cid=3092&ir_affid=10451
> 
> 
> 
> freesync range ?
Click to expand...

It's 40-60hz, it's the same one i'm looking at picking up, been on sale in Aus for a few weeks now


----------



## royfrosty

Too bad. Im still looking for a 32inch 4k VA or IPS panel with freesync.


----------



## royfrosty

Quote:


> Originally Posted by *en9dmp*
> 
> Nice - crying out for a single slot PCI bracket tho...


Yeap. Trying to get it. Probably getting aqc.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *littlestereo*
> 
> The perfect Fury X monitor is finally here! ...IPS FreeSync monitor (LG 27MU67)


I certainly don't want to derail this thread, but you may want to read 4K's review of this monitor.

http://4k.com/monitor/a-review-of-the-lg-27mu67-b-27-class-4k-uhd-ips-led-monitor/

Of course, the age old adage of what "opinions are like" applies here, but still might be worth reading up on before committing to a purchasing decision.

With my 2 Fury X's on order, I was considering sending back my 3 ASUS MG279Qs and picking up these LGs instead.
I figured with the LG's being 4K, and the Fury X able to _do_ 4K, _why not_?
But, after reading some of the reviews, including this one, I'm not so sure and I might just stick with my current panels.


----------



## localh85

If you were planning on getting two of those LG panels, consider getting this single 32" Samsung freesync 4k PLS monitor.

I have it and it is pretty great. I am playing BF4 on ultra @ 4k with just 1 Fury X just fine.

Here is the official spec page http://www.samsung.com/uk/consumer/it/monitor/professional-monitor/LU32E85KRS/EN

I don't know why Samsung US does not have this listed. you can buy it from many retailers, I found mine through google shopping:

https://www.google.com/shopping/product/10930188326354791273?sclient=psy-ab&hl=en&authuser=0&q=samsung+u32e850r&oq=samsung+U32E850R&pbx=1&bav=on.2,or.r_cp.&bvm=bv.101525188,d.cWw&biw=1921&bih=1014&pf=p&gs_rn=64&gs_ri=psy-ab&tok=fmMZBRuMmoSy6Nogkf4fvQ&ds=sh&pq=samsung+ue32e850&cp=16&gs_id=6&xhr=t&tch=1&ech=1&psi=MJ_gVcXjGMK3-AHH06a4BQ.1440784176770.1&prds=paur:ClkAsKraXx13vJXp0VcK494iD-uA7xME6ATvA2QRBxYd1qyysgdamkAh2mDJpJqxHQKF7lXEwpT0WO1qTvcOJjqVs8O7yCeqDySLk3LGKDPZmG-Fe77lBbmRExIZAFPVH72tB4gXOMy4szZ24OrEP2WtVMXhwQ&sa=X&sqi=2&ved=0CIUEEMQVMABqFQoTCPW5ya-rzMcCFcjOgAodcLcOmg

I bought mine through Immedia Systems, they were very responsive and fast at shipping despite my initial apprehensions that I never heard of them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *littlestereo*
> 
> The perfect Fury X monitor is finally here! ...IPS FreeSync monitor (LG 27MU67)
> 
> 
> 
> I certainly don't want to derail this thread, but you may want to read 4K's review of this monitor.
> 
> http://4k.com/monitor/a-review-of-the-lg-27mu67-b-27-class-4k-uhd-ips-led-monitor/
> 
> Of course, the age old adage of what "opinions are like" applies here, but still might be worth reading up on before committing to a purchasing decision.
> 
> With my 2 Fury X's on order, I was considering sending back my 3 ASUS MG279Qs and picking up these LGs instead.
> I figured with the LG's being 4K, and the Fury X able to _do_ 4K, _why not_?
> But, after reading some of the reviews, including this one, I'm not so sure and I might just stick with my current panels.
Click to expand...

Really?

After reading that the lack of USB does not concern me in the slightest and the brightness i can overlook
Quote:


> The 27MU67-B 27 from LG is of course also far from perfect and a few notable deficiencies are worth mentioning, though none of them are serious enough to warrant ignoring this monitor.
> 
> First of all, there are no USB ports on the 27MU67-B 27. This is a bizarre and gross oversight on the part of LG and while yes you do get dual HDMI, Display Port 1.2 and mini Display Port 1.2 ports, USB is a crucial extra that a lot of users wouldn't want to go without.
> 
> Secondly, the IPS panel, while excellent for visual quality does also mean a slightly slower response time when it comes to PC gaming controls. Furthermore, it adds to the price of this monitor just enough to make it slightly more expensive than other models with the same specs that we've seen on the market.
> 
> Finally, the 300cd /m2 brightness level of the screen isn't actually quite that high in practice, registering somewhat less at 290 cd/m2 or below in some cases. While this will be fine for most users, there are cheaper 4K PC monitors that come with superior 320 or 350 cd/m2 brightness ratings.


^ that's the bad......

I'll still be grabbing it at some point


----------



## Sgt Bilko

Quote:


> Originally Posted by *localh85*
> 
> If you were planning on getting two of those LG panels, consider getting this single 32" Samsung freesync 4k PLS monitor.
> 
> I have it and it is pretty great. I am playing BF4 on ultra @ 4k with just 1 Fury X just fine.
> 
> Here is the official spec page http://www.samsung.com/uk/consumer/it/monitor/professional-monitor/LU32E85KRS/EN
> 
> I don't know why Samsung US does not have this listed. you can buy it from many retailers, I found mine through google shopping:
> 
> https://www.google.com/shopping/product/10930188326354791273?sclient=psy-ab&hl=en&authuser=0&q=samsung+u32e850r&oq=samsung+U32E850R&pbx=1&bav=on.2,or.r_cp.&bvm=bv.101525188,d.cWw&biw=1921&bih=1014&pf=p&gs_rn=64&gs_ri=psy-ab&tok=fmMZBRuMmoSy6Nogkf4fvQ&ds=sh&pq=samsung+ue32e850&cp=16&gs_id=6&xhr=t&tch=1&ech=1&psi=MJ_gVcXjGMK3-AHH06a4BQ.1440784176770.1&prds=paur:ClkAsKraXx13vJXp0VcK494iD-uA7xME6ATvA2QRBxYd1qyysgdamkAh2mDJpJqxHQKF7lXEwpT0WO1qTvcOJjqVs8O7yCeqDySLk3LGKDPZmG-Fe77lBbmRExIZAFPVH72tB4gXOMy4szZ24OrEP2WtVMXhwQ&sa=X&sqi=2&ved=0CIUEEMQVMABqFQoTCPW5ya-rzMcCFcjOgAodcLcOmg
> 
> I bought mine through Immedia Systems, they were very responsive and fast at shipping despite my initial apprehensions that I never heard of them.


That really is not an option for me $900 for the LG and $1459 for the Samsung









It is a nice monitor though


----------



## Ceadderman

I will stick with my 40" until prices come down. Would love to have 4k but my system will run a 40" with no issues for now and when I upgrade GPU in 2016(come on tax day! Papa needs a new...) then I will consider what's available.

~Ceadder


----------



## MalsBrownCoat

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'll still be grabbing it at some point


Were you going to use a triple stand? Because if the intention is using the native stands, take note that "while pivot and height adjustability is there, the display can't be rotated or tilted".

Disappointing, but if you're mounting the panels on another stand, you should be ok. Anyway, food for thought before getting everything all set up and then having an "aw crap" moment.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I'll still be grabbing it at some point
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Were you going to use a triple stand? Because if the intention is using the native stands, take note that "while pivot and height adjustability is there, the display can't be rotated or tilted".
> 
> Disappointing, but if you're mounting the panels on another stand, you should be ok. Anyway, food for thought before getting everything all set up and then having an "aw crap" moment.
Click to expand...

No, i don't run eyefinity, I game on a single monitor and have two peripherals for other junk


----------



## BlackyMeow

Did you guys know that we can run VSR AND Eyefinity at the same time ?

Here's Dirt Rally, high settings, no AA. Single Sapphire R9 Fury.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BlackyMeow*
> 
> Did you guys know that we can run VSR AND Eyefinity at the same time ?
> 
> Here's Dirt Rally, high settings, no AA. Single Sapphire R9 Fury.


Maybe.....









http://www.overclock.net/t/1552448/2x2-eyefinity-on-a-single-display-no-bezels-4gpu-crossfirex-amd-vsr-gaming-benching-6400x3600-60hz/0_50


----------



## You Mirin

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> http://4k.com/monitor/a-review-of-the-lg-27mu67-b-27-class-4k-uhd-ips-led-monitor/


Eh, I don't really see any real negatives in that review, but it's a pretty bad one though lol. Hopefully TFTcentral does a "real" review on it soon though.


----------



## xer0h0ur

It appears that the rumor of a higher tier Asus Fury card had legs after all: http://wccftech.com/asus-teases-rog-matrix-gtx-980-ti-rog-matrix-r9-fury-graphics-cards-unveiling-2nd-september/



Perhaps this is the card that will actually take advantage of the Strix's custom PCB.


----------



## xPliZit

According to this:
http://www.forum-3dcenter.org/vbulletin/showthread.php?p=10737166#post10737166
*the FREE-SYNC range can be brought down to 33 HZ !! for a total range of 33-61 HZ using a modded display driver.*

My monitor is on its way as we speak


----------



## Ceadderman

Oh please please please please tell me this is true! An R9 RoG card?
















~Ceadder


----------



## Randomdude

Just got my Fury X today. It's a Sapphire card, I don't have anywhere to test it yet but first impressions based on just looks aren't that good. The radiator has a couple small dents on the other side of where the fan sits. There were no plugs to protect the hdmi/dp connectors nor a protection cap for the PCB. The card comes with two huge stickers on the front and back plates, which aren't very eye pleasing and could definitely do without. But those can be peeled off and it comes down to preference so there's that. I have high hopes that my card won't suffer from those noise-related issues I see people complain about, as silence is a great bonus to any build.

I'm curious as to what memory modules to purchase. Does the Fury series benefit from low CAS, high-clocked RAM? Would I see a benefit going from 16 gigs of 1866 (or was that 1600) cas 9 or 8 white Hyperx Fury to 16 gigs of Dominator Platinum 2400 cas 10? If the difference is small I would rather go with the Hyperx modules as they look good and are cheaper.


----------



## Ceadderman

You get that as open box? Cause by your description, that's what it sounds like Especially the dents. If the packaging was good but there is damage, and that should never happen in new undamaged packaging. The missing protectors too.









~Ceadder


----------



## Randomdude

Apologies for the late reply.

Nope, it was not open box. Had a couple Sapphire stickers sealing the box as well. I am wondering whether to return it or not. There doesn't seem to be any damage to the card. One curious thing is that the older boxes had 450Gbps written on them for bandwidth specification. This one has a sticker right on top with the correct 512 and also with the 4096bit bus added info.

Seeing as how the card itself seems to be in good condition, I wonder if I should test my luck and see if it has any issues... or request a new one because I paid for a mint card.

Packaging was not that great. The nylon bag the card comes with, along with the one for the cooler, they weren't sealed in any way. The rad and card were covered by them, but not so that air is sealed out.


----------



## xPliZit

Test your luck, you still have 30 days of replacement guarantee.


----------



## xer0h0ur

Sounds stupid to me to return an item you haven't even tried in a PC.


----------



## fewness

Haven't been following the thread for several days.....have we unlocked Fury X voltage yet ? Have we?


----------



## ozyo

Quote:


> Originally Posted by *fewness*
> 
> Haven't been following the thread for several days.....have we unlocked Fury X voltage yet ? Have we?


no


----------



## fewness

Quote:


> Originally Posted by *ozyo*
> 
> no


Damn !


----------



## BlackyMeow

Quote:


> Originally Posted by *Randomdude*
> 
> Apologies for the late reply.
> 
> Nope, it was not open box. Had a couple Sapphire stickers sealing the box as well. I am wondering whether to return it or not. There doesn't seem to be any damage to the card. One curious thing is that the older boxes had 450Gbps written on them for bandwidth specification. This one has a sticker right on top with the correct 512 and also with the 4096bit bus added info.
> 
> Seeing as how the card itself seems to be in good condition, I wonder if I should test my luck and see if it has any issues... or request a new one because I paid for a mint card.
> 
> Packaging was not that great. The nylon bag the card comes with, along with the one for the cooler, they weren't sealed in any way. The rad and card were covered by them, but not so that air is sealed out.


It was the same with the Fury X I got from Sapphire. Box was sealed, rad had dents in it, bags weren't sealed.

This card made an awful lot of noise (pump noise, coil whine, fan rattle) so I returned it.


----------



## Medusa666

So a question to the owners of Sapphire Fury Tri-X.

Do you experience any coilwhine? I'm asking because I have made an effort to have a truly silent rig, only Noctua fans and cooler, fans turned off while idle and low load etc. I'm sitting on a 295X2 today but I'm not happy with the sound of the pump and fans revving up and down.


----------



## BlackyMeow

Quote:


> Originally Posted by *Medusa666*
> 
> So a question to the owners of Sapphire Fury Tri-X.
> 
> Do you experience any coilwhine? I'm asking because I have made an effort to have a truly silent rig, only Noctua fans and cooler, fans turned off while idle and low load etc. I'm sitting on a 295X2 today but I'm not happy with the sound of the pump and fans revving up and down.


Yes. Coil whine is definitely there.

There is high pitched coil whine when frame rates are super high (300-2000 fps), and I can hear it very well.

The coil whine is much less noticeable when frame rates are "normal". Check Jayztwocents' video to hear how it sounds. It doesn't bother me...


----------



## Sgt Bilko

Quote:


> Originally Posted by *BlackyMeow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Medusa666*
> 
> So a question to the owners of Sapphire Fury Tri-X.
> 
> Do you experience any coilwhine? I'm asking because I have made an effort to have a truly silent rig, only Noctua fans and cooler, fans turned off while idle and low load etc. I'm sitting on a 295X2 today but I'm not happy with the sound of the pump and fans revving up and down.
> 
> 
> 
> Yes. Coil whine is definitely there.
> 
> There is high pitched coil whine when frame rates are super high (300-2000 fps), and I can hear it very well.
> 
> The coil whine is much less noticeable when frame rates are "normal". Check Jayztwocents' video to hear how it sounds. It doesn't bother me...
Click to expand...

Jay's video is Pump Noise, not coil whine......

If you have issues with Coil Whine then cap the fps in Catalyst with FRTC...


----------



## BlackyMeow

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Jay's video is Pump Noise, not coil whine......


Can't you just check your facts before saying I'm wrong ?






Also, FRTC gives me tearing even when I set it to 60fps, because it caps at 59.3...

AND, the coil whine is still there even at 60 fps, so that won't change anything.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BlackyMeow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Jay's video is Pump Noise, not coil whine......
> 
> 
> 
> Can't you just check your facts before saying I'm wrong ?
Click to expand...

Sorry i assumed you were talking about the Fury X video he did, i skimmed over the Fury Tri-X part of the quote and only seen Fury X for some reason.

I know he complained alot about the noise with the Fury X though.


----------



## mRYellow

Guys, maybe you can help me. For some reason i can't OC the memory on my Fury?
It was working fine before. Not sure if it's related to the new windows 10 insiders build.

Anyone have ideas or suggestions?


----------



## p4inkill3r

Using Afterburner? If so, make sure that 'extend official overclocking limits' is enabled in settings.


----------



## Scorpion49

Quote:


> Originally Posted by *Medusa666*
> 
> So a question to the owners of Sapphire Fury Tri-X.
> 
> Do you experience any coilwhine? I'm asking because I have made an effort to have a truly silent rig, only Noctua fans and cooler, fans turned off while idle and low load etc. I'm sitting on a 295X2 today but I'm not happy with the sound of the pump and fans revving up and down.


Mine, its in for RMA now though, hopefully I get a quiet one back:


----------



## Alastair

Quote:


> Originally Posted by *Randomdude*
> 
> Just got my Fury X today. It's a Sapphire card, I don't have anywhere to test it yet but first impressions based on just looks aren't that good. The radiator has a couple small dents on the other side of where the fan sits. There were no plugs to protect the hdmi/dp connectors nor a protection cap for the PCB. The card comes with two huge stickers on the front and back plates, which aren't very eye pleasing and could definitely do without. But those can be peeled off and it comes down to preference so there's that. I have high hopes that my card won't suffer from those noise-related issues I see people complain about, as silence is a great bonus to any build.
> 
> I'm curious as to what memory modules to purchase. Does the Fury series benefit from low CAS, high-clocked RAM? Would I see a benefit going from 16 gigs of 1866 (or was that 1600) cas 9 or 8 white Hyperx Fury to 16 gigs of Dominator Platinum 2400 cas 10? If the difference is small I would rather go with the Hyperx modules as they look good and are cheaper.


both of my Sapphire Fury Tri-x cards arrived in nearly perfect condition. They all had the dust caps in the ports, boxes were sealed. My only grip was on one of my cards the great metallic sticker with the yellow trim that says Sapphire on it was not glued on properly and was pealing off. But I just added some sticky glue to help it on its way.


----------



## BlackyMeow

Quote:


> Originally Posted by *Alastair*
> 
> both of my Sapphire Fury Tri-x cards arrived in nearly perfect condition. They all had the dust caps in the ports, boxes were sealed. My only grip was on one of my cards the great metallic sticker with the yellow trim that says Sapphire on it was not glued on properly and was pealing off. But I just added some sticky glue to help it on its way.


He's talking about a Fury X, not a Fury Tri-X


----------



## Alastair

Quote:


> Originally Posted by *BlackyMeow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> both of my Sapphire Fury Tri-x cards arrived in nearly perfect condition. They all had the dust caps in the ports, boxes were sealed. My only grip was on one of my cards the great metallic sticker with the yellow trim that says Sapphire on it was not glued on properly and was pealing off. But I just added some sticky glue to help it on its way.
> 
> 
> 
> He's talking about a Fury X, not a Fury Tri-X
Click to expand...

oh. Didn't catch that sorry. But still. Since they are a Sapphire product I would expect it to be of the same quality. I mean Fury X is more premium than normal Fury you would expect all the niceties in place.


----------



## BlackyMeow

Quote:


> Originally Posted by *Alastair*
> 
> oh. Didn't catch that sorry. But still. Since they are a Sapphire product I would expect it to be of the same quality. I mean Fury X is more premium than normal Fury you would expect all the niceties in place.


Well, all Fury X's are the same though...


----------



## Ceadderman

First of all... not all Fury X are the same. That's an assumption based on misinformation that AMD is building them and not their AIB partners. So far the only claims about how ugly they unbox have been made about one of their AIB partners and that was Sapphire. Nobody has posted any kind of alternate partner as being guilty of lack of protectors, open bags or dented fins. So nope, not identical. Reference standards are being strictly adhered to.

So second I have to agree. Being the premium card you would expect all the niceties to be in place when you unbox one.









~Ceadder


----------



## BlackyMeow

Quote:


> Originally Posted by *Ceadderman*
> 
> First of all... not all Fury X are the same. That's an assumption based on misinformation that AMD is building them and not their AIB partners. So far the only claims about how ugly they unbox have been made about one of their AIB partners and that was Sapphire. Nobody has posted any kind of alternate partner as being guilty of lack of protectors, open bags or dented fins. So nope, not identical. Reference standards are being strictly adhered to.
> 
> So second I have to agree. Being the premium card you would expect all the niceties to be in place when you unbox one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I had one Fury X from MSI and one from Sapphire. The packaging was identical. The foam was the same, the inner cardboard box was the same... Are you implying that AIB partners are building Fury X and AMD even tells them how the cards must be packaged ? I don't think so... The cards were identical, same coil whine, same fan rattle noise, and they both had dents in the rad. The platic bags were the same too. The Sapphire model had the new pump so the pump noise was reduced, but it was still there. The only difference between the two cards was the stickers on the fan and the card itself.


----------



## mRYellow

Quote:


> Originally Posted by *p4inkill3r*
> 
> Using Afterburner? If so, make sure that 'extend official overclocking limits' is enabled in settings.


Thanks, got it working again. It was enabled but i unticked it and re-enabled did the trick.


----------



## xer0h0ur

AMD has no fabrication plants. The assembly of the HBM/Die/Interposer is done by Hynix but after that no one knows exactly who takes care of the rest of the assembly. Its presumed the AIB handles the rest of the assembly.


----------



## You Mirin

Quote:


> Originally Posted by *fewness*
> 
> Damn !


I wonder what's the hold up on wizzards program. I know unwinder is still on vacation.......


----------



## xer0h0ur

Nvidia paying to have it buried. It didn't take this long for them to gain Afterburner/Trixx support on the 9XX series.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nvidia paying to have it buried. It didn't take this long for them to gain Afterburner/Trixx support on the 9XX series.


Probably. I heard Nvidia was paying Hynix to slow-roll HBM production also. They have their fingers in a lot of pies.


----------



## ozyo

Quote:


> Originally Posted by *Forceman*
> 
> Probably. I heard Nvidia was paying Hynix to slow-roll HBM production also. They have their fingers in a lot of pies.


----------



## Alastair

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Nvidia paying to have it buried. It didn't take this long for them to gain Afterburner/Trixx support on the 9XX series.
> 
> 
> 
> Probably. I heard Nvidia was paying Hynix to slow-roll HBM production also. They have their fingers in a lot of pies.
Click to expand...

maybe unwinder got a vacation on Nvidia. Terms and conditions being. "delay Fury Voltage control".


----------



## xer0h0ur

Laugh it up if you like. When you come back months later and it still doesn't exist, feel free to come up with another logical reason for it.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Laugh it up if you like. When you come back months later and it still doesn't exist, feel free to come up with another logical reason for it.


Well considering Trixx also doesn't have it, and Sapphire is AMD exclusive, I doubt Nvidia is paying them to not provide it.


----------



## mRYellow

Unwinder (Alex) said he was still waiting for a sample card.


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> Well considering Trixx also doesn't have it, and Sapphire is AMD exclusive, I doubt Nvidia is paying them to not provide it.


Except the application itself is not made by Sapphire....nor is he an employee of Sapphire far as I know.

Frankly if it takes him this long to add support for one card while having a sample in hand, he may want to consider a new line of work.

If MSI is really that stupid to still not provide Unwinder with a card then that is entirely on them. You can't blame Unwinder for that.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Except the application itself is not made by Sapphire....nor is he an employee of Sapphire far as I know.


Hmm, I thought he was. I thought they had hired someone to develop Trixx for them a while back, maybe I'm thinking of GPU Tweak? Which also doesn't have voltage control. Either there is a single person who is doing this for everyone (which seems strange) or there's something else preventing it. Didn't Unwinder say they had changed something in the Fury that made it more difficult? Or maybe that was just in regards to Nvidia's versus AMD's implementation. I didn't pay that much attention.


----------



## You Mirin

Quote:


> Originally Posted by *mRYellow*
> 
> Unwinder (Alex) said he was still waiting for a sample card.


Hopefully its there waiting for him from MSI when hes back....still surprised AMD or even amdmatt (with this 4 lol) did not just send one to Unwinder from the start.

http://forums.guru3d.com/showpost.php?p=5136403&postcount=6

Well I actually did some googling trying to find the post where wizzard has mentioned that he already gave the code to sapphire and found this.

http://forums.overclockers.co.uk/showpost.php?p=28464898&postcount=6183

so Soon™?
Quote:


> Originally Posted by *Forceman*
> 
> Hmm, I thought he was. I thought they had hired someone to develop Trixx for them a while back, maybe I'm thinking of GPU Tweak? Which also doesn't have voltage control. Either there is a single person who is doing this for everyone (which seems strange) or there's something else preventing it. Didn't Unwinder say they had changed something in the Fury that made it more difficult? Or maybe that was just in regards to Nvidia's versus AMD's implementation. I didn't pay that much attention.


Wizzard does Trixx and Unwinder does Afterburner


----------



## Shatun-Bear

It seems very odd that no-one has come up with full voltage control for the Fury X in over two months after release.

And Unwinder being on vacation until September - I am sure he hasn't been away since the end of June.


----------



## Neon Lights

Quote:


> Originally Posted by *Shatun-Bear*
> 
> It seems very odd that no-one has come up with full voltage control for the Fury X in over two months after release.


Well, if one has a Fury Strix one can tweak the voltage with ASUS GPU Tweak. I wrote a post about it in this thread but nobody wrote a reply yet.

I repeat here: Flash the BIOS(es) that I linked to in my previous post.


----------



## MiladEd

So I'm planning to upgrade my monitor from a 22" 1680x1050 panel to BenQ QHD 144 Hz FreeSync monitor and was wondering if a single R9 Fury would suffice to run games like Witcher 3 and GTA V on that resolution and refresh rate. Thanks!


----------



## Neon Lights

Quote:


> Originally Posted by *MiladEd*
> 
> So I'm planning to upgrade my monitor from a 22" 1680x1050 panel to BenQ QHD 144 Hz FreeSync monitor and was wondering if a single R9 Fury would suffice to run games like Witcher 3 and GTA V on that resolution and refresh rate. Thanks!


Yes, but not with maximum details. You need a minimum of two Furys (or Fury Xs) in order to not run into a GPU limit. Then, you will however often run into a CPU limit, especially in GTA V (in fact, for GTA V, the CPU limit occurs so often that I personally always play it only with one Fury X, also because there is quite a lot of Microstuttering and of a course a higher input lag with two Fury Xs). In The Witcher 3 I would recommend using two Furys because there is quite a bit less CPU limitation than in GTA V and a second Fury X really helps the framerates.
By the way, I personally use an Eizo FG2421 ([email protected]), so you may have a bit less CPU limitation and more GPU limitation than me.
Also, if the CPU in your signature is the one you actually use to play those games, getting a high end Intel CPU (6700K, 4770K, 5820K) will lower your CPU limit in those games quite a bit, I know it sounds annoying, I also used an FX 8150, but unfortunately the most games get a much higher CPU limit with an AMD CPU than with an Intel CPU. I am using a [email protected] at the moment and have to say that for high framerates such as 120FPS and 144FPS, using a high end Intel CPU that is overclocked is very important.


----------



## Horcsch

Hey guys I also got a fury tri-x and its running good so far. My only gripe is that it looks really bent and I think it was this way when it arrived. Is this normal or may it be a sign that it was used? Do I have to worry about that? The shop shipped used hardware before based on the reviews...
Sadly I dont really remember how it was packaged I was too excited, but I think it didnt even come in a bag it just was in the box...should I rma it?


----------



## p4inkill3r

Quote:


> Originally Posted by *Horcsch*
> 
> Hey guys I also got a fury tri-x and its running good so far. My only gripe is that it looks really bent and I think it was this way when it arrived. Is this normal or may it be a sign that it was used? Do I have to worry about that? The shop shipped used hardware before based on the reviews...
> Sadly I dont really remember how it was packaged I was too excited, but I think it didnt even come in a bag it just was in the box...should I rma it?


Can you post a picture of it?


----------



## Horcsch

Yes of course. Also there is a loud coil whine and a sticker on the package that someone here mentioned (saying 4096 / 512gb/s) put over the wrong specifactions so its a old box too
It looks really bas on the pic








I could get a new one right now for 25€ less


----------



## p4inkill3r

Yeah, I think you should RMA.


----------



## Shatun-Bear

Horcsch that's shocking service, send it back.
Quote:


> Originally Posted by *Neon Lights*
> 
> Well, if one has a Fury Strix one can tweak the voltage with ASUS GPU Tweak. I wrote a post about it in this thread but nobody wrote a reply yet.
> 
> I repeat here: Flash the BIOS(es) that I linked to in my previous post.


Nice I had a look at your posts. What benchmark scores are you getting with those overclocks?


----------



## fjordiales

Quote:


> Originally Posted by *Neon Lights*
> 
> Well, if one has a Fury Strix one can tweak the voltage with ASUS GPU Tweak. I wrote a post about it in this thread but nobody wrote a reply yet.
> 
> I repeat here: Flash the BIOS(es) that I linked to in my previous post.


I would like to thank you for sharing the link on the bios. I was able to do 4low bios unlock with it. Was stuck with the unlock/back-up bios step.

The link also has the stock Fury Strix bios so just in case I want to reflash it. I haven't played around with it yet but I was able to push my cards to 1060 so far. Was stuck with 1040 before the new bios.


----------



## Neon Lights

Quote:


> Originally Posted by *Shatun-Bear*
> 
> Horcsch that's shocking service, send it back.
> Nice I had a look at your posts. What benchmark scores are you getting with those overclocks?


If you mean the overclocks I linked to, those were done by an extreme overclocker, I only linked to the BIOSes because I am interested to know what clocks can be achieved with them without hard-voltage modding the graphic cards. I also do not have a Fury Strix, I have two Fury Xs (though having a mod BIOS for the Fury Strix, which for some reason is not being made for the reference cards - we can not even control the god damned voltage yet - has made me consider exchanging my Fury Xs for Fury Strixs). I myself can overclock my Fury Xs to a little more than 1100MHz.


----------



## antonis21

My 3dmark score with fury 1050/500 and 3770k at 4.6ghz http://www.3dmark.com/3dm/8407742


----------



## Shatun-Bear

Quote:


> Originally Posted by *antonis21*
> 
> My 3dmark score with fury 1050/500 and 3770k at 4.6ghz http://www.3dmark.com/3dm/8407742


Nice. I've compared your scores with my system with a single EVGA 980. Seems the Fury is helping your system keep up with the 6-core proc:


----------



## Jflisk

15.8 beta drive up
http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## You Mirin

Quote:


> Originally Posted by *Shatun-Bear*
> 
> And Unwinder being on vacation until September - I am sure he hasn't been away since the end of June.


Yeah he got paid that nvidia money.









If only amd had sent it to him from the start.


----------



## WheelZ0713

Quote:


> Originally Posted by *antonis21*
> 
> My 3dmark score with fury 1050/500 and 3770k at 4.6ghz http://www.3dmark.com/3dm/8407742


Nice. I managed to nudge a little over 14000 with a 4770K @ 4.5ghz and 1090/560 after unclocking 4 cu's with the bios flash.

I'll grab a screenshot when i get home from work.


----------



## Kana-Maru

This highest I've been able to get with my Fury X with stock HBM and OC 1125Mhz on the Core was 15,750 in FireStrike. Stock Fury X gets me 15,073. I'll try overclocking my RAM to see what I can get soon.


----------



## Jflisk

WOOF- the 15.8 drivers are a mess with the Fury X . BFH doing the black screen thing after playing 10 minutes. Batman Arkham knight graphic artifacting so bad about ready to have a stroke.There are not many drivers I have been disappointed with - Just venting. If anyone else decides to try them let me know how it goes. Thanks

Reverted back to 15.7.1 no problems what so ever.


----------



## Kaapstad

Here is mine now it is up and running

*Kaapstad (The Red Machine)*


Next job will be to give it a good clean now it is running.


----------



## battleaxe

Quote:


> Originally Posted by *Kaapstad*
> 
> Here is mine now it is up and running
> 
> *Kaapstad (The Red Machine)*
> 
> 
> Next job will be to give it a good clean now it is running.


Four Fury's?


----------



## p4inkill3r

Quote:


> Originally Posted by *Kaapstad*
> 
> Here is mine now it is up and running


----------



## Jflisk

Quote:


> Originally Posted by *Kaapstad*
> 
> Here is mine now it is up and running
> Next job will be to give it a good clean now it is running.


Nice machine


----------



## battleaxe

I think I know the answer but how does four FuryX compare to Four 980ti ?

I'm betting scaling is a lot better for AMD...


----------



## ozyo

Quote:


> Originally Posted by *battleaxe*
> 
> I think I know the answer but how does four FuryX compare to Four 980ti ?
> 
> I'm betting scaling is a lot better for AMD...






btw maingear remove original video from their channel


----------



## xer0h0ur

Four Fury X's murder anything Nvidia has. Still beats it when comparing three cards. Scaling is only favorable for Nvidia at two cards.


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Four Fury X's murder anything Nvidia has. Still beats it when comparing three cards. Scaling is only favorable for Nvidia at two cards.


The technically the scaling is better for AMD with 2 cards too







But it doesn't make up for the fact that the ti is just outright fast, so SLI scales well enough to still slightly edge the Fury X. But with Tri/Quad fire AMD really take the cake and solidly beat the ti, despite being inferior 1 on 1.


----------



## Arniebomba

Could anyone with a fury x, measure the distance from the center of the screw holes to the outer sides of the radiator?


----------



## Ceadderman

If this guy can get FOUR of them Fury Xs, I don't think there is really much of a wait for them. Sure he probably was one of the lucky few to be able to get more than 1 but dude has FOUR.









~Ceadder


----------



## Ceadderman

Quote:


> Originally Posted by *Arniebomba*
> 
> Could anyone with a fury x, measure the distance from the center of the screw holes to the outer sides of the radiator?


iirc, the Radiatior itself is 120mm side to side.

120mm screw holes are 105mm center to center. So that leaves 15mm of space cut by half so your answer *should* be 7.5mm from center to edge.









~Ceadder


----------



## Arniebomba

Quote:


> Originally Posted by *Ceadderman*
> 
> iirc, the Radiatior itself is 120mm side to side.
> 
> 120mm screw holes are 105mm center to center. So that leaves 15mm of space cut by half so your answer *should* be 7.5mm from center to edge.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Thanks for the info!
Though i think the sides are different in length, taken from the screw holes.
See:

Oh and the radiator is 154mm in length


----------



## Kaapstad

Quote:


> Originally Posted by *battleaxe*
> 
> I think I know the answer but how does four FuryX compare to Four 980ti ?
> 
> I'm betting scaling is a lot better for AMD...


4 Fury Xs v 4 Titan Xs for scaling is about even.

That maingear video above is total rubbish and should not be take seriously.

With scaling on 4 way systems a lot comes down to if the drivers work but if they do like in Tomb Raider then scaling performance between AMD and NVidia is almost identical.


----------



## Wage

Quote:


> Originally Posted by *Kaapstad*
> 
> 4 Fury Xs v 4 Titan Xs for scaling is about even.
> 
> That maingear video above is total rubbish and should not be take seriously.
> 
> With scaling on 4 way systems a lot comes down to if the drivers work but if they do like in Tomb Raider then scaling performance between AMD and NVidia is almost identical.


Since when? Quadfire 295X2 user here, and the above is a total crock. Tri/Quad-SLI scaling has been abysmal for nVidia since 7XX series, and the gains on AMD's side have been even more apparent with driver optimization and frametime fixes since 7XXX series. From memory, isn't this mainly due to SLI still relying on the bridge ribbon and its lower bandwidth?

Having a third or fourth Titan X is like throwing money in the trash due to little/no scaling, and the same can be said for 980ti.


----------



## mRYellow

Quote:


> Originally Posted by *Kaapstad*
> 
> Here is mine now it is up and running
> 
> *Kaapstad (The Red Machine)*
> 
> 
> Next job will be to give it a good clean now it is running.


Awesome goodness from the Cape!


----------



## battleaxe

Quote:


> Originally Posted by *ozyo*
> 
> 
> 
> 
> 
> btw maingear remove original video from their channel


Quote:


> Originally Posted by *xer0h0ur*
> 
> Four Fury X's murder anything Nvidia has. Still beats it when comparing three cards. Scaling is only favorable for Nvidia at two cards.


Okay so one card 980ti is great, more than that and the difference starts to swing over to AMD. Once the drivers and voltage is all figured out these things are going to be all we had hoped they were I think. The performance is there, just got to wait a bit to see it all unfold. In the meanwhile, savings is building...


----------



## xer0h0ur

http://iyd.kr/753

Mind you that was at the launch of Fury X. Drivers have come a ways since then. Either way it doesn't paint a pretty picture for Titan X or 980 Ti's scaling at triple and quad SLI given that we know those cards are stronger as single cards.

Nothing earth-shattering in the testing. We already knew AMD's cards are quite gimped at low resolution be it from hardware limitations or simply sub-par DX11 drivers with a lot of overhead. Either way no one is buying 2, 3 or 4 of these cards to play at bloody 1080p. If they are then lord help their stupidity.


----------



## Thoth420

You guys are really making me consider my first dual gpu rig. I have one Fury X now.
So....for 1440 gaming (preferably above 60hz) is the scaling good enough to justify another fury x(and the extra block)? Both scenarios they will be cooled in a custom loop not the stock water cooler.


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*
> 
> You guys are really making me consider my first dual gpu rig. I have one Fury X now.
> So....for 1440 gaming (preferably above 60hz) is the scaling good enough to justify another fury x(and the extra block)? Both scenarios they will be cooled in a custom loop not the stock water cooler.


I say go for it.


----------



## xer0h0ur

I suppose that really depends on how long you plan on keeping that setup? If you're going to jump ship to the Arctic Islands generation with HBM2, the node shrink, extra power and rumored new iteration of GCN then I would say don't do it. If you plan on keeping it for longer than that then it could be worth it for you assuming whatever game(s) you play have crossfire support.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> I suppose that really depends on how long you plan on keeping that setup? If you're going to jump ship to the Arctic Islands generation with HBM2, the node shrink, extra power and rumored new iteration of GCN then I would say don't do it. If you plan on keeping it for longer than that then it could be worth it for you assuming whatever game(s) you play have crossfire support.


Sound advice for sure, but it can be applied to most questions regarding GPUs in specific and technology in general.


----------



## Kaapstad

Quote:


> Originally Posted by *Wage*
> 
> Since when? Quadfire 295X2 user here, and the above is a total crock. Tri/Quad-SLI scaling has been abysmal for nVidia since 7XX series, and the gains on AMD's side have been even more apparent with driver optimization and frametime fixes since 7XXX series. From memory, isn't this mainly due to SLI still relying on the bridge ribbon and its lower bandwidth?
> 
> Having a third or fourth Titan X is like throwing money in the trash due to little/no scaling, and the same can be said for 980ti.


I have both TitanXs and Fury Xs in idential PCs, the only difference is the AMD setup is in a red case and the NVidia one is in a green case so I think when I say the scaling is about the same for both camps I do know from practical experience.

Having 3 or 4 TitanXs also opens up options you can not do with 4 FXs or 4 980 Ti cards as there are several games @2160p that are both very GPU grunt based and use over 8gb of VRAM where both SLI scaling and 12gb of memory are important. So putting it bluntly if you are going to use TitanXs the smart thing to do is get 3 or 4 of them and make use of the extra memory. It is a waste of money to use just a single TX as there are other cards that can do just as well.

You also mentioned Quadfired Hawaii GPUs, I also own 4 x 290Xs and know exactly how these cards run together and I also know where the skelitons are buried when using quadfire with these GPUs. No 4 way setup from either vender is perfect and it is totally wrong to cherry pick games and benches to suit your side of the arguement when it comes to scaling.

Take it from someone who uses both everyday that the scaling is very similar for both venders. All the cards in my signature I have still got and use plus some I have not included like HD 5970s for example.

As to bridges don't be surprised if you see AMD go back to using them or something similar as it will be needed if games appear that start using all the multi GPU features that DX12 can offer.


----------



## battleaxe

Quote:


> Originally Posted by *Kaapstad*
> 
> I have both TitanXs and Fury Xs in idential PCs, the only difference is the AMD setup is in a red case and the NVidia one is in a green case so I think when I say the scaling is about the same for both camps I do know from practical experience.
> 
> Having 3 or 4 TitanXs also opens up options you can not do with 4 FXs or 4 980 Ti cards as there are several games @2160p that are both very GPU grunt based and use over 8gb of VRAM where both SLI scaling and 12gb of memory are important. So putting it bluntly if you are going to use TitanXs the smart thing to do is get 3 or 4 of them and make use of the extra memory. It is a waste of money to use just a single TX as there are other cards that can do just as well.
> 
> You also mentioned Quadfired Hawaii GPUs, I also own 4 x 290Xs and know exactly how these cards run together and I also know where the skelitons are buried when using quadfire with these GPUs. No 4 way setup from either vender is perfect and it is totally wrong to cherry pick games and benches to suit your side of the arguement when it comes to scaling.
> 
> Take it from someone who uses both everyday that the scaling is very similar for both venders. All the cards in my signature I have still got and use plus some I have not included like HD 5970s for example.
> 
> As to bridges don't be surprised if you see AMD go back to using them or something similar as it will be needed if games appear that start using all the multi GPU features that DX12 can offer.


May I beg for your salary please? Pretty please?


----------



## Jflisk

Quote:


> Originally Posted by *Thoth420*
> 
> You guys are really making me consider my first dual gpu rig. I have one Fury X now.
> So....for 1440 gaming (preferably above 60hz) is the scaling good enough to justify another fury x(and the extra block)? Both scenarios they will be cooled in a custom loop not the stock water cooler.


Grab the second one its worth it. I could not deal with just one. Waiting for the X2 might do 3 total.


----------



## Kaapstad

Quote:


> Originally Posted by *battleaxe*
> 
> May I beg for your salary please? Pretty please?


I am just a factory worker.









Computers are my main hobby and I have very few other interests and commitments.


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> Sound advice for sure, but it can be applied to most questions regarding GPUs in specific and technology in general.


See, my reasoning for that response was purely because he was thinking of taking on the cost of the added card plus a waterblock for it and all of its associated costs and extra work. If it was purely plug and play cards with no extra costs then really its just about if you can afford it or not.


----------



## xer0h0ur

Quote:


> Originally Posted by *Kaapstad*
> 
> I have both TitanXs and Fury Xs in idential PCs, the only difference is the AMD setup is in a red case and the NVidia one is in a green case so I think when I say the scaling is about the same for both camps I do know from practical experience.
> 
> Having 3 or 4 TitanXs also opens up options you can not do with 4 FXs or 4 980 Ti cards as there are several games @2160p that are both very GPU grunt based and use over 8gb of VRAM where both SLI scaling and 12gb of memory are important. So putting it bluntly if you are going to use TitanXs the smart thing to do is get 3 or 4 of them and make use of the extra memory. It is a waste of money to use just a single TX as there are other cards that can do just as well.
> 
> You also mentioned Quadfired Hawaii GPUs, I also own 4 x 290Xs and know exactly how these cards run together and I also know where the skelitons are buried when using quadfire with these GPUs. No 4 way setup from either vender is perfect and it is totally wrong to cherry pick games and benches to suit your side of the arguement when it comes to scaling.
> 
> Take it from someone who uses both everyday that the scaling is very similar for both venders. All the cards in my signature I have still got and use plus some I have not included like HD 5970s for example.
> 
> As to bridges don't be surprised if you see AMD go back to using them or something similar as it will be needed if games appear that start using all the multi GPU features that DX12 can offer.


Or you can drop the platitudes and just click on some reviews of people showing multiple games on dual, triple and quad Titan X, 980 Ti and Fury X's. I am sure you will excuse me if I trust a reviewer over your statement. I would also hope people aren't blindly believing that just because AMD's XDMA crossfire solution scales better than Nvidia's SLI that somehow its without its demons. Its simply not possible to have 4 of anything without running into headaches. That has been proven time and time again with cards from either side. However, that still has nothing to do with the scaling itself which is still favoring AMD's XDMA solution far more than it ever does SLI.


----------



## Ceadderman

Quote:


> Originally Posted by *Arniebomba*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> iirc, the Radiatior itself is 120mm side to side.
> 
> 120mm screw holes are 105mm center to center. So that leaves 15mm of space cut by half so your answer *should* be 7.5mm from center to edge.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the info!
> Though i think the sides are different in length, taken from the screw holes.
> See:
> 
> Oh and the radiator is 154mm in length
Click to expand...

Okay then the math tends to be the same but overall length being 154mm instead of 120mm. I was thinking you wanted side to side not end to end.

120 holess are still 105mm center to center, whether sideways or lengthwise. Square fan bodies are still 120mm as well. Since the balance of the Rad extends further with the fan squarely mounted you have 34mm of difference to account for and halve that for an even 17mm. +/- a couple mm if the body isn't evenly balanced

How does that sound?









~Ceadder


----------



## Kaapstad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Or you can drop the platitudes and just click on some reviews of people showing multiple games on dual, triple and quad Titan X, 980 Ti and Fury X's. I am sure you will excuse me if I trust a reviewer over your statement. I would also hope people aren't blindly believing that just because AMD's XDMA crossfire solution scales better than Nvidia's SLI that somehow its without its demons. Its simply not possible to have 4 of anything without running into headaches. That has been proven time and time again with cards from either side. However, that still has nothing to do with the scaling itself which is still favoring AMD's XDMA solution far more than it ever does SLI.


As this is the AMD side of the forum I am not here to post a load of negative benchmarks to prove a point.

There is very little difference between C/F and SLI, you can take it or leave it or do the same as me and take the time to find out.

Even if I did post the screenshots to prove my point some people would probably have some excuse or another as to why the C/F scores were so bad.


----------



## Malamute3511

I always thought CF scaled better than SLI once you got into 3 way and 4 way. Once you hit 3 and 4 way its a mute point anyways as you don't get enough of a bump in performance to make the 3rd or 4th GPU worth it imo. Small performance boost for the headaches it brings is just not worth it. That y I run 2 cards in SLI and when I had my old AMD cards I ran 2 way CF. I could be wrong tho.


----------



## Silent Scone

Quote:


> Originally Posted by *Malamute3511*
> 
> I always thought CF scaled better than SLI once you got into 3 way and 4 way.


In terms of raw frame scaling, yes


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> I suppose that really depends on how long you plan on keeping that setup? If you're going to jump ship to the Arctic Islands generation with HBM2, the node shrink, extra power and rumored new iteration of GCN then I would say don't do it. If you plan on keeping it for longer than that then it could be worth it for you assuming whatever game(s) you play have crossfire support.


3 to 5 years as I don't game as much as I used to. Also the tubing is going to be PETC Acrylic so it's pretty locked in once it's finished for a while.


----------



## Ceadderman

You mean PETG right?









~Ceadder


----------



## Neon Lights

Quote:


> Originally Posted by *Kaapstad*
> 
> There is very little difference between C/F and SLI, you can take it or leave it or do the same as me and take the time to find out.


Yes, but it appears from the main games that are benchmarked that there are more that support Tri/Quadfire than Triple/Quad SLI.

I used 4 7970s myself and I can only say that if 3 or 4 GPUs were supported, it worked, and was also still playable because the input lag was not too high.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> 3 to 5 years as I don't game as much as I used to. Also the tubing is going to be PETC Acrylic so it's pretty locked in once it's finished for a while.


In that case go for it.


----------



## Thoth420

Quote:


> Originally Posted by *Ceadderman*
> 
> You mean PETG right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah. The hard tubing. Was replying on my phone.

Quote:


> Originally Posted by *xer0h0ur*
> 
> In that case go for it.


I think I will...just hope I either get two gpus that don't coil whine or I can't hear it via sound dampening.


----------



## Otterfluff

Quote:


> Originally Posted by *Neon Lights*
> 
> Well, if one has a Fury Strix one can tweak the voltage with ASUS GPU Tweak. I wrote a post about it in this thread but nobody wrote a reply yet.
> 
> I repeat here: Flash the BIOS(es) that I linked to in my previous post.


I tried the unlocked asus bios on my gigabyte fury X but it wouldent boot up. Reverted it using the backup bios. I am planning on getting a second fury X and will grab a Asus fury X to test with, which will hopefully work. Id grab a strix if there was a waterblock compatible with it. Know if any blocks currently compatible with the Asus Strix?

I already have the ek fury x blocks on back order but id change that if a strix compatible one was around.


----------



## Alastair

Quote:


> Originally Posted by *Otterfluff*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Neon Lights*
> 
> Well, if one has a Fury Strix one can tweak the voltage with ASUS GPU Tweak. I wrote a post about it in this thread but nobody wrote a reply yet.
> 
> I repeat here: Flash the BIOS(es) that I linked to in my previous post.
> 
> 
> 
> I tried the unlocked asus bios on my gigabyte fury X but it wouldent boot up. Reverted it using the backup bios. I am planning on getting a second fury X and will grab a Asus fury X to test with, which will hopefully work. Id grab a strix if there was a waterblock compatible with it. Know if any blocks currently compatible with the Asus Strix?
> 
> I already have the ek fury x blocks on back order but id change that if a strix compatible one was around.
Click to expand...

No blocks for Strix sorry man. EK has no plans at this time to produce a block for the Fury Strix.


----------



## localh85

Quote:


> Originally Posted by *Kaapstad*
> 
> As this is the AMD side of the forum I am not here to post a load of negative benchmarks to prove a point.
> 
> There is very little difference between C/F and SLI, you can take it or leave it or do the same as me and take the time to find out.
> 
> Even if I did post the screenshots to prove my point some people would probably have some excuse or another as to why the C/F scores were so bad.


There is noticeable difference to the effect that Crossfire outperforms SLI in every instance.

gpuscale.jpg 245k .jpg file


----------



## Kaapstad

Quote:


> Originally Posted by *localh85*
> 
> There is noticeable difference to the effect that Crossfire outperforms SLI in every instance.
> 
> gpuscale.jpg 245k .jpg file


It is very easy to cherry pick a few games to prove a point lol.

It would be very easy for me to bench a load of mainstream games and the summary would give it to the NVidia cards. All I would have to do is include The Witcher 3 and a few of the Total War games where 4 way support on AMD cards is practically non existent.


----------



## localh85

Quote:


> Originally Posted by *Kaapstad*
> 
> It is very easy to cherry pick a few games to prove a point lol.
> 
> It would be very easy for me to bench a load of mainstream games and the summary would give it to the NVidia cards. All I would have to do is include The Witcher 3 and a few of the Total War games where 4 way support on AMD cards is practically non existent.


[source required]


----------



## Kaapstad

Quote:


> Originally Posted by *localh85*
> 
> [source required]


Me lol

On my own cards.


----------



## AC1White1Glint

Quote:


> Originally Posted by *Kaapstad*
> It is very easy to cherry pick a few games to prove a point lol.


True. It is very easy to cherry pick games to prove your point. However...
Quote:


> Originally Posted by *Kaapstad*
> It would be very easy for me to bench a load of mainstream games and the summary would give it to the NVidia cards.


Without running benchmarks and comparing them to other cards on the market there is no way to determine which is better under all/what circumstances. All that's happened here today is that you tried to throw mud on another member who provided evidence without providing your own proof.
Quote:


> Originally Posted by *Kaapstad*
> 
> Me lol
> 
> On my own cards.


That's not evidence. That's you expecting us to take your word at face value and as the truth. That won't do... huh. So how do we resolve this, Mr. Kaapstad?


----------



## xer0h0ur

Its quite easy. You ignore the person providing no proof of his claims and believe the reviews showing actual results across a spectrum of games. Such as the review I have linked more times than I can count.


----------



## Kaapstad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its quite easy. You ignore the person providing no proof of his claims and believe the reviews showing actual results across a spectrum of games. Such as the review I have linked more times than I can count.


That maingear Video contained mostly synthetic benches from Unigine and Futuremark along with the Tomb Raider bench using overclocked cards.

It also contained the MLL bench which should never be used for more than 2 cards from either camp as there is no support.

If I post the benches from Futuremark and Unigine along with the Tomb Raider one run on my overclocked cards using identical PCs will you admit that the Maingear video was garbage ?


----------



## Mega Man

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its quite easy. You ignore the person providing no proof of his claims and believe the reviews showing actual results across a spectrum of games. Such as the review I have linked more times than I can count.


Or you can do this.
Quote:


> Originally Posted by *Kaapstad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *localh85*
> 
> [source required]
> 
> 
> 
> Me lol
> 
> On my own cards.
Click to expand...

See my 8 way. Cfx owns all Nvidia can do. Source:"me lol"


----------



## xer0h0ur

Quote:


> Originally Posted by *Kaapstad*
> 
> That maingear Video contained mostly synthetic benches from Unigine and Futuremark along with the Tomb Raider bench using overclocked cards.
> 
> It also contained the MLL bench which should never be used for more than 2 cards from either camp as there is no support.
> 
> If I post the benches from Futuremark and Unigine along with the Tomb Raider one run on my overclocked cards using identical PCs will you admit that the Maingear video was garbage ?


Obviously you have no idea I am talking about DGLee's benchmarking which was done on Titan X, 980 Ti and Fury X across a lot of games and resolutions.....


----------



## Ceadderman

Quote:


> Originally Posted by *Kaapstad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *localh85*
> 
> There is noticeable difference to the effect that Crossfire outperforms SLI in every instance.
> 
> gpuscale.jpg 245k .jpg file
> 
> 
> 
> 
> It is very easy to cherry pick a few games to prove a point lol.
> 
> It would be very easy for me to bench a load of mainstream games and the summary would give it to the NVidia cards. All I would have to do is include The Witcher 3 and a few of the Total War games where 4 way support on AMD cards is practically non existent.
Click to expand...

isn't that what reviewers do? Cherrypick benches to show their hardware being reviewed in a favorable light? They are after all for the most part "sponsored" reviewers. And you see A LOT more of it coming from nVidia reviewers ftmp.









~Ceadder


----------



## Kaapstad

Quote:


> Originally Posted by *xer0h0ur*
> 
> Obviously you have no idea I am talking about DGLee's benchmarking which was done on Titan X, 980 Ti and Fury X across a lot of games and resolutions.....


The Maingear Video was posted up as proof where as in fact they got just about everything wrong and I just offered to prove it.

This is why I don't like posting benchmarks in owners threads.

I have also done a lot of testing along with other people on other forums in a wide variety of games.

I won't be posting any benchmark scores here as it would be pointless.

The other thing I find laughable is all this started over a simple question someone asked about which brand scaled better 4 up. I gave an honest and accurate answer and when people found it did not suit them they made huge fuss about it.

If you are not prepared for the answer don't ask the question.

quote name="Ceadderman" url="/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/4170#post_24376616"]
isn't that what reviewers do? Cherrypick benches to show their hardware being reviewed in a favorable light? They are after all for the most part "sponsored" reviewers. And you see A LOT more of it coming from nVidia reviewers ftmp.









~Ceadder







[/quote]

+1

With the wide variety of possible games and benches to use it is almost impossible to figure out which is better. With the swings and rounabouts you get with this it appears very close between the brands when it comes to 4 way scaling and all you can say is it is about even.

Even when one brand comes out on top in a game it does not mean that it is worth using that setup for that game. For example in Shadows of Mordor Middle Earth (using max settings at 4k) the Fury Xs do scale better and get higher fps than the TitanXs but it is a total stutterfest and dreadful to look at, the TitanXs get slightly lower fps but the minimums are far better and the game runs very smooth.


----------



## Randomdude

But Kaapstad, look at it like this, even though you say you've done this and that, the facts people see are - there are reviews with numbers, there is you saying things without any such "numbers" (i.e. factual proof) to back it up. Therefore you have no answers (numbers) to give...


----------



## Kaapstad

Quote:


> Originally Posted by *Randomdude*
> 
> But Kaapstad, look at it like this, even though you say you've done this and that, the facts people see are - there are reviews with numbers, there is you saying things without any such "numbers" (i.e. factual proof) to back it up. Therefore you have no answers (numbers) to give...


I offered to post some proof above to show where the Maingear video was garbage but then people avoided the issue.


----------



## POOTYTANGASAUR

I am a 980ti owner. I'm not here to crash the thread or anything. I check up constantly to see if amd has unlocked voltage control, no luck yet hahaha. I want to see how the fury X compares when it is at its best, not gimped with stock volts. That isn't the reason I am posting though. I am posting because of Kaapstad. I mean its not like people lie on the internet or anything but I wouldn't mind if he at least provided some pics of his rigs, just so nobody can doubt his experience lol. Also like others have commented recently in this thread. From the benchmarks I have seen online AMD scaling seems far better past 2 gpus. I have never Xfired or SLIed tho so I am not speaking from experience.


----------



## Kana-Maru

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> I am a 980ti owner. I'm not here to crash the thread or anything. I check up constantly to see if amd has unlocked voltage control, no luck yet hahaha. I want to see how the fury X compares when it is at its best, not gimped with stock volts.


Smh. The card isn't gimped. What are you talking about. From my test the card performs very good. Good enough to justify my purchase over the 980 TI Hybrid I wanted. I'm not using the most up to date equipment, but overall the 1440p & 4K scores are impressive.

I know and understand that you can overclock the GTX 980 Baby Titan Ti well over stock settings, but I have no need to increase my power usage, increase heat and pound my card into the ground with high OCs. I don't need a e-peen on the same super old benchmark programs. I'm surprised how well stock settings are performing with DX11 games. Nvidia drivers are much more optimized for DX11 than AMD. AMD on the other hand might have a upper hand with their parallel structure\GCN\low level API that helped craft DX12 and Vulcan.


----------



## POOTYTANGASAUR

Disregard unless you are Kana-Maru









Spoiler: Warning: Spoiler!



@Kana-Maru - Woah bud. I wasn't bashing anyones purchase of the Fury X. But the performance delta between the cards atm is undeniable. IMO the fury X could close the gap with overclocking, which isn't possible atm because no real voltage control. TBH I will potentially be purchasing a Fury X, early benchmarks are showing AMD crushing in DX12. If more games/benchmarks indicate that in the future then I will most definitely be getting one. Also I don't OC for the epeen, its nice I guess but I play @1440p so I want all the perf I can get from a single card. When I built this system the 980ti was the clear performance leader, if the Fury X eats that gap I will gladly return the more expensive green team card for one.


----------



## Kana-Maru

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> @Kana-Maru - Woah bud. I wasn't bashing anyones purchase of the Fury X.


" I check up constantly to see if amd has unlocked voltage control, no luck yet hahaha."

That seemed like trolling to me haha.
Quote:


> But the performance delta between the cards atm is undeniable.


If you are comparing the 3rd party OVERCLOCKED cards to a STOCK Fury X then yeah you are comparing apples to oranges. I've mentioned that the 980 Ti overclocks better. Yes AMD is doing better with the DX12 draw calls and the AoS becnchmark at the moment.

Getting the most performance is important, but not when you have to spend a arm and a leg to get it. Then once you get it you'll have to increase the heat and power consumption. I was playing Batman: AK and my Fury X never went above 32C. I was getting upwards towards 100-140fps @ 1080p. I beat the game. Great game, just buggy. The FPS can dip to 70fps and can jump to 140fps in the same area. I hope they fix the game soon. Anyways @ 1440p and 4K the Fury X has great performance.


----------



## rdr09

Quote:


> Originally Posted by *Kaapstad*
> 
> I offered to post some proof above to show where the Maingear video was garbage but then people avoided the issue.


Post away already.

Including oc values.


----------



## POOTYTANGASAUR

Disregard unless you are Kana-Maru









Spoiler: Warning: Spoiler!



I don't think you understand what I mean. I will overclock any card I purchase, simply because I will gain performance. If I were to purchase a Fury X I would want to overclock it, which atm it really can't do. SO I really cannot compare OC to OC so I am forced to compare a heavily OCed card to a barely OCed card. That is why I am checking constantly to see if voltage control is unlocked and seeing peoples results with OCed Fury X. Also a note the 980ti isn't a very hot card, the air cooled ones are no hotter than my r9 290 vapor X (heavily OCed it rarely hit 85C with a good fan curve). Especially since I have the hybrid, heat is a non issue, power consumption is also a non-issue to me because electricity is cheap lol. I just want maximum performance. Also TBH if you are already spending 550+ for a graphics card I don't think it would hurt many people to spend another 200 to get the absolute best performer on the market. If the Fury X clocks well with voltage the story will change. Lastly no trolling, I just check up weekly to see if they released drivers for voltage control, just like I waited for 290 drivers. It is just a thing with new graphics cards, just seems it is taking forever with the Fury and Fury X.

TL;DR - I am not trolling. I just want the best card around within reason (Titan X is stupid IMO). At the time I purchased the 980ti it was the king. That has the possibility of changing with voltage control on the Fury X. I don't think it will overtake the 980ti (on DX11) but it should be able to match it. At which point the Fury X would be the better buy because Nvidias DX12 performance looks disappointing so far.

EDIT: A hybrid 980ti is stock board just with AIO, I would say it is best apples to apples for 980ti VS Fury X. Both water cooled, both stock board. I would almost say air coolers on the 980ti doesn't make a good comparison because liquid can get generally 25+ more mhz. Most 980ti air coolers don't often break 1500mhz, AIO 980tis almost always can unless its a bad chip. This is my last comment so I don't push this thread off topic more than I already have lol.


----------



## Orthello

Has anyone seen AOS benchmarked with Fury or Fury X , i have only seen a 980 (getting beaten by a 390x in heavy scenes) and a 980ti (sometimes been beaten by a 290x in 4k). No overclocks in there but still it looks promising in this game for AMD.

I want to see what happens with a Fury X in this game .. does it then beat a 980ti by a wide margin or seing as the ACEs are only mildly improved from prior cards is it only a modest increase from 290x/390x level.

Guru 3d are going to run this bench on the Nano soon .. going to be interesting for sure.


----------



## Kaapstad

Quote:


> Originally Posted by *rdr09*
> 
> Post away already.
> 
> Including oc values.


I will post some more later when I get time providing people don't get upset in this thread.

Here is one Maingear got very wrong. I am using max settings @2160p which is higher than they used yet I still get a higher score than they did with the TitanXs.

4 x Fury X @1140/500 stock volts
5960X @4.5
2160p
15.7.1 drivers



4 x TitanX @1441/1977
5960X @4.5
2160p
352.86 drivers



What is a bit unfair is the Fury X does not have voltage control but even if it did it would not be enough to close the gap on most of the synthetic benches. Fortunately the AMD cards do a bit better in games.

Tomb Raider Max settings
5960X @4.0
4x Fury X @1130/500
15.7.1
2160P










2160p
4 x TitanX @1405/2002
5960X @4.0
347.88










I will post some more later.


----------



## rdr09

Quote:


> Originally Posted by *Kaapstad*
> 
> I will post some more later when I get time providing people don't get upset in this thread.
> 
> Here is one Maingear got very wrong. I am using max settings @2160p which is higher than they used yet I still get a higher score than they did with the TitanXs.
> 
> 4 x Fury X @1140/500 stock volts
> 5960X @4.5
> 2160p
> 15.7.1 drivers
> 
> 
> 
> 4 x TitanX @1441/1977
> 5960X @4.5
> 2160p
> 352.86 drivers
> 
> 
> 
> What is a bit unfair is the Fury X does not have voltage control but even if it did it would not be enough to close the gap on most of the synthetic benches. Fortunately the AMD cards do a bit better in games.
> 
> Tomb Raider Max settings
> 5960X @4.0
> 4x Fury X @1130/500
> 15.7.1
> 2160P
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2160p
> 4 x TitanX @1405/2002
> 5960X @4.0
> 347.88
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will post some more later.


Thanks. +rep.

1600 boost lol. i read 347 does not suffer TDR.


----------



## mRYellow

These benches are invalid. You need to use 15.8








Jokes!


----------



## Kaapstad

Quote:


> Originally Posted by *mRYellow*
> 
> These benches are invalid. You need to use 15.8
> 
> 
> 
> 
> 
> 
> 
> 
> Jokes!


Just removed 15.8 from my system as they are not as good as 15.7.1 for quadfire.

Just got into a benching match on another forum with my mate about whose Fury Xs are faster on Tomb Raider and just managed to edge him out.









Here is my updated score using 15.7.1


----------



## Alastair

@hyp36rmax How come we don't have a comprehensive list of owners on the OP yet?


----------



## Kana-Maru

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> EDIT: A hybrid 980ti is stock board just with AIO, I would say it is best apples to apples for 980ti VS Fury X. Both water cooled, both stock board. I would almost say air coolers on the 980ti doesn't make a good comparison because liquid can get generally 25+ more mhz. Most 980ti air coolers don't often break 1500mhz, AIO 980tis almost always can unless its a bad chip. This is my last comment so I don't push this thread off topic more than I already have lol.


No it wouldn't be apples to apples. 980 Ti reference is a air cooler and heat was one of the biggest issues that Nvidia fanboys cried about with nearly every AMD GPU over the past couple of years.. The 980 Ti Hybrid is still 3rd party aftermarket GPU with a higher price [$749.99-$800.00 from what I've seen]. I've compared my Fury X vs a reference 980 Ti @ stock and overclock settings using synthetic benchmarks for starters. At stock the Fury X won 3 out of the 5 benchmarks [DX11]. Overclocked the 980 Ti lead the test. I still have more test to complete, but the high overclock doesn't necessarily mean a ton of performance over the Fury X lower OC.

For instance the higher clocked Reference 980 Ti @ 1477Mhz vs the Reference Fury X @ 1125Mhz performance difference in FireStrike [Performance] was a 10.6% difference Total Score. "The Graphic Score" was only 8.4% in favor of Nvidia . So even though the 980 Ti is overclocked roughly 31% higher than the Fury X OC, it doesn't mean that 980 Ti crushes the Fury X. 10.6% lead will easily be lowed with better drivers.

Now if you want to read my review you can check it out. Here are two pages for Fury X vs 980 Ti stock and overclock:
Stock:
http://www.overclock-and-game.com/hardware/computer-tech-reviews/40-amd-fury-x-review?showall=&start=8

Overclock:
http://www.overclock-and-game.com/hardware/computer-tech-reviews/40-amd-fury-x-review?showall=&start=9

I'll be updating the overclock setting soon. It takes some time to get the data and type everything up. Just remember I'm using a X58+6 core-DDR3-1675Mhz+PCie 2.0 against a X99+ i7 [email protected] GHz 8 core+DDR3-2133Mhz Pcie 3.0


----------



## POOTYTANGASAUR

Disregard unless you are Kana-Maru









Spoiler: Warning: Spoiler!



Well now I have to reply lol. Couldn't resist. A non ref 980ti can be had for $650 without a sale so why compare the reference cooler? I would never have compared a stock 290x/290 to a stock 780ti. Most people buy cards with aftermarket coolers, unless they are going to watercool their card. You could also get a stock 980ti and buy the hybrid cooler and the price would end up ~$680 (630 for reference 980ti and 50 for cooler). I am not a fanboy so I don't care what fanboys cried about. I am discussing the correct way to compare two graphics cards while trying to attain peak performance. It would make sense for both to be AIO cooled as it is a cheap and great solution for near custom watercooling performance. The price difference is so small that I would call it a non-issue. All this is about is performance, which atm the 980ti leads in generally by about 10% alot of games show even more. It is a shame there is no voltage adjustment for the fury X yet but until then we have to make purchases based off of current performance. Once voltage control comes around we will see how they stack up with both cards 100% balls to the walls. This is my last comment for realsies this time lol. I want to repeat I am not shaming anyone for purchasing a fury X I was really just saying I am excited for the fury x to get voltage control so I can see how these cards stack up.


----------



## xKrNMBoYx

I've always thought I'd go straight to Arctic Islands/Greenland but I'm now thinking of getting two Furys. Only dilemma is that a $100-200 more gets me 980 Ti SLI. But if both Furys unlock to full Fiji (a dream) then...well you know.


----------



## Kana-Maru

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> Well now I have to reply lol. Couldn't resist.


Haha I knew you would.








Quote:


> A non ref 980ti can be had for $650 without a sale so why compare the reference cooler?


Well AMD has nothing, but reference designs for all of their cards at the moment. I used Nvidia reference card which doesn't include a water cooled solution. If you want a looped water cooler then you'll have spend well over $650.00 plus tax depending on where you live. Smart move by AMD to include the water cooling solution they carried over from the 295x2

Quote:


> I would never have compared a stock 290x/290 to a stock 780ti. Most people buy cards with aftermarket coolers, unless they are going to watercool their card. You could also get a stock 980ti and buy the hybrid cooler and the price would end up ~$680


That's you and that's your choice. You are entitled to your opinions on how you would benchmark GPUs. Great. The Hybrid Cooler is $100 and the 980 Ti is $649.99 and higher. You'll be spending AT LEAST $749.99 which is the price of the 980 Ti Hybrid. So I'm not sure where you are pulling those prices from. Unless you are trying to include various rebates or something.

$100
http://www.newegg.com/Product/Product.aspx?Item=N82E16814998105

$749.99
http://www.newegg.com/Product/Product.aspx?Item=N82E16814487144

I'm not sure were you are getting $630 from either. The 980 Ti released @ $649.99 and most of the cards are near $700 now [$680 - Newegg]

Quote:


> I am not a fanboy so I don't care what fanboys cried about. I am discussing the correct way to compare two graphics cards while trying to attain peak performance. It would make sense for both to be AIO cooled as it is a cheap and great solution for near custom watercooling performance. The price difference is so small that I would call it a non-issue. All this is about is performance, which atm the 980ti leads in generally by about 10% alot of games show even more.


I never called you a fanboy, I simply stated what fanboys cried about. I see you took that personally. People have been unfair to AMD GPUs over the past few years and this is coming from a person who rock dual 670s for years and previous Nvidia technology. Seeing people compare Fury X to highly overclocked 980 Ti aftermarket cards was laughable. I chose to compare things my way by using both reference designs from the actual companies[AMD\Nvidia]. The price difference for water cooled solution is 15%. OF COURSE THE 980 TI LEADS in DX11. We all know Nvidia has the best optimized drives. Also OF COURSE NVIDIA LEADS with the higher overclock settings. The 980 Ti 30%+ overclocks over the stock core clock only translate to 5%-10% performance increases and even less if you include higher resolutions. Fury X holds it's ground well and with DX12 games coming from AAA companies I expect AMDs GCN to perform even better.

Quote:


> It is a shame there is no voltage adjustment for the fury X yet but until then we have to make purchases based off of current performance. Once voltage control comes around we will see how they stack up with both cards 100% balls to the walls. This is my last comment for realsies this time lol. I want to repeat I am not shaming anyone for purchasing a fury X I was really just saying I am excited for the fury x to get voltage control so I can see how these cards stack up.


There's nothing to shame IMO. The performance is what matters to me and the card is performing very well. Someone did indeed hack the voltage limitations on a Fury [hacked to Fury X] and overclocked the heck out of it with LN2. Got 1 TB/s on the HBM as well. You can look it up if you haven't heard about it already. I see that voltage is a make or break for you. You made your choice and went with the 980 Ti. I personally could careless if Fury X voltage is ever unlocked. As long as AMD continues to improve performance and keep the drivers optimized for the latest titles I'll be fine.


----------



## Jflisk

WOW just WOW - Does not matter if you have a 980 ti or Fury X if you have at least 2 there both going to chew through games or any thing else for that matter. Its not like we are discussing anything running below 60 FPS with either option. Its a close race either way from everything I have read at least.If you want cool and quite out of the box Fury X . If you want to do a water block or air 980TI. Either or If you add a water block there about the same price by the time your done for those of us with the water cooling addiction. There both great cards any way you look at it. Just my 2C


----------



## battleaxe

Quote:


> Originally Posted by *Jflisk*
> 
> WOW just WOW - Does not matter if you have a 980 ti or Fury X if you have at least 2 there both going to chew through games or any thing else for that matter. Its not like we are discussing anything running below 60 FPS with either option. Its a close race either way from everything I have read at least.If you want cool and quite out of the box Fury X . If you want to do a water block or air 980TI. Either or If you add a water block there about the same price by the time your done for those of us with the water cooling addiction. There both great cards any way you look at it. Just my 2C


Wow. Someone with some sense. Freaking awesome.


----------



## POOTYTANGASAUR

Disregard unless you are Kana-Maru









Spoiler: Warning: Spoiler!



I dont want to quote spam cuz it will make the post enormous. But I will reply in order to what you listed.

1. It is pretty difficult to not want to get the last word in an argument or discussion (for me at least) lol.

2. The Fury X is AIO cooled, IMO it is only fair for both cards to be AIO cooled then. Especially if AIO is all that AMD offers (its all they really need to offer, why run air if the stock is water lol)

3. On newegg and other websites stock 980ti can be had for 620-640 without a sale (if one happens to be on sale why not pick it up). I misprinted the hybrid cooler it is 100. So the end result would reasonably cost 730. $80 more but performs better in a majority of cases and runs just as cool. Only edge the Fury X has is size at that point.
Currently on newegg alone there are an asus, zotac, and an msi card listed for $650. Then there is another msi, and windforce for $660. So you get aftermarket aircoolers, some custom pcb, and more performance for same price atm.

4. I didn't think you called me a fanboy, I was just replying to this "No it wouldn't be apples to apples. 980 Ti reference is a air cooler and heat was one of the biggest issues that Nvidia fanboys cried about with nearly every AMD GPU over the past couple of years.. " I was just saying I don't care what that group of people thinks because I am not one of them haha. I had no issues whatsoever with the heat of the hawaii cards. Aftermarket coolers like the sapphire could easily tame the beasts.

5. "There's nothing to shame IMO. The performance is what matters to me and the card is performing very well." (you were replying to me saying its a shame there is no voltage control yet)
Voltage control would allow for better performance so you should care at least a little lol. Also if you look at a majority of game benchmarks the 980ti beats the fury x by around 10% clock for clock. The 980ti being the only one able to OC atm gives it even more of an edge. Once Fury X gets that ability that will be one thing the 980ti will lose and hopefully the performance gap will close.

I am not arguing that the Fury X isn't a good purchase. I am simply saying for alot of people shopping in the price range the 980ti makes more sense atm. Voltage control and driver improvements will help support the Fury X's case in the future and that is what I look forward to. But atm as far as real world performance the 980ti is king, DX12 will probably be different in the future but there are barely any instances of DX12 in the wild atm. DX11 will take a long time to phase out aswell so for alot of people a DX11 card is all they need. If the Fury X can catch up to 980ti with OCs and drivers then one day it will be king lol. We will see.


----------



## Scorpion49

Welp. I sent my Fury to athlon micro for Sapphire RMA a week ago, it arrived there on Monday and I still haven't heard anything back from it. I can't decide if that is a good thing or a bad thing.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Welp. I sent my Fury to athlon micro for Sapphire RMA a week ago, it arrived there on Monday and I still haven't heard anything back from it. I can't decide if that is a good thing or a bad thing.


RMA always takes too long. I don't think I've ever gotten one back within two weeks or even heard anything within a week. Maybe recognition that they received it max.. I wouldn't worry yet.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Welp. I sent my Fury to athlon micro for Sapphire RMA a week ago, it arrived there on Monday and I still haven't heard anything back from it. I can't decide if that is a good thing or a bad thing.


Send an email to [email protected] and inquire on your status.


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> RMA always takes too long. I don't think I've ever gotten one back within two weeks or even heard anything within a week. Maybe recognition that they received it max.. I wouldn't worry yet.


Well, I've almost always had a notification that they received it which is all I wanted really, heck with MSI I've had a new card by the end of the week several times. Asus is the only one I know that will sit on a product for 6 months before deciding you can have it back.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I've almost always had a notification that they received it which is all I wanted really, heck with MSI I've had a new card by the end of the week several times. Asus is the only one I know that will sit on a product for 6 months before deciding you can have it back.


No joke. And it was two ASUS boards I was talking about too. LOL

MSI was a lot faster getting me a replacement 290x about a year ago as memory serves.


----------



## Randomdude

I've had nothing BUT the best experience with Sapphire RMA, throughout the 9 years I've used them. I wouldn't be too worried.

@kana-maru Nice to see someone with similar stance to mine around this thread.


----------



## Kana-Maru

Quote:


> Originally Posted by *Jflisk*
> 
> WOW just WOW - Does not matter if you have a 980 ti or Fury X if you have at least 2 there both going to chew through games or any thing else for that matter.


Everyone understands that both cards are great top of the line GPUs to own. That's not the issue. I was actually looking at a few 980 Ti's before switching to the Fury X.

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I dont want to quote spam cuz it will make the post enormous. But I will reply in order to what you listed.
> 
> 1. It is pretty difficult to not want to get the last word in an argument or discussion (for me at least) lol.
> 
> 2. The Fury X is AIO cooled, IMO it is only fair for both cards to be AIO cooled then. Especially if AIO is all that AMD offers (its all they really need to offer, why run air if the stock is water lol)
> 
> 3. On newegg and other websites stock 980ti can be had for 620-640 without a sale (if one happens to be on sale why not pick it up). I misprinted the hybrid cooler it is 100. So the end result would reasonably cost 730. $80 more but performs better in a majority of cases and runs just as cool. Only edge the Fury X has is size at that point.
> Currently on newegg alone there are an asus, zotac, and an msi card listed for $650. Then there is another msi, and windforce for $660. So you get aftermarket aircoolers, some custom pcb, and more performance for same price atm.
> 
> 4. I didn't think you called me a fanboy, I was just replying to this "No it wouldn't be apples to apples. 980 Ti reference is a air cooler and heat was one of the biggest issues that Nvidia fanboys cried about with nearly every AMD GPU over the past couple of years.. " I was just saying I don't care what that group of people thinks because I am not one of them haha. I had no issues whatsoever with the heat of the hawaii cards. Aftermarket coolers like the sapphire could easily tame the beasts.
> 
> 5. "There's nothing to shame IMO. The performance is what matters to me and the card is performing very well." (you were replying to me saying its a shame there is no voltage control yet)
> Voltage control would allow for better performance so you should care at least a little lol. Also if you look at a majority of game benchmarks the 980ti beats the fury x by around 10% clock for clock. The 980ti being the only one able to OC atm gives it even more of an edge. Once Fury X gets that ability that will be one thing the 980ti will lose and hopefully the performance gap will close.
> 
> I am not arguing that the Fury X isn't a good purchase. I am simply saying for alot of people shopping in the price range the 980ti makes more sense atm. Voltage control and driver improvements will help support the Fury X's case in the future and that is what I look forward to. But atm as far as real world performance the 980ti is king, DX12 will probably be different in the future but there are barely any instances of DX12 in the wild atm. DX11 will take a long time to phase out aswell so for alot of people a DX11 card is all they need. If the Fury X can catch up to 980ti with OCs and drivers then one day it will be king lol. We will see.


1. It's a discussion. An argument isn't worth it. Last words don't matter to me as long as the discussion is respectful.

2. What don't you understand about the word reference. 980 Ti reference is air cooler and has always been Nvidia's choice. AMD has moved towards H2O for various reasons.

3.Every 980 Ti I saw on Newegg this morning was listed above MSRP @ $679.99. I'll believe Amazon and Newegg. I was in the market for a 980 Ti and they all have been higher from 3rd parties as expected.

4. Just stating facts as to why AMD started using water coolers. People and reviewers complained about everything. I'm glad to see AMD tackle most of those issues.

5. Sigh. Nvidia has better optimized DX11 drivers. We will see how the 980 Ti and the Fury X battle over time as more DX12 \ Vulcan titles release. Even in DX11 AMD is known to increase their DX11 drivers over time. Don't be so short sighted. DX12 is being used by many AAA studios and games are on the way.

I understand what you are saying and I am saying that I was in the same boat with you. I was looking towards 980 Ti and waited for Fury X results. Every gamer is different. I wanted something that was cooler on the GPU and something that would perform well @ 1440p\1600p\4K. I'm sorry, but playing Batman: Arkham Knight with Nvidia GameWorks enabled while seeing more than 100fps-140fps [1080p - game is broken at the moment] and staying below 35c is simply amazing. So yes Fury X will benefit some gamers as well as 980 Ti. My point is that not everyone cares about punishing their cards with high overclocks and tons of heat. AMD has a history of supporting their GPUs longer and increasing their performance with mature drivers. I expect nothing less this time around.

Quote:


> Originally Posted by *Randomdude*
> 
> I've had nothing BUT the best experience with Sapphire RMA, throughout the 9 years I've used them. I wouldn't be too worried.
> 
> @kana-maru Nice to see someone with similar stance to mine around this thread.


Great. Good to see that their are others out there that doesn't 100% or mostly focus solely on overclocking potential.


----------



## Scorpion49

Quote:


> Originally Posted by *Kana-Maru*
> 
> Great. Good to see that their are others out there that doesn't 100% or mostly focus solely on overclocking potential.


A good Freesync monitor means I'm more interested in silence from the card than max OC, I could care less if I get 79fps or 71, it all looks the same to me


----------



## POOTYTANGASAUR

Disregard unless you are Kana-Maru









Spoiler: Warning: Spoiler!



I am not asking you to trust me over newegg lol.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127891&cm_re=980ti-_-14-127-891-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127902&cm_re=980ti-_-14-127-902-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127895&cm_re=980ti-_-14-127-895-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121979&cm_re=980ti-_-14-121-979-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125803&cm_re=980ti-_-14-125-803-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814500382&cm_re=980ti-_-14-500-382-_-Product

A total of 6 980tis (some reference board some not) currently listed on newegg for $660 or less (without sales). There are 2-3 more on sale for even less than that and 2 stock 980tis listed on sale for 620 and 630 atm.
Not sure what time you checked but it couldn't have been within 24 hours as those have been listed at those prices since at least yesterday morning. Amazon is a different story, all of its non ref are around $680 but there is also a reference listed for $620 there.

I understand what reference means lolol. But I do not believe stock coolers make for a fair comparison for reasons I have already stated. Just as a refresher I will restate a few. First, alot of people that buy reference/stock watercool their graphics card, add aio, or aftermarket coolers (many did all of the above with 290/290x). Second, why buy reference when better coolers are offered for $10-50 more lol, especially if you plan on overclocking and overvolting. Third, I would not compare the stock 980ti to stock Fury X because the Fury X is watercooled. In no way is that fair, especially when there are AIO coolers like the EVGA hybrid out there. If a year ago someone compared a hybrid cooled 780ti to a 290x everyone would **** themselves. Finally, since these are very high end cards alot of the people will be paying that extra money for better cooling so why show a comparison with one gimped by air vs aio cooling? I never payed attention to 290 vs 780 benchmarks because they were generally stock coolers (nvidias cooling solution being better at the time so it wasn't fair), I would really only look at things like sapphire tri-x vs asus or evga because both cards and both chips are able to stretch their legs with the cooling.

Another thing is nowadays cards are alot more reliable and OCing is alot more prevalent and safer than it used to be. Idk what you are talking about with this "not everyone cares about punishing their cards with high overclocks and tons of heat". Generally you only need to add like +200mv for high OCs which is completely safe. Also overclocks can't even damage the hardware it is the voltage that can which if kept under +300mv is pretty damn safe on pretty much all graphics cards nowadays and that type of voltage is totally fine with AIO and watercooling. I ran 1225, 1675 on my r9 290 with +200 mv for over a year and it chugs along just fine. GPUboost also plays a big role as you don't even have to touch it and it will achieve higher clocks, set a temp or power target and let it work. Alot of non reference 980tis can boost over 1300 stock lol.


----------



## Kana-Maru

Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> I am not asking you to trust me over newegg lol.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127891&cm_re=980ti-_-14-127-891-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127902&cm_re=980ti-_-14-127-902-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127895&cm_re=980ti-_-14-127-895-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121979&cm_re=980ti-_-14-121-979-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125803&cm_re=980ti-_-14-125-803-_-Product
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814500382&cm_re=980ti-_-14-500-382-_-Product
> 
> A total of 6 980tis (some reference board some not) currently listed on newegg for $660 or less (without sales). There are 2-3 more on sale for even less than that and 2 stock 980tis listed on sale for 620 and 630 atm.
> Not sure what time you checked but it couldn't have been within 24 hours as those have been listed at those prices since at least yesterday morning. Amazon is a different story, all of its non ref are around $680 but there is also a reference listed for $620 there.


Dude every link you posted was $649.99 to $659.99. The only card that was $620 required a $20.00 rebate [MSI]. I knew you meant rebate which why I stated rebate a few post ago. You claimed that you could get a GTX 980 Ti for $620-$630 or whatever you said early. Then you stated that the water cooled GTX 980 Ti could be purchased for under $700.00 [$680]. Wrong on both accounts. As I said the 980 Ti recently released and MSRP is $649.99- $1049.99 and possibly higher from re-sellers.

Alright here is a search link from Newegg I performed *30 seconds ago*
I'm not asking you to believe me either.

*GTX 980 Ti search*
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=GTX+980+Ti&N=-1&isNodeId=1

A simple search shows $679.99 for a lot of the Ti's. Or $649.99 - $699.99. A far from what you claimed earlier. Just in case that link wasn't enough here is a screen shot from Newegg search:

-Zoom Level Version
http://s26.postimg.org/9naeu7auh/Newegg_980_Ti_price.jpg

Quote:


> If a year ago someone compared a hybrid cooled 780ti to a 290x everyone would **** themselves.


Both the 780 Ti and 290X was air cooled reference designs. You are once again including a 3rd party manufacturer smh.

For the rest of your post I'd just like to say I understand overclocking. That's what you are focused on that's great. Not everyone cares. Get over it. I don't care about how reliable overclocking is because that's not the point of this discussion and all of that extra stuff can stay on the side lines. I've made my point.

Quote:


> Originally Posted by *Scorpion49*
> 
> A good Freesync monitor means I'm more interested in silence from the card than max OC, I could care less if I get 79fps or 71, it all looks the same to me


I agree. When it gets down to 3fps - 7fps who cares. Apparently many people do. I guess some people have sensitive eyes.


----------



## POOTYTANGASAUR

"As I said the 980 Ti recently released and MSRP is $649.99- $1049.99" I loled.

This will be my concrete final reply. I will only be debunking your oddly angry reply.


Spoiler: Warning: Spoiler!



I listed 6 980tis which I could easily find on newegg via scrolling. If you aren't willing to scroll down a web page to save yourself 50 bucks don't yell at me lololololololol.
You confirmed what I had already confirmed which was the fact I listed 6 980tis that could be had for 650-660.
The one that is listed for 620 on newegg does have a rebate, but it is 630 without the rebate so who cares LOL.
You pitch a fit that I misprinted saying the hybrid was 50 not 100, which I had already apologized for.
Then you whined about my example saying if someone compared an AIO cooled 780 vs a stock 290 people would cry because it wouldn't be fair lol. It was simply an example, I was saying comparing an AIO to an air cooled card is stupid I am sure alot of people agree with that.
Pretty much the entire point of the discussion was overvolting and overclocking from the start. It started when you were offended by me saying I was periodically checking in on fury x getting voltage control.
Also that screenshot you posted was funny as hell since 6 of the cards shown are for more than $660, but the other 6 are for $660 or less LOLOLOLOLOLOL. Are you trolling me? I said that non ref 980tis could be had for 660 or less and you are trying to prove me wrong, then post a picture proving me right hahahahahahahahaha.
Thanks for the mind numbing waste of time. This degraded from me saying I was excited about the future of the Fury X into a discussion with a confused idk what to consider you, I would say you are a fanboy that doesn't realize he is a fanboy. I buy the card that offers me maximum performance. At the time of purchase 980ti was it, in future I may pickup Fury X or arctic islands. But you straight argued with me over nothing, which makes me feel dumb for feeding you lolololol. Oh well, at least at the end I had a good laugh. Have a good day, have fun with the Fury X. I may join this club at some point depends what the future holds. To everyone else, sorry this took up so much space in the thread, just read around it.



EDIT: Modified this and my previous posts to minimize distractions within the thread. Content unmodified.


----------



## Kana-Maru

I'm not sure why you think I'm angry. I guess you feel that you know everything. Also no one is yelling, but the fact that you think I am means you are probably sensitive in life. I'm sorry I can't help you with that.
Quote:


> Originally Posted by *POOTYTANGASAUR*
> 
> "As I said the 980 Ti recently released and MSRP is $649.99- $1049.99" I loled.


Ok now I'm just going to say that you are trolling at this point. Everyone knows that 980 Ti's get expensive. EVGA is literally charging for ASIC now.

http://www.evga.com/articles/00944/EVGA-GeForce-GTX-980-Ti-KINGPIN/

GTX 980 Ti KingPin = *$849.99 - $1049.99*

Keep lol'ing. You are making yourself look and sound silly. I guess you really were out to troll since your first post.

Long story short everyone doesn't care about overclocking the heck out of their cards [35%] for a 5%-10% performance increase.


----------



## Nunzi

Long story short everyone doesn't care about overclocking the heck out of their cards [35%] for a 5%-10% performance increase.[/quote]

especially when you cant...........


----------



## Randomdude

I wouldn't bother with "lol'ing" people, honestly. Too disrespectful to hold a discussion with. Starting a conversation with someone is the same as going into another person's car.

#Nunzi Especially if you had read it he said he did not choose the Ti over the Fury because HE did not deem it (overclocking) important for HIS needs. Else he'd have picked the Ti, as he said which you I assume didn't read, but then you shouldn't butt in uninformed either way. Again: those are your needs, thus you picked a card that can overclock more easily. Good for you. Not the same needs as every other person on the planet however. Don't project your ideals onto others.

If you had a specific different reason to post that comment please by all means. I hope it's not the usual "Oh but I would have loved AMD to succeed but they failed because the card doesn't fulfill my specific needs, hence I bought nVidia out of lack of options" bull****.


----------



## Kana-Maru

Quote:


> Originally Posted by *Nunzi*
> 
> Quote:
> 
> 
> 
> Long story short everyone doesn't care about overclocking the heck out of their cards [35%] for a 5%-10% performance increase.
> 
> 
> 
> especially when you cant...........
Click to expand...

Ohhhhh I see what you did there. Well think of it like this:
Since gamers know they can't unlock the voltage and gamers KNOW that Fury X doesn't overclock that good, well why are they still buying the Fury X? The card doesn't stay in stock very long. If you don't understand that gamers already know that the Fury X voltage can't be unlocked by now then I don't know what to tell you.


----------



## rdr09

Quote:


> Originally Posted by *Kaapstad*
> 
> It is very easy to cherry pick a few games to prove a point lol.
> 
> It would be very easy for me to bench a load of mainstream games and the summary would give it to the NVidia cards. All I would have to do is include The Witcher 3 and a few of the Total War games where 4 way support on AMD cards is practically non existent.


told someone in the Titan X club to seek your help with 4 way sli.


----------



## WheelZ0713

Just wanted to share my best benchmark so far.

I unlocked 4 cu's and installed the new beta drivers. Then got this score with 1100 @ 560 with 4.5Ghz.

1100.560.PNG 243k .PNG file


----------



## mRYellow

What's a good OC on the HBM?


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> What's a good OC on the HBM?


Well 100mhz on the HBM is 25%. So I rate anything from. 50-100 Mhz is decent.


----------



## Kana-Maru

*Fury X Review Update:*

Since Batman: Arkham Knight was finally patched correctly on the 3rd I decided to benchmark the game again. The initial benchmarks weren't worth uploading since the game had some serious issues with a ton of GPUs. Thankfully the game is working much better for me now. After the patch I was seeing up to 130fps -140fps @ 1080p.

You can view my benchmark here:
http://www.overclock-and-game.com/hardware/computer-tech-reviews/40-amd-fury-x-review?showall=&start=6


----------



## xer0h0ur

I heard people complaining about Crossfire flashing textures though with the most recent game patch/ AMD driver. With Arkham Knight of course.


----------



## Otterfluff

Quote:


> Originally Posted by *Alastair*
> 
> No blocks for Strix sorry man. EK has no plans at this time to produce a block for the Fury Strix.


Ive changed my plan, canceled the ek fury x blocks and going to get a Asus Strix. I will test my heatkiller IV block on the Strix for size from my cpu while switching to intel stock cooler on the cpu. If all fits well ill be using a cheap quick disconnect and start some testing with the custom bios.

Long term plan would to do the volt mods and cut the stock Strix cooler in half with a dremel to cool the vrm actively. It should still fit one of the fans over it. Planning to have the pots wired into a breadboard with a dip switch to select between the four voltage readings and a lcd volt meter.

I think the HBM overclocking shows far more potential than the core overclocks and the only way i am going to be able to tackle that is by having the ability to volt mod the memory voltage.


----------



## xer0h0ur

Quote:


> Originally Posted by *Otterfluff*
> 
> Ive changed my plan, canceled the ek fury x blocks and going to get a Asus Strix. I will test my heatkiller IV block on the Strix for size from my cpu while switching to intel stock cooler on the cpu. If all fits well ill be using a cheap quick disconnect and start some testing with the custom bios.
> 
> Long term plan would to do the volt mods and cut the stock Strix cooler in half with a dremel to cool the vrm actively. It should still fit one of the fans over it. Planning to have the pots wired into a breadboard with a dip switch to select between the four voltage readings and a lcd volt meter.
> 
> I think the HBM overclocking shows far more potential than the core overclocks and the only way i am going to be able to tackle that is by having the ability to volt mod the memory voltage.


Do come back with pictures and your results. I have only seen one other person actually mod their Strix similarly.


----------



## Otterfluff

Quote:


> Originally Posted by *xer0h0ur*
> 
> Do come back with pictures and your results. I have only seen one other person actually mod their Strix similarly.


I will defiantly post back with pictures but It will likely be two weeks until I get the strix in the mail and have free days off work.


----------



## battleaxe

Quote:


> Originally Posted by *Otterfluff*
> 
> I will defiantly post back with pictures but It will likely be two weeks until I get the strix in the mail and have free days off work.


Are there any posts showing this process of cutting the board apart and putting them under ice from start to finish? I find this whole thing fascinating, I'd like to try my hand at it sometime once I learn more. Would be cool to see a write up and process for how someone got their results and created the card to run showing their steps. Seems so much like weird science to me...


----------



## aznguyen316

Hello, Got a Sapphire R9 Tri-X Fury and unlocked all the shaders and have been gaming fine, no issues the past day. Very happy. Detailed some stuff here:

http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool/500_50#post_24380910

Anyway, glad to be an owner.

I haven't bothered touching the memory yet for OC, any real gaming performance gains with it?


----------



## Otterfluff

Quote:


> Originally Posted by *battleaxe*
> 
> Are there any posts showing this process of cutting the board apart and putting them under ice from start to finish? I find this whole thing fascinating, I'd like to try my hand at it sometime once I learn more. Would be cool to see a write up and process for how someone got their results and created the card to run showing their steps. Seems so much like weird science to me...


The guy here "Xtreme Addict" does it with ln2. http://forum.hwbot.org/showthread.php?t=142320

He also provided the modified bios for the Asus strix.

I think that the asus strix design did not change much from their other models ie 290x and GTX versions so the volt mod process was transferable.

ln2 does not really interest me but I am more interested in what a strong custom loop can achieve.

Alphacool GPX solo looks like a nice option for the strix, cheap and flexible mounting for a water block: http://modmymods.com/alphacool-nexxxos-gpx-solo-xbox-compatible-black.html

They might even come out with a plate for the strix later?


----------



## Otterfluff

Quote:


> Originally Posted by *aznguyen316*
> 
> Hello, Got a Sapphire R9 Tri-X Fury and unlocked all the shaders and have been gaming fine, no issues the past day. Very happy. Detailed some stuff here:
> 
> http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool/500_50#post_24380910
> 
> Anyway, glad to be an owner.
> 
> I haven't bothered touching the memory yet for OC, any real gaming performance gains with it?


The HBM overclocks have shown to have a stronger relative performance gain compared to gpu core increases. Getting a large OC on the HBM has not really been achieved outside of ln2, but then again no one else who has had the mods to increase the voltage for memory via a volt modification has done any testing with the gpu under water yet.


----------



## battleaxe

Quote:


> Originally Posted by *Otterfluff*
> 
> The guy here "Xtreme Addict" does it with ln2. http://forum.hwbot.org/showthread.php?t=142320
> 
> He also provided the modified bios for the Asus strix.
> 
> I think that the asus strix design did not change much from their other models ie 290x and GTX versions so the volt mod process was transferable.
> 
> ln2 does not really interest me but I am more interested in what a strong custom loop can achieve.
> 
> Alphacool GPX solo looks like a nice option for the strix, cheap and flexible mounting for a water block: http://modmymods.com/alphacool-nexxxos-gpx-solo-xbox-compatible-black.html
> 
> They might even come out with a plate for the strix later?


Awesome! +1 sir


----------



## Jflisk

Quote:


> Originally Posted by *Otterfluff*
> 
> The guy here "Xtreme Addict" does it with ln2. http://forum.hwbot.org/showthread.php?t=142320
> 
> He also provided the modified bios for the Asus strix.
> 
> I think that the asus strix design did not change much from their other models ie 290x and GTX versions so the volt mod process was transferable.
> 
> ln2 does not really interest me but I am more interested in what a strong custom loop can achieve.
> 
> Alphacool GPX solo looks like a nice option for the strix, cheap and flexible mounting for a water block: http://modmymods.com/alphacool-nexxxos-gpx-solo-xbox-compatible-black.html
> 
> They might even come out with a plate for the strix later?


You might want to take a look at that die. looks like it will cool a diamond shape GPU die not a square GPU die. Looks like all cooling goes to the middle of the diamond shape.


----------



## Otterfluff

Quote:


> Originally Posted by *Jflisk*
> 
> You might want to take a look at that die. looks like it will cool a diamond shape GPU die not a square GPU die. Looks like all cooling goes to the middle of the diamond shape.


I agree your right. I will see if this heatkiller IV fits first and go from there.


----------



## aznguyen316

Quote:


> Originally Posted by *Otterfluff*
> 
> The HBM overclocks have shown to have a stronger relative performance gain compared to gpu core increases. Getting a large OC on the HBM has not really been achieved outside of ln2, but then again no one else who has had the mods to increase the voltage for memory via a volt modification has done any testing with the gpu under water yet.


So I went ahead and OC the memory and effective was around 1135 Mhz max I think. I also OC the core a tad more. 1095 seems to be crash so kept testing and 1093Mhz seems okay lol. So I think I got my highest achievable clocks and score for now until we get voltage. I'm not bothering to game with these clocks but just for benchmark runs.

Sapphire Tri-X Fury fully unlocked shaders with 3dmark 1080p - graphics score 17,052. Goal was to break 17k on graphics and I did it yay.

1093Mhz core with 568Mhz memory clock
http://www.3dmark.com/fs/5923385

I may test with valley benchmark too for comparisons.


----------



## weinstein888

Any news on voltage unlocking the Fury X? I'm sure I would have heard about it had there been, but I'm just checking in...


----------



## ozyo

Quote:


> Originally Posted by *weinstein888*
> 
> Any news on voltage unlocking the Fury X? I'm sure I would have heard about it had there been, but I'm just checking in...


nothing yet


----------



## Scorpion49

Man I can't wait for the Fury to come back from RMA. Trying to play DA;I on a 380 at 1440p even with freesync I'm dropping out of the 40hz lower end of the window during combat on medium settings. It still blows my mind sometimes how fast GPU's are these days, this 380 that is barely cutting it for me is about as fast as the top-end 580 SLI I was running not that long ago.


----------



## You Mirin

So is there just one revision on the fury x? Mine is a updated one and it sure does make a noise...........

Was also curious on how it should sound like, but all the videos on the fury x is just pump whine ones lol.


----------



## drm8627

hey guys do you know if the fury x/fury/nano will scale as well as the 290x has with its new drivers for dx12 due to their usage of asynchronous compute? has there been any word about it? have they not gotten around to the new drivers for dx12 on fury x/fury/nano yet?


----------



## kitg90

Has anyone removed the heatsink from the tri x fury?

I'm having issues cause the heatsink just won't come off


----------



## Cool Mike

Anyone have any ideas on when the Fury X2 (Dual GPU) will be released?

Have seen indications of October or November but nothing offical.


----------



## Alastair

Quote:


> Originally Posted by *kitg90*
> 
> Has anyone removed the heatsink from the tri x fury?
> 
> I'm having issues cause the heatsink just won't come off


haven't tried yet. Waiting for my EK blocks first.


----------



## Ceadderman

Probability of errant screw?









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Cool Mike*
> 
> Anyone have any ideas on when the Fury X2 (Dual GPU) will be released?
> 
> Have seen indications of October or November but nothing offical.


The only official statement I saw was stating a season. They have never said a month for it.


----------



## Hemicrusher

New member.....Sapphire Fury Tri-X

Great card!


----------



## xer0h0ur

I'll just leave this R9 Nano porn here:













http://wccftech.com/custom-sff-radeon-r9-nano-rig-packs-10-tflops-performance-xeon-e52699-v3-18-core-processor/

Also something which shouldn't be much of a surprise but its good to know:

http://wccftech.com/amd-radeon-r9-nano-crossfired-radeon-r9-fury-fiji-powered-cards-air-liquid-cooling-combo/


----------



## ENTERPRISE

Quote:


> Originally Posted by *xer0h0ur*
> 
> I'll just leave this R9 Nano porn here:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://wccftech.com/custom-sff-radeon-r9-nano-rig-packs-10-tflops-performance-xeon-e52699-v3-18-core-processor/
> 
> Also something which shouldn't be much of a surprise but its good to know:
> 
> http://wccftech.com/amd-radeon-r9-nano-crossfired-radeon-r9-fury-fiji-powered-cards-air-liquid-cooling-combo/


Nice !

I really really want to get to October as I am hoping for news on the Fury X2....Time just goes too slow when you are waiting for more news lol.


----------



## GorillaSceptre

Where the heck is voltage control.. Slackers









With the massive cooling headroom and beefy power delivery, i don't see why it wouldn't clock well.

From what I've read the unlocked voltage had some kind of bug that caused throttling, so we still haven't seen what this thing can do.

"Overclockers dream" might still be a reality









But over 2 months is getting a bit ridiculous..


----------



## Orthello

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Where the heck is voltage control.. Slackers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the massive cooling headroom and beefy power delivery, i don't see why it wouldn't clock well.
> 
> From what I've read the unlocked voltage had some kind of bug that caused throttling, so we still haven't seen what this thing can do.
> 
> "Overclockers dream" might still be a reality
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But over 2 months is getting a bit ridiculous..


Yeah it seems really odd - unless it just is difficult to implement. Possibly a new bios with increased power limits would be required etc - that's just a guess. Still its disappointing alright that there is still no voltage control.


----------



## aznguyen316

Ended up modding a Tri-X OC Fury bios for _all shaders and flashed it to my stock Fury, allowed my load voltage to increase to Fury X voltage of 1.21 but haven't tested if it OC's further than what I had before. Still stable!


----------



## drm8627

has anyone found any information on the fury x asynchronous compute performance?


----------



## mRYellow

Quote:


> Originally Posted by *aznguyen316*
> 
> Ended up modding a Tri-X OC Fury bios for _all shaders and flashed it to my stock Fury, allowed my load voltage to increase to Fury X voltage of 1.21 but haven't tested if it OC's further than what I had before. Still stable!


Would you mind sharing how you modified the bios to increase the voltage?

Unwinder still doesn't have a a Fury card. No card, not no voltage control on the horizon.

Official word from Unwinder
Quote:


> Nope. I've already posted a few times that I cannot do anything without a card here. Other vendors still cannot provide working software with voltage control for Fiji even after a few months of work with card, it is pointless to expect me to magically get it working distantly without seeing hardware.
> Truth is sad: MSI don't see Fiji cards as good selling product at all. So maybe I'll get a card one day when Fiji sales will be important for them. Sample arrival date is unknown. And even when it arrive, it will take a while to get voltage control implemented.
> 
> Alexey Nicolaychuk aka Unwinder, RivaTuner creator


http://forums.guru3d.com/showpost.php?p=5156101&postcount=47

Anyone care sponsoring him a card?


----------



## You Mirin

Quote:


> Truth is sad: MSI don't see Fiji cards as good selling product at all.












I wonder why AMD doesn't just send him one already.......


----------



## Semel

Quote:


> Originally Posted by *GorillaSceptre*
> 
> With the massive cooling headroom and beefy power delivery, i don't see why it wouldn't clock well. ..


Really?
https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/

Sadly it's not the case.


----------



## SpeedyVT

Quote:


> Originally Posted by *Semel*
> 
> Really?
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/
> 
> Sadly it's not the case.


It's a non-official overvoltage. Some people have even used the LN2 unofficial bios.

Speculation is that previous and current revision of the fury has physical overhead issues. The speculation is driven by the Nano.

Although power consumptions raises non-linearly. It's plausible that it's just how the power scaling works with this design. Using different types of VRMs can alter this experience too, that was also speculated.


----------



## Alastair

Quote:


> Originally Posted by *Semel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GorillaSceptre*
> 
> With the massive cooling headroom and beefy power delivery, i don't see why it wouldn't clock well. ..
> 
> 
> 
> Really?
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/
> 
> Sadly it's not the case.
Click to expand...

it's a bogus post. Something was clearly wrong with the developers software. As in a properly designed software would not cause severe negative scaling as seen that write up. Any body who DIDNT take that article with a bucket load of salt is clearly gullible.

Let's belive one man's write up, who clearly rushed to be the first with "overvoltage" support who only briefly tested the effects on one game. Sounds legit to me.


----------



## Semel

A bogus post from Wizard? LOL..

Well, whatever makes you happy


----------



## mRYellow

Quote:


> Originally Posted by *Semel*
> 
> A bogus post from Wizard? LOL..
> 
> Well, whatever makes you happy


Wizzard is very credible but that post is old and nothing has filtered down into any software.
So i too have my doubts why this has been implemented yet.


----------



## SpeedyVT

Quote:


> Originally Posted by *mRYellow*
> 
> Wizzard is very credible but that post is old and nothing has filtered down into any software.
> So i too have my doubts why this has been implemented yet.


Quote:


> Originally Posted by *Semel*
> 
> A bogus post from Wizard? LOL..
> 
> Well, whatever makes you happy


Wizzard is pretty legit, but he himself said it was unofficial stuff done to do this. He did a great job documenting his own person exploitation of hardware which is fantastic, but reverse engineering the tech takes time and we won't see something proper from Wizzard for a while. The key word is proper.


----------



## You Mirin

Wizzard mentioned it somewhere that he has submitted it to sapphire awhile ago.

AMDmatt said there was a small bug on trixx that should be fixed and released soon™ about two weeks(or more?) ago.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> Really?
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/
> 
> Sadly it's not the case.


You do realize W1zzard is the founder of techpowerup right? A site that now straight up publishes Nvidia propaganda along the lines of "It's Now Been Over 160 Days Since a Catalyst WHQL Release", or you know, publishes the results of "testing" his unfinished, bugged, overvoltage software as if definitive. I really trust those guys


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> You do realize W1zzard is the founder of techpowerup right? A site that now straight up publishes Nvidia propaganda along the lines of "It's Now Been Over 160 Days Since a Catalyst WHQL Release", or you know, publishes the results of "testing" his unfinished, bugged, overvoltage software as if definitive. I really trust those guys


They also just posted up a big rant about not getting a NANO to test out....

Hmmm, wonder why?


----------



## You Mirin

Yeah guy that does stuff for trixx is obvious biased for nvidia.

AMD being selective on review samples.......
AMD sending RTP member(s?) with social media the nano......
AMD not putting any effort to release unlock voltages for fiji.......

Hmmm, wonder why?


----------



## aznguyen316

Quote:


> Originally Posted by *mRYellow*
> 
> Would you mind sharing how you modified the bios to increase the voltage?


Hey sorry to mislead you, I did not modify for voltage, it's just how the bios comes. The Tri-X Fury and the Tri-X Fury OC have slightly different load voltage from the bios, I learned this after reading through Anandtech's review again after getting the card. Sadly 24mV didn't help get any higher clocks for me lol, but it did give me stock 1040Mhz so I don't have to set anything.

They mention it too when they say Sapphire sent them stock bios as well regarding voltage.

http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17


----------



## Alastair

Quote:


> Originally Posted by *You Mirin*
> 
> Yeah guy that does stuff for trixx is obvious biased for nvidia.
> 
> AMD being selective on review samples.......
> AMD sending RTP member(s?) with social media the nano......
> AMD not putting any effort to release unlock voltages for fiji.......
> 
> Hmmm, wonder why?


Are you gonna be productive to the thread. Or are you just here to troll?

Do you even own a Fury? There is the door. Bye


----------



## xer0h0ur

Quote:


> Originally Posted by *You Mirin*
> 
> Yeah guy that does stuff for trixx is obvious biased for nvidia.
> 
> AMD being selective on review samples.......
> AMD sending RTP member(s?) with social media the nano......
> AMD not putting any effort to release unlock voltages for fiji.......
> 
> Hmmm, wonder why?


AMD is well within their right to refuse samples to whomever they want. Particularly to Nvidia shill sites. They also baited people they sent Fury X samples to by using different batches of radiator fans on their review samples so that they knew who was leaking pictures and information. You're likely to not get any more samples if you leaked info or pictures of Fury X before release. Or if you say, dropped and damaged your review sample before even reviewing it.

As far as I can tell, neither AMD nor Nvidia actively put any effort into voltage modification software so I have no idea what you're going on about there.


----------



## You Mirin

Quote:


> Originally Posted by *Alastair*
> 
> Are you gonna be productive to the thread. Or are you just here to troll?
> 
> Do you even own a Fury? There is the door. Bye


Yeah I'm just trolling, because I "reused" another comment against AMD.









And no, I do not even own a fury...........but I do own a fury x that seems to have the pump noise though (EEEEE sup Kaapstad). Yeah, it was a recent purchase and it is the updated revision.

Now do you even own fury? There is the door. Bye








Quote:


> Originally Posted by *xer0h0ur*
> 
> AMD is well within their right to refuse samples to whomever they want. Particularly to Nvidia shill sites. They also baited people they sent Fury X samples to by using different batches of radiator fans on their review samples so that they knew who was leaking pictures and information. You're likely to not get any more samples if you leaked info or pictures of Fury X before release. Or if you say, dropped and damaged your review sample before even reviewing it.
> 
> As far as I can tell, neither AMD nor Nvidia actively put any effort into voltage modification software so I have no idea what you're going on about there.


I don't see TPU as biased at all, but maybe I'm just blind to it?









Why doesn't AMD just send one to Unwinder already, so that the e3 quote can finally be put to rest. :shrug:


----------



## xer0h0ur

I am with you on that one. I don't know if there is a liability issue that stops AMD or Nvidia for that matter from providing these guys with cards instead of having an AIB provide them.


----------



## Ceadderman

It's not liability it's cost effectiveness that limits nVidia and AMD from the end manufacturing and mass production of cards.

They send Reference to the AIB and the AIB mass produce the cards for the end user. If either company went around this process it would impact their bottom line for even just the short term. Yes their bottom line would stabilize but it likely would cost too much to implement mass production for the global scale. Not to mention impact the shelf price enough to limit purchase by the end user without having to take a loss.

Things are fine the way they are other than limited availability for Furies.









~Ceadder


----------



## xer0h0ur

Cost effectiveness is the reason why AMD and Nvidia don't send one video card to two people that make the popular voltage modification softwares out there? You say a lot of nonsensical things but this one may take the cake.


----------



## Huntcraft

Any word on fury nano waterblock?


----------



## Huntcraft

Any word on fury nano waterblock?


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Cost effectiveness is the reason why AMD and Nvidia don't send one video card to two people that make the popular voltage modification softwares out there? You say a lot of nonsensical things but this one may take the cake.


First of all this was not in reference to how many cards are being sent to Reviewers. That's a whole nother matter in of itself.

Second what I was saying was not "nonsensical". It would cost some big bucks to build a factory large enough to take over what the AIBs' do considering how many factories just one of the AIB partners have. Say it's just 1 Factory for each AIB. That's a large scale project in it's own right. How in the heck is this a nonsensical point of view?









What spurred this view was someone suggesting that AMD take over the manufacturing process from the AIB partners.









~Ceadder


----------



## xer0h0ur

Then why even respond to what I said which has nothing to do with what you're talking about?


----------



## xer0h0ur

Quote:


> Originally Posted by *Huntcraft*
> 
> Any word on fury nano waterblock?


There will inevitably be a Nano waterblock. Who will have one first is the only question other than when will it be available to purchase.


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Then why even respond to what I said which has nothing to do with what you're talking about?


I wasn't referring to your post per se. Maybe the AIB reference but it was a ways back and the post was so long, that I didn't quote it since I was on my phone. Sorry if you caught up in it but that was not the intent. I try not to pick bones with people. That's not me.









~Ceadder


----------



## hyp36rmax

Anyone picking up a Nano? I'm pretty impressed as it will make a great addition to an MITX build. One article hit it on the dot compared to the FURY X pointing out the performance per watt. "What happened to that extra 100 Watts" referring to the FURY X.


----------



## Agent Smith1984

Quote:


> Originally Posted by *hyp36rmax*
> 
> Anyone picking up a Nano? I'm pretty impressed as it will make a great addition to an MITX build. One article hit it on the dot compared to the FURY X pointing out the performance per watt. "What happened to that extra 100 Watts" referring to the FURY X.


Sadly, nano may be proof that fiji is at the end of its rope on clock speed, even with voltage......


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sadly, nano may be proof that fiji is at the end of its rope on clock speed, even with voltage......


I agree with this, especially with the power limitation.


----------



## rt123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sadly, nano may be proof that fiji is at the end of its rope on clock speed, even with voltage......


http://forum.hwbot.org/showthread.php?t=142320


----------



## rdr09

Quote:


> Originally Posted by *rt123*
> 
> http://forum.hwbot.org/showthread.php?t=142320


45% faster than my 290 clocked at 1330 with a G score of 6300.


----------



## rt123

Quote:


> Originally Posted by *rdr09*
> 
> 45% faster than my 290 clocked at 1330 with a G score of 6300.


And not maxed out yet.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rt123*
> 
> http://forum.hwbot.org/showthread.php?t=142320


I understand that hard mods can open it up some, but I'm talking standard voltage offsets that we are likely to see if voltage control becomes a reality....

Btw...

Really, ASUS? Use the same Damn direct contact cooler as the 390 on the fury strix? Guess that's why they built a big board.

Note to Asus.... Heat spreaders work better!

Regardless, i like the fury a lot, i Really do! But nano says to me "i can do 1000mhz on very little voltage, but need tons of voltage to break 1150mhz...."

Just my opinion though...


----------



## rt123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I understand that hard mods can open it up some, but I'm talking standard voltage offsets that we are likely to see if voltage control becomes a reality....


Well you just said voltage at first, that's why I posted it.


----------



## rdr09

Quote:


> Originally Posted by *rt123*
> 
> And not maxed out yet.


it actually matched my 290s in crossfire - around 11.5K G score.


----------



## 00riddler

Quote:


> Originally Posted by *Huntcraft*
> 
> Any word on fury nano waterblock?


Yes.


----------



## By-Tor

Nano's on Newegg..

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709 600566293


----------



## mRYellow

Quote:


> Originally Posted by *00riddler*
> 
> Yes.


That's beautiful


----------



## mRYellow

Unwinder just posted that MSI has shipped the Nano for testing.

http://forums.guru3d.com/showpost.php?p=5157193&postcount=49

Code:



Code:


Some progress. MSI just sent me Fiji based R9 Nano, so I hope that it will be enough to get Fury support as well.

Alexey Nicolaychuk aka Unwinder, RivaTuner creator


----------



## ozyo

Quote:


> Originally Posted by *00riddler*
> 
> Yes.


ho boy


----------



## drm8627

Quote:


> Originally Posted by *ozyo*
> 
> ho boy


wow that was quick!!


----------



## GorillaSceptre

Quote:


> Originally Posted by *mRYellow*
> 
> Unwinder just posted that MSI has shipped the Nano for testing.
> 
> http://forums.guru3d.com/showpost.php?p=5157193&postcount=49
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Some progress. MSI just sent me Fiji based R9 Nano, so I hope that it will be enough to get Fury support as well.
> 
> Alexey Nicolaychuk aka Unwinder, RivaTuner creator


So nobody sends him a Fury/X, but they send him the card that will probably have the most limited supply?

Makes sense.


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Unwinder just posted that MSI has shipped the Nano for testing.
> 
> http://forums.guru3d.com/showpost.php?p=5157193&postcount=49
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Some progress. MSI just sent me Fiji based R9 Nano, so I hope that it will be enough to get Fury support as well.
> 
> Alexey Nicolaychuk aka Unwinder, RivaTuner creator


Praise JEZUZ! Somethign is finally happening, I find this whole lack of Afterburner support a poor show from MSI and all those involved in Afterburners development.


----------



## the9quad

Quote:


> Originally Posted by *Alastair*
> 
> Praise JEZUZ! Somethign is finally happening, I find this whole lack of Afterburner support a poor show from MSI and all those involved in Afterburners development.


Dude was on vacation all of last month, give him a break. it's just one dude.


----------



## Otterfluff

Buildzoid has some new info up on volt modding the reference PCB Furys.

http://cxzoid.blogspot.com.au/search?updated-min=2015-01-01T00:00:00-08:00&updated-max=2016-01-01T00:00:00-08:00&max-results=50

This is good news as I would rather be modding the fury X I already have + they support full gpu waterblock plate.

I measured my Heatkiller IV block and it can probably be hacked onto my fury X, all the holes line up however only if you get a M3 precise mount kit from EK as the M4 mounting kit from watercool is just too big and the mounting holes wont line up to fit.

I was about to buy the strix fury but ill hold off as I look into volt modding my fury X, having a full waterblock would be worth it.


----------



## Alastair

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Praise JEZUZ! Somethign is finally happening, I find this whole lack of Afterburner support a poor show from MSI and all those involved in Afterburners development.
> 
> 
> 
> Dude was on vacation all of last month, give him a break. it's just one dude.
Click to expand...

If MSI wants their name all over afterburner then maybe they should give him a couple more hands to work with. So that things like this don't get delayed.


----------



## Otterfluff

I find it amusing that Afterburner got a nano when several review sites did not.









I am happy MSI have their priorities right this time around.

Better late than never!


----------



## You Mirin

Quote:


> Originally Posted by *Alastair*
> 
> If MSI wants their name all over afterburner then maybe they should give him a couple more hands to work with. So that things like this don't get delayed.


Yeah its obviously MSI's fault.








Quote:


> Originally Posted by *Otterfluff*
> 
> I find it amusing that Afterburner got a nano when several review sites did not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am happy MSI have their priorities right this time around.
> 
> Better late than never!


I wonder if they want to see what he's able to do with it before they decide on producing the fury.


----------



## Ceadderman

Quote:


> Originally Posted by *Otterfluff*
> 
> I find it amusing that Afterburner got a nano when several review sites did not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am happy MSI have their priorities right this time around.
> 
> Better late than never!












Of course they did. MSi builds them. You think they're gonna simply share their hardware with reviewers before building the software to support it?









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *You Mirin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> If MSI wants their name all over afterburner then maybe they should give him a couple more hands to work with. So that things like this don't get delayed.
> 
> 
> 
> Yeah its obviously MSI's fault.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Otterfluff*
> 
> I find it amusing that Afterburner got a nano when several review sites did not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am happy MSI have their priorities right this time around.
> 
> Better late than never!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I wonder if they want to see what he's able to do with it before they decide on producing the fury.
Click to expand...

Yes it is Msi's fault. If they want their name on a a software product. They should then provide the developers of said software with the tools and equipment and the manpower necessary to keep it up to date. Anything less is shoddy support. The fact that they have not provided unwinder a Fury to update MSI for this long is just poor. MSI has the name on the products. So they should supply the hardware that needs updated support even if it on a short term loan basis.


----------



## xer0h0ur

Quote:


> Originally Posted by *GorillaSceptre*
> 
> So nobody sends him a Fury/X, but they send him the card that will probably have the most limited supply?
> 
> Makes sense.


Someone give this man a cookie. Exactly what I thought when I read that.
Quote:


> Originally Posted by *Alastair*
> 
> Yes it is Msi's fault. If they want their name on a a software product. They should then provide the developers of said software with the tools and equipment and the manpower necessary to keep it up to date. Anything less is shoddy support. The fact that they have not provided unwinder a Fury to update MSI for this long is just poor. MSI has the name on the products. So they should supply the hardware that needs updated support even if it on a short term loan basis.


^ A thousand times this


----------



## Scorpion49

So I picked up a second R9 380 for crossfire while I wait for my Fury to get back, and its interesting how closely they perform. Since Fiji took a lot from Tonga, two 380's = the same amount of shaders and they seem to work nearly identical to the Fury in DA:I.


----------



## Skinnered

Quote:


> Originally Posted by *xer0h0ur*
> 
> Please do tell if this ends up working for you. I had thought of this before when people were having trouble overclocking in crossfire but I never spoke up and forgot the suggestion altogether till now. Would be useful info if it works.


Maye a bit late but, I now can overclock the second GPU in CCC after disabling UPS. I also have disabled CF to try overclocking the GPU's individually via msi ab, but the oc-settings don't stick.
Last but not least, I cann't overclock the memory of the second in CCC as I don't have a visiable slider.

Is there anybody with Fury CF who have succesfully overclock core and mem for both their cards?


----------



## PePoX

I want to build a pc around october / december and i want a couple fury x but hbm2 it's almost (i think) around the corner and i dont want to spend money and have too upgrade mis year so! Should i buy those or should i stick around with gddr5 until hbm2 arrives


----------



## fjordiales

Quote:


> Originally Posted by *Skinnered*
> 
> Maye a bit late but, I now can overclock the second GPU in CCC after disabling UPS. I also have disabled CF to try overclocking the GPU's individually via msi ab, but the oc-settings don't stick.
> Last but not least, I cann't overclock the memory of the second in CCC as I don't have a visiable slider.
> 
> Is there anybody with Fury CF who have succesfully overclock core and mem for both their cards?


I have Strix fury and OC all 3 BUT it was acting up before. I will post or PM my afterburner settings when I get home from gym. Also, I have all 3 with 4 more CU unlocked.


----------



## Skinnered

Quote:


> Originally Posted by *fjordiales*
> 
> I have Strix fury and OC all 3 BUT it was acting up before. I will post or PM my afterburner settings when I get home from gym. Also, I have all 3 with 4 more CU unlocked.


Thanx, will look at it when you post them


----------



## RaduZ

Quote:


> Originally Posted by *PePoX*
> 
> I want to build a pc around october / december and i want a couple fury x but hbm2 it's almost (i think) around the corner and i dont want to spend money and have too upgrade mis year so! Should i buy those or should i stick around with gddr5 until hbm2 arrives


It depends, something new is always just around the corner. Can you last 6-7 months without any upgrade? I don't think the new cards will come out sooner than that.


----------



## fjordiales

Quote:


> Originally Posted by *Skinnered*
> 
> Thanx, will look at it when you post them


Sent a PM.


----------



## xer0h0ur

Quote:


> Originally Posted by *PePoX*
> 
> I want to build a pc around october / december and i want a couple fury x but hbm2 it's almost (i think) around the corner and i dont want to spend money and have too upgrade mis year so! Should i buy those or should i stick around with gddr5 until hbm2 arrives


From what I can remember HBM2 won't be ready for mass production for quite some time. Availability is currently at late Q2 if not beginning of Q3 2016. There will not be anything using HBM2 until AMD's Arctic Islands and Nvidia's Pascal.


----------



## Lu(ky

Hey guys just picked up a Asus R9 Fury X card and wondering if anyone here is using a eK or XSPC full block on your card? I will be doing a build log soon and I can not make up my mind on the water blocks. I already own both cpu blocks eK and XSPC raystorm but I just can make up my mind. I think the XSPC looks better if you like the lighting with it. SO which one?


----------



## Gumbi

How you finding th cooling on the Strix? Cores and VRM after benching? From what I've read it's a very capable cooler.


----------



## MalsBrownCoat

Well, after quite the fiascos with 290X's, followed by 390X's, I've now moved on to a pair of Fury X's.

(feel free to peruse my build log/updates in sig below)







Looking forward to seeing what these things can do.


----------



## MrKoala

Why do people want 3rd party water blocks for the Fury X instead of disconnecting the stock rad and put the tubes into the loop?


----------



## rv8000

Quote:


> Originally Posted by *MrKoala*
> 
> Why do people want 3rd party water blocks for the Fury X instead of disconnecting the stock rad and put the tubes into the loop?


Superior cooling, better design, less restriction of components existing in the fury x loop, lots of reasons really. Just because it would be possible to mod it into an existing loop doesnt mean it is the best or smartest option.


----------



## Ceadderman

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrKoala*
> 
> Why do people want 3rd party water blocks for the Fury X instead of disconnecting the stock rad and put the tubes into the loop?
> 
> 
> 
> Superior cooling, better design, less restriction of components existing in the fury x loop, lots of reasons really. Just because it would be possible to mod it into an existing loop doesnt mean it is the best or smartest option.
Click to expand...

This!

If you already have an existing loop, why not?

~Ceadder


----------



## Scorpion49

Quote:


> Originally Posted by *MrKoala*
> 
> Why do people want 3rd party water blocks for the Fury X instead of disconnecting the stock rad and put the tubes into the loop?


Because comparing a weak AIO pump with tiny, restrictive tubing and an extremely mediocre VRM cooling path to a true dedicated block isn't favorable to the stock cooling solution.


----------



## PePoX

Quote:


> Originally Posted by *RaduZ*
> 
> It depends, something new is always just around the corner. Can you last 6-7 months without any upgrade? I don't think the new cards will come out sooner than that.


I was thinking in getting a place holder good enough for 1080 like a cheap 290 and yes i could wait that time


----------



## Otterfluff

XFX will honor your warranty if you install a water block but don't cut up your AIO. So if you still have your AIO intact you can still claim the warranty. Not all brands will do this but I do not think they would be the exception.

So using a custom water block is a strong choice.

Water cool replied to a email I sent exactly 14 days ago
Quote:


> Hello,
> 
> we plan to release the R9 Fury X waterblock(s) in three weeks. The nickel plated versions will be probably available one week later.


These are all images of their new Titan IV Blocks and the new fury X blocks should look very similar.






Link to the new Titan IV blocks they do sell:
http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Categories/Wasserkühler/GPU_Kuehler/"Geforce%20GTX%20TITAN%20X"

They are defiantly worth considering as they incorporate the new IV waterblock structure from their CPU block into their new design for the IV GPU range. I am really looking forward to their Fury blocks, it should only be a week until they release according to their email.


----------



## Ceadderman

Afaik all major card manufacturers will honor the warranty so long as you affix the stock cooler back on before shipping it back to them. I know XFX and Sapphire will and by my experience RMAing with ASUS they do too.









I don't know about MSi or PowerColor however.









~Ceadder


----------



## Shatun-Bear

Quote:


> Originally Posted by *RaduZ*
> 
> It depends, something new is always just around the corner. Can you last *6-7 months* without any upgrade? I don't think the new cards will come out sooner than that.


Doubt anything is coming as soon as March or April 2016. Looking at HBM2 production timescale, I reckon July at the very earliest for Arctic Islands and Pascal.


----------



## royfrosty

Yay! Just completed assembling the waterblocks for my twin fury X.

Removed the Stock Fury X PCIE bracket.



Attached the link bridges.



And finally added into my GeneXis 2.0


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *royfrosty*
> 
> Yay! Just completed assembling the waterblocks for my twin fury X.
> 
> Removed the Stock Fury X PCIE bracket.
> 
> 
> 
> Attached the link bridges.
> 
> 
> 
> And finally added into my GeneXis 2.0






That looks awesome!!


----------



## jackalopeater

I've been playing with this little puppy all weekend....temps/noise so much better in a case, lol. There is a touch of coil whine, but only in a few games menu (mostly in Hitman Absolution) and really not noticeable inside a case with the side panel on......I've also been yelled at for saying this, but I'm just glad mine doesn't scream like most of the reviewers does!


----------



## Gumbi

Coil whine in high end cards ia unavoidable in high FPS menus. Crysis 1 menu rand at 3000fps or so for me I had whine in the menu, absolutely none while gaming though, same with the heaven exit screen.


----------



## SpeedyVT

Quote:


> Originally Posted by *Gumbi*
> 
> Coil whine in high end cards ia unavoidable in high FPS menus. Crysis 1 menu rand at 3000fps or so for me I had whine in the menu, absolutely none while gaming though, same with the heaven exit screen.


Enable AMD's frame rate control and menus regardless of vsync or not will be locked in at a max of the specified frames. Because a series of frames are dropped it improves the latency to render the frame giving also a smoother experience. Still just a sugar pill to some of issues people experience.


----------



## fewness

Are we there yet ......about voltage unlock ?


----------



## Kana-Maru

Google it. The same question is getting annoying.


----------



## en9dmp

To be fair, this thread is more in the know than Google... And this is the only question left to answer that anyone really cares about


----------



## xer0h0ur

Unwinder has an R9 Nano now so he is working on it for Afterburner. I have no idea if W1zzard has finally submitted a fixed version of Trixx since the last one was not approved.


----------



## rv8000

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unwinder has an R9 Nano now so he is working on it for Afterburner. I have no idea if W1zzard has finally submitted a fixed version of Trixx since the last one was not approved.


Wizzard normally replies to my PM's within 1-3 days, I haven't heard anything back from him in almost 2 weeks now so who knows.


----------



## TK421

So on the reference Fury X pcb the voltage control is unlocked, but how about the tdp control which allows higher power consumption compared to stock bios?


----------



## Thoth420

Yep Absolution has an insanely high frame cap at menus might be 3000 as well with v sync off.


----------



## xer0h0ur

Quote:


> Originally Posted by *SpeedyVT*
> 
> Enable AMD's frame rate control and menus regardless of vsync or not will be locked in at a max of the specified frames. Because a series of frames are dropped it improves the latency to render the frame giving also a smoother experience. Still just a sugar pill to some of issues people experience.


For what its worth I tried using the CCC's framerate control and all that did was introduce input lag as my frametimes went up. This was observed in CS:GO. So that would imply the exact opposite of what you're saying.

I was getting pissed that it felt like people were getting the drop on me before I had a chance to even shoot. I checked my frametimes and sure as **** it was hovering steady at 16.9ms. I disabled crossfire, left vsync off and used nothing to control framerate. Frametimes dropped between 2 and 4ms.


----------



## TK421

Quote:


> Originally Posted by *TK421*
> 
> So on the reference Fury X pcb the voltage control is unlocked, but how about the tdp control which allows higher power consumption compared to stock bios?


anyone?


----------



## Ceadderman

It may or may not be coming. Only AMD knows for sure at this point.









Since Fury x2 has yet to launch, so I've a mind that they may be waiting until that launches sometime this fall.









~Ceadder


----------



## TK421

Quote:


> Originally Posted by *Ceadderman*
> 
> It may or may not be coming. Only AMD knows for sure at this point.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since Fury x2 has yet to launch, so I've a mind that they may be waiting until that launches sometime this fall.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


No user bios mods?


----------



## SpeedyVT

Quote:


> Originally Posted by *xer0h0ur*
> 
> For what its worth I tried using the CCC's framerate control and all that did was introduce input lag as my frametimes went up. This was observed in CS:GO. So that would imply the exact opposite of what you're saying.
> 
> I was getting pissed that it felt like people were getting the drop on me before I had a chance to even shoot. I checked my frametimes and sure as **** it was hovering steady at 16.9ms. I disabled crossfire, left vsync off and used nothing to control framerate. Frametimes dropped between 2 and 4ms.


Depends on the game. Guess CS:GO doesn't work well with it.


----------



## flopper

Quote:


> Originally Posted by *SpeedyVT*
> 
> Depends on the game. Guess CS:GO doesn't work well with it.


I wouldnt use crossfire on cs:go as you would want as latency low as possible.


----------



## Vlada011

Weird on net, you tube are not possible to find single video clip with installation waterblock on Fury X.
In same time there are many of them for GTX980Ti, TITAN X, GTX980, GTX780Ti, GTX780 different brands.
People completely forgot how nice could be build on small mATX motherboards Fury X crossfire with EKWB Hydro Copper block.
They are almost as Phoebus Sound Cards, little bigger. Don't need even bridge.
Special because on with waterblock people get rid of only problematic part on Fury X, noisy pump.


----------



## Thoth420

Quote:


> Originally Posted by *Vlada011*
> 
> Weird on net, you tube are not possible to find single video clip with installation waterblock on Fury X.
> In same time there are many of them for GTX980Ti, TITAN X, GTX980, GTX780Ti, GTX780 different brands.
> People completely forgot how nice could be build on small mATX motherboards Fury X crossfire with EKWB Hydro Copper block.
> They are almost as Phoebus Sound Cards, little bigger. Don't need even bridge.
> Special because on with waterblock people get rid of only problematic part on Fury X, noisy pump.


Contact EK I am sure they will be happy to assist. I am very new to custom cooling and they spent hours explaining all types of things to me.
I also plan on blocking my Fury(s) (still undecided how many I want to use and never ran multi GPU before) to avoid pump sound issues and to have a legit pump since the one they shipped with the card is laughable in regard to a DDC. Oh and of course to have a legendary sized res because they just look so Awesome and Cooooooool!


----------



## Vlada011

I'm didn't tried multi GPU and I will stay with single graphic card in future.
That's best for gaming.


----------



## Thoth420

Quote:


> Originally Posted by *Vlada011*
> 
> I'm didn't tried multi GPU and I will stay with single graphic card in future.
> That's best for gaming.


I found the same to be true so far. I tend to stick with the strongest single GPU out that plays with my vast library of games(not just the top 10 most popular that month...*cough Nvidia cough*)


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> I found the same to be true so far. I tend to stick with the strongest single GPU out that plays with my vast library of games(not just the top 10 most popular that month...*cough Nvidia cough*)


You mean you like to keep your GPU for longer than 6 months and actually enjoy when you card gets stronger instead of weaker after driver optimizations? Yeah, I guess that is kinda cool.









Personally, I like the 3.5gb + .5gb of RAM I got on my 970... boy was that ever a bargain. Shoulda just got another 290x... what a waste.


----------



## Vlada011

Me too, only cards with full potential.
Maybe and I will back on AMD I'm not sure at this moment but maybe.


----------



## battleaxe

Quote:


> Originally Posted by *Vlada011*
> 
> Me too, only cards with full potential.
> Maybe and I will back on AMD I'm not sure at this moment but maybe.


I still like to have a catalog of GPU's on both camps. Truthfully, both make great GPU's, right now Nvidia has a slight, tiny, microscopic advantage







. AMD always comes from behind with the driver improvements while Nvidia abandons to the next gen. Also, as a past miner, I have to say AMD really pulled its weight where Nvidia didn't help much. AMD makes a solid product. I plan to get a FuryX when the price comes down and when we have HBM2. Looking forward to that day.


----------



## fjordiales

Anyone seen this yet? Has nano xfire with fury x results too.

http://www.techpowerup.com/reviews/AMD/R9_Nano_CrossFire/


----------



## Medusa666

Anyone here owning a Asus Fury Strix who would like to give some comments about the performance, and especially the noise levels during idle and load, and coil whine.

It would be greatly appreciated.

: )

Thanks.


----------



## Scorpion49

Well Athlon Micro eventually came through and sent me a brand new Fury Tri-X. This one has a little bit of coil whine, but nowhere near as bad as the old one. I can barely hear it with the case closed and not at all with headphones on while the old card sounded like a cicada sitting on your shoulder.


----------



## p4inkill3r

Quote:


> Originally Posted by *fjordiales*
> 
> Anyone seen this yet? Has nano xfire with fury x results too.
> 
> http://www.techpowerup.com/reviews/AMD/R9_Nano_CrossFire/


The dual-Nano look is pretty sweet.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Well Athlon Micro eventually came through and sent me a brand new Fury Tri-X. This one has a little bit of coil whine, but nowhere near as bad as the old one. I can barely hear it with the case closed and not at all with headphones on while the old card sounded like a cicada sitting on your shoulder.


I knew they'd come through.


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone here owning a Asus Fury Strix who would like to give some comments about the performance, and especially the noise levels during idle and load, and coil whine.
> 
> It would be greatly appreciated.
> 
> : )
> 
> Thanks.


AFAIK, all 3 don't have coil whine. I will try to record trifire 3dmark especially the custom fan I have.

But short answer, needs 40-50% to keep temps at 65-75. Trifire temps are 70/80/65 on witcher 3. Above 60% fan is whiny. On default fan profile, fan starts at about 65 degrees.


----------



## Vlada011

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone here owning a Asus Fury Strix who would like to give some comments about the performance, and especially the noise levels during idle and load, and coil whine.
> 
> It would be greatly appreciated.
> 
> : )
> 
> Thanks.


If decide to go with Fury X I will buy ASUS.
But pump go immediately in garbage. AMD should take look on CORSAIR tubes example on MSI Hawk GTX980Ti and to remember how such premium product should look. Special because Fury X no many custom models and Fury X should represent AMD company no matter who made and they shouldn't allow such cheap messy products.
Tubes and sleeve on Fury X is rock bottom. Can't be worse. But with EKWB Predator and



...situation is far better. Special on mATX motherboard.
Because my opinion is that Mini ITX is nice but can't hold hard core gaming RIG with all function.
I mean if customer only want graphic and CPU and SATA III and Onboard Sound OK.
But if they want to use 2 cards, dedicate sound, M.2 or graphic card, M.2 and PCI-E SSD and good Sound Card... platform can't be smaller than mATX.
ASUS really disappointed me when they build X99M-WS with transfer up to 10Gbps... From that moment when someone confirm that I didn't open topic any more.


----------



## Scorpion49

Well, as soon as I put the new Fury in my games started hard-locking again. I'm getting about ready to just go to Nvidia if I can't have a decently working AMD solution.


----------



## nickcnse

Well guys, I have the r9 fury x with 3x Asus VG236H monitors. Now I know we've visited this issue a bit but would the general consensus be that I should sell these 3 monitors in an attempt to get either a 1440p or 4k single monitor solution instead of trying to get three seperate Active converters working? With three of the converters I'm looking at almost $240 from what I've seen, and I've heard they don't necessarily work that well. If I do upgrade monitors what do you guys think would be the best option? Thank you everyone.


----------



## battleaxe

Quote:


> Originally Posted by *nickcnse*
> 
> Well guys, I have the r9 fury x with 3x Asus VG236H monitors. Now I know we've visited this issue a bit but would the general consensus be that I should sell these 3 monitors in an attempt to get either a 1440p or 4k single monitor solution instead of trying to get three seperate Active converters working? With three of the converters I'm looking at almost $240 from what I've seen, and I've heard they don't necessarily work that well. If I do upgrade monitors what do you guys think would be the best option? Thank you everyone.


4k all the way. If you look at it for a couple days you will never, ever want to go back. My 1080p displays that my wife uses and the kids use looks like it has peanut butter smeared all over the screen in comparison. Its that fuzzy. Seriously huge upgrade.


----------



## nickcnse

Did you by chance happen to come from a 120hz monitor before hand? I play some FPS games but mostly DOTA2 so I probably don't really need the top refresh rate but I did get used to it.


----------



## battleaxe

Quote:


> Originally Posted by *nickcnse*
> 
> Did you by chance happen to come from a 120hz monitor before hand? I play some FPS games but mostly DOTA2 so I probably don't really need the top refresh rate but I did get used to it.


I was referring to the pixel density.


----------



## nickcnse

I understand the pixel density, I was just wondering if you were experienced int he 120hz monitors as well. Thank you for the advice = )


----------



## TK421

Any user vbios mods yet that allow higher power consumption that stock?


----------



## Scorpion49

Quote:


> Originally Posted by *nickcnse*
> 
> Well guys, I have the r9 fury x with 3x Asus VG236H monitors. Now I know we've visited this issue a bit but would the general consensus be that I should sell these 3 monitors in an attempt to get either a 1440p or 4k single monitor solution instead of trying to get three seperate Active converters working? With three of the converters I'm looking at almost $240 from what I've seen, and I've heard they don't necessarily work that well. If I do upgrade monitors what do you guys think would be the best option? Thank you everyone.


I've used several 4K monitors, its nice but not something I would want all the time. I think the sweet spot right now for price, performance and the ability to run on one Fury card is 1440p 144hz. I really like my BenQ XL2730Z.


----------



## mRYellow

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, as soon as I put the new Fury in my games started hard-locking again. I'm getting about ready to just go to Nvidia if I can't have a decently working AMD solution.


Strange. My RIG is rock solid.


----------



## Alastair

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, as soon as I put the new Fury in my games started hard-locking again. I'm getting about ready to just go to Nvidia if I can't have a decently working AMD solution.


then it's not the GPU that's the problem.


----------



## looncraz

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, as soon as I put the new Fury in my games started hard-locking again. I'm getting about ready to just go to Nvidia if I can't have a decently working AMD solution.


You have a problem elsewhere, then, you need to start troubleshooting the rest of your system.

Far too often people blame the wrong hardware (or software).

If the Fury is the only changing variable, then you have to look at power delivery, the PCI-e slot itself, and Windows driver configuration corruption.

My system was rock-stable with my 7870XT for year, when I installed an R9 290 it became unstable in quite a few games. I thought it was the video card, so I swapped back to my 7870XT, which resolved the problems, and I installed my R9 290 in one of my other systems - and it was rock solid there! I wiped the drivers, cleaned up everything I could, and the problem still persisted on my Windows 7 x64 install. So, I considered the installation borked, and made a backup of my user profile, reinstalled, all fresh drivers, and the R9 290 from the very first moment, and have never had an issue since.

Hope you don't have to go through all that.. it sucks!


----------



## xer0h0ur

Quote:


> Originally Posted by *flopper*
> 
> I wouldnt use crossfire on cs:go as you would want as latency low as possible.


Yeah I already knew that but here is the kicker, I had already disabled crossfire altogether using an application profile. Despite this I was still experiencing crossfire latency so I went into the CCC and also disabled crossfire there which coupled with removing the framerate limiter dropped frametime from 16.9ms to 2-4ms. I also observed that if I uncapped the framerate altogether with the fps_max 0 command it further reduced frametimes in certain situations. CS:GO is just one oddball DX9 game. I wonder when they will port it over to the Source 2 engine.

Edit: Just to be clear, any external frame limiter introduced frametime latency. So if you're using RivaTuner's frame limiter, RadeonPro, or the CCC's frame limiter it doesn't matter. Any of them will cause this in CS:GO. Its just one of those games where you don't want V-Sync or any other bloody thing controlling the framerate at all.

I was getting really pissed off that I suddenly was getting rekt in that game. I was near de-ranking from DMG back to MGE because of the frametime issues. Things seem to be okay now.


----------



## Medusa666

Quote:


> Originally Posted by *fjordiales*
> 
> AFAIK, all 3 don't have coil whine. I will try to record trifire 3dmark especially the custom fan I have.
> 
> But short answer, needs 40-50% to keep temps at 65-75. Trifire temps are 70/80/65 on witcher 3. Above 60% fan is whiny. On default fan profile, fan starts at about 65 degrees.


Quote:


> Originally Posted by *fjordiales*
> 
> AFAIK, all 3 don't have coil whine. I will try to record trifire 3dmark especially the custom fan I have.
> 
> But short answer, needs 40-50% to keep temps at 65-75. Trifire temps are 70/80/65 on witcher 3. Above 60% fan is whiny. On default fan profile, fan starts at about 65 degrees.


Hi Sir, thank you for your informative reply.

I'm considering picking one of these cards up, I was initially looking at the Fury X, bought two of them, but returned both due to pump noise and coil whine. I'm very sound sensitive so I only use Noctua stuff and extremely silent graphic cards.

I got a 295X2 today, I love the card to bits but sometimes the crossfire is not supported, and the small fan cooling the VRM togheter with the pump makes some unwanted noise, I have replaced the radiator fan with a Noctua one though.

Sapphire Fury would be my go to, but it seems to have a high amount of coil whine on nearly every ex, what seems to be good about the ASUS is the concrete chokes which is supposed to reduce coil whine.

Anyone else care to pitch in here?

If the card is virtually coil whine free, I think I'l go for it.

: )


----------



## Thoth420

Quote:


> Originally Posted by *battleaxe*
> 
> 4k all the way. If you look at it for a couple days you will never, ever want to go back. My 1080p displays that my wife uses and the kids use looks like it has peanut butter smeared all over the screen in comparison. Its that fuzzy. Seriously huge upgrade.


What size panel are you looking at?....because I cannot tell any difference between 4k 28 inch and 2560x1440 27 inch at all....

Essentially this:
Quote:


> Originally Posted by *Scorpion49*
> 
> I've used several 4K monitors, its nice but not something I would want all the time. I think the sweet spot right now for price, performance and the ability to run on one Fury card is 1440p 144hz. I really like my BenQ XL2730Z.


I have tried almost every high end panel so I think I have enough experience....high refresh 1440 destroys 60hz 4k all day.


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Hi Sir, thank you for your informative reply.
> 
> I'm considering picking one of these cards up, I was initially looking at the Fury X, bought two of them, but returned both due to pump noise and coil whine. I'm very sound sensitive so I only use Noctua stuff and extremely silent graphic cards.
> 
> I got a 295X2 today, I love the card to bits but sometimes the crossfire is not supported, and the small fan cooling the VRM togheter with the pump makes some unwanted noise, I have replaced the radiator fan with a Noctua one though.
> 
> Sapphire Fury would be my go to, but it seems to have a high amount of coil whine on nearly every ex, what seems to be good about the ASUS is the concrete chokes which is supposed to reduce coil whine.
> 
> Anyone else care to pitch in here?
> 
> If the card is virtually coil whine free, I think I'l go for it.
> 
> : )


I'm gonna get some heat on this but as good as the Strix is, Tri-X is just better. I've made comments about if the Tri-X was red & black, I would get it. Advantage of the Strix is the custom PCB, BUT NERFED on voltage. I get 1.7v on all 3 on full load benchmark.

Also, Strix has these bios that are voltage unlocked. I flashed them from here.

http://forum.hwbot.org/showthread.php?t=142320

Basically, Tri-x has a more efficient cooler, better water block support. Luckily for me, all 3 Strix have no coil whine.


----------



## Medusa666

Quote:


> Originally Posted by *fjordiales*
> 
> I'm gonna get some heat on this but as good as the Strix is, Tri-X is just better. I've made comments about if the Tri-X was red & black, I would get it. Advantage of the Strix is the custom PCB, BUT NERFED on voltage. I get 1.7v on all 3 on full load benchmark.
> 
> Also, Strix has these bios that are voltage unlocked. I flashed them from here.
> 
> http://forum.hwbot.org/showthread.php?t=142320
> 
> Basically, Tri-x has a more efficient cooler, better water block support. Luckily for me, all 3 Strix have no coil whine.


Thank you for yet another clarification.

To me, the fact that you got zero coil whine out of three cards is very positive, and it helps me out in my decision here.

I think I will read a few more reviews, but if nothing major happens I will most likely sell my 295X2 tomorrow and buy a ASUS Fury Strix.

Thanks for your input, it has been valuable.

: )


----------



## xer0h0ur

I vaguely remember the rumor of another Asus card coming soon being confirmed. A Fury variant which will likely actually take advantage of the custom PCB that the Strix isn't really doing anything with.


----------



## Medusa666

A wet dream would be a MSI Fury Lightning, but hey, that is never going to happen, but if it does, I will throw my money at MSI


----------



## xer0h0ur

There is nothing stopping MSI from making a Fury Lightning. They just can't make a Fury X Lightning.


----------



## Neon Lights

Quote:


> Originally Posted by *fjordiales*
> 
> I'm gonna get some heat on this but as good as the Strix is, Tri-X is just better. I've made comments about if the Tri-X was red & black, I would get it. Advantage of the Strix is the custom PCB, BUT NERFED on voltage. I get 1.7v on all 3 on full load benchmark.
> 
> Also, Strix has these bios that are voltage unlocked. I flashed them from here.
> 
> http://forum.hwbot.org/showthread.php?t=142320
> 
> Basically, Tri-x has a more efficient cooler, better water block support. Luckily for me, all 3 Strix have no coil whine.


Hey could you please say something about your overclocking with the unlocked voltage BIOSes? I asked for it earlier, I am very interested to know how overclocking works on the Fury Strix without hard-voltage modding it.


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Thank you for yet another clarification.
> 
> To me, the fact that you got zero coil whine out of three cards is very positive, and it helps me out in my decision here.
> 
> I think I will read a few more reviews, but if nothing major happens I will most likely sell my 295X2 tomorrow and buy a ASUS Fury Strix.
> 
> Thanks for your input, it has been valuable.
> 
> : )


Quote:


> Originally Posted by *Neon Lights*
> 
> Hey could you please say something about your overclocking with the unlocked voltage BIOSes? I asked for it earlier, I am very interested to know how overclocking works on the Fury Strix without hard-voltage modding it.


I'm not sure how multi quote works on my phone but my info will be for you both. I believe it was neon who posted the link on the unlock voltage Strix, so thanks on that.

I have tried the unlock voltage bios on air BUT it works on Asus gpu tweak for now. Problem with that is for some reason, I can't OC the memory on that bios and gpu tweak combo. Also, the bios was based of the 1st one, not the latest bios so I just went back to original bios. Then unlocked it.

I got a little frustrated so went back to afterburner and unlocked 4low on my own.

Also, my idle temps on that bios with vgpu unlock is high. I made sure I installed the correct bios too.

Short story, I didn't do complete benchmark and got frustrated so went back to stock then unlocked 3840 CU with latest bios instead. And the bios kinda acts up on xfire and trifire.

Also, I think Amazon US customer said he had a little coil whine on Strix. I'm just lucky I don't have them.


----------



## Neon Lights

Quote:


> Originally Posted by *fjordiales*
> 
> I'm not sure how multi quote works on my phone but my info will be for you both. I believe it was neon who posted the link on the unlock voltage Strix, so thanks on that.
> 
> I have tried the unlock voltage bios on air BUT it works on Asus gpu tweak for now. Problem with that is for some reason, I can't OC the memory on that bios and gpu tweak combo. Also, the bios was based of the 1st one, not the latest bios so I just went back to original bios. Then unlocked it.
> 
> I got a little frustrated so went back to afterburner and unlocked 4low on my own.
> 
> Also, my idle temps on that bios with vgpu unlock is high. I made sure I installed the correct bios too.
> 
> Short story, I didn't do complete benchmark and got frustrated so went back to stock then unlocked 3840 CU with latest bios instead. And the bios kinda acts up on xfire and trifire.
> 
> Also, I think Amazon US customer said he had a little coil whine on Strix. I'm just lucky I don't have them.


But what I wanted to know is (if you did do so) how the overclocking works (even if only on the GPU), so if you can use ASUS GPU tweak to adjust the voltage and the core frequency and if yes, if you could overclock it more than without changing the voltage.


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> What size panel are you looking at?....because I cannot tell any difference between 4k 28 inch and 2560x1440 27 inch at all....
> 
> Essentially this:
> I have tried almost every high end panel so I think I have enough experience....high refresh 1440 destroys 60hz 4k all day.


28" and that's funny as I couldn't tell much difference between 1080p and 1440. But with 4k I saw a huge difference. I guess we are all different is what it really comes down to. I've not had the experience of playing on a 120hz or 144hz display. Pretty happy with my 4k at 60hz.


----------



## Thoth420

Quote:


> Originally Posted by *battleaxe*
> 
> 28" and that's funny as I couldn't tell much difference between 1080p and 1440. But with 4k I saw a huge difference. I guess we are all different is what it really comes down to. I've not had the experience of playing on a 120hz or 144hz display. Pretty happy with my 4k at 60hz.


Indeed as long as you are happy that is all that really matters.


----------



## Wage

Quote:


> Originally Posted by *Ceadderman*
> 
> It may or may not be coming. Only AMD knows for sure at this point.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since Fury x2 has yet to launch, so I've a mind that they may be waiting until that launches sometime this fall.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


More like Q2 2016 at this rate, at least if it follows 295X2's example.

A Q4 2015 press release would be nice for the holidays though!

EDIT: Oh wait, I didn't know PCB pics were leaked in June. Guess there's hope for Q4 2015 after all?


----------



## Ceadderman

They have been saying the Fall so anything is possible atp.









It would not surprise me that's where the HBM have been going. Considering how in demand R9 290x was in the past it makes a lot of sense for AMD to put on hold lower level HBM cards to fit their high end cards into schedule to get them out on time.

~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Wage*
> 
> More like Q2 2016 at this rate, at least if it follows 295X2's example.
> 
> A Q4 2015 press release would be nice for the holidays though!
> 
> EDIT: Oh wait, I didn't know PCB pics were leaked in June. Guess there's hope for Q4 2015 after all?


LOL no. Q2 or Q3 is when they will be launching Arctic Islands as around that time frame is when HBM2 will finally be in mass production. Fury X2 will launch this year. How much of it will be available is another story altogether. With every launch those fully assembled interposers keep getting more and more scarce. Its all due to one of the parts of the assembly barely recently going into mass production. Dies are in good supply as is HBM.


----------



## TK421

Quote:


> Originally Posted by *TK421*
> 
> Any user vbios mods yet that allow higher power consumption that stock?


Stock pcb, fury x

Guys?


----------



## mRYellow

Quote:


> Originally Posted by *TK421*
> 
> Stock pcb, fury x
> 
> Guys?


No, not that i'm aware of.


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Thank you for yet another clarification.
> 
> To me, the fact that you got zero coil whine out of three cards is very positive, and it helps me out in my decision here.
> 
> I think I will read a few more reviews, but if nothing major happens I will most likely sell my 295X2 tomorrow and buy a ASUS Fury Strix.
> 
> Thanks for your input, it has been valuable.
> 
> : )


I have the fan profile I use for the Strix here:


Here's the vid, kinda sloppy but gives you an idea on the fan noise. And sorry for the dryer noise, was doing laundry.




Quote:


> Originally Posted by *Neon Lights*
> 
> But what I wanted to know is (if you did do so) how the overclocking works (even if only on the GPU), so if you can use ASUS GPU tweak to adjust the voltage and the core frequency and if yes, if you could overclock it more than without changing the voltage.


Sorry, was not able to do this one. I was only able to add voltage but I got frustrated on the Mem OC that won't sync. Plus my idle temps on that bios were kinda high. I think that bios was more for single GPU being pushed to it's limit.


----------



## Gumbi

How hot does she get under load? Say at 50% fan speed? Core/VRMs


----------



## fjordiales

Quote:


> Originally Posted by *Gumbi*
> 
> How hot does she get under load? Say at 50% fan speed? Core/VRMs


Top/Middle/Bottom (These are just observations while playing Witcher 3)

75/80/65 - Peak

70/75/60 - Average

84/84/80 - Peak on default fan.

77/80/70 - Average on default fan.

Also, on default fan, it keeps it at 45%(60% on middle card) average to "balance" the silence and performance. Really depends on the game though. There are times that after I'm done playing(mostly after 4 hours), fan speed is at 50-60% until it drops to 40 degrees then shuts off fan. Also, fans don't start to spin unless I hit 65 deg under default fan. I asked for custom fan profiles from some members here and I just did a trial/error from what they have. I only got replies from Tri-X owners though.


----------



## Otterfluff

So I plumbed in my fury X into a 420 Monsta Rad. Just the Fury X, 15cm of neoprene tubing (ID 10mm OD 16mm) and two 10mm barb fittings + cable ties.













More details in my build log: http://forums.overclockers.com.au/showthread.php?p=16874111#post16874111


----------



## aznguyen316

^ haha wow that is nuts. Should put your cpu underwater.


----------



## Otterfluff

Quote:


> Originally Posted by *aznguyen316*
> 
> ^ haha wow that is nuts. Should put your cpu underwater.


The CPU is in the works, im just waiting on parts in the mail


----------



## aznguyen316

Quote:


> Originally Posted by *Otterfluff*
> 
> The CPU is in the works, im just waiting on parts in the mail


Awesome yeah, realized it after I went to your build log. =D


----------



## Otterfluff

Here is two clearer images I did with a better camera.





any idea how I could clean up that loose cable sleeving?


----------



## Scorpion49

Quote:


> Originally Posted by *Otterfluff*
> 
> Here is two clearer images I did with a better camera.
> 
> any idea how I could clean up that loose cable sleeving?


Burn it off with a lighter, the ends will then melt together and stop it from fraying afterwards.


----------



## Otterfluff

I tried some more benching with furmark and afterburner.

Not much luck with core overclocking but HBM memory is getting better results than the stock cooler.

Ive benched 620 Mhz on Furmark for about an hour now with +50% power. My average core temperature is 41C

I did try 650 but it gets artifacts. I will try to increase the range upwards from 620 tomorrow.


----------



## fjordiales

Anyone seen these yet?

http://www.techpowerup.com/216195/xfx-also-readies-its-radeon-r9-fury-air-cooled-graphics-card.html

http://www.techpowerup.com/216194/xfx-readies-a-liquid-cooled-radeon-r9-fury.html


----------



## Scorpion49

Quote:


> Originally Posted by *fjordiales*
> 
> Anyone seen these yet?
> 
> http://www.techpowerup.com/216195/xfx-also-readies-its-radeon-r9-fury-air-cooled-graphics-card.html
> 
> http://www.techpowerup.com/216194/xfx-readies-a-liquid-cooled-radeon-r9-fury.html


That air cooled card is a carbon copy of the Tri-X.


----------



## aznguyen316

And the powercolour one too!


----------



## Semel

*Unreal Engine 4 INFILTRATOR Tech Demo DX11 Vs DX12 GTX 980 TI Vs AMD Fury X FPS Comparison. Disappointing*






And i980ti is a reference card here....so, considering most Ti's can be OCed to at least 1400+ without even touching voltage control... oh well.. I'm still waiting patiently for my sapphire tri-x fury to arrive


----------



## Neon Lights

Quote:


> Originally Posted by *Semel*
> 
> *Unreal Engine 4 INFILTRATOR Tech Demo DX11 Vs DX12 GTX 980 TI Vs AMD Fury X FPS Comparison. Disappointing*
> 
> 
> 
> 
> 
> 
> And i980ti is a reference card here....so, considering most Ti's can be OCed to at least 1400+ without even touching voltage control... oh well.. I'm still waiting patiently for my sapphire tri-x fury to arrive


I also ran the Elemental tech demo (with a Fury X) and noticed that the performance was not that great at all, perhaps even worse than with DirectX 11, which is also the case with the Infiltrator tech demo, so I would say that even though the programs support DirectX 12 as a rendering path, they are so badly optimzed for it that they are a very bad example of DirectX 12 benchmarks.


----------



## SpeedyVT

Quote:


> Originally Posted by *Semel*
> 
> *Unreal Engine 4 INFILTRATOR Tech Demo DX11 Vs DX12 GTX 980 TI Vs AMD Fury X FPS Comparison. Disappointing*
> 
> 
> 
> 
> 
> 
> And i980ti is a reference card here....so, considering most Ti's can be OCed to at least 1400+ without even touching voltage control... oh well.. I'm still waiting patiently for my sapphire tri-x fury to arrive


Looking at that video there was literally no difference between the 980 ti's and Fury X's frames in intense scenes, any of the similar less complex scenes the 980 ti easily did 20% more. Probably because of it's ROP configuration rather than it's shear power. This is why the Fury X doesn't top out in simplistic scenes.

If this was indeed properly utilizing DX12 the ms would've been much lower, I don't see this benchmark getting much creditability.

Most people would often determine a video's cards absolute performance by it's average or highest FPS, but the best way to determine absolute performance is by the video cards lowest average fps. Or how well it does not lose frames in a suddenly complex scene. This way benchmarks tend to be completely bias to the videocard with the greater ROPs rather than the video card with the most power and performance.

Not saying the Fury is better, but from this benchmark it handles clearly better in intense scenes.


----------



## GorillaSceptre

Yeah, i don't think a benchmark getting 20fps lower in DX12 should be taken seriously.


----------



## jase78

yeah but that video is comparing the fury x and the 980 ti which is not even close to an apples to apples comparison. Remember the 980 is the nvidia equivalent to the fury x not the ti.


----------



## EpicOtis13

Quote:


> Originally Posted by *jase78*
> 
> yeah but that video is comparing the fury x and the 980 ti which is not even close to an apples to apples comparison. Remember the 980 is the nvidia equivalent to the fury x not the ti.


Lol what? The fury non x beats the 980 and the fury x beats it by quite a bit, fury x and 980ti are very very close especially at the higher resolutions.


----------



## WheelZ0713

Quote:


> Originally Posted by *EpicOtis13*
> 
> Lol what? The fury non x beats the 980 and the fury x beats it by quite a bit, fury x and 980ti are very very close especially at the higher resolutions.


This^

The Fury smashes the 980 let alone putting it up against the Fury X.

The Fury lines up against the 980 and the Fury X lines up against the 980Ti.


----------



## Kana-Maru

The Fury X was initially for the GTX 980. The Titan X released in the Spring. Apparently the 980 Ti was suppose to release in June, but Nvidia released it earlier. I wonder what else was suppose to release in June [Fury X reveal at E3 and release on June 24th]. I guess AMD did all they could and decided to keep things moving regardless of the 980 Ti "surprise".


----------



## WheelZ0713

That's interesting, i wasn't aware it was intended that way. Gives a whole new perspective to how well the AMD cards performed against Nvidia.

Given that the Fury X is well above the 980 it's intended competitior and only falls short of the the Ti, the unexpected competitor by around 5%.


----------



## MadRabbit

Hey guys and girls.

Since I got a cash flow going I was thinking about upgrading from my 280x to Fury (non X) CFX.

Do you think its a solid thing to do or just wait out for the new gen?

My 280x CFX is not bad for what I do but the upgrade itch is real


----------



## WheelZ0713

Quote:


> Originally Posted by *MadRabbit*
> 
> Hey guys and girls.
> 
> Since I got a cash flow going I was thinking about upgrading from my 280x to Fury (non X) CFX.
> 
> Do you think its a solid thing to do or just wait out for the new gen?
> 
> My 280x CFX is not bad for what I do but the upgrade itch is real


Personally i couldn't wait and brought the Tri-x. Really hapy with the perfromance especially after unlocking a few of th cu's. Only thing that i find vaguley annoying is that there isn;t much OC headroom, at least not until someone can tell us how to unlock the voltage.

Overall a really solid card in my oppinion. I can't see myself upgrading fr a while now.


----------



## Kana-Maru

Quote:


> Originally Posted by *WheelZ0713*
> 
> That's interesting, i wasn't aware it was intended that way. Gives a whole new perspective to how well the AMD cards performed against Nvidia.
> 
> Given that the Fury X is well above the 980 it's intended competitior and only falls short of the the Ti, the unexpected competitor by around 5%.


Yeah that's the part people always leave out. Nvidia couldn't allow the Fury X to release against their current [at the time] consumer market flagship GTX 980. I feel bad for GTX 980 purchasers and I would be PO'd if I had purchased a GTX 970 or a 980 [non Ti].

The biggest difference between the Fury X and the 980 Ti is that the 980 Ti has a lot of overclocking overhead. Which adds stress, increase power usage and heat to the components[980 Ti] . Even then the jump in performance isn't no more than 6%-10% for a highly overclocked 980 Ti vs a mild OC'd Fury X.


----------



## EpicOtis13

Quote:


> Originally Posted by *MadRabbit*
> 
> Hey guys and girls.
> 
> Since I got a cash flow going I was thinking about upgrading from my 280x to Fury (non X) CFX.
> 
> Do you think its a solid thing to do or just wait out for the new gen?
> 
> My 280x CFX is not bad for what I do but the upgrade itch is real


If you are at 1080p ir 1440p just wait, and upgrade to AMD Greenland cards next summer, which will blow the Furies and Titans out of the water.


----------



## WheelZ0713

Quote:


> Originally Posted by *EpicOtis13*
> 
> If you are at 1080p ir 1440p just wait, and upgrade to AMD Greenland cards next summer, which will blow the Furies and Titans out of the water.


If you're playing at 1080 then there's no real need to wait, the fury will be more than powerful enough.


----------



## Forceman

Quote:


> Originally Posted by *WheelZ0713*
> 
> If you're playing at 1080 then there's no real need to wait, the fury will be more than powerful enough.


If you are at 1080p I think you would be better off getting a 290X/390/390X to hold you over until the new cards drop. Fury performance at 1080p is not worth the extra cost over those cards.


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> If you are at 1080p I think you would be better off getting a 290X/390/390X to hold you over until the new cards drop. Fury performance at 1080p is not worth the extra cost over those cards.


This


----------



## Medusa666

I currently own a 295X2, I love this card but have been considering other options lately. My CPU and motherboard died and it made me think in terms of just things functioning for a long time. What is the lifespan of this card, three years? I mean looking at the AIO solution etc. The vrm, it is hard to find info about the details of the components.

I have had my eyes on the Asus Fury Strix for a month now, I like it's oversized VRM and power delivery and the zero DB mode, overall quality feels superior to the 295X2.

I had problems with crossfire support being weak or nonexistent in a few titles too when I began playing them, Witcher 3 Heroes of the storm and 7 days to die. This also bothered me a bit because then the performance was that of a 290X.

I bought two Fury X but was disappointed since both had the pump noise issue.


----------



## netman

does any of the sapphire fury trix owners have uefi working - whenever i switch on ultra fast boot on my Asrock Z77 Extreme6 Mobo i got error 97 (no graphics card found) without uefi the card works normal on the same board. It's the same on both bios switch positions.

hmm


----------



## BlackyMeow

Still no voltage control ? This is getting ridiculous..


----------



## en9dmp

So Fury X2 looks like it only has 2 8-pin power connectors... Surely this means performance will be more like Nano CF rather than Fury X CF?


----------



## Forceman

Quote:


> Originally Posted by *en9dmp*
> 
> So Fury X2 looks like it only has 2 8-pin power connectors... Surely this means performance will be more like Nano CF rather than Fury X CF?


I don't think that's necessarily the case. The 295X2 only has 2 8-pin connectors also, and it is two full Hawaii chips.


----------



## en9dmp

Quote:


> Originally Posted by *Forceman*
> 
> I don't think that's necessarily the case. The 295X2 only has 2 8-pin connectors also, and it is two full Hawaii chips.


true but 290x was only a 6 and an 8, so 2 X 8pin is still more... Just intrigued how it can pull enough power to run 2 fully clocked Fiji XTs with the same power connectors used to run 1. Fury X has 2 8pin...


----------



## Thoth420

Quote:


> Originally Posted by *BlackyMeow*
> 
> Still no voltage control ? This is getting ridiculous..


Is it due to a lack of the dude still having a fury x to play with?


----------



## Scorpion49

Quote:


> Originally Posted by *netman*
> 
> does any of the sapphire fury trix owners have uefi working - whenever i switch on ultra fast boot on my Asrock Z77 Extreme6 Mobo i got error 97 (no graphics card found) without uefi the card works normal on the same board. It's the same on both bios switch positions.
> 
> hmm


Z77 is probably a hybrid UEFI/BIOS which is giving you the problems. Works fine on every Z97/Z170/X99 board I've used. Even works on FM2+.


----------



## netman

hmm i don't know but with the internal gpu of the cpu (3770K and 2600K) uefi ultra fast boot on my Z77 Asrock Xtreme 6 works like a charm - also my old msi 7970 lightning worked perfect with uefi ultra fast boot - just the Sapphire fury trix gives me an error and won't boot with uefi on

only way to get machine working again is to make a cmos reset so ultra fast boot is not set anymore ...


----------



## nadja92

Hoping to get a Fury X soon, does any of the reference cards fit the Aqua computer water blocks?


----------



## Scorpion49

Quote:


> Originally Posted by *netman*
> 
> hmm i don't know but with the internal gpu of the cpu (3770K and 2600K) uefi ultra fast boot on my Z77 Asrock Xtreme 6 works like a charm - also my old msi 7970 lightning worked perfect with uefi ultra fast boot - just the Sapphire fury trix gives me an error and won't boot with uefi on
> 
> only way to get machine working again is to make a cmos reset so ultra fast boot is not set anymore ...


Well there is your problem already, the 7970 Lightning is NOT a UEFI vBIOS, so your board is using a hybrid or legacy mode for fast-boot with that.


----------



## netman

well you are right the 7970 lightning has no uefi bios @ stock









but the msi support gave out real uefi gop bioses for 7970 lighting and 7970 lighting boost edition when you requested it - within their official forums - what i did







so my lightning definitly had uefi gop bios and it worked perfectly with win 8.1 and this z77 mobo in ultra fast boot modus as i said


----------



## By-Tor

Quote:


> Originally Posted by *Forceman*
> 
> I don't think that's necessarily the case. The 295X2 only has 2 8-pin connectors also, and it is two full Hawaii chips.


If you look at the Powercolor 290x II Devil 13 air cooled dual core with 4-8 pin connectors.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584


----------



## Gumbi

How hot does she get under load? Say at 50% fan speed
Quote:


> Originally Posted by *By-Tor*
> 
> If you look at the Powercolor 290x II Devil 13 air cooled dual core with 4-8 pin connectors.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584


non binned chips, plus they might just be playing it safe by adhering to PCI connector specs.


----------



## richie_2010

does any kind person with a nano care to assist me with some pcb dimensions and screw locations??

just shoot me a pm

many thanks


----------



## wdpir32k3

Just got my sapphire fury in today and I love it and also I got my Acer 144hz with free sync and I'll tell you this card kicks butt


----------



## xer0h0ur

Quote:


> Originally Posted by *en9dmp*
> 
> true but 290x was only a 6 and an 8, so 2 X 8pin is still more... Just intrigued how it can pull enough power to run 2 fully clocked Fiji XTs with the same power connectors used to run 1. Fury X has 2 8pin...


Because AMD didn't care about drawing more power through the 8-pin cables than the specification calls for and well the Fury X doesn't use all of the power available to it either.


----------



## Gdourado

Anyone has a fury X on a mini ITX build?
If so, what case did you use?

Cheers!


----------



## looncraz

Quote:


> Originally Posted by *xer0h0ur*
> 
> Because AMD didn't care about drawing more power through the 8-pin cables than the specification calls for and well the Fury X doesn't use all of the power available to it either.


PCi-E power specifications say an 8-pin power cable can deliver 150 Watts and you can pull 75 watts from the board.

So Fury X[2]'s power configuration maxes at 375W.

So, yeah, AMD may skirt the line with an X2, and maybe gently step over it









EDIT:

/sarc


----------



## xer0h0ur

Quote:


> Originally Posted by *looncraz*
> 
> PCi-E power specifications say an 8-pin power cable can deliver 150 Watts and you can pull 75 watts from the board.
> 
> So Fury X[2]'s power configuration maxes at 375W.
> 
> So, yeah, AMD may skirt the line with an X2, and maybe gently step over it


What part of they didn't care about drawing more power through the 8-pins than the specification calls for did you not understand? The 295X2 doesn't care about the power draw spec for 8-pin cables


----------



## Agent Smith1984

Isn't it more likely that fury x2 is going to (2) 175w nano's?


----------



## Forceman

Quote:


> Originally Posted by *looncraz*
> 
> PCi-E power specifications say an 8-pin power cable can deliver 150 Watts and you can pull 75 watts from the board.
> 
> So Fury X[2]'s power configuration maxes at 375W.
> 
> So, yeah, AMD may skirt the line with an X2, and maybe gently step over it


Tell that to the 500W 295X2.


----------



## looncraz

Quote:


> Originally Posted by *xer0h0ur*
> 
> What part of they didn't care about drawing more power through the 8-pins than the specification calls for did you not understand? The 295X2 doesn't care about the power draw spec for 8-pin cables


Ugh, I was trying to be sarcastic









I even meant to put a /sarc tag... my bad


----------



## en9dmp

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Isn't it more likely that fury x2 is going to (2) 175w nano's?


That was the point of my original post, but it seems that cards are capable of drawing more power than the spec allows. I wasn't aware of this...

If this is the case, what is preventing the nano drawing 250w from its single 8-pin connector and basically performing like a fury X? I assume the BIOS is what determines the power draw, but can this not be modified?


----------



## xer0h0ur

Well, without going over spec you can draw 150W on the 8-pin + 75W from the PCI-E slot for 225W total. Going a meager 25W over that isn't really a stretch at all.


----------



## Neon Lights

Quote:


> Originally Posted by *Neon Lights*
> 
> Aqua Computer water blocks got delivered to me today, a few hours ago I mounted them and integrated them into my loop (which is shared witch my mainboard water blocks). Here is how it looks:
> 
> 
> 
> 
> I have to admit, though, that I totally broke part of the threads in the Aqua Computer kryoConnect. The set screw that I screwed into it for right water flow just would not let it screw itself into the thread after the gap where the opening of the water outlet is. I also could not get it out again either by turning it backwards so I tried to get it as far as possible into the thread and now it is screwed into totally wrong, but there does not seem to be a problem with the water flow.


Quote:


> Originally Posted by *nadja92*
> 
> Hoping to get a Fury X soon, does any of the reference cards fit the Aqua computer water blocks?


You mean these? All the Fury X reference cards have the same PCB layout, no matter from which manufacturer you buy them they are all made by the same manufacturer (AMD themselves as far as I know).


----------



## nadja92

Quote:


> Originally Posted by *Neon Lights*
> 
> You mean these? All the Fury X reference cards have the same PCB layout, no matter from which manufacturer you buy them they are all made by the same manufacturer (AMD themselves as far as I know).


That's good to know, I was looking at the sapphire card. Now it's just staring at prices waiting for money to hit my bank.


----------



## Sickened1

Quote:


> Originally Posted by *Forceman*
> 
> If you are at 1080p I think you would be better off getting a 290X/390/390X to hold you over until the new cards drop. Fury performance at 1080p is not worth the extra cost over those cards.


What about 1440P? I'm at 1080P and looking to pick up a new Fury Tri-X this weekend. I am looking to move up to 1440P, perhaps higher, at around time for black friday.


----------



## WheelZ0713

Quote:


> Originally Posted by *Sickened1*
> 
> What about 1440P? I'm at 1080P and looking to pick up a new Fury Tri-X this weekend. I am looking to move up to 1440P, perhaps higher, at around time for black friday.


I have a Tri-x and currently run at 1080 and i average 160fps with everything maxxed. As such i will be uprgrading to 144hz @ 1440 sometime next week.


----------



## Silent Scone

Quote:


> Originally Posted by *WheelZ0713*
> 
> I have a Tri-x and currently run at 1080 and i average 160fps with everything maxxed. As such i will be uprgrading to 144hz @ 1440 sometime next week.


What a waste


----------



## Sickened1

Quote:


> Originally Posted by *WheelZ0713*
> 
> I have a Tri-x and currently run at 1080 and i average 160fps with everything maxxed. As such i will be uprgrading to 144hz @ 1440 sometime next week.


Perfect. Just ordered one this morning. Will be here tomororw! Thank you amazon and your $9 next day shipping!


----------



## WheelZ0713

Quote:


> Originally Posted by *Sickened1*
> 
> Perfect. Just ordered one this morning. Will be here tomororw! Thank you amazon and your $9 next day shipping!


Nice! You'll be stoked mate.

I'm thinking i might have to pick this bad boy up while i'm out and about today.

http://www.pccasegear.com/index.php?main_page=product_info&cPath=558_1213&products_id=31957


----------



## richie_2010

Quote:


> Originally Posted by *richie_2010*
> 
> does any kind person with a nano care to assist me with some pcb dimensions and screw locations??
> 
> just shoot me a pm
> 
> many thanks


is anyone able to assist, i have read people are not liking them with them being close to the fury x price and the nasty coil whine but i would have thought someone would have one


----------



## xer0h0ur

Quote:


> Originally Posted by *Sickened1*
> 
> What about 1440P? I'm at 1080P and looking to pick up a new Fury Tri-X this weekend. I am looking to move up to 1440P, perhaps higher, at around time for black friday.


As for modern DX11/DX12 games, I am sure there will still be plenty of games that will be taxing on the video card @ 1440p. I know that Arkham Knight was still taxing @ 1440p when I tried playing it lower than 4K.


----------



## Sonikku13

I am currently on a laptop with an A10-4600M with no dedicated graphics card. I have a spare desktop in the basement I can use for gaming too, but since I took out the SSD, it's not useful to me at the moment. It does have an A10-7850K, and I used to pair it with a 290X, so I'm not concerned about potential bottlenecking.

Assuming I buy an SSD, is a Radeon R9 Nano worth it? My timeframe to buy it is between now and Black Friday 2015. I am getting sick of "standard (laptop) settings" in FFXIV, and I would love to max FFXIV out. Either way, whether I buy the Radeon R9 Nano or not, my rig will be overhauled completely with either NVIDIA Pascal or AMD Arctic Islands in 2016.


----------



## nadja92

Quote:


> Originally Posted by *Neon Lights*
> 
> You mean these? All the Fury X reference cards have the same PCB layout, no matter from which manufacturer you buy them they are all made by the same manufacturer (AMD themselves as far as I know).


Sorry to ask again but does the Fury have the same Pcb board as a Fury X? So an X waterblock would for a normal reference Fury?


----------



## Neon Lights

Quote:


> Originally Posted by *nadja92*
> 
> Sorry to ask again but does the Fury have the same Pcb board as a Fury X? So an X waterblock would for a normal reference Fury?


The Furys from Sapphire have, the ASUS Strix not, I am not sure about the XFX ones.


----------



## nadja92

Quote:


> Originally Posted by *Neon Lights*
> 
> The Furys from Sapphire have, the ASUS Strix not, I am not sure about the XFX ones.


That's great as I was looking at a Sapphire Fury Tri-X instead and then slapping on a Aquacomputer waterblock. Thank you. I hope they make the active backplate soon though.


----------



## xer0h0ur

My memory is a bit shot right now but I believe only the Asus Strix uses a non-reference PCB so the rest of the cards out there should be using the same PCB as the Fury X. Asus was supposed to be releasing an ROG Matrix Fury but I haven't seen it yet to know if they are finally going to make actual use of that custom PCB design on that card or not.


----------



## fjordiales

per TPU.

Fury X:

http://www.techpowerup.com/reviews/AMD/R9_Fury_X/4.html



Tri-X:

http://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/3.html



Strix:

http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/4.html


----------



## Thoth420

To crossfire EK Blocked Fury X's for 1440 144hz gaming or not...that is the question. I plan on buying a FreeSync Panel as well. Is there ever going to be FreeSync Xfire support?


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> To crossfire EK Blocked Fury X's for 1440 144hz gaming or not...that is the question. I plan on buying a FreeSync Panel as well. Is there ever going to be FreeSync Xfire support?


Freesync crossfire works fine?? Only in DX9 it doesn't but AMD doesn't give a crap about DX9 anyhow.


----------



## Ceadderman

Does anybody give two craps for DX9 anymore? Most games are DX11 these days.









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Ceadderman*
> 
> Does anybody give two craps for DX9 anymore? Most games are DX11 these days.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Skyrim ring any bells?


----------



## Scorpion49

I have a ton of DX9 games I like to go back and play, many of my friends who game on PC feel the same way. Not everybody only plays the newest AAA titles.


----------



## Thoth420

Nice, I care about DX9 but those games should be plenty fine with xfire disabled.


----------



## Sickened1

Okay so i have my build together. Has anyone else had MAJOR issues with FPS and stuttering in W10? Games are unplayable, Furmark will randomly get a set of FPS and stay there till i close and reopen it. Sometimes it will be 28 fps, sometimes 62, sometimes 98 give or take 5fps on each.

The GPU says its at 100% load, temps don't go much higher than mid 50's. So im a little confused. Anyone have any input on this?


----------



## Ceadderman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Does anybody give two craps for DX9 anymore? Most games are DX11 these days.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Skyrim ring any bells?
Click to expand...

Touche.









But let's be honest, since I was meaning developers not gamers. DX11 is being implemented more now and Skyrim is old enough that anyone with DX11 capable graphics can still play it.

But I sincerily was referring to the developer side of things.









You still get +Rep for your quick wit.









~Ceadder


----------



## Semel

Is it possible that memory overclocking "capacity" has changed? A few days ago I could run firestrike at 580 artifacts free and now I'm limited to 560+
I've been playing some games these days at 1080/550-560

PS As for the core the maximum stable overclock I've managed to get on my unlocked fury is 1080.

PS Sorry guys.;((

I just rechecked my previous benchmark results and it looks like I was mistaken. Memory was overclocked to 570 not 580.

I've just tried running 1080/570 and everything works OK.


----------



## Neon Lights

Quote:


> Originally Posted by *xer0h0ur*
> 
> My memory is a bit shot right now but I believe only the Asus Strix uses a non-reference PCB so the rest of the cards out there should be using the same PCB as the Fury X. Asus was supposed to be releasing an ROG Matrix Fury but I haven't seen it yet to know if they are finally going to make actual use of that custom PCB design on that card or not.


The XFX ones are also supposed to have a custom PCB.

Before buying them and water blocks, I would just compare pictures of the PCB to make sure.


----------



## Medusa666

After long and careful consideration I sold my 295X2 and ordered a ASUS R9 Fury Strix this morning, I have high hopes for this card and that it will perform in 1440P, excited to see once I get it.







Will be back with a mini-review!

Thanks to the fellow OCN members who gave me opinions about their cards in this thread, it helped out alot in making my decision.


----------



## Semel

Pardon my asking, but why did you sell it? 295x2 is a beast of a card. And fury strix won't perform at 295x2 level.Not even close.


----------



## Medusa666

Quote:


> Originally Posted by *Semel*
> 
> Pardon my asking, but why did you sell it? 295x2 is a beast of a card. And fury strix won't perform at 295x2 level.Not even close.


295x2 is truly a beast, and I do love that card but some of the games I enjoy the most don't support crossfire,while the performance is great in other games where it works, I prefer steady and predictable performance overall over potentially higher.

Noise is the second reason, I use Noctua cooler and Noctua fans only, and the Strix is a more silent card especially during low or idle loads.

I will miss this beauty of a card, but hopefully the new owner will take care of it : )


----------



## Kana-Maru

^Yup that sounds like money was a factor.


----------



## Medusa666

Quote:


> Originally Posted by *Kana-Maru*
> 
> ^Yup that sounds like money was a factor.


Haha you being ironic? : )


----------



## newls1

Guys can you please help me and NOT yell at me for not reading 450 pages here to find this answer!







.... I should be getting a Shapphire tri X Fury coming, what is the OC program you all are using to unlock the voltage and increase core/mem and voltage for these cards?? thanks a million


----------



## p4inkill3r

MSI Afterburner is my choice, but voltage unlock is not available(yet?)


----------



## huzzug

If you are gonna use the Sapphire, I'd use their own utility Trixx as that seems to be more compatible with their cards in terms of voltage and overclock limits. This is based on their cards from previous gen which did not allow me to alter volts or overclock past a certain limit unless I used Trixx


----------



## fjordiales

Quote:


> Originally Posted by *Medusa666*
> 
> Haha you being ironic? : )


FYI...

http://www.hardocp.com/article/2015/09/28/asus_strix_radeon_r9_fury_dc3_crossfire_at_4k_review#.VgmajH7n_qA


----------



## cnckane

Today I've got a similar artifacting/flickering issue on desktop with my Sapphire Fury like this one: 




My card is 2 weeks old and I didn't have any issues in-game. Simply enabling/disabling the card from Device Manager solves it.

I don't think it's a faulty card it wouldn't be so easy to fix once it happens and as I said it happens very rarely and never in-game.


----------



## Jflisk

Quote:


> Originally Posted by *cnckane*
> 
> Today I've got a similar artifacting/flickering issue on desktop with my Sapphire Fury like this one:
> 
> 
> 
> 
> My card is 2 weeks old and I didn't have any issues in-game. Simply enabling/disabling the card from Device Manager solves it.
> 
> I don't think it's a faulty card it wouldn't be so easy to fix once it happens and as I said it happens very rarely and never in-game.


Just change the screen resolution and back will clear it also. Mine does it also - I am not even sure if AMD is aware of the problem.


----------



## wdpir32k3

Quote:


> Originally Posted by *Sickened1*
> 
> What about 1440P? I'm at 1080P and looking to pick up a new Fury Tri-X this weekend. I am looking to move up to 1440P, perhaps higher, at around time for black friday.


I just got my fury last week and 1440p is nothing to it also it never gets hot


----------



## Semel

Quote:


> Originally Posted by *cnckane*
> 
> Today I've got a similar artifacting/flickering issue on desktop with my Sapphire Fury like this one:
> 
> 
> 
> .


Same here. It happens sometimes when I browse the internet. Other than that the card works perfectly. Changing resolution helps. I've disabled today everything related to EnableULPS in registry. just in case.


----------



## Sickened1

Do any other owners have issues with moving windows on the desktop being laggy and just crap fps in general?

I have 15.7.1 installed with driver package 15.20.1062.1004 which is the newest non-beta. In furmark 1080p stress test i will either get ~30 fps or ~95. Whichever i get is completely random when i launch it. Playing games is the same way. Anyone else have these types of problems?


----------



## DerkaDerka

Didn't want to start a whole new thread on this so figured I'd ask in here. Have they fixed the noise issues with these cards yet or is it still luck of the draw? I saw an article that claimed AMD knew what the problem was and had found a fix but didn't know if they've implemented it yet?

I would like to stick with AMD as I don't really care for Nvidia, but if they still haven't definitively fixed the noise issues then I will probably end up getting a 980TI until next years cards come out.


----------



## cnckane

I'm not sure about the pump noise (I think if you get from the new ones it won't be an issue), but my R9 Fury has very load coil noise at high load (GTA V, [email protected] FPS).


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> Is it possible that memory overclocking "capacity" has changed? A few days ago I could run firestrike at 580 artifacts free and now I'm limited to 560+
> I've been playing some games these days at 1080/550-560
> 
> PS As for the core the maximum stable overclock I've managed to get on my unlocked fury is 1080.
> 
> PS Sorry guys.;((
> 
> I just rechecked my previous benchmark results and it looks like I was mistaken. Memory was overclocked to 570 not 580.
> 
> I've just tried running 1080/570 and everything works OK.


Did you by chance make a driver version change? One driver to the next does affect your overclocking.


----------



## Thoth420

Quote:


> Originally Posted by *DerkaDerka*
> 
> Didn't want to start a whole new thread on this so figured I'd ask in here. Have they fixed the noise issues with these cards yet or is it still luck of the draw? I saw an article that claimed AMD knew what the problem was and had found a fix but didn't know if they've implemented it yet?
> 
> I would like to stick with AMD as I don't really care for Nvidia, but if they still haven't definitively fixed the noise issues then I will probably end up getting a 980TI until next years cards come out.


Mine was from release day US. I just opted to water block it and call it a day. Pump noise here...coil whine can be caused by too many things to blame the GPU solely.


----------



## nickcnse

Just threw an EK waterblock with the nickel backplate on my r9 fury x and it is amazingly cool. No matter what I do it only ever runs at 39C. Average room temp is 26C.


----------



## p4inkill3r

Quote:


> Originally Posted by *nickcnse*
> 
> Just threw an EK waterblock with the nickel backplate on my r9 fury x and it is amazingly cool. No matter what I do it only ever runs at 39C. Average room temp is 26C.


Firestrike?


----------



## Jflisk

Furmark ?


----------



## Mega Man

Beer?


----------



## Arizonian

Quote:


> Originally Posted by *p4inkill3r*
> 
> Firestrike?


Quote:


> Originally Posted by *Jflisk*
> 
> Furmark ?


Quote:


> Originally Posted by *Mega Man*
> 
> Beer?


The exact order I test a new GPU


----------



## HagbardCeline

Finally was able to get one of these things. Hopefully it's one of the new ones. Ordered it from B&H but it shipped straight from the distributor in California (which was fine with me, since I'm on the West Coast).


----------



## p4inkill3r

Welcome aboard the good ship Lollipop.


----------



## Medusa666

Quote:


> Originally Posted by *HagbardCeline*
> 
> Finally was able to get one of these things. Hopefully it's one of the new ones. Ordered it from B&H but it shipped straight from the distributor in California (which was fine with me, since I'm on the West Coast).


Hi man, welcome to the club.

Can you take some shots of the card once unboxed I think Sapphire made some changes to the backplate and it would be nice to confirm.


----------



## HagbardCeline

Quote:


> Originally Posted by *Medusa666*
> 
> Hi man, welcome to the club.
> 
> Can you take some shots of the card once unboxed I think Sapphire made some changes to the backplate and it would be nice to confirm.


Is this the angle you want?


----------



## Medusa666

Quote:


> Originally Posted by *HagbardCeline*
> 
> Is this the angle you want?


Perfect, thank you!


----------



## Medusa666

Finally got my *ASUS Fury Strix*









I first bought two Fury X but wasn't satisfied with the noise levels, so I have high hopes for this card now.





Installed and ready for some benching, the chicken gives a 10% performance boost


----------



## battleaxe

Quote:


> Originally Posted by *Medusa666*
> 
> Finally got my *ASUS Fury Strix*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I first bought two Fury X but wasn't satisfied with the noise levels, so I have high hopes for this card now.
> 
> 
> 
> 
> 
> Installed and ready for some benching, the chicken gives a 10% performance boost


Jeez, the backplate is almost touching the CPU cooler...

Chicken's are always a big help. Especially round dinner time!


----------



## Mega Man

Another reason I prefer water over big air


----------



## Medusa666

Quote:


> Originally Posted by *battleaxe*
> 
> Jeez, the backplate is almost touching the CPU cooler...
> 
> Chicken's are always a big help. Especially round dinner time!


Yeah, there is about 2-3 mm between the Noctua fan clips and the backplate, the backplate is not conductive though, and the exposed GPU isn't close to the clip at all, but who knows


----------



## GMcDougal

I have a Sapphire Fury Tri-X in coming. Sold my 980 to go to this. Im hoping to find the smoothness i had with my 290x. I had two 970's and a 980 and all were not as smooth as my 290x on a 120hz monitor.


----------



## swiftypoison

Quote:


> Originally Posted by *GMcDougal*
> 
> I have a Sapphire Fury Tri-X in coming. Sold my 980 to go to this. Im hoping to find the smoothness i had with my 290x. I had two 970's and a 980 and all were not as smooth as my 290x on a 120hz monitor.


How much did you sell your 980 for? I am thinking of selling mine and picking up a Fury X.


----------



## Sickened1

After getting a new motherboard, either it was bad or it had driver level issues with the AMD card, i can truly see this card shine now. Wow, glad i did that instead of just going green.


----------



## GMcDougal

395 shipped. I paid $380 for it 3 months ago


----------



## Dominican

after reading some your command i am coming from SAPPHIRE VAPOR-X Radeon R9 290 to ASUS FURY R9 need them for 4k


----------



## Cool Mike

Does the Sapphire version now have a Sapphire label on the fan?
Is the word "Sapphire" on the front plate also?

Thanks


----------



## Medusa666

Quote:


> Originally Posted by *Cool Mike*
> 
> Does the Sapphire version now have a Sapphire label on the fan?
> Is the word "Sapphire" on the front plate also?
> 
> Thanks


Yeah, Indeed it does.


----------



## Randomdude

I'd like to join the club! After (north of) a month of getting the parts and other stumbles the PC finally booted into bios for the first time today. Haven't installed an OS yet, but I'm definitely itching.

Best part to me is the contrast between the systems. Went from a 2x1GB, E6300, stock cooler, 1950 Pro, nforce 630i mobo, some 120Mv ripple 350w PSU to 16GB Dom platinums 2133 cas 9, 4790k, h110i, fury x, gb z97x g1 wifi bk, seasonic 760 plat, sandisk 240 extreme pro, stryker case xD


----------



## Medusa666

Quote:


> Originally Posted by *Randomdude*
> 
> I'd like to join the club! After (north of) a month of getting the parts and other stumbles the PC finally booted into bios for the first time today. Haven't installed an OS yet, but I'm definitely itching.
> 
> Best part to me is the contrast between the systems. Went from a 2x1GB, E6300, stock cooler, 1950 Pro, nforce 630i mobo, some 120Mv ripple 350w PSU to 16GB Dom platinums 2133 cas 9, 4790k, h110i, fury x, gb z97x g1 wifi bk, seasonic 760 plat, sandisk 240 extreme pro, stryker case xD


Congratulations to your new shiny system and welcome to the club : )


----------



## Randomdude

Thank you! I'll put up pictures when I have nothing better to do.


----------



## nickcnse

Here's firestrike, ended up hitting 41C after multiple runs: http://www.3dmark.com/fs/6123057

Only have the core overclocked, need to get the memory OC'd too, as well as download furmark lol.

Edit: Furmark score:6292 points (104 FPS, 60000 ms)

Didn't see a way to link officially. Furmark hit 45C but my fans didnt even start moving on my rads cause my CPU was cruising at around 32C


----------



## Semel

*nickcnse*

noice







Could you , please, run firestrike ( not extreme\ultra)?

Thanx.

Quote:


> Furmark score:6292 points (104 FPS, 60000 ms)


what preset?

PS Btw what was your driver tessellation setting?


----------



## Thoth420

Quote:


> Originally Posted by *Randomdude*
> 
> I'd like to join the club! After (north of) a month of getting the parts and other stumbles the PC finally booted into bios for the first time today. Haven't installed an OS yet, but I'm definitely itching.
> 
> Best part to me is the contrast between the systems. Went from a 2x1GB, E6300, stock cooler, 1950 Pro, nforce 630i mobo, some 120Mv ripple 350w PSU to 16GB Dom platinums 2133 cas 9, 4790k, h110i, fury x, gb z97x g1 wifi bk, seasonic 760 plat, sandisk 240 extreme pro, stryker case xD


Major upgrade! Welcome and enjoy the new build!


----------



## Dominican

I just got it.


----------



## Medusa666

Quote:


> Originally Posted by *Dominican*
> 
> I just got it.


Congratulations, it is an amazing card!


----------



## GMcDougal

Just got my Sapphire Fury Tri-X. The cooler on this thing does a ridiculous job of keeping it cool. Loving it so far.


----------



## Semel

Hey guys.
I unlocked my fury to 3840 stream processors and OCed it to 1080/570. I use a custom afterburner fan profile when I play and the card is perfectly stable in benchmarks and games even withcer 3 (gpu hovers around 45-50C at 45-50% fan speed and it is not loud at all).

However, I was curious and decided to disable my custom afterburner fan profile and let the card's bios control fans speed

I launched witcher 3 and started playing it.When GPU reached 65C amd driver crashed .

So I decided then to disable overclock and test it again without afterburner profile active. As expected it got to 74-75C and fans were at ~27-28+%. So the card was maintaining a default working 74-75C using as little fan power as possible to keep it real quite.

I wonder why the driver crashed when I tested OCed fury without "proper" cooling.. I mean it didn't even reach 75C and crashed at 65C. VRMs got real HOT? =)

Any idea what happened there?


----------



## p4inkill3r

The card was probably hunting for more voltage.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> Hmm..interesting..
> 
> I unlocked my fury to 3840 stream processors and OCed it to 1080/570. I use a custom afterburner fan profile when I play and the card is stable.(gpu hovers around 45-50C at 45-50% fan speed and it is not loud at all).
> 
> However, I was curious and decided to disable my custom afterburner fan profile and let the card auto control fans.
> 
> I launched witcher 3 and started playing it.When GPU reached 65C driver crashed ..
> 
> So I decided then to disable overclock and test it again without afterburner profile active. As expected it got to 74-75C and fan was at ~27-30%. So the card was maintaining a working 74-75C using as little fan power as possible to keep it real quite.
> 
> I wonder why the driver crashed when I tested OCed fury without "proper" cooling.. I mean it didn't even reach 75 temp and crashed at 65C. VRMs got real HOT? =)


a card will always be more stable at a given lower temp. The card will need less volts at lower temp for the same clock speed. Your clock is stable at 50c bit is not at 70c.


----------



## Semel

*battleaxe*
Quote:


> The card will need less volts at lower temp for the same clock speed


Ah..I see. I didn't know that


----------



## nickcnse

Here's the firestrike score, non-extreme, non-ultra: http://www.3dmark.com/fs/6129402


----------



## Semel

Quote:


> Originally Posted by *nickcnse*
> 
> Here's the firestrike score, non-extreme, non-ultra: http://www.3dmark.com/fs/6129402


Thank you. My graphics score is pretty close to yours









http://www.3dmark.com/3dm/8729351 (3840 stream processors unlocked, 1080/570)

For some reason 3dmark doesn't detect my core\memory speed. ;(


----------



## royfrosty

Some benchies on my rig


Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/royfrosty/media/20151004_011407_zpsdbjov6hw.jpg.html





Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/royfrosty/media/GPU 2_zpsjfis4zoc.png.html



GPU 2 is my main GPU and GPU 1 is my slave. Ambient 28degC


Spoiler: Warning: Spoiler!



http://s995.photobucket.com/user/royfrosty/media/GPU_zpsuvvnveqo.png.html


----------



## aznguyen316

nice build!! I just re-did my loop some to better fit my mITX case. You have nice temps too. I'm going for a near silent build, but here's my best Fury unlocked benchmark from a few weeks back on air.

http://www.3dmark.com/fs/5923385

I haven't re-run with it under water to check temps though.


----------



## Dominican

how far has anyone been able to oc asus one so far ??


----------



## xer0h0ur

The Strix card actually runs slightly lower voltage than the other Fury cards. You're not going to OC better without the physical modification of the card which a couple of people have done.

It really doesn't make sense either for them to have created a non-reference design only to gimp it. Were still waiting on the Asus ROG Matrix version of the Fury which is due to come out. This will presumably finally take advantage of the non-reference design. Or so its believed.


----------



## fjordiales

Quote:


> Originally Posted by *Dominican*
> 
> how far has anyone been able to oc asus one so far ??


1080/500 in crossfire witcher 3. 1040-1050/500 in trifire. We're stuck at 1.69v compared to 1.8-1.212v on tri-x.


----------



## wdpir32k3

Quote:


> Originally Posted by *cnckane*
> 
> I'm not sure about the pump noise (I think if you get from the new ones it won't be an issue), but my R9 Fury has very load coil noise at high load (GTA V, [email protected] FPS).


what fury are you running?


----------



## wdpir32k3

Got my fury in last week


----------



## Semel

Quote:


> Originally Posted by *fjordiales*
> 
> 1080/500 in crossfire witcher 3. 1040-1050/500 in trifire. We're stuck at 1.69v compared to 1.8-1.212v on tri-x.


Well you get 1080 at 1.69v and I have the same 1080 at 1.2v. 1090 is not stable in some games....=)

Btw Unwinder still hasn't received Nano ;(( Unbelievable.. I wonder though..when voltage control becomes available what will be the safe voltage increase for our furies.. I reckon 1.25, not higher.. so as to keep temps in check.


----------



## GMcDougal

I'm just trying to figure out two things: why did AMD lie to us and why aren't they allowing us to modify these cards. I watched the revealing and things like " overclockers dream " keep popping in my head. I just got this Fury and love it, I think it's a great card but I just don't get it.


----------



## xer0h0ur

BTW the overclockers dream comment was made for the Fury X but same disappointment still applies. Without legitimate voltage control we still don't have a clear picture.


----------



## Thoth420

Ahhh marketing puffery...my favorite is still "gaming".


----------



## MrKoala

You forgot the professional bucket at $5000... Don't get me started on the certified industrial bucket and the complete bucketing solution with an extra ladle showing ground-breaking poor ergonomics that once again changes everything.


----------



## Thoth420

Quote:


> Originally Posted by *MrKoala*
> 
> Your forgot the professional bucket at $5000... Don't get me started on the certified industrial bucket and the complete bucketing solution with an extra ladle showing ground-breaking poor ergonomics that once again changes everything.


Indeed I did


----------



## Randomdude

First impressions of this new PC - it's doing everything I want it to do without breaking a sweat. Very silent, all the functionality features are in working order. The Stryker case is a looker for sure. For the next build I do I will however go with a more streamlined case, something classy in black and a bigger side-window. I wanted to buy a sound card, put it off for later, and now I am against it. Sound is good as is with the motherboard alone.

In all honestly, I don't think I needed a GPU as powerful as the Fury X. In hindsight, I should've gone with a Gigabyte X99 top tier board, a REV or X99-WS, a 5960X and a weak GPU. The best part of my build in my opinion is the motherboard. This Gigabyte Z97X-Gaming G1 WiFi-BK is really, really killing it with features and stability. I can't imagine how the enthusiast boards are like.


----------



## fewness

managed to stuff my Nano into Jonsbo V2, which doesn't have a graphics card slot to begin with. I basically cut a hole at its back, not artisan quality but works well. The case is 170 mm (W) x 200 mm (H) x 200 mm (D) = 6.8 L. I'm going to have the best fps/liter.....
















Original V2 back, for compare


----------



## Randomdude

That looks absolutely sick.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fewness*
> 
> managed to stuff my Nano into Jonsbo V2, which doesn't have a graphics card slot to begin with. I basically cut a hole at its back, not artisan quality but works well. The case is 170 mm (W) x 200 mm (H) x 200 mm (D) = 6.8 L. I'm going to have the best fps/liter.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Original V2 back, for compare


Love it dude, ready to crack vtec for sure!


----------



## Jflisk

Need someone to conduct an experiment. Battlefield hardline VSR turned on running mantle or dx11 seems to do it quicker with mantle at 2550 X 1440 play for awhile see if you system black screen crashes (intermittent) with 15.9.1 Beta. Doesn't crash with 15.7.1 B. Thanks


----------



## Alastair

So iI just got my EKWB kit for my Fury's. And while taking off the Tri-x cooler I noticed something a bit worrying. I'm not sure if it's the interposer or just a sticky tape that Sapphire put there. But what ever it is the thermal paste has squeezed underneath it. Any thoughts on this? Using the toothpick to just gently show the thing. I think it's a tap or something used to just protect the HBM stacks from the heatsink? Am i right? I just want to make sure before I go any further to make sure this chip isn't DOA.


----------



## cnckane

No idea what's that but Techpowerup also mentioned that it's not on the Fury X: TPU review - page 4

"_In the second picture, you can see some kind of transparent sticky foil that covers the interposer and has cutouts for the GPU die and the HBM stacks. I'm not sure what it does, but it's not present on the AMD Fury X or the ASUS Fury Strix._"


----------



## Alastair

Quote:


> Originally Posted by *cnckane*
> 
> No ide what's that but Techpowerup also mentioned that it's not on the Fury X: TPU review - page 4
> 
> "_In the second picture, you can see some kind of transparent sticky foil that covers the interposer and has cutouts for the GPU die and the HBM stacks. I'm not sure what it does, but it's not present on the AMD Fury X or the ASUS Fury Strix._"


so then the question is. Can I safely remove it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> Need someone to conduct an experiment. Battlefield hardline VSR turned on running mantle or dx11 seems to do it quicker with mantle at 2550 X 1440 play for awhile see if you system black screen crashes (intermittent) with 15.9.1 Beta. Doesn't crash with 15.7.1 B. Thanks


Any memory leeks on 15.9 with you guys?

390 club is having a lot.....


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Any memory leeks on 15.9 with you guys?
> 
> 390 club is having a lot.....


15.9.1 Beta is supposed to fix a problem with memory leak but the driver numbers are the same since 15.8 B. The 15.7.1 uses a different driver but introduces a intermittent texture problem on windows 10 where the screen shrinks and has weird lines in it changing the resolution fixes it. Also does not happen as frequently if the ULPS is off. So if memory leak defined is the screen going black when playing games every soo often then yes I have one . See my last post have not had it happen yet at no VSR and 1920x 1080 Resolution. Turn the VSR on at 2550x1440 Then the problem starts with anything from 15.8B on up to 15.9.1B same driver number throughout. All the problems listed are on AMD forum or reported. Thru forum and there help page.


----------



## Alastair

What's the best way to apply TIM to the beast? Small dots for HBM II machine imagine and then what for this massive die?


----------



## Randomdude

Hmm. Weird problem. I left my PC for a while and when I came back, the monitor was essentially in "sleep", but it was flickering between black and grey-ish as if it was about to turn on every few seconds. The GPU was at the same time going between the green light on the indicator which I assume is the zero core power state and suddenly ramping up to full red and back again in rhythm with the screen transitions. I don't believe this is normal behavior. I got into desktop and restarted it. Usually the restart goes smooth, this time the computer shut off as if it went into hibernation before it powered on on its own. Any ideas what this is?

There also seems to be a high-pitched, very low noise coming from my monitor whenever I hover my mouse over something that can be executed. What gives?

I'm really at a loss. Disabled power saving modes for the time being.

Specifications are the following: Seasonic 760xp2, 4790k, Fury X, Dom Plats 4x4 2133, Z97X-Gaming G1, 240 Sandisk Ext Pro

The Zero Core state I've never used before and I'm not too sure about it. The PSU supports C6/C7 Haswell power states if that's of any help.


----------



## Tivan

Memory leak means that the vram fills up indefinitely (in this case, vram usage would go up slightly when resizing some windows, never going down.). that issue is supposedly fixed with 15.9.1

I had the issue with 15.9 but not anymore with 15.9.1 so that's something.


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> What's the best way to apply TIM to the beast? Small dots for HBM II machine imagine and then what for this massive die?


I think I would do the cross star on the die on these to cover it. + X put them together.


----------



## Sonikku13

I am tempted to buy a Radeon R9 Nano. I was wondering at what price would it be worth grabbing one at? I was thinking $400 on Black Friday.


----------



## p4inkill3r

Quote:


> Originally Posted by *Sonikku13*
> 
> I am tempted to buy a Radeon R9 Nano. I was wondering at what price would it be worth grabbing one at? I was thinking $400 on Black Friday.


I highly doubt you'll see Nano @ $400. I'd grab one at $500-$550, however.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> I highly doubt you'll see Nano @ $400. I'd grab one at $500-$550, however.


I'm selling my spare XFX Fury X for 500. Never used so can't comment on pump noise and my rig is in the shop getting its loop done and some other stuffs so I only have my Sager at the moment. No way to test her and local sales have been a bust because I live in an area where people really don't buy flagship gpus.


----------



## p4inkill3r

I bet you could unload it at that price on the OCN marketplace.


----------



## xer0h0ur

You don't make your own loops? -100 e-cred. LOL just playin. If I wasn't waiting for the Arctic Islands generation I would jump on that offer.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> I bet you could unload it at that price on the OCN marketplace.


That's probably the plan soon.
Quote:


> Originally Posted by *xer0h0ur*
> 
> You don't make your own loops? -100 e-cred. LOL just playin. If I wasn't waiting for the Arctic Islands generation I would jump on that offer.


If I had any experience I would give it a try but this is alot of expensive hardware and I learn more by reverse engineering for some odd reason. I feel like I would def screw something up.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> That's probably the plan soon.
> If I had any experience I would give it a try but this is alot of expensive hardware and I learn more by reverse engineering for some odd reason. I feel like I would def screw something up.


I took the risk with about $5000 worth of hardware and took it as a learning experience. I was crappin myself though on the 2nd loop re-vamp since I had sprung a leak from a rotary fitting which was directly above the power supply. Just lucky that there is a metal shroud surrounding the PSU or else it would have gotten real interesting.


----------



## swiftypoison

Quote:


> Originally Posted by *Thoth420*
> 
> I'm selling my spare XFX Fury X for 500. Never used so can't comment on pump noise and my rig is in the shop getting its loop done and some other stuffs so I only have my Sager at the moment. No way to test her and local sales have been a bust because I live in an area where people really don't buy flagship gpus.


Put it on Amazon and ill bite. I have a Amazon credit card i need to use


----------



## Alastair

Funny thing this. Took apart my second Sapphire Fury. And no sticky tape stuff under the die.

DAMN this die is such a work of art. I could stare at it all day.













MSI HD6850 power edition with Fury. Barts and Fiji.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> I took the risk with about $5000 worth of hardware and took it as a learning experience. I was crappin myself though on the 2nd loop re-vamp since I had sprung a leak from a rotary fitting which was directly above the power supply. Just lucky that there is a metal shroud surrounding the PSU or else it would have gotten real interesting.


I have heard rotary fittings are most prone to leaks. Any truth to that or just coincidence that was the type in your case?


----------



## Jflisk

Quote:


> Originally Posted by *Thoth420*
> 
> I have heard rotary fittings are most prone to leaks. Any truth to that or just coincidence that was the type in your case?


I have heard stories of rotary fittings leaking also. I have used bits power rotary fittings with no problems numerous times.


----------



## Thoth420

Quote:


> Originally Posted by *Jflisk*
> 
> I have heard stories of rotary fittings leaking also. I have used bits power rotary fittings with no problems numerous times.


Thanks for the feedback.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I have heard rotary fittings are most prone to leaks. Any truth to that or just coincidence that was the type in your case?


Rotary fittings are the only type of fitting I have ever had leak on me. Odd thing is that I have several of those same rotary fittings in use with no issue. Must have just been one bad fitting out of that lot.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Rotary fittings are the only type of fitting I have ever had leak on me. Odd thing is that I have several of those same rotary fittings in use with no issue. Must have just been one bad fitting out of that lot.


Cheers for the help with some real user feedback!


----------



## ht_addict

New to the club with a Sapphire FuryX. Huge thread, with lots of reading. Just wondering if anyone in the thread has repasted with Coolaboratory Liquid Ultra?


----------



## Randomdude

Quote:


> Originally Posted by *ht_addict*
> 
> New to the club with a Sapphire FuryX. Huge thread, with lots of reading. Just wondering if anyone in the thread has repasted with Coolaboratory Liquid Ultra?


That paste is conductive I believe and I wouldn't use it on a GPU. I could be wrong, someone else chime in if I am.

Welcome to the club! Hope you like the card.


----------



## p4inkill3r

Quote:


> Originally Posted by *ht_addict*
> 
> New to the club with a Sapphire FuryX. Huge thread, with lots of reading. Just wondering if anyone in the thread has repasted with Coolaboratory Liquid Ultra?


Temps are so great that I haven't even considered it, tbh.


----------



## xer0h0ur

Isn't the problem with that liquid ultra that you need to get the core up to like 90C or somewhere thereabouts before the stuff melts and becomes effective? Other than the obvious fact its electrically conductive.

Edit: Nevermind, that only applies to the liquid metalpad they sell. The other stuff like CLU and CLP has a much lower melting point where it becomes effective.


----------



## WheelZ0713

So i haven't dropped in on this thread in a while. Decided to pop by and do some catching up and from what i am reading people are playing with Voltages!?

Did someone finally unlock it? This would make me happy in my pants!


----------



## p4inkill3r

Quote:


> Originally Posted by *WheelZ0713*
> 
> So i haven't dropped in on this thread in a while. Decided to pop by and do some catching up and from what i am reading people are playing with Voltages!?
> 
> Did someone finally unlock it? This would make me happy in my pants!


Nope.


----------



## WheelZ0713

Quote:


> Originally Posted by *p4inkill3r*
> 
> Nope.


BAH!!!!

I got uncomfortably excited.


----------



## ht_addict

Quote:


> Originally Posted by *Randomdude*
> 
> That paste is conductive I believe and I wouldn't use it on a GPU. I could be wrong, someone else chime in if I am.
> 
> Welcome to the club! Hope you like the card.


I used it on my 7970M in my Alienware. Worked like a charm just have to make sure you spread it carefully


----------



## rdr09

who needs skylake?

http://www.overclock.net/t/1576152/hardocp-r9-fury-x-crossfire-vs-gtx-980-ti-sli-vs-titan-x-sli-4k-resolution

Article is superbly done . . .

for noobs.


----------



## Scorpion49

So I happened to snag one of the XFX R9 Fury cards to test out, they seem to have a slightly different power delivery system so I was hoping for something with less coil whine than either of my reference PCB Sapphire cards. Nope. This card is now the coil whine champion, louder than both of the others put together. I couldn't even tolerate a single run of 3Dmark before I removed it from my system and put it back in the box.


----------



## Ceadderman

Lock it into anot her room for 24hrs of 3DMark. It should help fix that. If not it may be that your GPU doesn't play well with your PSU.

~Ceadder


----------



## Vlada011

Quote:


> Originally Posted by *Alastair*
> 
> Funny thing this. Took apart my second Sapphire Fury. And no sticky tape stuff under the die.
> 
> DAMN this die is such a work of art. I could stare at it all day.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI HD6850 power edition with Fury. Barts and Fiji.


Yes Fury X size is perfect for mATX boards.
I can't remember better configuration for smaller RIGs than Fury X CF with water blocks only single slot cards.
Even Sound Blaster Z is probably same thickness as Fury X with EKWB waterblock.


----------



## Scorpion49

Quote:


> Originally Posted by *Ceadderman*
> 
> Lock it into anot her room for 24hrs of 3DMark. It should help fix that. If not it may be that your GPU doesn't play well with your PSU.
> 
> ~Ceadder


Nope, not doing it. I've had enough man, I tried. I really did. I don't give a crap about trying to work it out of the card, its going back and I'm done with Fiji forever. This is the third card with the same issues. I have 5 systems at my disposal with all different makes and models of power supply and motherboard, it doesn't make a lick of difference which one I put them in. The VRM design on these cards is garbage and lends itself to excessive whine, and I won't tolerate it just to support AMD. Maybe I'll get a 390X, performance is close enough and it won't have to be Nvidia.


----------



## Ceadderman

The above pic explains a lot why you cannot simply drop a GPU cooler on these new cards. Look at the size of that die.









~Ceadder


----------



## xer0h0ur

Quote:


> Originally Posted by *Scorpion49*
> 
> Nope, not doing it. I've had enough man, I tried. I really did. I don't give a crap about trying to work it out of the card, its going back and I'm done with Fiji forever. This is the third card with the same issues. I have 5 systems at my disposal with all different makes and models of power supply and motherboard, it doesn't make a lick of difference which one I put them in. The VRM design on these cards is garbage and lends itself to excessive whine, and I won't tolerate it just to support AMD. Maybe I'll get a 390X, performance is close enough and it won't have to be Nvidia.


I don't know what to tell you man. Either you have the worst luck with getting whiny cards or there is something else causing that coil whine for you like dirty power or the PSU not cooperating with you.


----------



## Scorpion49

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know what to tell you man. Either you have the worst luck with getting whiny cards or there is something else causing that coil whine for you like dirty power or the PSU not cooperating with you.


I think the few people here that don't have whine are the lucky ones.


----------



## xer0h0ur

Well I can't blame you if you're wiping your hands of it. You gave it several tries.


----------



## GMcDougal

This Sapphire is 99% coil whine free. The only time i hear any is when exiting out of Valley benchmark. Gaming, running 3dmark 2013 and the valley benchmark itself have 0 coil whine.


----------



## rv8000

Not to be a jerk, but trying to escape coil whine is almost impossible at the high end and enthusiast tiers of GPU's. My 780's, 670's, 7950, 7970, 290s, 290x's, Fury/Fury X, 970s, and single 980 have all had some form of coil whine. Sure some are worse than others, but like I've explained in this thread and one or two others numerous times, there are WAY to many factors to solely blame one thing or the other.

Continuously buying cards in the hopes of getting a brand or make without coil whine isn't a good idea. If it really bothers someone that much, take the time to do the research as to why the phenomenon occurs and see what factors you can change to help reduce the issue; sound proofing, checking for dirty power in both the psu and house, make sure apartment/house is properly grounded, no large magnets near your pc, learning about the quality and make of chokes specific gpu models come with, how heat/voltage effect the frequency and pitch of the noise and finding optimal voltages within thermal tolerances that can help reduce the noise, and so on...

I personally got really hung-up over coil whine on one of my cards (cant remember specifically which one) and did ridiculous things, I even found the one socket in the whole damn house that wasn't grounded properly was the one my pc was running from and even worse that when the front foundation was exposed for waterproofing the workers never replaced the grounding rod for the entire house after digging out the old one







. Once I really invested myself into understanding why it was happening, it became clear there are so many things you cannot control that can influence coil whine. Long story short, don't burn yourself out over one thing.


----------



## Mega Man

@rv8000 your the first i have heard in a while state the truth, alas if it wasnt like beating your head against a wall. most dont seem to care about the truth, they are of the mentality " it makes noise make it stop"

imo coil whine will only get worse the more we make cards that can push out more and more pixels


----------



## BaddParrot

I have had my Sapphire Fury X about 12 weeks now. I did hear it "Ticking" once. That sound went away as fast as it started.
No matter how hard I push this pc, I have never heard the Fury X card/pump/fan over the Corsair 760D case fans or the H100i fans/pump.

I think my Sapphire Fury X was the 2nd batch shipped out. Reading these threads makes me feel lucky I guess.


----------



## xer0h0ur

Quote:


> Originally Posted by *BaddParrot*
> 
> I have had my Sapphire Fury X about 12 weeks now. I did hear it "Ticking" once. That sound went away as fast as it started.
> No matter how hard I push this pc, I have never heard the Fury X card/pump/fan over the Corsair 760D case fans or the H100i fans/pump.
> 
> I think my Sapphire Fury X was the 2nd batch shipped out. Reading these threads makes me feel lucky I guess.


Yeah man. The coil whine on my 295X2 and 290X both can only be audible, and barely at that, when I reduce the fan speed of my radiator fans to a near stall. I wonder though if my EK waterblocks are providing some measure of insulation to the coil whine though.


----------



## Kana-Maru

I haven't heard any coil whine or any noise from my Asus R9 Fury X. I turned off everything [fans etc] just to see if I could hear something. Nothing.


----------



## Alastair

Guys after installing my Fury's my screen is so dark. Like contrast has been turned way down or something. But my settings have not changed and the display driver is on defaults, is there any help for this?


----------



## Vlada011

That pump on Fury X is stolen patent by Asian company from Western company and I would throw in garbage and install waterblock.
In same time I would get rid off disgusting SLEEVED tubes.


----------



## Alastair

Well no coil whine on both my Fury's. Welp. Guess I'm a lucky one.


----------



## p4inkill3r

Quote:


> Originally Posted by *Vlada011*
> 
> That pump on Fury X is stolen patent by Asian company from Western company and I would throw in garbage and install waterblock.
> In same time I would get rid off disgusting SLEEVED tubes.


Why even buy a Fury X in that case?


----------



## Vlada011

Fury X is good with EK waterblock.
Where is Fury X wit air cooler?


----------



## Alastair

So guys. 3 problems.

1. The EK FC Bridge has a terribly placed out port. It gets in the way of the power connectors. Solution?


2. What sort of scores should I expect my 2 Fury's to post at standard Firestrike 1.1 (not ultra or extreme) with my 8370? I'm getting around 14500 for two Sapphire Fury's at stock and that seems a bit low does it not?

3. Ever since I plugged in my new Fury's. The display is dark. Like gamma or something is down. My screen settings haven't changed and my driver settings are at default. I went from using DVI with my 6850's to HDMI on my Fury's. (screen has both plugs) funnily enough my screen also seemed dim when i tried HDMI on my 6850's. Any solution to that?


----------



## Gumbi

Quote:


> Originally Posted by *Scorpion49*
> 
> So I happened to snag one of the XFX R9 Fury cards to test out, they seem to have a slightly different power delivery system so I was hoping for something with less coil whine than either of my reference PCB Sapphire cards. Nope. This card is now the coil whine champion, louder than both of the others put together. I couldn't even tolerate a single run of 3Dmark before I removed it from my system and put it back in the box.


It's clear that there is an issue on your end considering all the configurations you've tried and still have coil whine.

These cards might have a tendency towards it, butnot necessarily a propensity.


----------



## Scorpion49

Quote:


> Originally Posted by *Gumbi*
> 
> It's clear that there is an issue on your end considering all the configurations you've tried and still have coil whine.
> 
> These cards might have a tendency towards it, butnot necessarily a propensity.


Its clear that some people can't accept that AMD has a poor design, I had thought the XFX had a slightly different VRM section which it does not so that is my mistake, I would never have bought it if I knew it was identical to the Sapphire cards. I have three GTX 670's, an R9 270, a GTX 580, two reference R9 290's and a pair of R9 380's and NONE of those have any whine whatsoever. Zip. Zero. Zilch. In any of my machines.

If you want to hear it, this is running 3Dmark (mind you, this is after 16 hours of running Valley to try and burn out the whine, and the phone does not pick up the volume of the sound very well):






And here is my trying to play Armored Warfare ~ 55-70 fps:


----------



## p4inkill3r

yeah, that's loud.


----------



## Jflisk

Hooly Molly did all you Furies sound like that . I Cant hear either one of my x's . I hear my pumps on my custom loop more then I hear anything coming out of the Fury X's


----------



## Vlada011

Wait what is problem with Fury X if whine, pump or something else?
If only whine from pump is problem than if customer replace CM pump with waterblock and connect in loop problem gone completely?


----------



## Alastair

Guys I am not getting the sort of performance I would expect from a pair of Fury's.

FX-8370 @ 4.95GHz
16gb ram
catalyst 15.1.7


----------



## p4inkill3r

Quote:


> Originally Posted by *Alastair*
> 
> Guys I am not getting the sort of performance I would expect from a pair of Fury's.
> 
> FX-8370 @ 4.95GHz
> 16gb ram
> catalyst 15.1.7


Performance in what?


----------



## xer0h0ur

Is that a typo on the driver version you're using?


----------



## Alastair

Quote:


> Originally Posted by *p4inkill3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys I am not getting the sort of performance I would expect from a pair of Fury's.
> 
> FX-8370 @ 4.95GHz
> 16gb ram
> catalyst 15.1.7
> 
> 
> 
> Performance in what?
Click to expand...

Games and the likes. GPU's are not maxing out. Framerates are low. That sort of thing.

Quote:


> Originally Posted by *xer0h0ur*
> 
> Is that a typo on the driver version you're using?


Yeah sorry using 15.7.1


----------



## Scorpion49

Quote:


> Originally Posted by *Jflisk*
> 
> Hooly Molly did all you Furies sound like that . I Cant hear either one of my x's . I hear my pumps on my custom loop more then I hear anything coming out of the Fury X's


My first Sapphire was the same. The second Sapphire was much quieter but still irritating, not bad enough to want to hassle with replacing it. I sold it because I thought it wouldn't fit in my new case but realized there is a ton of room and the max graphics length recommendation was basically BS. I wanted the fastest single AMD powered air-cooled card I could get for my 1440p Freesync screen so I was then torn between the XFX and Asus Fury cards, the XFX just looks nicer and I don't trust Asus as far as I can throw them for warranty so I took a 3rd chance and here we are.


----------



## Jflisk

Quote:


> Originally Posted by *Scorpion49*
> 
> My first Sapphire was the same. The second Sapphire was much quieter but still irritating, not bad enough to want to hassle with replacing it. I sold it because I thought it wouldn't fit in my new case but realized there is a ton of room and the max graphics length recommendation was basically BS. I wanted the fastest single AMD powered air-cooled card I could get for my 1440p Freesync screen so I was then torn between the XFX and Asus Fury cards, the XFX just looks nicer and I don't trust Asus as far as I can throw them for warranty so I took a 3rd chance and here we are.


Have you sent the card into xfx and called them not sure if you are in the us or not but their warranty claims department is astronomical in the US. You can call them first explain the problem and what you experience is and they should get it fixed for you. This is there direct US number (909) 230-9800 . I know there only there M-F


----------



## Scorpion49

Quote:


> Originally Posted by *Jflisk*
> 
> Have you sent the card into xfx and called them not sure if you are in the us or not but their warranty claims department is astronomical in the US. You can call them first explain the problem and what you experience is and they should get it fixed for you. This is there direct US number (909) 230-9800 . I know there only there M-F


Nope. Its already on its way back to Newegg for a gift card. I'm 100% done with Fiji. I'm considering either a GTX 980 or a 390X right now.


----------



## Jflisk

Quote:


> Originally Posted by *Scorpion49*
> 
> Nope. Its already on its way back to Newegg for a gift card. I'm 100% done with Fiji. I'm considering either a GTX 980 or a 390X right now.


You might want to have a look at the AMD forums before you buy a 390X seems like they are having problems with black screens in games DX11. I cant find a link to the forum but the problem was definitely acknowledged by AMD on the 390X.

This is a link the to forum its hard to search stuff there but it is there somewhere.
https://community.amd.com/community/support-forums


----------



## Scorpion49

Quote:


> Originally Posted by *Jflisk*
> 
> You might want to have a look at the AMD forums before you buy a 390X seems like they are having problems with black screens in games DX11. I cant find a link to the forum but the problem was definitely acknowledged by AMD on the 390X.
> 
> This is a link the to forum its hard to search stuff there but it is there somewhere.
> https://community.amd.com/community/support-forums


Thanks, I'll check carefully. If my pair of 290's weren't going elsewhere for now I would just use one of them, but now I'm going to have a ~$600 Newegg gift card that I need to use anyways so I prefer to have something new. I wanted to stick with AMD because of my XL2730Z since I like freesync, I'm using the 670's I have in SLI right now though and its not too bad.


----------



## jase78

Maybe just grab a ti.


----------



## xer0h0ur

Maxwell is probably one of Nvidia's most gimped architectures ever. I skipped the Fiji generation because I don't want the HBM teething generation and I don't trust the gaming industry one bit so I am not going from 4GB Hawaii cards to 4GB HBM cards. Particularly with how god awful developers are with optimizing vRAM usage and how irresponsible they are with so called ULTRA textures along with the rest of the bells and whistles taking up ridiculous amounts of vRAM.

All DX12 benchmarks are thoroughly beginning to show why GCN's compute and parallel processing design shine. You couldn't convince me to throw my money away on a Maxwell card in any way shape or form. Its not the ideal DX12 architecture. If you're a DX9/11 gamer then Maxwell will serve you well. They still hold a lead there in terms of driver overhead.


----------



## Semel

And yet reference 980ti beats amd fury x in those dx12 benchmarks even without async compute.. and we all know how good 980ti at being overclocked

..


----------



## GorillaSceptre

Quote:


> Originally Posted by *Semel*
> 
> And yet reference 980ti beats amd fury x in those dx12 benchmarks even without async compute.. and we all know how good 980ti at being overclocked
> 
> ..


What do you mean "even without async compute"?

You do know that's the best case for Maxwell right? If games used heavy Async then Nvidia would be steam rolled..

Fiji will take a while to show it's true potential(and i don't mean better drivers, i mean games built with compute and Async in mind), and by that time there will be far more powerful products out, and Maxwell( even V2) is a dead end.

Neither are a wise choice at their current pricing.


----------



## flopper

Quote:


> Originally Posted by *GorillaSceptre*
> 
> What do you mean "even without async compute"?
> 
> You do know that's the best case for Maxwell right? If games used heavy Async then Nvidia would be steam rolled..
> 
> Fiji will take a while to show it's true potential(and i don't mean better drivers, i mean games built with compute and Async in mind), and by that time there will be far more powerful products out, and Maxwell( even V2) is a dead end.
> 
> Neither are a wise choice at their current pricing.


Nvidia sold people old tech with the 980 series.
now delaying any patches for arc with dx12 along the way due to old tech.

amd has better tech and simply are the better choice.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> And yet reference 980ti beats amd fury x in those dx12 benchmarks even without async compute.. and we all know how good 980ti at being overclocked
> 
> ..


Since there aren't any released games on DX12 yet I am referencing the few beta games of which have been tested. Battlefront being the most recent. The only other one I can remember off the top of my head was Ashes of the Singularity.

http://www.guru3d.com/articles_pages/star_wars_battlefront_beta_vga_graphics_performance_benchmarks,1.html


----------



## Forceman

Battlefront is DX11.


----------



## Semel

Just some info for those who experience this issue
https://community.amd.com/thread/188642

I had some instances when in happened (only when browsing) Disabling ULPS didn't help. However, several days ago I did two things:

- https://support.microsoft.com/en-us/kb/2665946 (Method 1)
- plugged my display port cable into another gpu port.

and so far it hasn't happened YET. For some reason I kept getting this bug almost always when I opened the above amd community link in firefox (not chrome) and now it's not happening anymore. Maybe I just got lucky and this bug will show its ugly face once more but I think it's worth trying if nothing else helps you.

Cheers.


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> Battlefront is DX11.


Oh crap, you're right. Faux pas on my end. So then the only DX12 games which have been tested are AotS and what else? Either way nothing is making heavy use of async yet.


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh crap, you're right. Faux pas on my end. So then the only DX12 games which have been tested are AotS and what else? Either way nothing is making heavy use of async yet.


Fable 3?


----------



## Thoth420

Any word if Deus Ex: Mankind Divided will be DX12?


----------



## xer0h0ur

Deus Ex is among the short list of games that will be DX12. Battlefront is a possibility that it will become DX12 after its release but there are no guarantees.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Deus Ex is among the short list of games that will be DX12. Battlefront is a possibility that it will become DX12 after its release but there are no guarantees.


Awesome! It's my favorite game series. Should work well in my new system and I assume it will be an AMD evolved title.


----------



## Luftdruck

Hey guys!

I'm new to overclock.net and would like to say Hi. Yesterday I received my AMD R9 Fury X from Sapphire. It looks like the new batch with the stickers on the card itself, sadly I got the noisy pump and even the rattling fan on the radiator. I read surely half of the posts on this thread - still, I'm just curious to ask if there's a way of fixing the pump noise with a new BIOS or anything like that? RMA would take some time and I would keep it if possible, as I allready sold my HD7950


----------



## Vlada011

Quote:


> Originally Posted by *Luftdruck*
> 
> Hey guys!
> 
> I'm new to overclock.net and would like to say Hi. Yesterday I received my AMD R9 Fury X from Sapphire. It looks like the new batch with the stickers on the card itself, sadly I got the noisy pump and even the rattling fan on the radiator. I read surely half of the posts on this thread - still, I'm just curious to ask if there's a way of fixing the pump noise with a new BIOS or anything like that? RMA would take some time and I would keep it if possible, as I allready sold my HD7950


You know that is not nicest solution.
Nice watercooling for 200-250$ is best option.
EKWB Predator 240 and one of their block for Fury X.


----------



## Otterfluff

If your keeping the AIO unit I wouldn't put up with the fan rattle personally. That alone would be a solid reason for me to rma one. Of course if your into custom loops then it would not matter about clicking fans and noisy pumps you would invest in a custom block. Both of my fury X had no pump or fan issues "Gigabyte and a XFX Fury X". Seems a waste that I wont use the AIO's.


----------



## Medusa666

My ASUS Fury Strix has a fan rattling loudly when it it hits 45% or more, don't really know if I should return it or just put up with it, so tired of sending stuff back and forth, waiting for RMA, etc, and there are no real options. Bought and tried two Fury X, wasn't satisfied with the noise levels, been looking to the MSI R9 390X, but it doesn't feel like a perfect match.

Considering just selling the Freesync monitor, buy a G-sync one togheter with a 980Ti and be done with it.


----------



## xer0h0ur

The last time I had a fan rattling on me I used a Dupont teflon lubricant spray on it. Never had it rattle again since. That stuff is godly at cleaning and lubricating. Just be careful when applying it as the stuff seemingly lasts forever so trying to remove overspray is a massive PITA.


----------



## Luftdruck

The fan doesn't bother me much, it's just the pump which is annoying as hell and drives me crazy.
I'm not really interessted in a custom loop - my case wouldn't even fit one (NZXT S340)








I requested a replacement card at my retailer (Amazon Germany). Hopefully it will be a silent one.


----------



## Medusa666

Quote:


> Originally Posted by *xer0h0ur*
> 
> The last time I had a fan rattling on me I used a Dupont teflon lubricant spray on it. Never had it rattle again since. That stuff is godly at cleaning and lubricating. Just be careful when applying it as the stuff seemingly lasts forever so trying to remove overspray is a massive PITA.


Thanks for the reply, thing is the card is not even two weeks old, so I think I'l just return it either by Return of Sale or RMA, either way I'l go with a new Fury Strix, or MSI 390X, or give the Fury X a third try : )


----------



## Randomdude

My Fury X seems to boost to 1150 with Power Limit on +50% and 9% Clocks. Absolutely 0 coil whine or any noise coming from it. Stays below 30C for most of the games I play. Extremely pleased with the card so far. 0 core state is an interesting feature as well.


----------



## Gumbi

Quote:


> Originally Posted by *Randomdude*
> 
> My Fury X seems to boost to 1150 with Power Limit on +50% and 9% Clocks. Absolutely 0 coil whine or any noise coming from it. Stays below 30C for most of the games I play. Extremely pleased with the card so far. 0 core state is an interesting feature as well.


That's impossible unless your ambients are freezing...


----------



## Randomdude

Quote:


> Originally Posted by *Gumbi*
> 
> That's impossible unless your ambients are freezing...


19-22C ambients, I don't know if that's freezing to you.


----------



## Gumbi

Quote:


> Originally Posted by *Randomdude*
> 
> 19-22C ambients, I don't know if that's freezing to you.


You said the card stays under 30c while gaming. That's impossible, even with top notch water cooling. Even with a chiller that's pushing it.

What your are reading is likely the idle temps. Fury Xs range from 45-60 degrees under load.


----------



## Decade

Drunk purchased a Fury X after having both DVI outputs on my R9 290 fail, displayport output has been dead for months... I think I've basically ran that card into the ground. Open box special, can't complain much.

Looking forward to my first BNIB flagship product arriving this week. Won't be working the week of the Fallout 4 release, so I'm definitely gonna put the card through it's paces with the one game I've been anticipating for years.
Deus Ex: Mankind Divided being the next AAA title I'll be getting, I imagine I'll probably be sitting pretty comfy with that game at 1440p with the Fury X and my 4.2ghz 4670K.


----------



## Medusa666

Quote:


> Originally Posted by *Decade*
> 
> Drunk purchased a Fury X after having both DVI outputs on my R9 290 fail, displayport output has been dead for months... I think I've basically ran that card into the ground. Open box special, can't complain much.
> 
> Looking forward to my first BNIB flagship product arriving this week. Won't be working the week of the Fallout 4 release, so I'm definitely gonna put the card through it's paces with the one game I've been anticipating for years.
> Deus Ex: Mankind Divided being the next AAA title I'll be getting, I imagine I'll probably be sitting pretty comfy with that game at 1440p with the Fury X and my 4.2ghz 4670K.


Yeah the Fury X is a compelling card, congratulations and hopefully you will be happy with your purchase : )


----------



## Medusa666

Are there any information regarding the expected lifespan / length for the AIO of the Fury X? Considering the coolant in the tubes, corrosion of the tubing, etc.

Should be some kind of datasheet or spec sheet out there.


----------



## Decade

Quote:


> Originally Posted by *Medusa666*
> 
> Yeah the Fury X is a compelling card, congratulations and hopefully you will be happy with your purchase : )


Performance is definitely there, I was honestly debating a 980Ti (while sober) but love how quiet my R9 290x is with a Corsair HG10+H80i, so I went ahead and went with the lesser card to keep things quiet in my box.
I already know it'll kill at 1440p, especially since I tend to only use 2x AA or less.

All in all, I could have made a worse drunk purchase. Like moving to an i5 Skylake when my 4670K is still a very relevant and current CPU.


----------



## Semel

Guys I got a question..

I checked this video (



) and I was kinda surprised that this guy was getting 110-120 fps on his sytem (i7 4790k 4.7ghz,R9 Fury Tri x OC 1050/500,DDR3 2400mhz Dominator Platinium,Samsung 250 Evo,Windows 8.1 x64 pro) at 1080p I've got 3770k @ 4.2 Ghz, 16Gb ram and sapphire tri-x fury unlocked to 3840 and "OCed" to 1050/550 however when I was playing the game at 1920x1200 I wasn't getting as many fps as he did.Switching to 1920x1080 didn't improve fps much. Mostly 85-100 depending on the situation. I have it installed on HDD though butI don't think it matters much except for different loading times.

What gives? I can't believe +500 Mhz gives such a boost.. Am I missing something here?


----------



## Randomdude

Quote:


> Originally Posted by *Gumbi*
> 
> You said the card stays under 30c while gaming. That's impossible, even with top notch water cooling. Even with a chiller that's pushing it.
> 
> What your are reading is likely the idle temps. Fury Xs range from 45-60 degrees under load.


I'm not playing heavy games, and I've set the FPS cap to 60 on games that shoot up to 500-600, I don't see what's so odd. Of course if I play Witcher 3 temperatures shoot up, but I don't. Idle is idle, tmax is tmax. Card is pretty much stuck on 25C in an old wow xpac that I play PvP on.


----------



## ff0000T34M

Hello, i may join this club after i figure out what i am going to settle on. I currently have 3x 390x cards and 2 x FuryX on the way. I usually game in very high resolutions and may change up and go for PLP eyefinity with FuryX. Right now i am trying to compare these against each other for vram capacity and performance. Anyways, i noticed not many people have posted vram usage(best we can with tools available and may not be entirely accurate) and even benchmarks. I realize the voltage issue and or lack of overclocking on fiji might be the reason. So instead of pestering people i will just try these against each other to see what wins for my usage.

That all said, does anyone here use PLP eyefinity by chance? just curious whats the take on it from another users impressions.


----------



## Neon Lights

Quote:


> Originally Posted by *Semel*
> 
> Guys I got a question..
> 
> I checked this video (
> 
> 
> 
> ) and I was kinda surprised that this guy was getting 110-120 fps on his sytem (i7 4790k 4.7ghz,R9 Fury Tri x OC 1050/500,DDR3 2400mhz Dominator Platinium,Samsung 250 Evo,Windows 8.1 x64 pro) at 1080p I've got 3770k @ 4.2 Ghz, 16Gb ram and sapphire tri-x fury unlocked to 3840 and "OCed" to 1050/550 however when I was playing the game at 1920x1200 I wasn't getting as many fps as he did.Switching to 1920x1080 didn't improve fps much. Mostly 85-100 depending on the situation. I have it installed on HDD though butI don't think it matters much except for different loading times.
> 
> What gives? I can't believe +500 Mhz gives such a boost.. Am I missing something here?


Do you know, for sure, what the game settings he used were (and if you are using the same)?
Apart from that, if the FPS are CPU limited in that scenario, it could be that the CPU frequency actually has a (considerable) impact, the architecture difference secondly. Then, perhaps even the difference in your RAMs (I am assuming that there is one) could cause (though not many) less FPS. The small difference in your resolutions, if the FPS are CPU limited, would actually work positively by reducing the CPU limitation and shifting load to the GPU. All that is what could under normal circumstances give you less FPS. If you have checked all that, the only thing I know you could do would be to check for something unusual that could reduce your performance which could for example be software causing load on the relevant hardware or eventual downclocks (assuming you have not ruled them out already) of your hardware which would mainly be caused by temperature problems.


----------



## Semel

*Neon Lights*

Yeah, I saw his video and settings he used .









My RAM is slower but I don't think it matters much these days.

No software causing load, GPU temp =~46-48C, same as CPU

I guess it's the CPU then









PS Well it looks like we have AMD to blame for this , as in their dx11 driver..


----------



## jase78

what do you guys think is done with all the fury cards that have been rma'd? probably a safe estimate of 40-60% of the total cards produced have been at this point. question is if the main complaint is coil whine ect. What do they do with them?


----------



## p4inkill3r

Quote:


> Originally Posted by *jase78*
> 
> what do you guys think is done with all the fury cards that have been rma'd? probably a safe estimate of 40-60% of the total cards produced have been at this point. question is if the main complaint is coil whine ect. What do they do with them?


I think your safe estimate is wildly unsafe and completely made up, but the cards are refurbished and put back on the market.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> I think your safe estimate is wildly unsafe and completely made up, but the cards are refurbished and put back on the market.


I love the absolutely made up statistic with a 20% variance. Seems legit...


----------



## Vlada011

Because I decide definitely to go with EKWB Predator and I will not change that decision no matter on what, than and Fury X pump whine will not be problem.
But I didn't decide definitely for Radeon, maybe and some GeForce GTX980Ti with waterblock enter in my PC case and become Hydro Copper. That's only before AMD and NVIDIA launch their DX12 cards, Pascal and I don't know what AMD prepare, later I will see Fury X and upgrade on successor or from 980Ti upgrade on full GeForce with more video memory.
Lot of people will probably build Mini ITX boards with R9 Nano.
EKWB Full cover block for R9 Nano look great. But maybe is price of card still big and Radeon customers wait to drop little.
Fury X and R9 Nano with full cover blocks become single slot cards, they will not look big on Impact or Gene motherboard.

This year have really great stuff on market with affordable 6 core Intel and Skylake with powerful chipset, great cards with HBM, small motherboards with all features almost as ATX.


----------



## p4inkill3r

For anyone that purchased via Amazon, I had this in my inbox this morning:
Quote:


> Dear Wade ********,
> 
> Your recent purchase of the "Sapphire Radeon R9 Fury X 4GB HBM HDMI/TRIPLE DP PCI-Express Gra..." comes with a 1 FREE year of Unlimited Everything cloud storage from Amazon Cloud Drive (a $59.99 value).
> 
> You can securely store unlimited files, auto-save photos to free up space on your phone, share large files like videos, and more.
> 
> To receive your free cloud storage, click the below link and log in with your Amazon account.
> 
> Click here
> 
> This offer is valid until November 4, 2015 in the US only and can't be shared with anyone else.
> 
> Thanks again for shopping at Amazon.


A nice surprise.


----------



## Alastair

So this is the state of affairs thus far. Only tried on. One card so far.

Gpu 1 looks like this.
..............xx
..............xx
.x.............x
..............xx

_all bios unlocks all CU's but causes artifacting. Computer will post after a cold start without issues.

_4 high bios unlocks only 3 CU's. Looks like this. POST 's fine from a cold start. No artifacting.
...............x
...............x
.x............x
...............x

4_low unlocks 4 CU's. No artifacting. POST's fine from a reboot but won't post from a cold start.
..............x.
..............x.
.x..............
..............x.

So why can't I get the card to boot from a cold start when using the 4_low bios?

And I had to take the EKWB backplate off of my GPU in order to reach the BIOS switch.
So does anyone have any ideas why 4_low will unlock cores, no artifacts, appears to be perfectly stable can run benchmarks and play games. But when I do a cold start after I shut down the computer, the graphics card refuses to post.


----------



## Scorpion49

Quote:


> Originally Posted by *Alastair*
> 
> So this is the state of affairs thus far. Only tried on. One card so far.
> 
> And I had to take the EKWB backplate off of my GPU in order to reach the BIOS switch.
> So does anyone have any ideas why 4_low will unlock cores, no artifacts, appears to be perfectly stable can run benchmarks and play games. But when I do a cold start after I shut down the computer, the graphics card refuses to post.


Did you use GPU-Z to extract the vBIOS?


----------



## xer0h0ur

I don't know about the Fury/Fury X Alastair but my 295X2 and 290X do funky things in my system with respect to startup. Let me give you an example. If I boot my rig without having the monitor turned on and active then it fails the POST and will not startup. If the monitor is on but sleeping I get the same result. I literally have to either immediately turn on the rig after powering on my monitor or wake it from sleep before powering on the rig. Any other way and the POST fails and rig never boots.

Note: I am using a 4K monitor through DisplayPort. Don't know if this is playing a part in this.


----------



## Alastair

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So this is the state of affairs thus far. Only tried on. One card so far.
> 
> And I had to take the EKWB backplate off of my GPU in order to reach the BIOS switch.
> So does anyone have any ideas why 4_low will unlock cores, no artifacts, appears to be perfectly stable can run benchmarks and play games. But when I do a cold start after I shut down the computer, the graphics card refuses to post.
> 
> 
> 
> Did you use GPU-Z to extract the vBIOS?
Click to expand...

No I used Atiflash.exe as per the guide. Rom size appears to be the full 256KB.


----------



## Alastair

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know about the Fury/Fury X Alastair but my 295X2 and 290X do funky things in my system with respect to startup. Let me give you an example. If I boot my rig without having the monitor turned on and active then it fails the POST and will not startup. If the monitor is on but sleeping I get the same result. I literally have to either immediately turn on the rig after powering on my monitor or wake it from sleep before powering on the rig. Any other way and the POST fails and rig never boots.
> 
> Note: I am using a 4K monitor through DisplayPort. Don't know if this is playing a part in this.


I am using HDMI to my monitor. Could this be having a strange effect? Should I go abck to DVI and use the provided adapter? For the time being I am on my 1080P screen but will be shortly getting a new screen to better match what my fury's can provide. Any issues with using a passive DVI-HDMI adapter?


----------



## Scorpion49

Quote:


> Originally Posted by *Alastair*
> 
> No I used Atiflash.exe as per the guide. Rom size appears to be the full 256KB.


No idea man, when I tried it on mine both worked but one gave artifacts. Either would cold-boot just fine though.


----------



## Alastair

Got card 1 fully working at 3840.

Card 2 I am having less luck with.
All artifacts
4low and 4high only unlock 3cu's.

Why are they only unlocking 3 CU's?


----------



## Scorpion49

Quote:


> Originally Posted by *Alastair*
> 
> Got card 1 fully working at 3840.
> 
> Card 2 I am having less luck with.
> All artifacts
> 4low and 4high only unlock 3cu's.
> 
> Why are they only unlocking 3 CU's?


Because two are bad? It won't unlock completely bad CU's.


----------



## Alastair

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Got card 1 fully working at 3840.
> 
> Card 2 I am having less luck with.
> All artifacts
> 4low and 4high only unlock 3cu's.
> 
> Why are they only unlocking 3 CU's?
> 
> 
> 
> Because two are bad? It won't unlock completely bad CU's.
Click to expand...

yeah judging from their positions they were both bad. Just my luck I loose the silicone lottery









..............xx
..............xx
..............xx
x.x.............

Can I run my cards at 3840 and 3584 crossfired, sort of like you used to crossfire 7970's and 7950's together?


----------



## mRYellow

Quote:


> Originally Posted by *Alastair*
> 
> yeah judging from their positions they were both bad. Just my luck I loose the silicone lottery
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ..............xx
> ..............xx
> ..............xx
> x.x.............
> 
> Can I run my cards at 3840 and 3584 crossfired, sort of like you used to crossfire 7970's and 7950's together?


Unlucky bud.
I don't see why not. Card is still being seen as a Fury. Run the better card in slot one and the other in two.


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> yeah judging from their positions they were both bad. Just my luck I loose the silicone lottery
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ..............xx
> ..............xx
> ..............xx
> x.x.............
> 
> Can I run my cards at 3840 and 3584 crossfired, sort of like you used to crossfire 7970's and 7950's together?
> 
> 
> 
> Unlucky bud.
> I don't see why not. Card is still being seen as a Fury. Run the better card in slot one and the other in two.
Click to expand...

Well guess I'll do just that. Oh well. I'm actually very disappointed. I at least hoped I would get two cards at 3840









I know there is no guarantee. But still quite disappointed.


----------



## ff0000T34M

Just got my 2 fury's in, bout to install them and run them for a spin. My first impression is why did sapphire put a cover sticker on both front and back of the card? It looks so damn cheesy, nothing like what I've seen in the bazillion pictures of the fury x. I hope this doesn't mean i void my warranty if i remove it. Its blocking all the screws on the top and on the bottom they couldn't even be bothered with putting it on flat. Nothing like some halfbutt sticker placements.


----------



## Alastair

Quote:


> Originally Posted by *ff0000T34M*
> 
> Just got my 2 fury's in, bout to install them and run them for a spin. My first impression is why did sapphire put a cover sticker on both front and back of the card? It looks so damn cheesy, nothing like what I've seen in the bazillion pictures of the fury x. I hope this doesn't mean i void my warranty if i remove it. Its blocking all the screws on the top and on the bottom they couldn't even be bothered with putting it on flat. Nothing like some halfbutt sticker placements.


Well AMD did encourage custom backplate designs etc. etc. So I imagine it won't be a problem.

If they ask why the stickers are off if you ever need to RMA tell them they weren't stuck properly and came off with age.


----------



## ff0000T34M

Quote:


> Originally Posted by *Alastair*
> 
> Well AMD did encourage custom backplate designs etc. etc. So I imagine it won't be a problem.
> 
> If they ask why the stickers are off if you ever need to RMA tell them they weren't stuck properly and came off with age.


Well good news so far is both of these are quiet as a mouse. I have t heard any coil whine or pump noise. Matter of fact since my whole system is fresh together for testing until im ready to go under water i hear my cpu fan over the whole machine. This rivals my custom water cooling petty much right now. Im in 4k so ill try 1080p as well.


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> yeah judging from their positions they were both bad. Just my luck I loose the silicone lottery
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ..............xx
> ..............xx
> ..............xx
> x.x.............
> 
> Can I run my cards at 3840 and 3584 crossfired, sort of like you used to crossfire 7970's and 7950's together?
> 
> 
> 
> Unlucky bud.
> I don't see why not. Card is still being seen as a Fury. Run the better card in slot one and the other in two.
Click to expand...

running one card at 3840 and the other at 3584 nets me the exact same performance figures in heaven as 3584/3584.

Heaven 4.0 @ 1920x1080
DX11
Quality = Ultra
Tessellation = extreme
AA = 4x
Full screen

3584/3584 @1000MHz/500MHz
=3058 points @ 121.4fps

3840/3584 @ 1000/500
=3056 points at 121.2fps.

So basically. No improvement at all.


----------



## ff0000T34M

Quote:


> Originally Posted by *Alastair*
> 
> running one card at 3840 and the other at 3584 nets me the exact same performance figures in heaven as 3584/3584.
> 
> Heaven 4.0 @ 1920x1080
> DX11
> Quality = Ultra
> Tessellation = extreme
> AA = 4x
> Full screen
> 
> 3584/3584 @1000MHz/500MHz
> =3058 points @ 121.4fps
> 
> 3840/3584 @ 1000/500
> =3056 points at 121.2fps.
> 
> So basically. No improvement at all.


Yeah i have ran into similar spots as well with 290x/290... except my testing was using various clock speeds and it seem to match up with lowest clocked gpu across the board. So i gave up tha the idea.

On a side note these furyx are absolutely smashing my 290x/390x cards. Insane.


----------



## xer0h0ur

Well yeah. Dual Fury X's are = triple 290X's. Its already been tested and compared several times over.


----------



## EpicOtis13

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well yeah. Dual Fury X's are = triple 290X's. Its already been tested and compared several times over.


I just wish that I could afford furies, but for now I will have to be content with my xfire 290's that I picked up for $410 with blocks.


----------



## Alastair

Quote:


> Originally Posted by *EpicOtis13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Well yeah. Dual Fury X's are = triple 290X's. Its already been tested and compared several times over.
> 
> 
> 
> I just wish that I could afford furies, but for now I will have to be content with my xfire 290's that I picked up for $410 with blocks.
Click to expand...

ain't nothing wrong with 290's mate.


----------



## EpicOtis13

Quote:


> Originally Posted by *Alastair*
> 
> ain't nothing wrong with 290's mate.


true, plus i really just want greenland


----------



## xer0h0ur

There is actually even a possibility I skip Arctic Islands as well. There is a rumor that HBM2 won't even be making an appearance on that gen. If that is the case then I am dropping that idea altogether and extending the use of my 295X2 / 290X.


----------



## EpicOtis13

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is actually even a possibility I skip Arctic Islands as well. There is a rumor that HBM2 won't even be making an appearance on that gen. If that is the case then I am dropping that idea altogether and extending the use of my 295X2 / 290X.


If that is true I think that I will keep my 290s a bit longer. Right now I have realized that my 290s just aren't quite enough for 4k, so im thinking about moving down to 1440p 144fps freesync.


----------



## ff0000T34M

I still haven't tested 1080 much, but i get no noise from my sapphires. The only thing i hear now and then is a water trickle sound. Nothing i haven't heard from watercooling anyways.

I am actually digging these coming from 290x quadfire, it looks like i barely hit 800watts peak that i have seen. Not sure aside from vram size why these cards are getting so much hate. I guess i need to give it a week then see how i feel. I am hoping the dual fiji comes out soon i want to see it in action and what it will look like

Edit: Also 290x is still beastly great cards i think they have been underrated since they released. Outside ref cooling they have done so well.


----------



## xer0h0ur

Quote:


> Originally Posted by *EpicOtis13*
> 
> If that is true I think that I will keep my 290s a bit longer. Right now I have realized that my 290s just aren't quite enough for 4k, so im thinking about moving down to 1440p 144fps freesync.


Yup, 4K is the reason I am running triple Hawaii XTs.

Edit: I almost forgot probably the biggest deciding factor for me on whether or not I buy Greenland. Displayport 1.3. I will be damned if I upgrade to a new generation of video card only to not get DP 1.3 in it. I am not about to lock myself to 60Hz on my next gen upgrade with 120Hz looming around the corner.


----------



## ff0000T34M

Nothing wrong with picking your battles and reasons not to upgrade. On the other hand there is always something to wait for. Trying to future proof is very difficult and it doesn't help when there is almost no support for it. Higher refresh rates are great to play on when games aren't tied to 30fps world physics.


----------



## Decade

Just unboxed and installed my Fury X.. at work... currently playing with 1080p benchmarks. Absolute overkill, excepting WONDERFUL results when I get home and back to my 1440p monitor.
LOVE the little details on the Sapphire card... soft touch matte plastic, backlit logo, GPU load LEDs. I guess this is what it's like to own a flagship GPU?

Coil whine seems reasonable and comparable to my R9 290.


----------



## xer0h0ur

Quote:


> Originally Posted by *ff0000T34M*
> 
> Nothing wrong with picking your battles and reasons not to upgrade. On the other hand there is always something to wait for. Trying to future proof is very difficult and it doesn't help when there is almost no support for it. Higher refresh rates are great to play on when games aren't tied to 30fps world physics.


I'm not sure if you realize this but DP standards last for quite some time. DP 1.3's 120Hz refresh rate is not going to get another bump for a long time. There isn't a single FPS gamer that doesn't appreciate high refresh rate monitors. I am a CS:GO junkie so while I tolerate playing @ 60Hz on a 4K screen I would certainly jump at the opportunity to move on to 120Hz. Its the only reason I haven't bothered with getting the Freesync equivalent of my current monitor. Would merely be a side-grade.


----------



## Medusa666

Quote:


> Originally Posted by *Decade*
> 
> Just unboxed and installed my Fury X.. at work... currently playing with 1080p benchmarks. Absolute overkill, excepting WONDERFUL results when I get home and back to my 1440p monitor.
> LOVE the little details on the Sapphire card... soft touch matte plastic, backlit logo, GPU load LEDs. I guess this is what it's like to own a flagship GPU?
> 
> Coil whine seems reasonable and comparable to my R9 290.


Hi, how are your initial impressions? Does the fan rattle?


----------



## Decade

Quote:


> Originally Posted by *Medusa666*
> 
> Hi, how are your initial impressions? Does the fan rattle?


Loving the 1440p performance... have some odd issues with Metro Last Light where I'm getting significant (single digit) FPS drops regardless of resolution and graphics fidelity options. All other games and benchmarks have been great, I'm able to run a slight 5% overclock on the core.
No issues with the fan yet, absolutely no rattles at this time. Pump noise is completely masked by other fan noise as well. If I could, I'd get a second fan used with the AIO to replace the stock SP120 fan on my CPU's H80i.
Frame rate targeting of 65fps along with a custom fan profile in MSI afterburner has it running ~1200rpm with 45C or lower temps in most games so far.


----------



## clubber_lang

Well I'll be joining this club here real soon. Just bought myself the Saphire Radeon R9 Fury 4GB card. I'll be dumping my 7070's I have in crossfire for this. Kind of had no choice since my race sims are not coded well for dual gpu's. ( Especially Rfactor 2 ) , so this card should give me a great boost in performance AND...I get to take advantage of the FreeSync on my new 34" monitor! Win-win for me!


----------



## HagbardCeline

Quote:


> Originally Posted by *clubber_lang*
> 
> Well I'll be joining this club here real soon. Just bought myself the Saphire Radeon R9 Fury 4GB card. I'll be dumping my 7070's I have in crossfire for this. Kind of had no choice since my race sims are not coded well for dual gpu's. ( Especially Rfactor 2 ) , so this card should give me a great boost in performance AND...I get to take advantage of the FreeSync on my new 34" monitor! Win-win for me!


Which monitor did you buy, btw?


----------



## clubber_lang

Quote:


> Originally Posted by *HagbardCeline*
> 
> Which monitor did you buy, btw?


I got the Acer 34' XR34CK with FreeSync. And man oh man , was it worth every penny! Coming from a 27" 1080p monitor is pretty dang cool! I play a few different race sims and probably my favorite is Game stock car Extreme. The physics in the game are known to be the best in the biz , along with rFactor and iracing as well. GSCE actually stays at an even 75mz basically all the time . RF2 taxes my single 7970 pretty bad. And iracing does too in heavy traffic.

I was in a weird place really. The FPS games I play every now and then really liked the crossfire , but these damn driving sims don't. Going from 1080p to 3440 x 1440 there was a big difference there and I just needed more power on the single GPU. I have no idea how well freesync works , so I'm hoping that it helps me even more.


----------



## HagbardCeline

Quote:


> Originally Posted by *clubber_lang*
> 
> I got the Acer 34' XR34CK with FreeSync. And man oh man , was it worth every penny! Coming from a 27" 1080p monitor is pretty dang cool!


This is going to be a strange question, but did you just buy that monitor at Fry's this afternoon? I was pulling into my parking spot right when someone carrying an ACER gaming monitor was walking towards their car. I almost was going to ask why that monitor ha ha, but headed into the store instead.


----------



## bonami2

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am no expert here but I know that cards with higher amount of power phases tend to have less to no coil whine. I presume its simply because there are more chokes to spread it across. Perhaps its due to them using higher quality chokes. I can't pinpoint it. However someone here claims that coil whine never goes away, that the frequency of the sound simply changes to being inaudible. Again, I can't confirm that either.


Quote:


> Originally Posted by *clubber_lang*
> 
> I got the Acer 34' XR34CK with FreeSync. And man oh man , was it worth every penny! Coming from a 27" 1080p monitor is pretty dang cool! I play a few different race sims and probably my favorite is Game stock car Extreme. The physics in the game are known to be the best in the biz , along with rFactor and iracing as well. GSCE actually stays at an even 75mz basically all the time . RF2 taxes my single 7970 pretty bad. And iracing does too in heavy traffic.
> 
> I was in a weird place really. The FPS games I play every now and then really liked the crossfire , but these damn driving sims don't. Going from 1080p to 3440 x 1440 there was a big difference there and I just needed more power on the single GPU. I have no idea how well freesync works , so I'm hoping that it helps me even more.


Try BEAMNG if you want the best physic in the world. ( not most realistic but they are trying. At least you can break the car and make it hard to drive


----------



## clubber_lang

Quote:


> Originally Posted by *HagbardCeline*
> 
> This is going to be a strange question, but did you just buy that monitor at Fry's this afternoon? I was pulling into my parking spot right when someone carrying an ACER gaming monitor was walking towards their car. I almost was going to ask why that monitor ha ha, but headed into the store instead.


HA! No that wasn't me , but I damn near did buy my monitor from them a little over a week ago! I went in there to see first hand what these 34" screens looked like ( physically look at it ). The kid that was helping me out was really cool and a fellow gamer as well and knew quite a bit about everything I was talking to him about. I told him I would think on it while I grabbed some lunch , and then when I got back he himself was out to lunch. Anyways , he got back and I told him " yeah I want the Acer for sure! ". He checked the stock and the only one they had was the display unit. His manager said he would knock $30.00 off , and I said no way , I want a new one. He said it could be 2 - 3 weeks before they might see another one in. So I got home and a guy on here referred me over to Ncixus and they had some and $200.00 less than Fry's!! So I bought and got 2 day shipping because I couldn't wait haha.

So you're a fellow NW guy too?


----------



## clubber_lang

Quote:


> Originally Posted by *bonami2*
> 
> Try BEAMNG if you want the best physic in the world. ( not most realistic but they are trying. At least you can break the car and make it hard to drive


" Beaming "....I'm not sure what that is actually. Can you fill me in?


----------



## bonami2

Quote:


> Originally Posted by *clubber_lang*
> 
> " Beaming "....I'm not sure what that is actually. Can you fill me in?


Steam

Beam ng

It the game name







well it more of a simulator than a game

Look on youtube


----------



## bonami2




----------



## HagbardCeline

Quote:


> Originally Posted by *clubber_lang*
> 
> So you're a fellow NW guy too?


Yup, SW of PDX. When I was putting my system together the only thing I bought at Fry's was the CPU (Which I got them to price-match hehe), they never seemed to have the other stuff I wanted in stock. They still haven't gotten the X99-A 3.1 boards, for example, and they never had the Fury X cards. (I did almost do a ship to store on the XFX version of the Fury X to Best Buy, but they didn't seem to get them in stock either!) In the end it was 95% mail order. My monitor is probably the next thing I will upgrade. I'm doing 2k and 4k video on the new machine and there's only so much the 1080p monitor can do, hehe.


----------



## Thoth420

Quote:


> Originally Posted by *clubber_lang*
> 
> I got the Acer 34' XR34CK with FreeSync. And man oh man , was it worth every penny! Coming from a 27" 1080p monitor is pretty dang cool! I play a few different race sims and probably my favorite is Game stock car Extreme. The physics in the game are known to be the best in the biz , along with rFactor and iracing as well. GSCE actually stays at an even 75mz basically all the time . RF2 taxes my single 7970 pretty bad. And iracing does too in heavy traffic.
> 
> I was in a weird place really. The FPS games I play every now and then really liked the crossfire , but these damn driving sims don't. Going from 1080p to 3440 x 1440 there was a big difference there and I just needed more power on the single GPU. I have no idea how well freesync works , so I'm hoping that it helps me even more.


Any IPS glow on the edges? Dead Pixels? Weird flickering or signal dropping? Basically anything bad about it?


----------



## clubber_lang

Ahhh very cool! I'm like you too , as I don't usually buy anything from Fry's. I think I've bought about 95% of my stuff from either Newegg or Tiger direct and now my monitor from Ncixus. I actually wanted to buy the monitor locally in case there were any issues with it.With a total black screen I have a little back bleeding , but honestly it doesn't bug me at all. I seem to have very little and there are ways to get rid of most of it , but it doesn't affect the sims / games that I'm playing. The thing is huge compared to my 27" and running it at 3440 X 1440 looks freakin incredible.

Thoth....not one dead pixel! Slight back bleeding with a totally black screen and the lights off , and honestly I haven't noticed anything in any of the games / sims I'm playing. I wish I could of gotten my hands on something like this sooner. For racing sims it's a damn good option if a guy can't do triple screens.


----------



## Thoth420

Thanks for the feedback. One last question: Does it wobble at all? Very wide panel and a bit over my budget so a monitor arm on top would really sting. I am trying to decide on a display for my sig rig.


----------



## clubber_lang

No wobble but right now it's firmly planted on a very big and heavy desk. I just ordered an extended monitor mount so i can put it on my simrig. I'm hoping i dont get to much movement with it being mounted there but we'll see.


----------



## Thoth420

Quote:


> Originally Posted by *clubber_lang*
> 
> No wobble but right now it's firmly planted on a very big and heavy desk. I just ordered an extended monitor mount so i can put it on my simrig. I'm hoping i dont get to much movement with it being mounted there but we'll see.


Thanks I will certainly consider it. I really want a freesync display pref something above 1080 but not 4k.


----------



## fjordiales

Anyone seen this yet?

http://wccftech.com/gigabyte-enters-custom-radeon-r9-fury-race-windforce-3x-cooled-graphics-card-clocked-1010-mhz/


----------



## Thoth420

Quote:


> Originally Posted by *fjordiales*
> 
> Anyone seen this yet?
> 
> http://wccftech.com/gigabyte-enters-custom-radeon-r9-fury-race-windforce-3x-cooled-graphics-card-clocked-1010-mhz/


A fury with 4GB HBM. Interesting addition.


----------



## drm8627

Quote:


> Originally Posted by *Thoth420*
> 
> A fury with 4GB HBM. Interesting addition.


dont all the fiji chips have 4 gb of hbm?


----------



## Scorpion49

Quote:


> Originally Posted by *Thoth420*
> 
> A fury with 4GB HBM. Interesting addition.


It might be interesting if they use a full custom PCB, Gigabyte makes some nice cards.


----------



## Thoth420

Quote:


> Originally Posted by *drm8627*
> 
> dont all the fiji chips have 4 gb of hbm?


It's a Fury not a Fury X.


----------



## drm8627

Quote:


> Originally Posted by *Thoth420*
> 
> It's a Fury not a Fury X.


fury has hbm, if im not mistaken. only difference between two is x is overclocked and watercooled


----------



## battleaxe

Quote:


> Originally Posted by *drm8627*
> 
> fury has hbm, if im not mistaken. only difference between two is x is overclocked and watercooled


ROP count is different too.


----------



## drm8627

Quote:


> Originally Posted by *battleaxe*
> 
> ROP count is different too.


yeah.. overall performance is quite similar though.


----------



## Thoth420

Quote:


> Originally Posted by *drm8627*
> 
> fury has hbm, if im not mistaken. only difference between two is x is overclocked and watercooled


You are right..long day. Although I wouldn't say it's overclocked just higher base clock.


----------



## drm8627

Quote:


> Originally Posted by *Thoth420*
> 
> You are right..long day. Although I wouldn't say it's overclocked just higher base clock.


right on no worries. X is probably higher binned as well. also mentioned it has a higher ROP

but overall the two cards have VERY similar performance.


----------



## Forceman

Quote:


> Originally Posted by *battleaxe*
> 
> ROP count is different too.


Same ROP count, different shader and TMU count.


----------



## ff0000T34M

After going back and forth on 390x and furyX i am sending my 390x's back. Fury gives me enough boost despite 4gb vram. When i am over 4gb on 390x its struggling hard and is not usually playable making the extra vram almost useless.(not talking below 4k). I'd rather be careful with Fury's vram limits then be able to use 8gb vram and it not be playable. Also, while i am not technical on vram usage and limits in knowledge. I have found that fury always seems to allocate less and i am able to up settings based on raw performance over the 390x. When using MSI ab on 390x in DAI i see around 4.3gb usage and in cut scenes its gone as high as 5+gb. and fury with exact same settings shows 3.2 - 3.8gb usage. I can find limits to Fury's vram if i try which is possibly a killer for people looking to future proof.

TDLR: 390x 4gb+ usage fps is too low to care(4k+). FuryX sub 4gb usage unlike 390x and i am able to use higher settings due to raw power. Can easily hit limits of fury vram which can be a turn off for people looking to future proof.

The information is mostly my opinion for my usage, but thought i would share in case people are wondering.


----------



## rdr09

Quote:


> Originally Posted by *ff0000T34M*
> 
> After going back and forth on 390x and furyX i am sending my 390x's back. Fury gives me enough boost despite 4gb vram. When i am over 4gb on 390x its struggling hard and is not usually playable making the extra vram almost useless.(not talking below 4k). I'd rather be careful with Fury's vram limits then be able to use 8gb vram and it not be playable. Also, while i am not technical on vram usage and limits in knowledge. I have found that fury always seems to allocate less and i am able to up settings based on raw performance over the 390x. When using MSI ab on 390x in DAI i see around 4.3gb usage and in cut scenes its gone as high as 5+gb. and fury with exact same settings shows 3.2 - 3.8gb usage. I can find limits to Fury's vram if i try which is possibly a killer for people looking to future proof.
> 
> TDLR: 390x 4gb+ usage fps is too low to care(4k+). FuryX sub 4gb usage unlike 390x and i am able to use higher settings due to raw power. Can easily hit limits of fury vram which can be a turn off for people looking to future proof.
> 
> The information is mostly my opinion for my usage, but thought i would share in case people are wondering.


a minimum of 2 390Xs is needed for 4K as you find out, then the vram becomes a non issue. at least with the games today.

edit: not sure if they updated AB in the latest version but with my hawaiis, it measures vram usage using global output. meaning it add it up. so, for 2 gpus, you need to divide the figure by 2. i use HWINFO and that too adds up the vram usage.

also, FuryX is easily 30% faster than 390X, so you can't use same settings in games. if you are - i mean.


----------



## ff0000T34M

Quote:


> Originally Posted by *rdr09*
> 
> a minimum of 2 390Xs is needed for 4K as you find out, then the vram becomes a non issue. at least with the games today.


I was trying to keep from posting a long post so i might have left some info out. I have 3 390x, and 2 furyX - I am also running above 4k resolution. The 390x don't have enough power unless i run 3 /4 of them. When you include the heat, power, and Cf scaling it's harder for the 390x to achieve the FPS the FuryX can at the same settings in most of my testing. This is all subjective to the user, in my case i'm finding i am getting more mileage from the faster card with less Vram. I didn't even factor costs either, but 3 x390x roughly = 2 furyX .

If i were running 4k, or less i'm sure 390x would be plenty. In fact there are some games i will run in 4k because they are horribly optimized = i.e.Gameworks


----------



## rdr09

Quote:


> Originally Posted by *ff0000T34M*
> 
> I was trying to keep from posting a long post so i might have left some info out. I have 3 390x, and 2 furyX - I am also running above 4k resolution. The 390x don't have enough power unless i run 3 /4 of them. When you include the heat, power, and Cf scaling it's harder for the 390x to achieve the FPS the FuryX can at the same settings in most of my testing. This is all subjective to the user, in my case i'm finding i am getting more mileage from the faster card with less Vram. I didn't even factor costs either, but 3 x390x roughly = 2 furyX .
> 
> If i were running 4k, or less i'm sure 390x would be plenty. In fact there are some games i will run in 4k because they are horribly optimized = i.e.Gameworks


my bad. yah, i don't game . . . works.


----------



## Alastair

How does this stack up? This was done on my 1080P screen for now.
Fury's in crossfire (56 CU's 64/224/3584) @ 1000/500 with AMD FX8370 @ 4.95GHz.



Also guys where does one go for the USR or ultimate super resolution or whatever the downscaling feature is called?


----------



## huzzug

Quote:


> Originally Posted by *Alastair*
> 
> Also guys where does one go for the USR or ultimate super resolution or whatever the downscaling feature is called?


Go to Catalyst Control Center -> Display Properties -> check "Enable Super-resolution (?)" checkbox. then you right click on desktop, and find higher resolution options in display properties.


----------



## mRYellow

I've noticed that the 4gb is a non issue on the Fury. They seems to use less ram.


----------



## fewness

Quote:


> Originally Posted by *mRYellow*
> 
> I've noticed that the 4gb is a non issue on the Fury. They seems to use less ram.


and I have data to support that. Here is the GDDR5 vs HBM vram usage I recorded for star wars battlefront beta. While TitanX's GDDR5 topped ~4G, FuryX's HBM did just fine with only ~3G been used.


----------



## HagbardCeline

Has anyone replaced the fan on the Fury X water block?


----------



## Alastair

Quote:


> Originally Posted by *HagbardCeline*
> 
> Has anyone replaced the fan on the Fury X water block?


Why would you? GT's are the best fans out there. Why would you want to change it?


----------



## By-Tor

Quote:


> Originally Posted by *Alastair*
> 
> How does this stack up? This was done on my 1080P screen for now.
> Fury's in crossfire (56 CU's 64/224/3584) @ 1000/500 with AMD FX8370 @ 4.95GHz.
> 
> 
> 
> Also guys where does one go for the USR or ultimate super resolution or whatever the downscaling feature is called?


Was curious to see how my 290x's performed against the fury's...

This is at the same 1080p res. and setting in game.

Couldn't get print screen to work so had to use my camera.

I was shocked that they did very well matched up to the fury's. Fury's max. FPS was much higher, but the min. FPS fell way off.


----------



## wdpir32k3

Do you guys think AMD will unlock the volts on the fury series anytime soon I know HBM is uncharted territory and it's not matured yet


----------



## ff0000T34M

Fury isn't well known for great performance @ 1080 but for a data point i'll throw mine in as well.

furyx x2 @ 1080p cpu stock 5960x(im lazy just build havent had time to oc yet)


Spoiler: Warning: Spoiler!


----------



## Kana-Maru

Tomb Raider -Ultimate- @ *1080p* Average is *166.5fps*

That from old data using the Catalyst 15.7.1 [7/29/2015] Drivers

I agree with the 1080p statement. 1440p and 4K is were Fury shines. 4K Average is 51fps using the older drivers.


----------



## Alastair

Quote:


> Originally Posted by *By-Tor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> How does this stack up? This was done on my 1080P screen for now.
> Fury's in crossfire (56 CU's 64/224/3584) @ 1000/500 with AMD FX8370 @ 4.95GHz.
> 
> 
> 
> Also guys where does one go for the USR or ultimate super resolution or whatever the downscaling feature is called?
> 
> 
> 
> Was curious to see how my 290x's performed against the fury's...
> 
> This is at the same 1080p res. and setting in game.
> 
> Couldn't get print screen to work so had to use my camera.
> 
> I was shocked that they did very well matched up to the fury's. Fury's max. FPS was much higher, but the min. FPS fell way off.
Click to expand...

You might find your I7 gives you a slight advantage in FPS over my FX. So yeah.


----------



## xer0h0ur

Quote:


> Originally Posted by *wdpir32k3*
> 
> Do you guys think AMD will unlock the volts on the fury series anytime soon I know HBM is uncharted territory and it's not matured yet


There is no lock. The problem lies in the software coders for Trixx and Afterburner not finishing or even starting work on updating their respective software to modify the voltage for Fury/Fury X. Supposedly Unwinder (Afterburner's coder) had not even received his R9 Nano yet. I have no idea if he finally got it and is making progress. W1zzard (Trixx's coder) had made one version of his software that had some level of support for voltage modification. It was rejected by Sapphire (presumably still buggy) even though W1zzard had gone ahead and published half baked testing on his unfinished software while claiming it was done and there wasn't anything else that could be done. Problem is he was the founder of Techpowerup which is a notorious Nvidia shill site. So no surprise there. I still contend he's been paid off by Nvidia to not finish or delay it as long as he can. Obviously I have no proof of this though.
Quote:


> Originally Posted by *ff0000T34M*
> 
> Fury isn't well known for great performance @ 1080 but for a data point i'll throw mine in as well.
> 
> furyx x2 @ 1080p cpu stock 5960x(im lazy just build havent had time to oc yet)
> 
> 
> Spoiler: Warning: Spoiler!


This, Fury/Fury X was aimed squarely at higher resolution gaming. Its not meant for 1080p gaming. I would even argue that you wasted your money buying enthusiast class GPU(s) to only play at 1080p.


----------



## fewness

Quote:


> Originally Posted by *xer0h0ur*
> 
> There is no lock. The problem lies in the software coders for Trixx and Afterburner not finishing or even starting work on updating their respective software to modify the voltage for Fury/Fury X. Supposedly Unwinder (Afterburner's coder) had not even received his R9 Nano yet. I have no idea if he finally got it and is making progress. W1zzard (Trixx's coder) had made one version of his software that had some level of support for voltage modification. It was rejected by Sapphire (presumably still buggy) even though W1zzard had gone ahead and published half baked testing on his unfinished software while claiming it was done and there wasn't anything else that could be done. Problem is he was the founder of Techpowerup which is a notorious Nvidia shill site. So no surprise there. I still contend he's been paid off by Nvidia to not finish or delay it as long as he can. Obviously I have no proof of this though.


Damn Nvidia! I knew it was you!


Spoiler: Warning: Spoiler!



Seriously?


----------



## xer0h0ur

Fury X released June 24th, W1zzard posts the results of his "testing" on his "finished" Trixx software on July 24th (its submitted to Sapphire and subsequently rejected), here we sit October 17th and not so much as a peep about it since it was rejected. You do the math.


----------



## By-Tor

Quote:


> Originally Posted by *Alastair*
> 
> You might find your I7 gives you a slight advantage in FPS over my FX. So yeah.


Nice.... I'm sure at higher res. the Fury would walk away...


----------



## xer0h0ur

LOL, yeah. Must be why AMD cut them off from the test card supply chain.


----------



## Forceman

Quote:


> Originally Posted by *xer0h0ur*
> 
> Fury X released June 24th, W1zzard posts the results of his "testing" on his "finished" Trixx software on July 24th (its submitted to Sapphire and subsequently rejected), here we sit October 17th and not so much as a peep about it since it was rejected. You do the math.


So rather than assume that the voltage control really is buggy and Sapphire (which only sells AMD cards) decided not to release it, you jumped straight to Nvidia paid them off? Yeah, that makes much more sense. I assume Nvidia also paid off Unwinder (Rivatuner) and Asus (GPUTweak) in this scenario?


----------



## xer0h0ur

Unwinder hadn't even received a card. He himself showed no interest in going out of pocket to get a Fiji based card and he shouldn't have to. That was Sapphire dropping the ball altogether on that one. I don't even have a clue who does Asus GPUTweak. Its not a matter of the other guys not getting anything done. Its a matter of someone who has had hardware in hand and been working on their software for months already.


----------



## Forceman

And the point is, if it was so easy, someone else would have done it by now. Much more likely that it just isn't as easy as it used to be, rather than some kind of Nvidia-led conspiracy.

Not that the why matters anyway, the result is the same. Just one more thing for AMD owners to "wait for".


----------



## Medusa666

So returned my Asus fury strix to the reseller due to one faulty fan on the cooler. Got a new one as replacement but also got the option to buy a Fury X instead.

Is the Fury X added performance worth the extra buck in your opinion? How long is the life expectancy of the AIO, it looks kinda cheap.


----------



## Semel

Quote:


> 1440p and 4K is were Fury shines


4K? Please... There is not a single card that can play modern games at 4K *maxed out* with decent (>40-45, coz everything less than 40\45 is *noticeably* worse than 60)[/B] fps.

Some games like witcher 3 are extremely unfriendly to amd gpus even with hairworks disabled. My fps sometimes drops to 48-50(woods, lots of trees) at 1080(!) in W3 albeit witcher 3 might be a bad example coz 1) nvidia 2) cd projekt has always been bad at coding.

And nvidia having 80% market share doesn't help things ...

Fury x (especially fury for obvious reasons) is kinda bad OCing wise. You don't need wizards voltage review to know this. Just volt mod it , use "special" bios,say, from hwbot and see for yourself.Yeah you could push it to a pretty good xxxx core speed but only using ridiculous amount of voltage and ln2 cooling...
Quote:


> Techpowerup which is a notorious Nvidia shill site.


yet most of their AMD GPU reviews are ones of the best out there and pretty unbiased.


----------



## rdr09

Quote:


> Originally Posted by *Semel*
> 
> 4K? Please... There is not a single card that can play modern games at 4K *maxed out* with decent (>40-45, coz everything less than 40\45 is *noticeably* worse than 60)[/B] fps.
> 
> Some games like witcher 3 are extremely unfriendly to amd gpus even with hairworks disabled. My fps sometimes drops to 48-50(woods, lots of trees) at 1080(!) in W3 albeit witcher 3 might be a bad example coz 1) nvidia 2) cd projekt has always been bad at coding.
> 
> And nvidia having 80% market share doesn't help things ...
> 
> Fury x (especially fury for obvious reasons) is kinda bad OCing wise. You don't need wizards voltage review to know this. Just volt mod it , use "special" bios,say, from hwbot and see for yourself.Yeah you could push it to a pretty good xxxx core speed but only using ridiculous amount of voltage and ln2 cooling...
> yet most of their AMD GPU reviews are ones of the best out there and pretty unbiased.


why would you assume that statement is referring to using a single card for 4K? i didn't.


----------



## Alastair

Here are some 1440P results in heaven from my two Fury's.

Heaven 4.0 @ 2560x1440 downscaled to 1080P
DX11
Quality = Ultra
Tessellation = extreme
AA = 4x
Full screen

2112 @ 83.8fps ave


AA 8X
1886 @ 74.9 fps ave


----------



## Alastair

Tomb Raider. Ultimate preset. 2x Sapphire Fury Tri-X @ 1000/500, FX-8370 @ 4.95GHz

For whatever reasons the min FPS seemed to be glitching. Cause nowhere did I see anything that low. Maybe just as the bench finished loading up?
1920x1080 = 171.4 fps ave


2560x1440 Downscaled = 123.7 fps ave


3840x2160 Downscaled = 65.9fps ave


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Unwinder hadn't even received a card. He himself showed no interest in going out of pocket to get a Fiji based card and he shouldn't have to. That was Sapphire dropping the ball altogether on that one. I don't even have a clue who does Asus GPUTweak. Its not a matter of the other guys not getting anything done. Its a matter of someone who has had hardware in hand and been working on their software for months already.


He can borrow my extra one. It's literally sitting here doing nothing.

I suspect however the same as what you said in an earlier post. Nvidia shills everywhere pushing antiquated notions like superior drivers....psh that died completely when GeForce Experience dropped. Haven't seen a beta driver listed on the GeForce site since...."Game SOMETIMES Ready" on any major AAA release though amirite?


----------



## xer0h0ur

Quote:


> Originally Posted by *Forceman*
> 
> And the point is, if it was so easy, someone else would have done it by now. Much more likely that it just isn't as easy as it used to be, rather than some kind of Nvidia-led conspiracy.
> 
> Not that the why matters anyway, the result is the same. Just one more thing for AMD owners to "wait for".


That much is a given. Either way you slice it its still a bad result since here we sit waiting for any software to give voltage control over Fiji.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Unwinder hadn't even received a card. He himself showed no interest in going out of pocket to get a Fiji based card and he shouldn't have to. That was Sapphire dropping the ball altogether on that one. I don't even have a clue who does Asus GPUTweak. Its not a matter of the other guys not getting anything done. Its a matter of someone who has had hardware in hand and been working on their software for months already.
> 
> 
> 
> He can borrow my extra one. It's literally sitting here doing nothing.
> 
> I suspect however the same as what you said in an earlier post. Nvidia shills everywhere pushing antiquated notions like superior drivers....psh that died completely when GeForce Experience dropped. Haven't seen a beta driver listed on the GeForce site since...."Game SOMETIMES Ready" on any major AAA release though amirite?
Click to expand...

Lets swap!







You can take my Tri-X that doesn't want to unlock CU's!


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Lets swap!
> 
> 
> 
> 
> 
> 
> 
> You can take my Tri-X that doesn't want to unlock CU's!


I plan on selling it. I just find this hilarious that I could manage to afford and get two being a nobody and this guy can't get one to do his magic. Sounds like a poor excuse to me but in the end since I am using one Fury X at 1440 reso to game casually lately how much this card could OC was not part of why I bought it personally. I would have grabbed a 980Ti or two for that since they have been proven in that regard.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I plan on selling it. I just find this hilarious that I could manage to afford and get two being a nobody and this guy can't get one to do his magic. Sounds like a poor excuse to me but in the end since I am using one Fury X at 1440 reso to game casually lately how much this card could OC was not part of why I bought it personally. I would have grabbed a 980Ti or two for that since they have been proven in that regard.


Honestly its a bit ridiculous to expect some random guy who is making software for an AIB to have to buy his own cards on EVERY SINGLE GEN to make the software for them. Its solely the responsibility of the AIB to get him a card. I blame the AIB completely on this one.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Honestly its a bit ridiculous to expect some random guy who is making software for an AIB to have to buy his own cards on EVERY SINGLE GEN to make the software for them. Its solely the responsibility of the AIB to get him a card. I blame the AIB completely on this one.


Which is why I said he could borrow mine to illustrate how ridiculous it is for this dude to say he hasn't been able to get his hands on one. Does he have 0 friends and 0 fans as well? I don't know him from a hole in the wall but would still let him borrow it no joke to solve this stupid issue.


----------



## Semel

*rdr09*
Quote:


> why would you assume that statement is referring to using a single card for 4K?


Maybe because of the way it was written?
*
Thoth420*
Quote:


> Does he have 0 friends and 0 fans as well?


He lives in Russia.. and not in the most populated city.. Considering USD-RUB exchange rate I don't think he could find someone to give him a card even in Moscow. These cards are really expensive there + shops charge much more than the official Russian price + economic crisis and stuff..

And I reckon there are not many who would be willing to risk their card sending it via mail to another country.


----------



## Vlada011

I don't know what to do.
Fury X have 4GB, R9-390X have 8GB.
I will use card next 2 years and Install nice waterblock with EKWB Predator but I'm not sure what to do.
4GB on Fury X worry me. Why AMD didn't launch card with 8GB.


----------



## drm8627

Quote:


> Originally Posted by *Vlada011*
> 
> I don't know what to do.
> Fury X have 4GB, R9-390X have 8GB.
> I will use card next 2 years and Install nice waterblock with EKWB Predator but I'm not sure what to do.
> 4GB on Fury X worry me. Why AMD didn't launch card with 8GB.


the RAM on the fury and x versions is still better than the 8gb from 390x. It has a wider bandwidth, and outperforms ddr5 to the point to where 4gb of HBM is still superior to 8gb of ddr5


----------



## Vlada011

Yea people try to convince me that 4GB HBM is similar as 6GB GDDR5 but somehow I'm not sure in that.
Anyway Fury X is much stronger and better card and could cost me about 200e more.
I don't need to afraid pump coil whine because I will install waterblock.
Maybe I made mistake... I bought 2 White LED CORSAIR Fans. If I back in Radeon club maybe I should use all Red Fans.
Red or Dead!







Anyway light from SBZ sound card is Red. OK I could sell fans easy to order Twin Pack Red.
Real Radeon DX12 monsters arrive with HBM2 and 8GB in 2016.


----------



## drm8627

Quote:


> Originally Posted by *Vlada011*
> 
> Yea people try to convince me that 4GB HBM is similar as 6GB GDDR5 but somehow I'm not sure in that.
> Anyway Fury X is much stronger and better card and could cost me about 200e more.
> I don't need to afraid pump coil whine because I will install waterblock.
> Maybe I made mistake... I bought 2 White LED CORSAIR Fans. If I back in Radeon club maybe I should use all Red Fans.
> Red or Dead!
> 
> 
> 
> 
> 
> 
> 
> Anyway light from SBZ sound card is Red. OK I could sell fans easy to order Twin Pack Red.


could also get the non x version, and save a bit, and get similar performance.


----------



## Vlada011

No crippled cards. If I need to pay similar price I would like to work with 100% potential.
Not AMD cripple them, brands overclock later again. When I have cripple card I save money than to upgrade as soon as possible to completely unlocked chip.
Always when some situation show up and I need few fps....example I could only set x2 AA or x4 I would thought Full chip could give me better fps.
With completely unlocked chips you know at least that's maximum from that architecture.


----------



## drm8627

Quote:


> Originally Posted by *Vlada011*
> 
> No crippled cards. If I need to pay similar price I would like to work with 100% potential.
> Not AMD cripple them, brands overclock later again. When I have cripple card I save money than to upgrade as soon as possible to completely unlocked chip.
> Always when some situation show up and I need few fps....example I could only set x2 AA or x4 I would thought Full chip could give me better fps.
> With completely unlocked chips you know at least that's maximum from that architecture.


its not a crippled card. it is a card with a few fewer ROPS, and no watercooling. you choice though.


----------



## Vlada011

Fury have about 500 Shaders less than Fury X?


----------



## Alastair

Quote:


> Originally Posted by *Vlada011*
> 
> Fury have about 500 Shaders less than Fury X?


Firstly while 512 disabled shaders still sound like a lot. In all reality its the ROP's and TMU's as far as I am aware. If you look at it as its whole. Fury ON PAPER should be around 15% slower than Fury X. However real world testing has revealed that the difference is more on the lines of 5%-7%

Then a normal Fury still has the potential to unlock. You can try unlocking all 8 disabled compute units into a full fat FuryX. But in all reality that is hard luck to come by. However you have a much greater chance of unlocking 4 of the 8 disabled units and thereby closing that 15% on paper gap to just around 7% and then the real world difference becomes around 3%-5% with 3840.

However look at it from a price perspective. Do you want to spend 15% more for something that is only around 5% faster? Your choice. I made mine. I got to Tri-X's. Lemme tell you. These things are fast!


----------



## Vlada011

Some test say overclocked 970 is 5% weaker than GTX980 and situation in reality is completely difference.


----------



## clubber_lang

Quote:


> Originally Posted by *Alastair*
> 
> Firstly while 512 disabled shaders still sound like a lot. In all reality its the ROP's and TMU's as far as I am aware. If you look at it as its whole. Fury ON PAPER should be around 15% slower than Fury X. However real world testing has revealed that the difference is more on the lines of 5%-7%
> 
> Then a normal Fury still has the potential to unlock. You can try unlocking all 8 disabled compute units into a full fat FuryX. But in all reality that is hard luck to come by. However you have a much greater chance of unlocking 4 of the 8 disabled units and thereby closing that 15% on paper gap to just around 7% and then the real world difference becomes around 3%-5% with 3840.
> 
> However look at it from a price perspective. Do you want to spend 15% more for something that is only around 5% faster? Your choice. I made mine. I got to Tri-X's. Lemme tell you. These things are fast!


Alistar....I have the exact same card as you have on the way from newegg. I'm only getting 1 card though. I am coming from 2 X 7970's ( which couldn't do the Freesync thing ) and with me needing more single GPU power for my race sims , you think I made the right call? And in FPS games , should I expect pretty close to the same performance as what my two 7970's were giving me? I'm pretty excited to see how well the Freesync helps in the race sims and fps games. And hopefully that new Saphire tri-x card will be here within the next couple days.


----------



## littlestereo

Quote:


> Originally Posted by *Semel*
> 
> Some games like witcher 3 are extremely unfriendly to amd gpus even with hairworks disabled. My fps sometimes drops to 48-50(woods, lots of trees) at 1080(!) in W3 albeit witcher 3 might be a bad example coz 1) nvidia 2) cd projekt has always been bad at coding.


Not sure who told you that Witcher 3 stuff but at 4k Fury X consistently beats 980ti in Witcher 3, Fury handily beats the 980, 290/x beats 970 and the 380 beats the 960 (at 1440p and 1080p). Witcher 3 is one of the few games that actually leverages AMD's raw hardware advantage.

http://arstechnica.com/gadgets/2015/06/amd-fury-x-reviews-show-strong-4k-performance-but-doesnt-beat-980-ti-overall/

http://www.bit-tech.net/hardware/graphics/2015/06/24/amd-radeon-r9-fury-x-review/9

http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html

http://www.extremetech.com/extreme/209665-amd-radeon-r9-fury-review-chasing-the-gtx-980s-sweet-spot/2


----------



## Vlada011

Quote:


> Originally Posted by *Alastair*
> 
> Firstly while 512 disabled shaders still sound like a lot. In all reality its the ROP's and TMU's as far as I am aware. If you look at it as its whole. Fury ON PAPER should be around 15% slower than Fury X. However real world testing has revealed that the difference is more on the lines of 5%-7%
> 
> Then a normal Fury still has the potential to unlock. You can try unlocking all 8 disabled compute units into a full fat FuryX. But in all reality that is hard luck to come by. However you have a much greater chance of unlocking 4 of the 8 disabled units and thereby closing that 15% on paper gap to just around 7% and then the real world difference becomes around 3%-5% with 3840.
> 
> However look at it from a price perspective. Do you want to spend 15% more for something that is only around 5% faster? Your choice. I made mine. I got to Tri-X's. Lemme tell you. These things are fast!


Now I checked price best I can find ASUS Fury is 590e, ASUS Fury X 650-660e...
I think worth so much investment in Fury X...
I can't wait to jump them on GeForce Community to show them image quality, in topic where hundreds people recognize inferior picture on GeForce, they later figure out something is not same exactly.
I notice better sharpness and details on Radeon before 10 years... Situation is not so drastic, now as before, NVIDIA improve picture quality but it's visible difference, more blur and like someone drop details for 20% on default settings. Best could be explained AMD on High same as NVIDIA on Ultra.


----------



## Alastair

Quote:


> Originally Posted by *clubber_lang*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Firstly while 512 disabled shaders still sound like a lot. In all reality its the ROP's and TMU's as far as I am aware. If you look at it as its whole. Fury ON PAPER should be around 15% slower than Fury X. However real world testing has revealed that the difference is more on the lines of 5%-7%
> 
> Then a normal Fury still has the potential to unlock. You can try unlocking all 8 disabled compute units into a full fat FuryX. But in all reality that is hard luck to come by. However you have a much greater chance of unlocking 4 of the 8 disabled units and thereby closing that 15% on paper gap to just around 7% and then the real world difference becomes around 3%-5% with 3840.
> 
> However look at it from a price perspective. Do you want to spend 15% more for something that is only around 5% faster? Your choice. I made mine. I got to Tri-X's. Lemme tell you. These things are fast!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Alistar....I have the exact same card as you have on the way from newegg. I'm only getting 1 card though. I am coming from 2 X 7970's ( which couldn't do the Freesync thing ) and with me needing more single GPU power for my race sims , you think I made the right call? And in FPS games , should I expect pretty close to the same performance as what my two 7970's were giving me? I'm pretty excited to see how well the Freesync helps in the race sims and fps games. And hopefully that new Saphire tri-x card will be here within the next couple days.
Click to expand...

I cant tell you if you made the right choice, You will have to experience it to see for yourself. However coming from 2 ye olde 6850's @ 1050MHz core and 1250MHz memory. The performance blows me away.

I can play BF4 without any frame drops. at 1440P downscaled on Ultra. Compared to my 6850's which would struggle at 1080P high. Especially when getting shot from behind and trying to turn rapidly.
I can play Star Citizen maxed out at 1080P. I have not tried 1440P yet. But my 6850's could barely manage medium.

I don't know whether you will get the same performance as two 7970's. Since a Fury X is basically double a 7970 core.
But what you wont have is micro-stutter caused by the old bridges that used to be used on 7970's. Dunno if micro-stutter effected you. I don't notice it.
and also obviously you have the chance to expand to a second card when ever you want.

So when you receive the card. You tell us if you think it is worth it.


----------



## ff0000T34M

So i got PLP eyefinity working but it took a lot of work. I think my issue was because i have different monitors and resolutions supported natively. Once i got it work i had screen tearing on the side panels regardless of type of connection. I suspect it was using an off refresh rate like 59hz or something. If anyone else gives this a shot I'd be curious how well it works.

since my monitors for side portrait were TN i didn't like them so if i can find some good VA or IPS panels i will give it another go.


----------



## HagbardCeline

Sort of crossing threads here, but I'm trying to boot to an M.2 drive and I'm getting an error saying my Sapphire R9 Fury X is not UEFI compatible. Is this merely a driver issue, or a hardware issue? The ASUS version of the Fury X *is* apparently UEFI compatible. Am I S.O.L. when it comes to booting to an M.2. drive here?


----------



## Alastair

Nope. Scratch it all! It was all a lie. My computer lied to me and it just managed to spit out better results then some of my older results.


----------



## mRYellow

Can someone confirm if there is a difference (Hardware wise) between the Sapphire Fury and the Fury OC edition?


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Can someone confirm if there is a difference (Hardware wise) between the Sapphire Fury and the Fury OC edition?


As far as I know it's just got a slightly higher voltage to allow for slightly better overclocking.

Can any body please test their pair of normal Fury's with 3 CU's unlocked per card? I would really like to know if anyone else can repeat the improvements I have seen.


----------



## Otterfluff

Sorry for no updates. I had a wrongly wired custom Power supply cable from ebay fry my mainboard and everything on the pci bus. Which included two Fury X. So I was up for a new mainboard, two Fury X and a random backup HDD that all smoked. It really put a dent in my watercooling project and getting around to volt modding. I am still waiting on replacement Fury X.

Had all the potentiometers ready to go but it just wasn't going to happen. I have a smoked fury X to practice my soldering on now at least.


----------



## mRYellow

Quote:


> Originally Posted by *Otterfluff*
> 
> Sorry for no updates. I had a wrongly wired custom Power supply cable from ebay fry my mainboard and everything on the pci bus. Which included two Fury X. So I was up for a new mainboard, two Fury X and a random backup HDD that all smoked. It really put a dent in my watercooling project and getting around to volt modding. I am still waiting on replacement Fury X.
> 
> Had all the potentiometers ready to go but it just wasn't going to happen. I have a smoked fury X to practice my soldering on now at least.


Ouch!


----------



## Semel

Quote:


> Originally Posted by *littlestereo*
> 
> Not sure who told you that Witcher 3 stuff but at 4k Fury X consistently beats 980ti in Witcher 3,


Who cares if it beats it at 4K? The framerate is unplayable unless you consider glorious 25 (and you WILL get 25+ if you go deep into the woods +weather) -35 cinematic experience lol playable.

PS Oh and btw on most sites you linked wicther 3 is not played at max settings and Im not talking about hairworks. ( a mix of high and ultra, SSAO)+ don't forget 980ti overclocks like crazy.


----------



## Luftdruck

Received my replacement Fury X this weekend. It still whines a bit - I'll get over it.

As it started to rattle (this time the graphicscards chassis) due to vibrations. I tried to open it and ****ed up like 2 of the 4 screws on the front plate.
Does anyone know which screws they use? Want to replace them asap. They might got stuck. Could someone suggest me a better fitting screw-driver?


----------



## Otterfluff

Quote:


> Originally Posted by *Semel*
> 
> Who cares if it beats it at 4K? The framerate is unplayable unless you consider glorious 25 (and you WILL get 25+ if you go deep into the woods +weather) -35 cinematic experience lol playable.
> 
> PS Oh and btw on most sites you linked wicther 3 is not played at max settings and Im not talking about hairworks. ( a mix of high and ultra, SSAO)+ don't forget 980ti overclocks like crazy.


To be honest no single card does 4k well unless you use two. A twin 390 or 390x or two fury is the setup to own if your serious about high end 4k gaming. The 980TI is only the better option for a single card. Crossfire just scales that much better at 4k. So for the argument of 4k gaming the 980 TI is just not the best solution. Anything else sure the 980TI it's a great card to have but if your aiming for 4k I would not consider it a first choice for a multi gpu setup.

If your really pushing for a 4k monitor and high end gaming then the fury is a much more attractive option at the high end.


----------



## Semel

Quote:


> Originally Posted by *Otterfluff*
> 
> To beThe 980TI is only the better option for a single card. Crossfire just scales that much better at 4k..


Yeah, I agree.Single GPU max performance ->980ti(overclocked), multi gpu-> crossfire.


----------



## Kana-Maru

Quote:


> Originally Posted by *Semel*
> 
> Who cares if it beats it at 4K? The framerate is unplayable unless you consider glorious 25 (and you WILL get 25+ if you go deep into the woods +weather) -35 cinematic experience lol playable.
> 
> PS Oh and btw on most sites you linked wicther 3 is not played at max settings and Im not talking about hairworks. ( a mix of high and ultra, SSAO)+ don't forget 980ti overclocks like crazy.


Where are you getting 25fps from? Every review he posted had the Fury X pushing nearly 40fps *Average*. I have The Witcher 3, but I never got around to benchmarking it, I never got around to benchmarking Ryse: SoR either.

My Fury X at stock speeds does very well at 4K. The only game that really hits it hard is Crysis 3. Yes the average 24fps felt as if I was watching a Blu-Ray DVD.


----------



## Alastair

When it comes to games that can downscale their resolution internally, like Battlefield 4, what is better for performance?

To use VSR in drivers and set 3840x2160 in game?
Or to use the games resolution scaling function and set 200%?


----------



## looncraz

Quote:


> Originally Posted by *Alastair*
> 
> When it comes to games that can downscale their resolution internally, like Battlefield 4, what is better for performance?
> 
> To use VSR in drivers and set 3840x2160 in game?
> Or to use the games resolution scaling function and set 200%?


I would imagine VSR is faster as it is purely hardware-driven and has effectively no overhead.


----------



## xer0h0ur

Quote:


> Originally Posted by *Otterfluff*
> 
> Sorry for no updates. I had a wrongly wired custom Power supply cable from ebay fry my mainboard and everything on the pci bus. Which included two Fury X. So I was up for a new mainboard, two Fury X and a random backup HDD that all smoked. It really put a dent in my watercooling project and getting around to volt modding. I am still waiting on replacement Fury X.
> 
> Had all the potentiometers ready to go but it just wasn't going to happen. I have a smoked fury X to practice my soldering on now at least.


HOLY COW! I practically want to cry for you.


----------



## Alastair

Nope. Scratch it all! It was all a lie. My computer lied to me and it just managed to spit out better results then some of my older results.


----------



## Semel

Quote:


> Originally Posted by *Kana-Maru*
> 
> Where are you getting 25fps from? Every review he posted had the Fury X pushing nearly 40fps *Average*..


At 4K? 40 fps average? LOL Check their settings

Here you go everything set to ultra, hairworks disabled, 4K






40 fps average? Riiight.

On my unlocked (3840) fury overclocked to fury x core clock I get same results. I'm saying this to stress out that in games surprisingly it's not locked units that cause fury to be slower than fury x but core clock

PS Wanna rape ur gpu? Go to the white orchard, Ford. And into the forest. or to Kaer Morhen forests.


----------



## xer0h0ur

There is a reason I am using 2 or 3 GPUs while 4K gaming....as of yet there isn't any single GPU capable of running every game maxed out at 4K.


----------



## ff0000T34M

Quote:


> Originally Posted by *Semel*
> 
> At 4K? 40 fps average? LOL Check their settings
> 
> Here you go everything set to ultra, hairworks disabled, 4K
> 
> 
> 
> 
> 
> 
> 40 fps average? Riiight.
> 
> On my unlocked (3840) fury overclocked to fury x core clock I get same results. I'm saying this to stress out that in games surprisingly it's not locked units that cause fury to be slower than fury x but core clock
> 
> PS Wanna rape ur gpu? Go to the white orchard, Ford. And into the forest. or to Kaer Morhen forests.


So you said 25fps he said 40fps on avg. Then you link a video of 30-35fps.... did anyone win this yet?


----------



## Semel

I said 25-35 fps so I reckon I won.







Besides this video doesn't feature the most gpu intensive scenes but even without them I saw fps dropping to 27-29 sometimes


----------



## Kana-Maru

Quote:


> Originally Posted by *ff0000T34M*
> 
> So you said 25fps he said 40fps on avg. Then you link a video of 30-35fps.... did anyone win this yet?


I said nearly 40fps. I was going based on the list @littlestereo posted here:

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/4730#post_24521797

As I said previously I have yet to run my benchmarks. I'll probably get around to it later. Based of t he links that @littlestereo provided it appears that the Fury X does beat the 980 Ti @ 4K and average *nearly 40fps* in the benchmarks. What more is there to say? It's not a win lose situation for me.
Quote:


> Originally Posted by *Semel*
> 
> I said 25-35 fps so I reckon I won.
> 
> 
> 
> 
> 
> 
> 
> Besides this video doesn't feature the most gpu intensive scenes but even without them I saw fps dropping to 27-29 sometimes


Cookie?

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/4730#post_24521797

As I said before the links he posted shows the Fury X pushing well over what you claimed.


----------



## HagbardCeline

So Sapphire advised that I switch the vBIOS switch to the other position. (In this case, from Left to Right as you face the big glowing Radeon) still got the "This VGA Card is not UEFI Compatible) message. What's funny is that when you submit a support ticket, they even call the card the R9 Fury X (UEFI). It's crazy.


----------



## euxoa

I just stupidly flashed over both BIOS on my Sapphire Fury X. I made backups with GPU-Z, but I didn't know they weren't full backups. Now all I get when I try to boot with my Fury is a black screen.

Would anyone with a Sapphire Fury X (SKU#: 21246-00) be generous enough to take a full backup of their vBIOS with atiflash and upload it somewhere for me? Any other model would work as well. At this point it can't hurt.


----------



## Thoth420

I chose the Fury X for two reasons: Small Size(nano wasn't out yet when I purchased hardware) which is necessary if I want my res well....where I want it in the case I am using. The second is I was looking for a solid single GPU to drive 2560 x 1440 for the next couple years. It does both quite well and while a single 980Ti would have done the trick as well .....Nvidia got my money the last two go rounds and I loved my 6970 so I decided time to spread some love back to AMD. I feel they earned it being first out with HBM.









Still deciding on a monitor and would love some suggestions. I have been waiting on Eizo Foris FreeSync but still no word on price or availability.


----------



## ff0000T34M

Quote:


> Originally Posted by *Thoth420*
> 
> I chose the Fury X for two reasons: Small Size(nano wasn't out yet when I purchased hardware) which is necessary if I want my res well....where I want it in the case I am using. The second is I was looking for a solid single GPU to drive 2560 x 1440 for the next couple years. It does both quite well and while a single 980Ti would have done the trick as well .....Nvidia got my money the last two go rounds and I loved my 6970 so I decided time to spread some love back to AMD. I feel they earned it being first out with HBM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still deciding on a monitor and would love some suggestions. I have been waiting on Eizo Foris FreeSync but still no word on price or availability.


Don't worry, no need to justify your reasons. Hell if you said it was because you wanted too, it is a free world. I am about to be the poster child of stupidity as i have 2 more fury x's coming. I had always planned on 3 but i want to see how 4 will perform. I am using primarily 6400x3600 resolution but until they get here i am testing various eyefinity setups and resolutions. In the end i probably will only use 3 and send the fourth on back.


----------



## Medusa666

Can't decide on what card to get, Fury or the Fury X,the extra performance of the X looks nice but the card has coil whine and possibly pump noise. The Asus Fury Strix I had was extremely silent most of the time until the fan broke. Another thing is that during heavy loads the Asus card became quite noisy with the cooler, leading me to suspect that the Fury X would be more silent overall.


----------



## p4inkill3r

Quote:


> Originally Posted by *Medusa666*
> 
> Can't decide on what card to get, Fury or the Fury X,the extra performance of the X looks nice but the card has coil whine and possibly pump noise. The Asus Fury Strix I had was extremely silent most of the time until the fan broke. Another thing is that during heavy loads the Asus card became quite noisy with the cooler, leading me to suspect that the Fury X would be more silent overall.


My release day Fury X doesn't have coil noise and the pump noise is no different from any other AIO, which is to say barely noticeable.


----------



## xer0h0ur

Quote:


> Originally Posted by *ff0000T34M*
> 
> Don't worry, no need to justify your reasons. Hell if you said it was because you wanted too, it is a free world. I am about to be the poster child of stupidity as i have 2 more fury x's coming. I had always planned on 3 but i want to see how 4 will perform. I am using primarily 6400x3600 resolution but until they get here i am testing various eyefinity setups and resolutions. In the end i probably will only use 3 and send the fourth on back.


In the end you're going to run into a vRAM wall running those ridiculously high eyefinity resolutions. IMO the first generation that will be capable of handling that will be Arctic Islands (with crossfired setups of course).


----------



## aznguyen316

Quote:


> Originally Posted by *Thoth420*
> 
> I chose the Fury X for two reasons: Small Size(nano wasn't out yet when I purchased hardware) which is necessary if I want my res well....where I want it in the case I am using. The second is I was looking for a solid single GPU to drive 2560 x 1440 for the next couple years. It does both quite well and while a single 980Ti would have done the trick as well .....Nvidia got my money the last two go rounds and I loved my 6970 so I decided time to spread some love back to AMD. I feel they earned it being first out with HBM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still deciding on a monitor and would love some suggestions. I have been waiting on Eizo Foris FreeSync but still no word on price or availability.


In my Phanteks ITX case, the small PCB of the Sapphire Fury + waterblock allowed me to put my res/pump combo where it is now, where as a longer card would not allow that. Love this size and GPU (it's been fully unlocked to Fury X as well)


----------



## ff0000T34M

Quote:


> Originally Posted by *xer0h0ur*
> 
> In the end you're going to run into a vRAM wall running those ridiculously high eyefinity resolutions. IMO the first generation that will be capable of handling that will be Arctic Islands (with crossfired setups of course).


I respect your view, having spent some time with my cards Vram is not my issue. I don't intend to play games at these resolutions with everything maxed out. The vram issue is being blown out of proportion for most users. Can i make Vram an issue for my usage? Sure i could. Also it's obvious the next generation gpu's can run something better. They should or were in trouble.


----------



## HagbardCeline

Just curious whether anyone has gotten a Fury X to work with a UEFI Boot/CSM disabled. Sapphire's webpage even refers to it as a UEFI card, and their support people basically told me they had no idea and to contact ATI. Gee whiz. The ASUS flavor of the card has UEFI bios available, but I was hesitant to flash my Sapphire card with the ASUS BIOS (even though they're the same card) without knowing for sure it was safe to do so.


----------



## Thoth420

Quote:


> Originally Posted by *aznguyen316*
> 
> In my Phanteks ITX case, the small PCB of the Sapphire Fury + waterblock allowed me to put my res/pump combo where it is now, where as a longer card would not allow that. Love this size and GPU (it's been fully unlocked to Fury X as well)


I had the same plan for the h440 and the small form factor allowed placement in the window as opposed to hidden.


----------



## ff0000T34M

Screwing around with PLP with fury and while its ghetto(put together what i had lying around)looking at least it works i suppose. I don't know that i would buy monitors just for this setup but im going to try it out a bit with fury and see.

6000x2160 res - 40inch middle and 2x23inch edges(de-bezzeled). they are TN so yeah i would definitely replace them if i went this route.


quick valley run ultra, no AA - all stock for now havent even gotten to overclocking yet.


----------



## Lorem Ipsum

Quote:


> Originally Posted by *Medusa666*
> 
> Can't decide on what card to get, Fury or the Fury X,the extra performance of the X looks nice but the card has coil whine and possibly pump noise. The Asus Fury Strix I had was extremely silent most of the time until the fan broke. Another thing is that during heavy loads the Asus card became quite noisy with the cooler, leading me to suspect that the Fury X would be more silent overall.


Perhaps the next thing to try would be a Sapphire Fury?

I'm trying to pick a Fiji card as well, and I'm very picky about acoustics. So my concern with the Sapphire card is the coil whine mentioned. But from some sound recordings, like this one, I note the Asus card also has a coil whine:





And seeing how much quieter Sapphire's cooler is in this comparison:





...Leads me to wonder if the Asus's booming fans don't just mask its coil whine, though I imagine the whine is different as the Asus has a different 12-phase custom power delivery system.

I might well buy a Sapphire Fury in the next few days, if I do I'll let you know how it sounds.


----------



## diggiddi

Quote:


> Originally Posted by *ff0000T34M*
> 
> Screwing around with PLP with fury and while its ghetto(put together what i had lying around)looking at least it works i suppose. I don't know that i would buy monitors just for this setup but im going to try it out a bit with fury and see.
> 
> 6000x2160 res - 40inch middle and 2x23inch edges(de-bezzeled). they are TN so yeah i would definitely replace them if i went this route.
> 
> 
> quick valley run ultra, no


Noice setup


----------



## Alastair

Quote:


> Originally Posted by *Lorem Ipsum*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Medusa666*
> 
> Can't decide on what card to get, Fury or the Fury X,the extra performance of the X looks nice but the card has coil whine and possibly pump noise. The Asus Fury Strix I had was extremely silent most of the time until the fan broke. Another thing is that during heavy loads the Asus card became quite noisy with the cooler, leading me to suspect that the Fury X would be more silent overall.
> 
> 
> 
> Perhaps the next thing to try would be a Sapphire Fury?
> 
> I'm trying to pick a Fiji card as well, and I'm very picky about acoustics. So my concern with the Sapphire card is the coil whine mentioned. But from some sound recordings, like this one, I note the Asus card also has a coil whine:
> 
> 
> 
> 
> 
> And seeing how much quieter Sapphire's cooler is in this comparison:
> 
> 
> 
> 
> 
> ...Leads me to wonder if the Asus's booming fans don't just mask its coil whine, though I imagine the whine is different as the Asus has a different 12-phase custom power delivery system.
> 
> I might well buy a Sapphire Fury in the next few days, if I do I'll let you know how it sounds.
Click to expand...

To be honest as the owner of two Sapphire Tri-X's only my top card has any sort of coil whine. It isn't even whine. It's more like coil "buzz". The main thing is however, 99% of the time I do not hear anything from the card. And when I do happen to hear it, it does not bother me. My machine sits 1 1/2 meters away from me and I can barely hear the buzz, when I do happen to hear it. And that's with my side-panel off. With the panel on I do not hear a thing.


----------



## xer0h0ur

Quote:


> Originally Posted by *ff0000T34M*
> 
> I respect your view, having spent some time with my cards Vram is not my issue. I don't intend to play games at these resolutions with everything maxed out. The vram issue is being blown out of proportion for most users. Can i make Vram an issue for my usage? Sure i could. Also it's obvious the next generation gpu's can run something better. They should or were in trouble.


Oh no, I never said anything about GPU power. I was strictly talking about vRAM needs/requirements whilst gaming running those eyefinity resolutions. Obviously you can lower a game's settings to not run into vRAM issues but frankly I am not the type of person that sacrifices visual fidelity for higher resolution. I want both so if you want to have high game settings at those resolutions its not a matter of GPU horsepower its strictly a matter of how much vRAM you have at your disposal as its already a given you will require multiple GPUs to handle those eyefinity resolutions anyways.


----------



## ff0000T34M

Gpu power and vram go hand in hand. I have 4x290x i was just using, and where i had vram issues the furyx doesn't. I also tried 390x and while spots where vram had been an issue for my 290x the gpu power couldn't make up for it. So 290x and 390x gave the same performance just about and so even though i could turn up settings cause i had more vram the fps was still too low. Fury is doing better because the gpu horsepower is there. Yeah i can tap out the vram if i want by using say AA. Graphical fidelity is not exactly straight forward either, just because a game offers all these settings doesn't mean they are good. Most games offer post processing options and all it does is add, motion blur, depth of blur, more blur oh and more blur we hope you don't notice.

Its really up to your usage, as you say you want max settings. There are many games that run very poorly at max settings even at 1080 or 4k. Obviously going higher isn't going to work. Do you use AA in all games at 4k?


----------



## mRYellow

Quote:


> Originally Posted by *HagbardCeline*
> 
> So Sapphire advised that I switch the vBIOS switch to the other position. (In this case, from Left to Right as you face the big glowing Radeon) still got the "This VGA Card is not UEFI Compatible) message. What's funny is that when you submit a support ticket, they even call the card the R9 Fury X (UEFI). It's crazy.


Here's my backup files
https://www.dropbox.com/sh/vriws0ur05oi47v/AABoHZXaf398MDiLI_LdQdhpa?dl=0


----------



## xer0h0ur

Quote:


> Originally Posted by *ff0000T34M*
> 
> Gpu power and vram go hand in hand. I have 4x290x i was just using, and where i had vram issues the furyx doesn't. I also tried 390x and while spots where vram had been an issue for my 290x the gpu power couldn't make up for it. So 290x and 390x gave the same performance just about and so even though i could turn up settings cause i had more vram the fps was still too low. Fury is doing better because the gpu horsepower is there. Yeah i can tap out the vram if i want by using say AA. Graphical fidelity is not exactly straight forward either, just because a game offers all these settings doesn't mean they are good. Most games offer post processing options and all it does is add, motion blur, depth of blur, more blur oh and more blur we hope you don't notice.
> 
> Its really up to your usage, as you say you want max settings. There are many games that run very poorly at max settings even at 1080 or 4k. Obviously going higher isn't going to work. Do you use AA in all games at 4k?


Hell to the no. I don't even touch AA in anything other than DX9 titles where all that horsepower is sitting there idling. Running AA at 4K barely even makes a discernible difference as I am sure you already noticed running eyefinity. I presume that a panel's pixel density comes into play here too but I can't particularly claim that with certainty. As I understand it Fiji and Hawaii manage vRAM differently though so I am not surprised that you're able to get better usage of your vRAM on those Fiji cards than you did on either of your Hawaii gen cards.


----------



## diggiddi

Quote:


> Originally Posted by *ff0000T34M*
> 
> Gpu power and vram go hand in hand. I have 4x290x i was just using, and where i had vram issues the furyx doesn't. I also tried 390x and while spots where vram had been an issue for my 290x the gpu power couldn't make up for it. *So 290x and 390x gave the same performance just about* and so even though i could turn up settings cause i had more vram the fps was still too low. Fury is doing better because the gpu horsepower is there. Yeah i can tap out the vram if i want by using say AA. Graphical fidelity is not exactly straight forward either, just because a game offers all these settings doesn't mean they are good. Most games offer post processing options and all it does is add, motion blur, depth of blur, more blur oh and more blur we hope you don't notice.
> 
> Its really up to your usage, as you say you want max settings. There are many games that run very poorly at max settings even at 1080 or 4k. Obviously going higher isn't going to work. Do you use AA in all games at 4k?


Cool, I am relieved to hear this cos I went with the 290x instead of the 390. Congrats on 1st rep BTW


----------



## Agent Smith1984

Quote:


> Originally Posted by *ff0000T34M*
> 
> Gpu power and vram go hand in hand. I have 4x290x i was just using, and where i had vram issues the furyx doesn't. I also tried 390x and while spots where vram had been an issue for my 290x the gpu power couldn't make up for it. So 290x and 390x gave the same performance just about and so even though i could turn up settings cause i had more vram the fps was still too low. Fury is doing better because the gpu horsepower is there. Yeah i can tap out the vram if i want by using say AA. Graphical fidelity is not exactly straight forward either, just because a game offers all these settings doesn't mean they are good. Most games offer post processing options and all it does is add, motion blur, depth of blur, more blur oh and more blur we hope you don't notice.
> 
> Its really up to your usage, as you say you want max settings. There are many games that run very poorly at max settings even at 1080 or 4k. Obviously going higher isn't going to work. Do you use AA in all games at 4k?


I have tested both the 290's 4GB of VRAM and the 390's 8GB VRAM, with single GPU and with CF for each series of cards, and I completely disagree with everything you have said here..... and I have tested all of the above in 1080, 1440, 4k, and higher VSR scenarios.

The 8GB on the 390 series DOES indeed solve the VRAM issues associated with the Hawaii cards.
Also, adding Fiji GPU's will NOT help with any VRAM limitations with HBM, which I have not tested myself, but have read this information from legit sources, and will post when found.

There are however some good ways around VRAM limitations..... one of them is to turn off your pagefile, or move it to a RAMDISK


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have tested both the 290's 4GB of VRAM and the 390's 8GB VRAM, with single GPU and with CF for each series of cards, and I completely disagree with everything you have said here..... and I have tested all of the above in 1080, 1440, 4k, and higher VSR scenarios.
> 
> The 8GB on the 390 series DOES indeed solve the VRAM issues associated with the Hawaii cards.
> Also, adding Fiji GPU's will NOT help with any VRAM limitations with HBM, which I have not tested myself, but have read this information from legit sources, and will post when found.
> 
> There are however some good ways around VRAM limitations..... one of them is to turn off your pagefile, or move it to a RAMDISK


I've had a RAMdisk for the last 18mos. Are you saying this is part of why we may not be ever having issue with RAM on our GPU's?


----------



## ff0000T34M

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have tested both the 290's 4GB of VRAM and the 390's 8GB VRAM, with single GPU and with CF for each series of cards, and I completely disagree with everything you have said here..... and I have tested all of the above in 1080, 1440, 4k, and higher VSR scenarios.
> 
> The 8GB on the 390 series DOES indeed solve the VRAM issues associated with the Hawaii cards.
> Also, adding Fiji GPU's will NOT help with any VRAM limitations with HBM, which I have not tested myself, but have read this information from legit sources, and will post when found.
> 
> There are however some good ways around VRAM limitations..... one of them is to turn off your pagefile, or move it to a RAMDISK


I think maybe i am misunderstood on this. I am not saying 39x series doesn't help with the extra vram. What i am saying is with the extra vram my settings didn't change much because the gpu was already struggling. While fiji has more gpu power i was able to depending on the game to actually either up a few settings or keep them the same and get way better fps. I can use higher settings than my 290x on some titles where vram seemed to be a hard limit. For my usage i found the fiji gives me more room than 390x or 290x. This is not a blanket statement or to persuade someone to get something and i am not trying to justify my purchase. What works for me doesn't work for others. When i look at costs for my setup, the price for upgrading from 290x to 390x gave me almost nothing in return. Going further up the ladder furyx gave me more to work with. This is again is my personal scenario, and doesn't fit everyone else.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ff0000T34M*
> 
> I think maybe i am misunderstood on this. I am not saying 39x series doesn't help with the extra vram. What i am saying is with the extra vram my settings didn't change much because the gpu was already struggling. While fiji has more gpu power i was able to depending on the game to actually either up a few settings or keep them the same and get way better fps. I can use higher settings than my 290x on some titles where vram seemed to be a hard limit. For my usage i found the fiji gives me more room than 390x or 290x. This is not a blanket statement or to persuade someone to get something and i am not trying to justify my purchase. What works for me doesn't work for others. When i look at costs for my setup, the price for upgrading from *290x to 390x* gave me almost nothing in return. Going *further up the ladder furyx gave me more to work with*. This is again is my personal scenario, and doesn't fit everyone else.


Now I get you...


----------



## euxoa

Quote:


> Originally Posted by *HagbardCeline*
> 
> Just curious whether anyone has gotten a Fury X to work with a UEFI Boot/CSM disabled. Sapphire's webpage even refers to it as a UEFI card, and their support people basically told me they had no idea and to contact ATI. Gee whiz. The ASUS flavor of the card has UEFI bios available, but I was hesitant to flash my Sapphire card with the ASUS BIOS (even though they're the same card) without knowing for sure it was safe to do so.


I wouldn't recommend flashing the ASUS BIOS. I tried that in order to gain UEFI support and ended up with an unusable card.

I ended up contacting Sapphire customer support in order to get a copy of the original BIOS, and instead I've received a BIOS file that has UEFI support.

The files in this post also have UEFI support.
Quote:


> Originally Posted by *mRYellow*
> 
> Here's my backup files
> https://www.dropbox.com/sh/vriws0ur05oi47v/AABoHZXaf398MDiLI_LdQdhpa?dl=0


----------



## Mega Man

theres this thing, called a bios switch, that would help you if you do that.
1 shut off pc.
2 flip switch
3 restart pc, 4 flip switch and reflash bios


----------



## HagbardCeline

Quote:


> Originally Posted by *euxoa*
> 
> I wouldn't recommend flashing the ASUS BIOS. I tried that in order to gain UEFI support and ended up with an unusable card.
> 
> I ended up contacting Sapphire customer support in order to get a copy of the original BIOS, and instead I've received a BIOS file that has UEFI support.
> 
> The files in this post also have UEFI support.


After Sapphire shunted me off to ATI, I received a zip file with a new BIOS and ATIWINFLASH 2.71. I'm curious to see if the BIOS files that are in that dropbox link match what the AMD guy sent. I already flashed the card and at least the card wasn't bricked. I haven't had a chance to try to boot with CSM disabled yet though.

I noticed with CSM off, I now can't see my m.2 drive. Which has me worried, since that was the whole reason I was trying to UEFI boot.


----------



## Medusa666

Guys,

I really need some help today, if anyone knows the quality of the Fury X PCB and components used for VRM power delivery etc and can give me an answer in comparison with Asus Super Alloy II used in their Fury Strix please reply.

Also, what is the expected lifespan of an AIO solution such as the one on the Fury X? How is the quality compared to other similar AIO?

Reason I'm asking is because today is the last day I can return my Fury Strix for another card, and I'm looking at the Fury X or the Sapphire Fury.

Acoustics are important to me, it has to be relative silent, but quality and longevity is also up there since I plan to keep the card for 3-5 years depending on the demands of games being released.

Edit: Another question, what card of the Strix or the Sapphire Fury do you believe have the best quality fans on the gpu? Are they similar? Reason I returned my first fury strix was that the third fan started making a rattling noise after ten days of use.


----------



## Kana-Maru

Quote:


> Originally Posted by *Medusa666*
> 
> Guys,
> 
> I really need some help today, if anyone knows the quality of the Fury X PCB and components used for VRM power delivery etc and can give me an answer in comparison with Asus Super Alloy II used in their Fury Strix please reply.


Can't help you with that.
Quote:


> Also, what is the expected lifespan of an AIO solution such as the one on the Fury X? How is the quality compared to other similar AIO?


Well if it's anything like my CPU AIO you have nothing to worry about for many years. Cooler Master is name brand and isn't new to AIO cooling solutions.
Quote:


> Reason I'm asking is because today is the last day I can return my Fury Strix for another card, and I'm looking at the Fury X or the Sapphire Fury.


I'm rocking the Asus Fury X. No issues, no coil whine, no noise. I purposely turned everything off and ran benchmarks. The only noise came from the nearly silent fan. Temps were well below 50c. Good luck find one though. It took forever to find mine in stock.
Quote:


> Acoustics are important to me, it has to be relative silent, but quality and longevity is also up there since I plan to keep the card for 3-5 years depending on the demands of games being released.


Same here. My PC is on my desk right next to me. It's silent and I don't wear gaming headsets. You can control the fan speed. Initially the fan speed is set to 70 if the GPU gets to hot [which is never!]. It's silent. I turned the fan up to 100 just to see if it would annoy me and it doesn't. You can barely here it. I'm shooting for 3 years myself. My last GTX 670s lasted 3 years. I couldn't deal with Nvidia's buggy drivers anymore. The picture quality seems to be much better with AMD as well. I can definitely see the difference.


----------



## ff0000T34M

quadfire furyx up and running, havent done much but here are some heaven benches

4k used top 30 thread settings(4k, 2x tess:extreme/ultra


Spoiler: Warning: Spoiler!






accidental obddball resolutions 2048x1536 max/ultra


Spoiler: Warning: Spoiler!






1080 max/ultra


Spoiler: Warning: Spoiler!







Cpu is still stock cause im on air ATM. havent had time to revamp my watercool setup since it had 290s in with cpu.

Edited for spoilers


----------



## Alastair

Quote:


> Originally Posted by *ff0000T34M*
> 
> quadfire furyx up and running, havent done much but here are some heaven benches
> 
> 4k used top 30 thread settings(4k, 2x tess:extreme/ultra
> 
> accidental obddball resolutions 2048x1536 max/ultra
> 
> 1080 max/ultra
> 
> 
> Cpu is still stock cause im on air ATM. havent had time to revamp my watercool setup since it had 290s in with cpu.


damn that's fast.


----------



## ff0000T34M

DAI 6400x3600 settings used:


Spoiler: Warning: Spoiler!









in battle shots for fps snapshots
47.9fps


Spoiler: Warning: Spoiler!






47.9fps


Spoiler: Warning: Spoiler!






44.9


Spoiler: Warning: Spoiler!






39.5fps


Spoiler: Warning: Spoiler!







some random shots quality is only as good as the JPEGs sadly:


Spoiler: Warning: Spoiler!










usually its 50-60fps outside except in battle or very intense spots. rarely have i seen it go sub 40's yet. It plays pretty good so far, but this will be one of few games that seem to scale out to 4 cards. The screenshots don't do it any justice of course they are quality reduced jpg's and all that.

On the crossover 44k i am using 2x2 eyefinity on 4 split screen. so i get 1.1 pixel ratios at 1080 per quadrant. The scaling is really good above that however i am sure i'm limited by PPI after that. I can say depending on the game textures do look quite better than say 1080, and marginally better than standard 4k. The biggest improvement is aliasing thanks to down sampling. At any rate this isn't much for demo but fury does have some power behind it.


----------



## 350mdk

hi there i own two of them they seem to b fine mine r the xfx versions


----------



## 350mdk

these r my scores


----------



## HagbardCeline

Managed to get ATI to send me UEFI compatible vBIOS for the Fury X. Huzzah. (and thus I'm finally rocking the M.2. boot drive)


----------



## Lorem Ipsum

I bought a Sapphire Fury today and so far am extremely impressed!



Quote:


> Originally Posted by *Medusa666*
> 
> Guys,
> 
> I really need some help today, if anyone knows the quality of the Fury X PCB and components used for VRM power delivery etc and can give me an answer in comparison with Asus Super Alloy II used in their Fury Strix please reply.
> 
> Also, what is the expected lifespan of an AIO solution such as the one on the Fury X? How is the quality compared to other similar AIO?
> 
> Reason I'm asking is because today is the last day I can return my Fury Strix for another card, and I'm looking at the Fury X or the Sapphire Fury.
> 
> Acoustics are important to me, it has to be relative silent, but quality and longevity is also up there since I plan to keep the card for 3-5 years depending on the demands of games being released.
> 
> Edit: Another question, what card of the Strix or the Sapphire Fury do you believe have the best quality fans on the gpu? Are they similar? Reason I returned my first fury strix was that the third fan started making a rattling noise after ten days of use.


I've spent about three hours with my ear to the Sapphire Fury, I really struggle with high pitched noises (hyperacusis) so component acoustics are extremely important to me.

The fan noise is extremely quiet. Previously, under full load, my GPU has always been the loudest thing in the case. But with this card, the Fractal Define case fans and BeQuiet cooler fans are louder than the fans on the Sapphire Fury. It's not an abrasive noise either, but a quiet hum. I can no longer easily tell if the GPU is under load or idle from how the computer sounds, which is excellent. There is no rattling to report yet.

From the reviews, what I was more worried about was VRM noise. At idle, there's no noise whatsoever. At load, if the case is open, I can hear what sounds like a fast clicking (it's not a whine, buzz is a closer word to describe it). If the case is closed I can still hear it, but only with my ear to the back vent.

I have spent quite a long time trying to get the best recording of the VRM noise. I put my mic right up with the PCB and under full load, but recorded immediately after the load was applied, so the card was under full load but the fans had not turned on yet so the noise was most noticeable. Even then, I needed to use the 'amplify' effect in audacity to hear it:

http://vocaroo.com/i/s0UC62RSaEaz

This card has no more VRM noise than my any of my previous cards (for reference, it sounds very much like the Asus GTX 760), so I'm OK with it. However, some people seem to have had coil whine with this card, so I suspect it's a bit of lottery. If you're worried then maybe you should buy from somewhere with an unwanted goods return policy?


----------



## ff0000T34M

Quote:


> Originally Posted by *Lorem Ipsum*
> 
> I've spent about three hours with my ear to the Sapphire Fury, I really struggle with high pitched noises (hyperacusis) so component acoustics are extremely important to me.
> 
> The fan noise is extremely quiet. Previously, under full load, my GPU has always been the loudest thing in the case. But with this card, the Fractal Define case fans and BeQuiet cooler fans are louder than the fans on the Sapphire Fury. It's not an abrasive noise either, but a quiet hum. I can no longer easily tell if the GPU is under load or idle from how the computer sounds, which is excellent. There is no rattling to report yet.
> 
> From the reviews, what I was more worried about was VRM noise. At idle, there's no noise whatsoever. At load, if the case is open, I can hear what sounds like a fast clicking (it's not a whine, buzz is a closer word to describe it). If the case is closed I can still hear it, but only with my ear to the back vent.
> 
> I have spent quite a long time trying to get the best recording of the VRM noise. I put my mic right up with the PCB and under full load, but recorded immediately after the load was applied, so the card was under full load but the fans had not turned on yet so the noise was most noticeable. Even then, I needed to use the 'amplify' effect in audacity to hear it:
> 
> http://vocaroo.com/i/s0UC62RSaEaz
> 
> This card has no more VRM noise than my any of my previous cards (for reference, it sounds very much like the Asus GTX 760), so I'm OK with it. However, some people seem to have had coil whine with this card, so I suspect it's a bit of lottery. If you're worried then maybe you should buy from somewhere with an unwanted goods return policy?


I have to say after having 4 sapphires in a row these are very quiet indeed. When loaded i can barely hear the cards and my case is open air right next to me bout 1ft and a half head level. If i were to complain about anything it would be the slosh every so often the cards make from the water inside. They easily rival my custom watercool kit on both noise and temps. I rarely seem them go over 45 or 50c.

This has to be the best build quality ive seen to date on a gpu. Granted its my first pre water cooled gpu. I love the led logos and usage meter on the card. It makes me want to keep my case side off just to look. I wonder if the coil noise some report really is related to the PSU. I thought for sure id run into one with whine or pump noise given the crap storm people stirred up.


----------



## fewness

Quote:


> Originally Posted by *ff0000T34M*
> 
> DAI 6400x3600 settings used:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> in battle shots for fps snapshots
> 47.9fps
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 47.9fps
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 44.9
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 39.5fps
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> some random shots quality is only as good as the JPEGs sadly:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> usually its 50-60fps outside except in battle or very intense spots. rarely have i seen it go sub 40's yet. It plays pretty good so far, but this will be one of few games that seem to scale out to 4 cards. The screenshots don't do it any justice of course they are quality reduced jpg's and all that.
> 
> On the crossover 44k i am using 2x2 eyefinity on 4 split screen. so i get 1.1 pixel ratios at 1080 per quadrant. The scaling is really good above that however i am sure i'm limited by PPI after that. I can say depending on the game textures do look quite better than say 1080, and marginally better than standard 4k. The biggest improvement is aliasing thanks to down sampling. At any rate this isn't much for demo but fury does have some power behind it.


Quote:


> Originally Posted by *ff0000T34M*
> 
> quadfire furyx up and running, havent done much but here are some heaven benches
> 
> 4k used top 30 thread settings(4k, 2x tess:extreme/ultra
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> accidental obddball resolutions 2048x1536 max/ultra
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 1080 max/ultra
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cpu is still stock cause im on air ATM. havent had time to revamp my watercool setup since it had 290s in with cpu.
> 
> Edited for spoilers


Sir, I want to see your build, share some pictures?


----------



## buildzoid

So has anyone here had success with voltage on the HBM? I pushed mine from 1.35V to 1.375V and that did absolutely nothing for clocks. I think you need to go bellow 0 to get the HBM to scale.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> So has anyone here had success with voltage on the HBM? I pushed mine from 1.35V to 1.375V and that did absolutely nothing for clocks. I think you need to go bellow 0 to get the HBM to scale.


How did you do that? New version of MSI AB or something similar available now?


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> How did you do that? New version of MSI AB or something similar available now?


I figured out how to hard mod the reference PCB Fury cards for VHBM and VCC. If the 290X is anything to go by the Fury will never get support for HBM voltage control in software. VCC should be supported in software some time soon. W1zz has a version of Trixx that supports it it's just a matter of time before Sapphire releases it.

Here's the guide I made for volt modding the reference Fury and Fury X PCB. I didn't want to post it since someone already posted it in the Voltmod sub forum so I figured everyone here knew about it.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> I figured out how to hard mod the reference PCB Fury cards for VHBM and VCC. If the 290X is anything to go by the Fury will never get support for HBM voltage control in software. VCC should be supported in software some time soon. W1zz has a version of Trixx that supports it it's just a matter of time before Sapphire releases it.
> 
> Here's the guide I made for volt modding the reference Fury and Fury X PCB. I didn't want to post it since someone already posted it in the Voltmod sub forum so I figured everyone here knew about it.


Wish I could do that....


----------



## Agent Smith1984

Guys,

HAS ANYONE USED THE XFX Triple D Fury??

I have a good opportunity for a great price on one and would love some input....


----------



## looncraz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> HAS ANYONE USED THE XFX Triple D Fury??
> 
> I have a good opportunity for a great price on one and would love some input....


http://www.newegg.com/Product/Product.aspx?Item=N82E16814150757

Seems pretty decent.


----------



## Agent Smith1984

Quote:


> Originally Posted by *looncraz*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150757
> 
> Seems pretty decent.


That's the one I bought.....

Didn't want to put the deal up before I purchased cause I had the feeling they'd sell out before I went to hit the damn button!!


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> HAS ANYONE USED THE XFX Triple D Fury??
> 
> I have a good opportunity for a great price on one and would love some input....


go for it and you tell us!


----------



## buildzoid

The XFX card is just a rebrand of the Power Color card. The heat sink should be pretty good and the PCB is the reference AMD design.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> go for it and you tell us!


DONE!!!

If it whines too bad I'll send 'er back and get back on the 390 train


----------



## Agent Smith1984

Quote:


> Originally Posted by *buildzoid*
> 
> The XFX card is just a rebrand of the Power Color card. The heat sink should be pretty good and the PCB is the reference AMD design.


Good to know there may be a chance of some partial unlocking then....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Guys,
> 
> HAS ANYONE USED THE XFX Triple D Fury??
> 
> I have a good opportunity for a great price on one and would love some input....


That looks like an awesome card.


----------



## Agent Smith1984

I am sitting at work right now, looking at my order confirmation, and literally ready to jump out of skin in excitement.

I thought the $520 price on the Fury Triple D was pretty awesome considering that's what the GTX 980 sells for, and with some clocking, possible unlocking, and with the hope that some day we can get some voltage control, I can even get close to 980ti performance with this bad boy....


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> So has anyone here had success with voltage on the HBM? I pushed mine from 1.35V to 1.375V and that did absolutely nothing for clocks. I think you need to go bellow 0 to get the HBM to scale.


In my limited experience the HBM overclocked alot better if you can drop the temperature. When I routed my stock Fury X cooler into a 420mm monsta Radiator I was able to get 600 Mhz HBM clocks with stock voltage under FurMark with 41C temperatures under load. I was only able to get half that under the stock cooling with 67C load temperature. I really think you need to look at a custom loop to bring the thermals down to push HBM.

In other good news I got my two replacement Fury X yesterday "Asus this time" so maybe in the next week I can finish plumbing them in and look at volt modding them.


----------



## Agent Smith1984

Look at this statement on the XFX website for the Triple D card:

"XFX Voltage Control Technology

Scale up to four GPUs with AMD CrossFire and amplify your system's graphics

Complete control over the power of your card. XFX knows the enthusiast gamer wants to squeeze every last ounce of performance out of the card, our voltage control technology allows you to fine tune your card to push it to the limit. Thanks to AMD Overdrive Technology, you can tweak the card right within the AMD Catalyst Control center, no extra software required

- See more at: http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-triple-dissipation-r9-fury-4tf9#sthash.132yFhW3.dpuf
"

WTH??? Voltage control through AMD Overdrive??? What is this non sense they speak of?


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> In my limited experience the HBM overclocked alot better if you can drop the temperature. When I routed my stock Fury X cooler into a 420mm monsta Radiator I was able to get 600 Mhz HBM clocks with stock voltage under FurMark with 41C temperatures under load. I was only able to get half that under the stock cooling with 67C load temperature. I really think you need to look at a custom loop to bring the thermals down to push HBM.
> 
> In other good news I got my two replacement Fury X yesterday "Asus this time" so maybe in the next week I can finish plumbing them in and look at volt modding them.


So my suspicion was correct HBM needs cold first and then voltage. The HBM chips are 2W each right? Because if they are that low power it might be possible to make a custom heatsink which would integrate TECs for each of the chips and keep them at lower temps. I'll also try see what happens if you run them on less voltage. They are specced for 1.2V after all.


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Look at this statement on the XFX website for the Triple D card:
> 
> "XFX Voltage Control Technology
> 
> Scale up to four GPUs with AMD CrossFire and amplify your system's graphics
> 
> Complete control over the power of your card. XFX knows the enthusiast gamer wants to squeeze every last ounce of performance out of the card, our voltage control technology allows you to fine tune your card to push it to the limit. Thanks to AMD Overdrive Technology, you can tweak the card right within the AMD Catalyst Control center, no extra software required
> 
> - See more at: http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-triple-dissipation-r9-fury-4tf9#sthash.132yFhW3.dpuf
> "
> 
> WTH??? Voltage control through AMD Overdrive??? What is this non sense they speak of?


Hmm...maybe someone needs to give XFX a call and ask them?


----------



## buildzoid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Look at this statement on the XFX website for the Triple D card:
> 
> "XFX Voltage Control Technology
> 
> Scale up to four GPUs with AMD CrossFire and amplify your system's graphics
> 
> Complete control over the power of your card. XFX knows the enthusiast gamer wants to squeeze every last ounce of performance out of the card, our voltage control technology allows you to fine tune your card to push it to the limit. Thanks to AMD Overdrive Technology, you can tweak the card right within the AMD Catalyst Control center, no extra software required
> 
> - See more at: http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-triple-dissipation-r9-fury-4tf9#sthash.132yFhW3.dpuf
> "
> 
> WTH??? Voltage control through AMD Overdrive??? What is this non sense they speak of?


Define voltage control.

If you underclock a Fury the core voltage drops in proportion to how much you've under-clocked the card. You can do that through CCC. So they're probably referring to that.


----------



## p4inkill3r

Yeah, but nobody means that.


----------



## buildzoid

Sounds exactly like what a marketing department would come up with IMO. CCC AFAIK doesn't support voltage control on any GPU so why would it support it on the XFX Fury?


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> Sounds exactly like what a marketing department would come up with IMO. CCC AFAIK doesn't support voltage control on any GPU so why would it support it on the XFX Fury?


Curse you with your wretched logic!


----------



## Agent Smith1984

It seems as though all hope is lost for voltage control on fury....

However, a fury for $520 with a massive cooler was just too good to pass up on


----------



## buildzoid

You can always just do hard mods....

I'll be getting a 4 way Fury X setup soon. I'll be doing testing up to 1.45V core and I'll see if I can't do something about the HBM. Who would I go to for a custom waterblock?


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> You can always just do hard mods....
> 
> I'll be getting a 4 way Fury X setup soon. I'll be doing testing up to 1.45V core and I'll see if I can't do something about the HBM. Who would I go to for a custom waterblock?


EKWB, swiftech, alphacool and aquacomputer all have waterblock out for the Fury boards.

I have two of the EK blocks and they are very solid, no reviews yet comparing each of the blocks yet but the alphacool would perform similar and be identical to all their others because the main block is interchangeable from all their other models.

I would honestly go for either the EK or the aquacomputer. The EK block will be cheaper.

For both brands go for the nickle plated versions as it will tarnish less and in previous blocks reviewed perform a tiny bit better maybe a degree cooler.

Currently the aquacomputer block dose not have a backplate the ekwb block does. It's debatable if it's worth getting it to help cool the other side of the core/vrm with the heat pads they provide but the backplate on the ekwb looks really nice. I am using the nickel ek backplate to mount a veroboard with my potentiometers on the back side of my fury X.


----------



## buildzoid

I need custom water blocks as in 1 of a kind blocks. I wanna try put TECs on the HBM chips.


----------



## Otterfluff

Oh I see what you mean.

Why do you want to cool the HBM separate from the core?


----------



## Arizonian

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It seems as though all hope is lost for voltage control on fury....
> 
> However, a fury for $520 with a massive cooler was just too good to pass up on


I have a chance to sell a 780TI for $300 possibly and if I do I'm moving to the TRI X Fury or new Gigabyte Windforce Fury out soon.

After buying two 970's and that VRAM farce along with new GFE being forced down my throat debacle I'll be excited to have main rig (at least) back to AMD.









Congrats on your XFX DD Fury, naturally we'll want to know your thoughts on it and looking forward to it.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Oh I see what you mean.
> 
> The HBM is on the die right? Is it the small brown chips that surround the die? I know the four nobs from the core are the interposer's.


The HBM is the 4 small rectangles around the core. The interposer is the thing under them and the core. Each of the squares should put out about 2W at stock so it might be possible to put small 5-10W TECs on each and keep them bellow 20C which should help them clock and possibly make them scale with voltage.

2 of these covering only the HBM stacks should be enough. However I do see a real problem with the fact that heat from the GPU core will travel through the interposer into the HBM stacks so this idea might be completely stupid and pointless. This will also raise the GPU's overall power draw by about 40W and the gains might be negligible but hey that's what overclocking is all about.

I guess I could do a mock up where a cut a copper shim to fill the gap between the core and the stock water block. If that gives OKish results I'll upgrade it to the full on custom block. I think I'll only do this to one card as a proof of concept.


----------



## xer0h0ur

Quote:


> Originally Posted by *HagbardCeline*
> 
> Managed to get ATI to send me UEFI compatible vBIOS for the Fury X. Huzzah. (and thus I'm finally rocking the M.2. boot drive)


Praise Gaben. Happy you finally have it working as it should have from the get go.


----------



## rdr09

Currently in Dubai and saw a sapphire fury for $440. i'm tempted.


----------



## looncraz

Quote:


> Originally Posted by *rdr09*
> 
> Currently in Dubai and saw a sapphire fury for $440. i'm tempted.


Make sure you won't get nailed with importation taxes!


----------



## xer0h0ur

Gentlemen. Hope is not lost. W1zzard may be sitting on a working version of Trixx but a Fiji card did finally get through Mother Russia and found its way into Unwinder's hands. He said both Fiji and the 980 Ti Lightning will get voltage support in Afterburner 4.2. *slow clap*


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> Gentlemen. Hope is not lost. W1zzard may be sitting on a working version of Trixx but a Fiji card did finally get through Mother Russia and found its way into Unwinder's hands. He said both Fiji and the 980 Ti Lightning will get voltage support in Afterburner 4.2. *slow clap*


If it's only +100mv like the 290X was then it won't do much since the Fury scales pretty badly with voltage. On stock mine did 1100mhz on voltage it does 1145mhz. Oh and that's with the core bellow 60C. If you run the core at 70 or 80C the scaling is even worse.


----------



## rdr09

Quote:


> Originally Posted by *looncraz*
> 
> Make sure you won't get nailed with importation taxes!


thanks for the heads up. their computer stores here dwarf microcenter. lol

can't find a fury x, though.


----------



## p4inkill3r

Quote:


> Originally Posted by *buildzoid*
> 
> If it's only +100mv like the 290X was then it won't do much since the Fury scales pretty badly with voltage. On stock mine did 1100mhz on voltage it does 1145mhz. Oh and that's with the core bellow 60C. If you run the core at 70 or 80C the scaling is even worse.


I don't care at this point if all we get is 100mv, I just want to move the slider!


----------



## ff0000T34M

Quote:


> Originally Posted by *p4inkill3r*
> 
> I don't care at this point if all we get is 100mv, I just want to move the slider!


Just think if this was a game, you would have to pay for the voltage slider unlock DLC. or Early access voltage slider will be added later.


----------



## p4inkill3r

Quote:


> Originally Posted by *ff0000T34M*
> 
> Just think if this was a game, you would have to pay for the voltage slider unlock DLC. or Early access voltage slider will be added later.


Eh, not really an apt comparison IMO. When I think back to all the BIOS flashing of yesteryear and the associated headaches involved, we really have it easy now.









I have ran Firestrike Extreme @ 1125/585 and have lots of headroom left thermally; I like to think that another 100mv could put me to 1200/600-625.


----------



## Otterfluff

This is a practise run on volt modding a fury X using a dead fury.



I can use this as a referance for my two new asus fury X. Planning to run the wire all to the top left then solder it into a veroboard.

Thats red plasti dip holding the wires down.


----------



## ozyo

so i install my second gpu
i dont know how but it overclock itself to 38755984 MHz
http://www.techpowerup.com/gpuz/details.php?id=d2be2

now that's what i call golden gpu


----------



## Mega Man

Quote:


> Originally Posted by *buildzoid*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Gentlemen. Hope is not lost. W1zzard may be sitting on a working version of Trixx but a Fiji card did finally get through Mother Russia and found its way into Unwinder's hands. He said both Fiji and the 980 Ti Lightning will get voltage support in Afterburner 4.2. *slow clap*
> 
> 
> 
> If it's only +100mv like the 290X was then it won't do much since the Fury scales pretty badly with voltage. On stock mine did 1100mhz on voltage it does 1145mhz. Oh and that's with the core bellow 60C. If you run the core at 70 or 80C the scaling is even worse.
Click to expand...

psst. it has always been like that, why do you think ref, only and water is always recommended [email protected]
i will add, i have unlimited control on my 290x ( as much voltage as i want ) and +300 on my 295s, so .... i dunno what you mean ?
Quote:


> Originally Posted by *Otterfluff*
> 
> This is a practise run on volt modding a fury X using a dead fury.
> 
> I can use this as a referance for my two new asus fury X. Planning to run the wire all to the top left then solder it into a veroboard.
> 
> Thats red plasti dip holding the wires down.


but more over why did you kill it >?!?!?!?!


----------



## Otterfluff

Quote:


> Originally Posted by *Mega Man*
> 
> but more over why did you kill it >?!?!?!?!


I didn't kill it the evil modular psu cable from ebay did D=

Note to self always check custom cables for correct wiring with multi meter first


----------



## Mega Man

i once switched the 12v with the ground on a gpu, and one of the many reasons i love my seasonic. all my video cards are still functional [email protected]

seasonic cut out immediately no damage no short


----------



## skkane

Quote:


> Originally Posted by *Otterfluff*
> 
> I didn't kill it the evil modular psu cable from ebay did D=
> 
> Note to self always check custom cables for correct wiring with multi meter first


I tried to use my Corsair gen3 green cables on the new Sirtec psu and was wondering why it would not start. Sounded like some protections were being tripped inside the PSU







Luckily my 980ti's and everything else survived (i had EVERYTHING wired up with the wrong cables). Sirtec cables have inverted 12v/ground pins compared to the corsair ones. I was really lucky that everything survived and was good with the proper cables.

Yes, always measure with multim or straight up don't try if you can't. PSU cables don't mix from brand to brand (unless in some rare case). Unlucky to have the card die


----------



## skkane

Quote:


> Originally Posted by *Mega Man*
> 
> i once switched the 12v with the ground on a gpu, and one of the many reasons i love my seasonic. all my video cards are still functional [email protected]
> 
> seasonic cut out immediately no damage no short


Damn. So you think the sirtec saved me aswell? First time fans started spinning, thing wanted to start but then i heard a "clank" sound from the PSU and everything turned off. Would not start again after that (fans or anything other then mb led light).

It's a cheapo PSU but sirtec is a medium quality brand. I've got the astro gd 1200w, for 160$ (thank you JohnnyGuru) new i could not resist it since my hx1000 was aging and couldn't hold my setup anymore.


----------



## xer0h0ur

Quote:


> Originally Posted by *buildzoid*
> 
> If it's only +100mv like the 290X was then it won't do much since the Fury scales pretty badly with voltage. On stock mine did 1100mhz on voltage it does 1145mhz. Oh and that's with the core bellow 60C. If you run the core at 70 or 80C the scaling is even worse.


Actually people routinely went past 100mV on the 2XX series but I don't believe that was with Afterburner. I know that I can run more than 100mV on my 290X but my 295X2 won't give me that option (within Afterburner).


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Gentlemen. Hope is not lost. W1zzard may be sitting on a working version of Trixx but a Fiji card did finally get through Mother Russia and found its way into Unwinder's hands. He said both Fiji and the 980 Ti Lightning will get voltage support in Afterburner 4.2. *slow clap*


Good news!

My fury is arriving Tuesday, hopefully we get that new ab soon so i can really push this thing!


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> Actually people routinely went past 100mV on the 2XX series but I don't believe that was with Afterburner. I know that I can run more than 100mV on my 290X but my 295X2 won't give me that option (within Afterburner).


You can use some code to go up to 300mv on AB.

Trixx allows up to 200mv by default.


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> You can use some code to go up to 300mv on AB.
> 
> Trixx allows up to 200mv by default.


*gets quarter chub*

May you point me in the direction of this code? I wouldn't mind giving it another go at how much more I can push my cards with more voltage. +100mV on the 295X2 and +150mV on the 290X got me decent overclocks but I am curious how much more I can push it then.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> *gets quarter chub*
> 
> May you point me in the direction of this code? I wouldn't mind giving it another go at how much more I can push my cards with more voltage. +100mV on the 295X2 and +150mV on the 290X got me decent overclocks but I am curious how much more I can push it then.


Its part of the OP on the 290/x thread.

How to give more volts on MSI Afterburner by OCN member sugarhell typer.gif

ADD more volts to MSI AB Guide
Source

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


----------



## buildzoid

For the reference 290X PCB there was also the PT1 and PT3 BIOSs from Shamino that went up to 2V using a modified GPUTweak.

HBM doesn't seem to scale with voltage at all. Going from 1.2V core to 1.3V core on my did nothing for HBM. Giving HBM 25mv extra also did absolutely nothing.


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> Its part of the OP on the 290/x thread.
> 
> How to give more volts on MSI Afterburner by OCN member sugarhell typer.gif
> 
> ADD more volts to MSI AB Guide
> Source
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


+rep, upboated. Kappa.


----------



## xer0h0ur

Quote:


> Originally Posted by *buildzoid*
> 
> For the reference 290X PCB there was also the PT1 and PT3 BIOSs from Shamino that went up to 2V using a modified GPUTweak.
> 
> HBM doesn't seem to scale with voltage at all. Going from 1.2V core to 1.3V core on my did nothing for HBM. Giving HBM 25mv extra also did absolutely nothing.


People have noted though that keeping the HBM cool enough allowed them to push higher clocks on it. I still do agree though with the opinion that overclocking HBM isn't exactly providing a noticeable enough result though to make it worth going to that extreme though.


----------



## Mega Man

Quote:


> Originally Posted by *buildzoid*
> 
> For the reference 290X PCB there was also the PT1 and PT3 BIOSs from Shamino that went up to 2V using a modified GPUTweak.
> 
> HBM doesn't seem to scale with voltage at all. Going from 1.2V core to 1.3V core on my did nothing for HBM. Giving HBM 25mv extra also did absolutely nothing.


he is correct


----------



## ff0000T34M

Ran a couple 5k quad furyx benchmarks, i want to do scaling for fiji testing but i have to rebuild my eyefinity every time i drop a gpu from crossfire(probably some sort of bug). So it gets to be a pain in the butt.

5k 5120x2880 - avg 60.6fps
Valley ultra/4xAA


Spoiler: Warning: Spoiler!






5k 5120x2880 - avg 42.fps
Heaven 4.0 ultra/tess:extreme 4xAA


Spoiler: Warning: Spoiler!







5k 5120x2880 - avg 60fps
Tomb Raider ultimate vsynch


Spoiler: Warning: Spoiler!







no vsynch - avg 97.1 fps


Spoiler: Warning: Spoiler!







Gonna take some time this weekend and get my build together as much as possible. I want to get an oc on my cpu soon. Quadfire is very temperamental as usual though, so its limiting benchmarks for sure as many games get negative or bad Cf scaling along with being impatient right now as i'd rather play than benchmark. Just figured we don't have much data on fiji in this thread sadly, and reviews get outdated.


----------



## xer0h0ur

Did you never see DG Lee's comprehensive testing of FuryX single,crossfire, tri-fire and quadfire? He's also not the only one to do testing on quad Fury X's but far as I know he did the most in depth testing of it. Your testing is a bit more niche than that though. Its already a small niche to run quadfire. You're further slimming it down by testing quadfire in eyefinity @ 5K.


----------



## ff0000T34M

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you never see DG Lee's comprehensive testing of FuryX single,crossfire, tri-fire and quadfire? He's also not the only one to do testing on quad Fury X's but far as I know he did the most in depth testing of it. Your testing is a bit more niche than that though. Its already a small niche to run quadfire. You're further slimming it down by testing quadfire in eyefinity @ 5K.


I get it dude, you seem to constantly have something to say when i post. Seeing as i have furyx and posting in the fury thread and you don't kind of doesn't make sense. Especially when you have nothing but negative or very condescending posts. Since it's that important to you i will not post here anymore. Have a nice day.


----------



## Medusa666

Quote:


> Originally Posted by *ff0000T34M*
> 
> I get it dude, you seem to constantly have something to say when i post. Seeing as i have furyx and posting in the fury thread and you don't kind of doesn't make sense. Especially when you have nothing but negative or very condescending posts. Since it's that important to you i will not post here anymore. Have a nice day.


I for one really enjoyed your posts, please do not stop but continue : )


----------



## xer0h0ur

Can someone else tell me what I said that was offensive?


----------



## xer0h0ur

Quote:


> Originally Posted by *ff0000T34M*
> 
> I get it dude, you seem to constantly have something to say when i post. Seeing as i have furyx and posting in the fury thread and you don't kind of doesn't make sense. Especially when you have nothing but negative or very condescending posts. Since it's that important to you i will not post here anymore. Have a nice day.


I'm not sure why you thought I was being negative or condescending but just to be clear I was merely saying that quadfired Fury X's is a small niche and even more so when testing at 5K eyefinity. You would be, as far as I know, the only person testing that specific setup so I wasn't discounting your testing or the information you're providing in the least bit. In fact I was highlighting what was unique to your testing and mentioned someone else who had also done thorough quadfire testing. To me your testing in particular is interesting since you would most likely be running newer drivers than the guys who had previously done quadfire testing. Its always interesting to see the performance difference between one driver to the next and particularly with a unique/expensive setup like yours. Its not like users like you grow on trees.


----------



## Otterfluff

Quote:


> Originally Posted by *ff0000T34M*
> 
> I get it dude, you seem to constantly have something to say when i post. Seeing as i have furyx and posting in the fury thread and you don't kind of doesn't make sense. Especially when you have nothing but negative or very condescending posts. Since it's that important to you i will not post here anymore. Have a nice day.


I was reading your stuff too. I do not think xer0 was trying to single you out but I understand how you felt that way. I hope you come back to join and contribute.


----------



## Medusa666

So after considering what to get I finally ordered another (and hopefully last) Fury X.

I contacted the retailer and made sure that this batch was new, they recieved it in october, so it can be 2-3 weeks old. Hopefully the card I recieve won't have the pump whine, if it does, I'l send it back for a refund and consider my options.


----------



## Skinnered

Intresting ff0000T34M, I have Dell UP2715K (too?) and running both FuryX CF (dual) and TitanX SLi in exchange and wondering how 3 or 4 CF would scale at 5K. Please post more.

I also ran into a PCIe bandwidth problem I think. Some games magically all seems to run into a wall at 39,9 fps and performing very jittery, and that are mostly older dx9-games.
I am on a Z87 platform so I have only 2x PCIe 3.0 at x8. Enough for 4K, but at 5K it isn't sufficient I think.
I have a 5930K and Asus X99 E-WS incoming though


----------



## xer0h0ur

I believe crossfire performance in DX9 games is pretty well fubared for AMD cards.


----------



## Skinnered

Quote:


> Originally Posted by *xer0h0ur*
> 
> I believe crossfire performance in DX9 games is pretty well fubared for AMD cards.


Yes, on some spots/places in certain games, single GPU performance is better, Arcania, DarkSouls 2 for example, but the 39,9 fps issue is maybe something different, but I have to see how it goes on an X99 platform.

Also, post shader injectors like reshade, seems to hit CF harder then SLI. Don't know if it's just the shader or the extra V-ram load on the allready "tiny" 4 GB.


----------



## the9quad

Quote:


> Originally Posted by *xer0h0ur*
> 
> I believe crossfire performance in DX9 games is pretty well fubared for AMD cards.


Crossfire in ALOT of games is fubared for AMD cards. Same thing probably for SLI though as well..


----------



## Neon Lights

When a game (also a DirectX 9 one) that can actually use two or more GPUs gets less FPS in certain areas then it is because of a CPU limit most of the time.


----------



## xer0h0ur

Reality is that AMD's DX9 driver overhead performance is less than stellar. They made strides in DX11 performance but really for all intents and purposes seemed to have stopped trying to improve DX9 performance. Now apparently they need to improve Linux/SteamOS driver performance since the SteamOS benchmarking showed Fury to be weaker than a GTX 950. Ouch.


----------



## Mega Man

Or. They could not care about Linux. Sorry but amd needs to put their money where it really matters and Linux isn't it


----------



## Thoth420

My GPU is still waiting for a 2560 x 1440 Freesync IPS with a 30 to 144hz range.

Cmon Eizo....


----------



## Agent Smith1984

So...

My XFX Triple D Fury is arriving tomorrow....

Anything i need to know about these?

I know I am listening up for coil whine, and will probably try to "burn it out" with some high FPS game screens if it's really bad.... but is there anything else to look out for?

I know it won't OC very well, so not expecting much more than 1050-1090 there until we get some voltage control (







), but is there any key or tricks to overclocking the HBM?

Also, are there any drivers that seem to do better than others at the moment?

I was reluctant to get this card, but it was between this for $520 and a GTX 980 variant in the $500-520 range, and this card should be handily faster than the 980 in most cases.

The 980ti and Fury X were both out of my price range, so this seemed to be about the most performance I could manage for the money....


----------



## buildzoid

you can probably get something between 1080 and 1130 on the core. For the HBM just keep it cool and you should hit between 540 and 600. The cooler it is the more you can clock it. You should also try the core unlocks to try get 3776 or 3840 stream processors.


----------



## Agent Smith1984

Quote:


> Originally Posted by *buildzoid*
> 
> you can probably get something between 1080 and 1130 on the core. For the HBM just keep it cool and you should hit between 540 and 600. The cooler it is the more you can clock it. You should also try the core unlocks to try get 3776 or 3840 stream processors.


The core unlocking is definitely something I am going to try.

What is the clock scaling to the 3584 shaders versus having more shaders, and possibly not clocking as well?

Trial and error on my end I'm sure, but didn't know if anyone had any good comparisons anywhere regarding 3584, 3776, and 3840 at different clock speeds.

I have read that some can't OC as high with unlocked shaders....


----------



## buildzoid

It differs from card to card. You'll have to test the performance of your max OC with as many cores unlocked as possible and your max OC with the normal core count.


----------



## Mega Man

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So...
> 
> My XFX Triple D Fury is arriving tomorrow....
> 
> Anything i need to know about these?
> 
> I know I am listening up for coil whine, and will probably try to "burn it out" with some high FPS game screens if it's really bad.... but is there anything else to look out for?
> 
> I know it won't OC very well, so not expecting much more than 1050-1090 there until we get some voltage control (
> 
> 
> 
> 
> 
> 
> 
> ), but is there any key or tricks to overclocking the HBM?
> 
> Also, are there any drivers that seem to do better than others at the moment?
> 
> I was reluctant to get this card, but it was between this for $520 and a GTX 980 variant in the $500-520 range, and this card should be handily faster than the 980 in most cases.
> 
> The 980ti and Fury X were both out of my price range, so this seemed to be about the most performance I could manage for the money....


i have bad news,

the list is long.

here is what you need to know





( i find it ironic that most of these kids look underage )
then

http://siberian-crown.rosinter.com/about/how-drink/

congrats , you know everything now !


----------



## buildzoid

Looks like the video is made in EU so they just have to be 18.


----------



## Mega Man

yet imo most look to be 16 or less ....

the most important one though. is Guinness it must be Guinness or get out !!!


----------



## By-Tor

mmmm Guinness.....


----------



## Agent Smith1984

Quote:


> Originally Posted by *Mega Man*
> 
> yet imo most look to be 16 or less ....
> 
> the most important one though. is Guinness it must be Guinness or get out !!!


I haven't had a Guinness in about ten years, but I'm so Damn excited to get this new card, i may just get a 6 pack for the head!

If fury is a headache, it wouldn't be the first time a piece of hardware offered up a challenge, but if the list is really that long, I'll can it, and go 390 crossfire....


----------



## Gamedaz

* I'm soon gonna be picking up a XFX Triple D Fury R9 as well.

* I'm switching from Nvidia because their drivers caused my GPU to brick..







...The Memory clocks possibly got stuck and didn't throttle down properly so it cooked the onboard memory etc. It was a GTX 780 ti Phantom Gainward etc. The best cooler on the Market with x3 fans that kept it at a stable 72c-78c Max. This happened when I was testing a Starwars Beta Game which required me to update to the latest drivers 358, when the whole system crashed after reboot etc, had to Re-Install new O.S. only to find out I still had green lines from bad drivers that corrupted memory or something on the card.

* AMD drivers are compatible with all games and do not require driver updates for them to work properly, Nvidia Drivers are too complicated IMO, they cater to too many people, and neglect older cards for stability, meaning the driver could be more stable on a 980 than on a 780 or under. This means I would have to purchase a 980 in order to play newer games without stability issues which IMO is too inconsistent to continue upgrading every 2 years to a new GPU.

* Nvidia cards also cater to the Above 1080 60 FPS crowd, people that use 1440 p or 4K displays should benefit with an Nvidia cards and their higher clock to better improve frame rates (which doesn not mean much for a 1080p set etc), right now I just need 1080p 60hz, When 4K tvs become more affordable and practical (OLED only no backlit tech) then I will switch back to Nvidia.

*Until then I plan on installing it (XFX R9 Fury Triple D) into my Steam System, and expect consistent results.

Agent Smith 1984: Could you confirm if the XFX card has the LED GPU tachometer as well, Neweggs reviews seem to state this feature on this card as well. * I understand its mostly a gimick, but when I look into my case if there are any issues I would like to see if the card is working properly or just idling etc, so It has some substance to it.

Either way this cards 3 fans should help keep temps low and push out stable 1080p 60hz Frame rates with most titles I have.


----------



## rdr09

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Gamedaz*
> 
> * I'm soon gonna be picking up a XFX Triple D Fury R9 as well.
> 
> * I'm switching from Nvidia because their drivers caused my GPU to brick..
> 
> 
> 
> 
> 
> 
> 
> ...The Memory clocks possibly got stuck and didn't throttle down properly so it cooked the onboard memory etc. It was a GTX 780 ti Phantom Gainward etc. The best cooler on the Market with x3 fans that kept it at a stable 72c-78c Max. This happened when I was testing a Starwars Beta Game which required me to update to the latest drivers 358, when the whole system crashed after reboot etc, had to Re-Install new O.S. only to find out I still had green lines from bad drivers that corrupted memory or something on the card.
> 
> * AMD drivers are compatible with all games and do not require driver updates for them to work properly, Nvidia Drivers are too complicated IMO, they cater to too many people, and neglect older cards for stability, meaning the driver could be more stable on a 980 than on a 780 or under. This means I would have to purchase a 980 in order to play newer games without stability issues which IMO is too inconsistent to continue upgrading every 2 years to a new GPU.
> 
> * Nvidia cards also cater to the Above 1080 60 FPS crowd, people that use 1440 p or 4K displays should benefit with an Nvidia cards and their higher clock to better improve frame rates (which doesn not mean much for a 1080p set etc), right now I just need 1080p 60hz, When 4K tvs become more affordable and practical (OLED only no backlit tech) then I will switch back to Nvidia.
> 
> *Until then I plan on installing it (XFX R9 Fury Triple D) into my Steam System, and expect consistent results.
> 
> Agent Smith 1984: Could you confirm if the XFX card has the LED GPU tachometer as well, Neweggs reviews seem to state this feature on this card as well. * I understand its mostly a gimick, but when I look into my case if there are any issues I would like to see if the card is working properly or just idling etc, so It has some substance to it.
> 
> Either way this cards 3 fans should help keep temps low and push out stable 1080p 60hz Frame rates with most titles I have.






780 Ti is getting there. could very well be the age of the card but. also, nvidia shines at lower rez not the other way around.


----------



## Mega Man

I agree nvidia is far better at 1080p then amd. Not saying your card can't do it. Just that it will get better when you do go to a big monitor


----------



## mRYellow

Just wanted to add in my experience with Fury.

I could easily game at 1080 on core. Didn't notice any visual anomalies except in Far Cry 3.
I dropped the core by 10 and it went away. Weird, as it was the only game that picked something up.


----------



## Gamedaz

Quote:


> Originally Posted by *mRYellow*
> 
> Just wanted to add in my experience with Fury.
> 
> I could easily game at 1080 on core. Didn't notice any visual anomalies except in Far Cry 3.
> I dropped the core by 10 and it went away. Weird, as it was the only game that picked something up.


* So your general experience with AMD @ 1080 is somewhat stable and consistent?

What is core?


----------



## mRYellow

Quote:


> Originally Posted by *Gamedaz*
> 
> * So your general experience with AMD @ 1080 is somewhat stable and consistent?
> 
> What is core?


Yes, but i've dropped my speed to 1070 on GPU. Wil try higher when we finally get voltage support.

By core i mean GPU.


----------



## Agent Smith1984

I have read that VRM's can get hot on the Fury X, but have people been noticing good temps on the air cooled Fury cards?


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have read that VRM's can get hot on the Fury X, but have people been noticing good temps on the air cooled Fury cards?


* I've heard that Fury X was released with Liquid cooling because Air cooling is insufficient for cooling the card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> * I've heard that Fury X was released with Liquid cooling because Air cooling is insufficient for cooling the card.


That's not true.....

Fury seems to do fine with 3rd part air cooling.

What I am curious about, is that I saw an overvolted Fury X card hitting 100C on the VRM's, but with some of the third party air coolers on the 390 series, VRM's can be kept in the 70's-80's pretty easily, so I am wondering if the air cooling solutions are doing as good of a job with VRM cooling as the 390 series cards, and is there a chance having much cooler VRM's on the Fury Pro may give it a better shot of overclocking with additional voltage when it becomes available to us?


----------



## buildzoid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have read that VRM's can get hot on the Fury X, but have people been noticing good temps on the air cooled Fury cards?


I tried shoving a temp probe into the Vcore VRM on my Fury Tri-X when I was running 1.3V core and I couldn't get a reading above 70 or 80C IIRC. So I think the air cooled cards have better VRM temps than the water cooled cards. However I didn't really test properly. I'll be able to get proper results around Xmas when I go home.


----------



## Medusa666

Ok, I'm pretty angry right about now.

Just recieved my new Fury X, with pump noise / whine and it had scratches on the backplate, and on the fan itself, i.e the XFX logo that is glued to the fan centre, also the protective plastic on the centre of fan was obviously removed.

Funny thing is that the retailer ensured me they recieved this card from the manufacturer early october ( I wanted to make sure that it was a late produced card ) and the outer box was factory sealed, in other words, it seems that XFX has just repackaged this card, most likely it is a used one from the first batch.

I'm never going to buy anything from XFX again, nor will I recommend the to anyone, quite the contrary, this is the rock bottom I have ever experienced as a customer.

The card is going back to the retailer, and yeah, not much more to it and don't know what to do next, order another Fury X and risk the same thing, maybe SAPPHIRE can be trusted to actually send NEW cards.


----------



## p4inkill3r

Where did you order it from?


----------



## Agent Smith1984

That would be my question also..... it could be the fault of the r/etailer.....

XFX has a pretty good track record for GPU's lately.... just sayin'


----------



## Gamedaz

Quote:


> Originally Posted by *mRYellow*
> 
> Yes, but i've dropped my speed to 1070 on GPU. Wil try higher when we finally get voltage support.
> 
> By core i mean GPU.


*In my understanding AMD clocks should be just fine for 1080p 60, I usually lock the frame rate to 60, anything more is not too noticible.


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> I tried shoving a temp probe into the Vcore VRM on my Fury Tri-X when I was running 1.3V core and I couldn't get a reading above 70 or 80C IIRC. So I think the air cooled cards have better VRM temps than the water cooled cards. However I didn't really test properly. I'll be able to get proper results around Xmas when I go home.


Yes, the third party coolers for the Fury cards seem to be stellar.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Yes, the third party coolers for the Fury cards seem to be stellar.


We'll know for sure when i get home tonight


----------



## Medusa666

Quote:


> Originally Posted by *p4inkill3r*
> 
> Where did you order it from?


The retailer is not important, the box was factory sealed, pointing to the fact that XFX has recieved an early batch card with pump noise and just repackaged it and sent it back to the retailer.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> That would be my question also..... it could be the fault of the r/etailer.....
> 
> XFX has a pretty good track record for GPU's lately.... just sayin'


Yeah, I'm surprised myself, never had anything like this happen to me either, could be an honest mistake, but the likelyhood of that happening is very low.


----------



## Medusa666

Quote:


> Originally Posted by *Agent Smith1984*
> 
> We'll know for sure when i get home tonight


I been looking into what Fury card to buy for months, and I have concluded that ASUS Fury Strix runs coolest, the VRM never went above 75c, however I do not know if it was due to the IR camera not being able to penetrate the backplate.

The Sapphire card VRM runs at around 100-110c.

The Fury X 100-110c.

These temps are during Furmark, I have found the information on youtube (Toms Hardware IR camera) and on various reviews that uses thermal imaging.

Hope it helps somewhat.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> My GPU is still waiting for a 2560 x 1440 Freesync IPS with a 30 to 144hz range.
> 
> Cmon Eizo....


I don't know about you but I am keeping a laser lock on the Black Friday / Cyber Monday prices for the BenQ XL2730Z. This is by far and away one of the best monitors recommended by FPS gamers and its the Freesync version of it. Unfortunately its not an IPS panel (Its TN) though but I care more about the rest of the features it brings versus needing an IPS panel.


----------



## p4inkill3r

Quote:


> Originally Posted by *Medusa666*
> 
> The retailer is not important, the box was factory sealed, pointing to the fact that XFX has recieved an early batch card with pump noise and just repackaged it and sent it back to the retailer.


I disagree that the retailer is not important, especially since I haven't heard of other instances of that occurring.


----------



## Medusa666

Quote:


> Originally Posted by *p4inkill3r*
> 
> I disagree that the retailer is not important, especially since I haven't heard of other instances of that occurring.


Yeah it is unheard of, I'm getting a new card sent to me tomorrow anyway, the retailer helped me out.


----------



## Gumbi

Quote:


> Originally Posted by *Medusa666*
> 
> I been looking into what Fury card to buy for months, and I have concluded that ASUS Fury Strix runs coolest, the VRM never went above 75c, however I do not know if it was due to the IR camera not being able to penetrate the backplate.
> 
> The Sapphire card VRM runs at around 100-110c.
> 
> The Fury X 100-110c.
> 
> These temps are during Furmark, I have found the information on youtube (Toms Hardware IR camera) and on various reviews that uses thermal imaging.
> 
> Hope it helps somewhat.


The Furmark numbers can't be trusted, as it throttles some cards.

Also, the numbers I've seen for the Sapphire Fury VRMs were far lower.


----------



## Otterfluff

Quote:


> Originally Posted by *Medusa666*
> 
> I been looking into what Fury card to buy for months, and I have concluded that ASUS Fury Strix runs coolest, the VRM never went above 75c, however I do not know if it was due to the IR camera not being able to penetrate the backplate.
> 
> The Sapphire card VRM runs at around 100-110c.
> 
> The Fury X 100-110c.
> 
> These temps are during Furmark, I have found the information on youtube (Toms Hardware IR camera) and on various reviews that uses thermal imaging.
> 
> Hope it helps somewhat.


The only thing I don't like about the strix is that it wont work/fit with any waterblocks since they use a different pcb.


----------



## Agent Smith1984

So for people getting the really bad coil whine with these cards....

Did it start from the beginning, or did it slowly develop after use?


----------



## xer0h0ur

From this thread I can only remember one person complaining about coil whine getting worse. The typical experience is coil whine from the get go or next to none. Really doesn't seem to be any middle ground here.


----------



## Semel

*buildzoid*

Is it possible to hex edit fury's bios to set custom voltage for "high performance" state?


----------



## buildzoid

Quote:


> Originally Posted by *Semel*
> 
> *buildzoid*
> 
> Is it possible to hex edit fury's bios to set custom voltage for "high performance" state?


I only do hard mods no software modding of any kind. I'm guessing that it should be possible to mod the BIOS for more voltage however I have no idea how one would do that. What I'm more intrested in is disabling some of the power saving stuff because I suspect that it causes the black screen crashes that I get during daily usage when pushing 1150mhz+ on core. I can run Unigine at 1180mhz core yet the card blackscreens in games like TERA or Toxikk. No artifacts of any kind just straight up crashes.


----------



## Semel

Quote:


> Originally Posted by *buildzoid*
> 
> I only do hard mods no software modding of any kind. .


1180 core? That's pretty impressive for a fury... Do you think it would be possible to push it to 1180 without hard modding when voltage control becomes available? What would you consider max "safe" voltage for air cooled fury without hard modding? My card is only stable at 1080Mhz (1050 in witcher 3) at default voltage./ Memory can be oced to 570 but I really need to keep it cool, below 60+C otherwise I get artifacts here and there in witcher 3.However I don't keep memory oced coz performance increase is ridiculously small (~1-1.5 fps) in games, it's not worth it.


----------



## buildzoid

Quote:


> Originally Posted by *Semel*
> 
> 1180 core? That's pretty impressive for a fury... Do you think it would be possible to push it to 1180 without hard modding when voltage control becomes available? What would you consider max "safe" voltage for air cooled fury without hard modding? My card is only stable at 1080Mhz (1050 in witcher 3) at default voltage./ Memory can be oced to 570 but I really need to keep it cool, below 60+C otherwise I get artifacts here and there in witcher 3.However I don't keep memory oced coz performance increase is ridiculously small (~1-1.5 fps) in games, it's not worth it.


The card only did 1140 in Unigine without the hard mods.


----------



## Semel

Quote:


> Originally Posted by *buildzoid*
> 
> The card only did 1140 in Unigine without the hard mods.


What was preventing it from getting higher than that? Not voltage I reckon.. Some "safety" measures in place?


----------



## buildzoid

Quote:


> Originally Posted by *Semel*
> 
> What was preventing it from getting higher than that? Not voltage I reckon.. Some "safety" measures in place?


It was voltage at that point. Above 1140mhz the card would artifact for a couple seconds and then crash into a blackscreen.


----------



## Agent Smith1984

Tis time to unleash the Fury!


----------



## xer0h0ur

Hopefully the gaming gods blessed you with a good card.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know about you but I am keeping a laser lock on the Black Friday / Cyber Monday prices for the BenQ XL2730Z. This is by far and away one of the best monitors recommended by FPS gamers and its the Freesync version of it. Unfortunately its not an IPS panel (Its TN) though but I care more about the rest of the features it brings versus needing an IPS panel.


I would consider it but lots of dead after 3 to 6 months has me worried.


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Tis time to unleash the Fury!


Unbox porn!


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I would consider it but lots of dead after 3 to 6 months has me worried.


Well FWIW I heard a lot of bad things about that monitor's first revision. The panel that was used initially has since been changed.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Hopefully the gaming gods blessed you with a good card.


I get POST screen, and then black screen on boot... Doesn't even try to load windows (that i can see or hear anyways..)

I seriously have no clue wth?


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I get POST screen, and then black screen on boot... Doesn't even try to load windows (that i can see or hear anyways..)
> 
> I seriously have no clue wth?


Try it in another machine.


----------



## BackwoodsNC

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I get POST screen, and then black screen on boot... Doesn't even try to load windows (that i can see or hear anyways..)
> 
> I seriously have no clue wth?


Uninstall your disolay driver then try to boot


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I get POST screen, and then black screen on boot... Doesn't even try to load windows (that i can see or hear anyways..)
> 
> I seriously have no clue wth?


Not sure what drivers you are using but known black screen start with FURYX. Mine does it from time to time. Try starting give it a few minutes if not power off then start. Good luck


----------



## buildzoid

I think I had to also boot safe mode uninstall what ever driver I was running and get the latest ones when I got my Fury Tri-X. I also had a bunch of problems with the DP cables not making proper contact on the GPU end.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well FWIW I heard a lot of bad things about that monitor's first revision. The panel that was used initially has since been changed.


Hrm well it's the only viable option for what I am looking for at the moment. I really would prefer an IPS but Eizo is taking their sweet time and I won't settle for a 90hz ceiling that the Asus has nor am I going back to 1080.


----------



## xxela

You uninstall previous driver before switching the cards, right? In any case try DDU (very good tool) in safe mode then install the new drivers. If still didn't work try a windows fresh install.


----------



## Medusa666

I'm getting my replacement Fury X on friday, do I have to remove and re-install the drivers even though it is the same GPU or can I just power off, turn off PSU, and swap the XFX one for the Sapphire?


----------



## mRYellow

Quote:


> Originally Posted by *Medusa666*
> 
> I'm getting my replacement Fury X on friday, do I have to remove and re-install the drivers even though it is the same GPU or can I just power off, turn off PSU, and swap the XFX one for the Sapphire?


Not necessary but i do recommend the 15.10 betas.

amd-catalyst-15.10beta-64bit-win10-win8.1-win7-oct12


----------



## BaddParrot

NM!


----------



## Alastair

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> We'll know for sure when i get home tonight
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I been looking into what Fury card to buy for months, and I have concluded that ASUS Fury Strix runs coolest, the VRM never went above 75c, however I do not know if it was due to the IR camera not being able to penetrate the backplate.
> 
> The Sapphire card VRM runs at around 100-110c.
> 
> The Fury X 100-110c.
> 
> These temps are during Furmark, I have found the information on youtube (Toms Hardware IR camera) and on various reviews that uses thermal imaging.
> 
> Hope it helps somewhat.
Click to expand...

the Sapphire Fury Tri-x has the coolest VRMs of any of the current Fiji cards at the moment. My cards run really cool.

Although can somebody tell me how they are measuring these temps? From what I can tell my Sapphire Fury's do not have a VRM temp probe installed. I really thought they would of included this with the cards like the 290's. But it seems iI wrong?


----------



## buildzoid

I stuck a K type probe into the VRM area and tried to measure the highest temperature possible.


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> I stuck a K type probe into the VRM area and tried to measure the highest temperature possible.


What kind of temps did you get at stock with, say, 50% fan speed on the VRMs?


----------



## buildzoid

Quote:


> Originally Posted by *Gumbi*
> 
> What kind of temps did you get at stock with, say, 50% fan speed on the VRMs?


I think I couldn't get a measurement above 80C with 1.3V core voltage


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> I think I couldn't get a measurement above 80C with 1.3V core voltage


That's soke good cooling then. What's stock voltage? 1.2~?


----------



## buildzoid

Quote:


> Originally Posted by *Gumbi*
> 
> That's soke good cooling then. What's stock voltage? 1.2~?


I didn't check at stock voltage because at stock the VRM isn't at any risk.


----------



## Gumbi

As an aside, did you ascertain how accurate the VRM sensors were in determining the hottest area of the board? Or, in other words, was the value given by the VRM pretty close to the hottest value you determined using the probe?


----------



## buildzoid

Quote:


> Originally Posted by *Gumbi*
> 
> As an aside, did you ascertain how accurate the VRM sensors were in determining the hottest area of the board? Or, in other words, was the value given by the VRM pretty close to the hottest value you determined using the probe?


There are no VRM sensor built into the card. At least there aren't any that GPU-z is picking up.


----------



## Agent Smith1984

Okay so.... I can't even get into the BIOS with this card...

I literally get the Asus POST screen logo, and it immediately loses signal.

I can't go into BIOS, Safemode, nothing... this is so weird, cause obviously the card is functioning enough to get the post screen......

The mobo is not flashing the GPU light either, which would normally signify a bad card, or that it is not communicating with the card.

This is going to drive me nuts


----------



## buildzoid

Try flip the BIOS switch.


----------



## Medusa666

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay so.... I can't even get into the BIOS with this card...
> 
> I literally get the Asus POST screen logo, and it immediately loses signal.
> 
> I can't go into BIOS, Safemode, nothing... this is so weird, cause obviously the card is functioning enough to get the post screen......
> 
> The mobo is not flashing the GPU light either, which would normally signify a bad card, or that it is not communicating with the card.
> 
> This is going to drive me nuts


Have you updated your motherboard BIOS?


----------



## Agent Smith1984

Quote:


> Originally Posted by *buildzoid*
> 
> Try flip the BIOS switch.


Quote:


> Originally Posted by *Medusa666*
> 
> Have you updated your motherboard BIOS?


Will try both of these this evening..... thanks


----------



## Medusa666

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Will try both of these this evening..... thanks


Actually, I know of a guy who had the exact same problem, but he had a Z77 motherboard with an i5, same behaviour though, and the BIOS update solved it.

Let us know how it goes!


----------



## Gamedaz

* Did you remove any Drivers from the previous setup?

If your using Nvidia Drivers you have to remove them or windows will reload the 355 358 drivers which cause black screen on boot.

* Try updating theBIOS and see if that might help.

* Have you tried Turning off AutoUpdates, they causing too many issues on Windows inclduing my Machine which in Turn damaged my Card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> * Did you remove any Drivers from the previous setup?
> 
> If your using Nvidia Drivers you have to remove them or windows will reload the 355 358 drivers which cause black screen on boot.
> 
> * Try updating theBIOS and see if that might help.
> 
> * Have you tried Turning off AutoUpdates, they causing too many issues on Windows inclduing my Machine which in Turn damaged my Card.


I'm going to try updating the system BIOS with another GPU, and wiping everything, even though I don't see how wiping the driver will help, since it won't even get past the post to even attempt to load windows.....

I am also going to give the BIOS switch a try. Maybe one is UEFI and the other is legacy?? This board has played nice with 3 other cards so far, two different 7970's and my previous Asus 390, not sure why it doesn't like this card.

On a positive note... It sure looks like a monster


----------



## xxela

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm going to try updating the system BIOS with another GPU, and wiping everything, even though I don't see how wiping the driver will help, since it won't even get past the post to even attempt to load windows.....
> 
> I am also going to give the BIOS switch a try. Maybe one is UEFI and the other is legacy?? This board has played nice with 3 other cards so far, two different 7970's and my previous Asus 390, not sure why it doesn't like this card.


I have the same MB and there is no update for SABERTOOTH 990FX R2.0 from 2014 so probably you have the latest bios. If after resetting bios you still cant go in and with another card works fine I think is some problem with this card.


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm going to try updating the system BIOS with another GPU, and wiping everything, even though I don't see how wiping the driver will help, since it won't even get past the post to even attempt to load windows.....
> 
> I am also going to give the BIOS switch a try. Maybe one is UEFI and the other is legacy?? This board has played nice with 3 other cards so far, two different 7970's and my previous Asus 390, not sure why it doesn't like this card.
> 
> On a positive note... It sure looks like a monster


* This is unusual, I've had this same issues happen with my Gainward Phantom GTX 780 ti????

I updated latest Nvidia Drivers , game froze (Star Wars Beta) Reboot - Green Lines?> Nuke O.S. - Green Lines - No Longer boots into Widnows (Black screen with cursor)? Boot into BIOS - Green Lines???...Damage card????

* Because of that I am going with this Card AMD XFX......so something is going on with Windows UPDATE TO WINDOWS 10!!!!! Thats ******* up the registry or somthing???

You have a Brand new card, this should not be happening in Windows at all.

I did a search and found the PCI communication drivers need openCL.dll files to work possibly with this card.

There's software called (ATI Driver Updater) It says it for advanced users so it means it knows what drivers to use for your card...not sure why you would have to use it though.

LINK: http://www.tweakbit.com/land/driver-updater/support?build=1ayy&content=brands&utm_source=sevenforums.com&utm_medium=link&utm_campaign=Sevenforums.2&kw=ATI

Maybe this will work?

As well, if you do a Clean Install, Windows will update Drivers to Nvidia?...I noticed this in my Fresh Install and had to go in and REMOVE Nvidia 3D and all other drivers (which happened to be...........355??? drivers???)...so just be care full cause Windows will Re-Install Nvidia Drivers after clean install.

* NOTE: Someone with an AMD car posted issues with thier MEMORY CLOCKS. Fans need to speed up

_*
"so i figured out the best way to fix this and theres a couple of things that you should do if you get this problem with the r9 290s. 1 do not rma your card theres no point your going to get the same card with the same problem until amd gets there act together. 2 make sure that your card is getting enuff power. do not use a cord that has a 6 and a 8 pin in the same line thats not enough power. 3 this is the main problem. your vram threshold is 85C but your card can get up to 95C so you black screen. one way to fix this problem is down clock your Vram and yes i know it sucks but it works. i have mine clocked at 1000hz or what ever and normally its like 1250 or so. that's a big difference but it really helps. 4 and last speed your fan up. the max right now it will go with the latest drivers is 45% i believe but i have mine set to 65% all the time and my card sits at 55 all the time now. doing all of these things i do not black screen! i just recorded me playing call of duty ghosts on ultra at 60fps constant so even with that down clock there's no lag so don't worry over it. until amd fixes this or tel*_ls everyone who bought this card that they screwed up nothing is going to change. hope this helps! "

NOTE: UltraX Theme Installer Could cause issues with Startup (Start Menu) as well possibly. It an .dll file that does not have security clearance from Micrsoft possibly??

Note: Make sure your internal GPU is not trying to output via the Intel HDMI Display port. Sometimes it could mistake that Port instead of the GPU port.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm going to try updating the system BIOS with another GPU, and wiping everything, even though I don't see how wiping the driver will help, since it won't even get past the post to even attempt to load windows.....
> 
> I am also going to give the BIOS switch a try. Maybe one is UEFI and the other is legacy?? This board has played nice with 3 other cards so far, two different 7970's and my previous Asus 390, not sure why it doesn't like this card.
> 
> On a positive note... It sure looks like a monster


Generally it's set to Legacy by default, at least that's my experience with several VaporX cards. On the off chance it's set to UEFI, flipping it might help.

I think a mobo BIOS update could help, as well as making sure it boots from GPU (not iGPU first







).


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Generally it's set to Legacy by default, at least that's my experience with several VaporX cards. On the off chance it's set to UEFI, flipping it might help.
> 
> I think a mobo BIOS update could help, as well as making sure it boots from GPU (not iGPU first
> 
> 
> 
> 
> 
> 
> 
> ).


I have no iGPU, and my BIOS is up to date.

The last thing to try is reseating the card (the fit was suspect in the rear case slot, but it did seat in, so I'll check that.

I am not even able to enter my BIOS at the post screen. I get he Asus bird logo, and the instructions at the bottom to either hit DEL or F1 to enter Setup.
From there it goes to a black screen and the TV says "no signal"

There's no way it's a driver issue, but could definitely be a BIOS issue... I mean, the card has power, the LED "Tach" lights up, and all is go for post, and then it just goes blank, which is so strange. I have read a few cold boot issues with Fury, but if I can't even get to windows, then every boot is a cold boot, and my case does not use a reset botton, so every start is a fresh one after I power off.

I am going to continue to research this....


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have no iGPU, and my BIOS is up to date.
> 
> The last thing to try is reseating the card (the fit was suspect in the rear case slot, but it did seat in, so I'll check that.
> 
> I am not even able to enter my BIOS at the post screen. I get he Asus bird logo, and the instructions at the bottom to either hit DEL or F1 to enter Setup.
> From there it goes to a black screen and the TV says "no signal"
> 
> There's no way it's a driver issue, but could definitely be a BIOS issue... I mean, the card has power, the LED "Tach" lights up, and all is go for post, and then it just goes blank, which is so strange. I have read a few cold boot issues with Fury, but if I can't even get to windows, then every boot is a cold boot, and my case does not use a reset botton, so every start is a fresh one after I power off.
> 
> I am going to continue to research this....


I tend to agree with you on the driver side. They aren't loaded until Windows boots up AFAIK. BIOS issue on either card side or mobo side would be my guess.


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have no iGPU, and my BIOS is up to date.
> 
> The last thing to try is reseating the card (the fit was suspect in the rear case slot, but it did seat in, so I'll check that.
> 
> I am not even able to enter my BIOS at the post screen. I get he Asus bird logo, and the instructions at the bottom to either hit DEL or F1 to enter Setup.
> From there it goes to a black screen and the TV says "no signal"
> 
> There's no way it's a driver issue, but could definitely be a BIOS issue... I mean, the card has power, the LED "Tach" lights up, and all is go for post, and then it just goes blank, which is so strange. I have read a few cold boot issues with Fury, but if I can't even get to windows, then every boot is a cold boot, and my case does not use a reset botton, so every start is a fresh one after I power off.
> 
> I am going to continue to research this....


Did you try it in another machine?

It sounds like a bum card to me, unfortunately, and I know that's not what you wanted to hear.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> Did you try it in another machine?
> 
> It sounds like a bum card to me, unfortunately, and I know that's not what you wanted to hear.


My son's box won't hold it, and the wife's isn't running yet....

I'd love to try it in something else, cause I find it really strange that it gets a display, and then it stops.

Can someone tell me what color the tach lights are supposed to be on boot?


----------



## xxela

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have no iGPU, and my BIOS is up to date.
> 
> The last thing to try is reseating the card (the fit was suspect in the rear case slot, but it did seat in, so I'll check that.
> 
> I am not even able to enter my BIOS at the post screen. I get he Asus bird logo, and the instructions at the bottom to either hit DEL or F1 to enter Setup.
> From there it goes to a black screen and the TV says "no signal"
> 
> There's no way it's a driver issue, but could definitely be a BIOS issue... I mean, the card has power, the LED "Tach" lights up, and all is go for post, and then it just goes blank, which is so strange. I have read a few cold boot issues with Fury, but if I can't even get to windows, then every boot is a cold boot, and my case does not use a reset botton, so every start is a fresh one after I power off.
> 
> I am going to continue to research this....


This has nothing to do with drivers. Is either a bad mount or the card itself, so you're right, try mount it again, and if is possible try with another card


----------



## Thoth420

Just my random thoughts but AgentSmith have you tried more than one display as well as ports on the actual GPU? I assume so but hey sometimes we all forget the small things.
I have also read a few people having issues with the DP ports with Fury and Fury Xs and it seems to always be the GPU I/O that is the problem in those cases.


----------



## dagget3450

Have you tried a different monitor? Try a different connection as well say hdmi instead of dp. I have a similar issue with my furyx and have had to use a different monitor.


----------



## Agent Smith1984

My 4K TV only has HDMI and that is what I have been trying.

I will try these things this evening:

1) BIOS Switch
2) Reseat the GPU
3) Try different display with HDMI (I have a 1080P Asus gaming monitor I can test)
4) Try using DP on the 1080P Asus monitor (will have to buy a cable to test this, but even if this works, the card is worthless to me, as this is going to be used with my 4k TV)

Again, can someone tell me the color that the LED lights over the power connectors are supposed to be during post/boot/idle?

This system works fine with the 7970 in it, and the PSU is brand new, so it's not a power issue, as I was just powering a much hungrier overlocked 390 with this unit, as well as (2) 7970's in crossfire that I tested just a week ago.

*Update:*

Take a look at this:
http://linustechtips.com/main/topic/415438-sapphire-r9-fury-wont-post-please-help/

Same scenario.... I am convinced it's a BIOS issue at this point.....
I am thinking either a CMOS clear, or the GPU BIOS switch is going to straighten this out... I sure hope so anyways!!


----------



## buildzoid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Again, can someone tell me the color that the LED lights over the power connectors are supposed to be during post/boot/idle?


\

The LEDs run what ever color you select using a 2 channel dip switch on the back of the card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *buildzoid*
> 
> \
> 
> The LEDs run what ever color you select using a 2 channel dip switch on the back of the card.


Gotcha, I assumed that maybe they change when the card is either idling or being used, so figured it may be something to look for when booting.


----------



## xer0h0ur

Have you already attempted a legacy boot?


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have you already attempted a legacy boot?


Well, I'm going to give the switch a shot this evening when I get home.


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Gotcha, I assumed that maybe they change when the card is either idling or being used, so figured it may be something to look for when booting.


The LEDS show when the card is throttling up or down...1 LED is IDLE. More than one = Full Throttle. From what I understand.

Does the GPU have a Dual BIOS Switch?

* Are you gonna start with a clean Install of windows?


----------



## spyshagg

try a normal monitor instead of your 4k tv


----------



## xer0h0ur

In semi-related news, AMD is going to be releasing another Omega-esque driver that is supposedly going to introduce new things and hopefully add performance.


----------



## diggiddi

When, at the of the year?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> The LEDS show when the card is throttling up or down...1 LED is IDLE. More than one = Full Throttle. From what I understand.
> 
> Does the GPU have a Dual BIOS Switch?
> 
> * Are you gonna start with a clean Install of windows?


No way I'm wiping windows again, just got 10 sorted out all nicely....

The card DOES have the switch, and I am going to give it a go this evening.
I have not heard confirmation on whether the two settings offer legacy and UEFI BIOS, but I am assuming they do.
Quote:


> Originally Posted by *spyshagg*
> 
> try a normal monitor instead of your 4k tv


I am going to try a monitor later, but again.... if it doesn't work with the TV, this card is worthless to me anyways.... I guess either way I need to verify it works though.

It just doesn't make sense that it gets post screen and then dies. That tells me it's a BIOS thing.

I imagine it's set to UEFI by default, and may not be playing nicely with the board.

Thanks everyone for the help.... I will update as to how things go later.

Hopefully I'll be finding overclocks by this evening, and seeing if she's got 4 extra CU's I can unlock


----------



## xer0h0ur

Supposedly it will come out November but there isn't a given date for it. That is purely a guess.


----------



## diggiddi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Supposedly it will come out November but there isn't a given date for it. That is purely a guess.


----------



## Medusa666

Ordered a Sapphire R9 Fury today, I have owned three Fury X, all had pump whine and one was used though sold to me as new, and the ASUS R9 Fury Strix I got had a fan rattling after ten days.

I'm putting my hope in this card, must be faith


----------



## BaddParrot

Quote:


> Originally Posted by *buildzoid*
> 
> \
> 
> The LEDs run what ever color you select using a 2 channel dip switch on the back of the card.


Let me ask here-
I have a Sapphire Fury X & have had no issues at all. I even flipped the switch to make sure the Blue LED's work fine also. I have read that when the card goes into a 0 power state, There is suppose to be 1 Green LED light. I have never seen the green LED.

I use Win 7. Sleep & Hibernate won't do it. Any idea's how to get the Green LED to light up? Thanks!


----------



## buildzoid

Quote:


> Originally Posted by *BaddParrot*
> 
> Let me ask here-
> I have a Sapphire Fury X & have had no issues at all. I even flipped the switch to make sure the Blue LED's work fine also. I have read that when the card goes into a 0 power state, There is suppose to be 1 Green LED light. I have never seen the green LED.
> 
> I use Win 7. Sleep & Hibernate won't do it. Any idea's how to get the Green LED to light up? Thanks!


I think the green LED only triggers in ULPS and that's the state the none primary GPUs go into when they aren't needed.


----------



## Gamedaz

* The switch at the top might not be a BIOS switch it could be the color LED switch that goes from RED to Blue.

The Saphire Tripple D Radeon has a DUAL BIOS switch ~ What the switch does is switches to a different BIOS if you O.C. and corrupt the BIOS, yo just flip the switch to the next BIOS.

I've had an issue with an EVGA card with a NO post reboot black screen issue....Brand new...turned out The PCI cables or the PSU itself was causing interference with the GPU???...I confirmed it because when I played around with the Power connectors, if they went too close to the Chassis the PC would be running then it would reboot...the FANS would speed up and Reboot_ again and again... I've never heard of a Power cable causing a GPU to reboot the entire system, Sent it back for replacement, and it was a faulty card as well??? I also had the same problem with my 780 ti (In a compelly differnt system Micro ATX etc) as soon as I routed the wires away from power source the GPU would work normally... Try a diiferent PCI power slot on the PSU if there are any available. Try another PSU if you can as well.

Keep us updated


----------



## Agent Smith1984

IT'S WORKING!!!!


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> IT'S WORKING!!!!


Awesome to hear! What was it out of curiosity?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Thoth420*
> 
> Awesome to hear! What was it out of curiosity?


I consider myself to have a pretty in depth knowledge of pc hardware...

In this case, i have not the foggiest idea...









Tried the switch, nothing, tried reseating, nothing, tried ddu, nothing, then noticed the "BOOT_ DEVICE" led was red right after post screen, so rebooted, and instead of hitting del for bios, i tried hitting f1 for bios, it went into the bios, and worked ever since... Its makes not one bit of sense at all....


----------



## Arizonian

Quote:


> Originally Posted by *Agent Smith1984*
> 
> IT'S WORKING!!!!


Glad to hear that, now if you can start playing with it we can get some numbers.









I've been following your new Fury purchase. Possibly have a chance to sell my 780Ti and will get an air cooled Fury myself if it pans through.


----------



## Gamedaz

* Post some gaming results!


----------



## Agent Smith1984

I'm going tip start with the synthetics of course, and will post shortly..

Trying to CCC working so i can overclock this thing...

Tested heaven on ultra 1080p with 8x aa, and got around 1600 score... That was with some clock fluctuation though, and yes, there is some coil noise, but not major....


----------



## buildzoid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> IT'S WORKING!!!!


What made it work again?


----------



## Himo5

So it wanted to be treated like a new CPU only it wasn't set up to ask for F1?.


----------



## Gamedaz

* Whats the difference between F1 or Delete?


----------



## Himo5

DEL is an optional, issue-free BIOS entry. F1 - among other things, such as Failed Overclock - is a New Hardware Detected trap.


----------



## Gamedaz

* So F1 is for new Hardware detected, + failed CPU . O.C.


----------



## Otterfluff

I finished my volt mods on my first fury X








Thats a Voltmeter LCD from ebay. http://tinyurl.com/ozqazfq

Using the Dip switch I can change between it reading core or HBM voltage.

Im about to plumb in all my acrylic pipe over the next two days. I was waiting for me to finish the volt mod on one card before I started bending.

I like this setup because I can undo the volt mods by just unplugging the wires from the veroboard.


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I consider myself to have a pretty in depth knowledge of pc hardware...
> 
> In this case, i have not the foggiest idea...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried the switch, nothing, tried reseating, nothing, tried ddu, nothing, then noticed the "BOOT_ DEVICE" led was red right after post screen, so rebooted, and instead of hitting del for bios, i tried hitting f1 for bios, it went into the bios, and worked ever since... Its makes not one bit of sense at all....


That's odd but still it's working!


----------



## GorillaSceptre

Quote:


> Originally Posted by *Otterfluff*
> 
> I finished my volt mods on my first fury X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thats a Voltmeter LCD from ebay. http://tinyurl.com/ozqazfq
> 
> Using the Dip switch I can change between it reading core or HBM voltage.
> 
> Im about to plumb in all my acrylic pipe over the next two days. I was waiting for me to finish the volt mod on one card before I started bending.
> 
> I like this setup because I can undo the volt mods by just unplugging the wires from the veroboard.


Awesome job.









I'd never have the guts to do that..









Post back with results.


----------



## Agent Smith1984

Well, I didn't get much time with it last night, cause it took me a few DDU's and and uninstalls to get CCC working (this happens to me every time I get a new GPU for some reason)......

I did finally get it working, and reinstalled MSI Afterburner.....

What's weird is.... I get 0's reported on memory clock and core clock in AB, and only a percentage offset for overclocking the core in CCC.

If I try to increase power limit I get this weird "almost artifact-like" glowing hue on my desktop background, however 3d applicaitons work fine....

This has been a weird experience so far, but with all new hardware, there can be kinks to work out. I'm not down on this thing yet, cause the more I read on these cards, the more I am seeing that at 4k (which I exclusively use) these cards almost neck and neck with the 980ti with a slight OC, and are certainly beating the GTX 980 at the same price.

I won't say that I'm "happy" with this yet, but my initial performance results are pretty impressive and will be testing more tonight, and trying to figure out how to overclock the damn thing...


----------



## Gumbi

How does she cool?







Core/VRMs?


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> IT'S WORKING!!!!


Congrats on getting it working


----------



## Medusa666

Quote:


> Originally Posted by *Agent Smith1984*
> 
> IT'S WORKING!!!!


Congratulations!!!


----------



## Gamedaz

Does After Burner support the new Hardware for O.C. I thought AMD has their own O.C software.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> Does After Burner support the new Hardware for O.C. I thought AMD has their own O.C software.


The only thing I can do in AB with this card is monitor temps, and move the power slider.

I mean, the core and HBM sliders move too, but they just show 0 by default, and I am able to increase them to 100, and I have NO CLUE what that will do.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The only thing I can do in AB with this card is monitor temps, and move the power slider.
> 
> I mean, the core and HBM sliders move too, but they just show 0 by default, and I am able to increase them to 100, and I have NO CLUE what that will do.


Does GPUz properly detect all the sensors?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Does GPUz properly detect all the sensors?


As far as I can tell.


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> As far as I can tell.


I presume you're running the latest version of Afterburner then?


----------



## diggiddi

Reinstall AB, it is wonky like that . I was having same issues after adding 2nd card had to reinstall


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The only thing I can do in AB with this card is monitor temps, and move the power slider.
> 
> I mean, the core and HBM sliders move too, but they just show 0 by default, and I am able to increase them to 100, and I have NO CLUE what that will do.


Make sure ULPS is disabled. I noticed a strange anomaly with AB and ULPS. I Had to use TRixx to actually get the thing disabled. I had a similar problem as this too with not showing the voltage, core, memory options.


----------



## Agent Smith1984

I'm going to wipe AB and reinstall it tonight.

Using 4.1.1 which is the newest I believe.

I will also disable ULPS


----------



## Gumbi

How do you find the cooling? Does it jive well with your high airflow setup?


----------



## buildzoid

To overclock HBM use Trixx 5.0.0 that's what I use.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm going to wipe AB and reinstall it tonight.
> 
> Using 4.1.1 which is the newest I believe.
> 
> I will also disable ULPS


Delete all previous settings too


----------



## Agent Smith1984

What are most people getting in Heaven at 1080P Ultra settings with 8x AA on their Fury pro??

I'm getting just a tad more than my overclocked 390 at around 1590 points (390 was hitting 1545 I believe @ 1200/1700)


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are most people getting in Heaven at 1080P Ultra settings with 8x AA on their Fury pro??
> 
> I'm getting just a tad more than my overclocked 390 at around 1590 points (390 was hitting 1545 I believe @ 1200/1700)


My 290X was breaking 1700 at 1231/1642 clocks. I think my Intel CPU was helping scores (4790k at 4.9ghz) plus I now I havw Samsung wonder RAM that I bought from a friend running highly clocked which may help a tad too.


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are most people getting in Heaven at 1080P Ultra settings with 8x AA on their Fury pro??
> 
> I'm getting just a tad more than my overclocked 390 at around 1590 points (390 was hitting 1545 I believe @ 1200/1700)


Try the test with higher resolutions you should see a big difference. 2560X1440 or higher.


----------



## diggiddi

Quote:


> Originally Posted by *Gumbi*
> 
> My 290X was breaking 1700 at 1231/1642 clocks. I think my Intel CPU was helping scores (4790k at 4.9ghz) plus I now I havw Samsung wonder RAM that I bought from a friend running highly clocked which may help a tad too.


Those are mad clocks what PSU r u using?


----------



## Gumbi

Quote:


> Originally Posted by *diggiddi*
> 
> Those are mad clocks what PSU r u using?


650 superflower amazon bronze... soon to be a 65pw leadex superflower gold (modular)









I may have misquoted, my best scores were the above, but were actually at 1249/1640 ~. 200mv and not gaming stable. You can find the pic in my pics history.


----------



## Himo5

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are most people getting in Heaven at 1080P Ultra settings with 8x AA on their Fury pro??
> 
> I'm getting just a tad more than my overclocked 390 at around 1590 points (390 was hitting 1545 I believe @ 1200/1700)


For the moment my Fury X is in a Crossblade Ranger dualing with an A10-7870K, 16g of TridentX and a 1080 screen.

At stock frequencies, Cpu:3900; Ram:2133; Gpu1:1050; Gpu2:918, I'm getting 1657 in Heaven (rising to only 1680 with Cpu:4600), 8103 in Firestrike (rising to 8846), 3697 in Luxmark Sala Gpu+Cpu, and 9359 in Final Fantasy IV Heavensward (rising to 9873).

I'll do the Ram and iGpu/dGpu standard OCs tomorrow and then see how far I can push it over the weekend.

Whatever the outcome, comparing results with higher powered setups, this card and this system really like each other.

No coil whine but for some reason I get pump noise if I leave Firefox on for any length of time







.


----------



## Agent Smith1984

What does it all mean?
http://www.3dmark.com/3dm/9083182


----------



## xer0h0ur

Sounds like you need to follow BradleyW's manual driver removal instructions instead of using driver sweepers or just the AMD uninstaller. I used to only DDU until I began getting oddball problems that made no sense whatsoever. Now I do the Catalyst Install Manager uninstall, manual registry wipe, DDU then install a new driver. I haven't had any problems since with driver installations. Only problem is that I don't believe his method has been tested on W10 installs yet.


----------



## Agent Smith1984

Temps look great on core, but no VRM reading... that sucks, but they gotta be pretty good....


----------



## Gamedaz

* Can you find what the VRM temps are? This can be critical, most cards have passive metal plates on the PSB that dissipate heat away from the VRM's, which is kind of important if your OC'ing.


----------



## buildzoid

Quote:


> Originally Posted by *Gamedaz*
> 
> * Can you find what the VRM temps are? This can be critical, most cards have passive metal plates on the PSB that dissipate heat away from the VRM's, which is kind of important if your OC'ing.


Ok lets go over some facts about the R9 Fury VRM. It runs IR 6811 high side MOSFETs and IR 6894 low side MOSFETs. At 125C the high side can provide up to 32A or about 384W to the GPU core. Without very heavy voltmods you aren't going to get over 300W power draw for the whole card(my card pulls 280W on the 8pins with 1.3V core). So that's safe. The low side 6894s can do 70A at 125C. There's six of so you get about 420A to core. Again without Vmods you aren't going to get anywhere near that current draw.
Finally we have the IR 3567B. This incredibly smart and annoying voltage controller will not let the VRM die from heat. If you hit 125C MOSFET temps you will end up with the core clock throttling hard(I've tested this on an R9 290X).

Unless Agent Smith1984 wants to take the card apart and stick a thermal probe into the VRM(which will also include getting some thermal pads because AFAIK the cards are using thermal "cement" for all heatsink contact) he isn't going to get a VRM temperature reading. Luckily this doesn't really matter.


----------



## Agent Smith1984

I'm finally starting to dig this card









Heaven 1080P with ultra settings and 8xaa @ 1075/550 (rock solid so far) looks like this (score was 67 frame @ 1682 points on first run bench, no pre-loop


This Crysis shot was @ 1075/550, True 4k @ 4096x2160P ALL MAX SETTINGS with no AA and it never dropped below 30!!


This is same settings with with system spec dropped from "very high" to "high" and ... I have to admit, these settings still looked beautiful, and played just as good (Crysis is my GO-TO game for testing CPU and GPU and CPU+GPU stability) Seriously? Crysis 3 at VeryHigh/High settings @ TRUE 4k and averaging well over 60FPS on a SINGLE CARD??? Yes! Count me in!


----------



## p4inkill3r

How about some FireStrike??


----------



## Gamedaz

*Thats good to know that it eases Crysis 3 with no major drawbacks.


----------



## Gumbi

How hot does it get when playing Crysis at that overclock. Performance looks great


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> How hot does it get when playing Crysis at that overclock. Performance looks great


60c at 45-48% fan speed


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 60c at 45-48% fan speed


That's what I'm talkin' about


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> How about some FireStrike??


Can't look them up cause it says the card is "Generic VGA adapter"....

As far as scores though, overall was around 10,800 (AMD CPU of course)

The graphics score itself is around 15,700 @ 1075/550, with 9600+ physics score and 3500+ combined score @ 5GHz CPU.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Can't look them up cause it says the card is "Generic VGA adapter"....
> 
> As far as scores though, overall was around 10,800 (AMD CPU of course)
> 
> The graphics score itself is around 15,600 @ 1075/550, with 9650+ physics score and 3570+ combined score @ 5GHz CPU.


From reading, despite the driver obviously not being 100% happy, this thing is performing top notch....









I gotta be honest..... it has been a little bit of a rodeo trying to get everything working properly, but damn if this little card doesn't pack a serious punch at 4k!!

Especially with this 9590 running at a full 5GHz (and reviews have showing that AMD CPU's LOVE 4k... so I don't feel like my CPU is a bottleneck at all for this card)

I need some damn voltage control for this thing like something fierce though....

60C core on air with not even hitting 50% fan speed? 200mv+ PLEASE!!!!


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> From reading, despite the driver obviously not being 100% happy, this thing is performing top notch....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I gotta be honest..... it has been a little bit of a rodeo trying to get everything working properly, but damn if this little card doesn't pack a serious punch at 4k!!
> 
> Especially with this 9590 running at a full 5GHz (and reviews have showing that AMD CPU's LOVE 4k... so I don't feel like my CPU is a bottleneck at all for this card)
> 
> I need some damn voltage control for this thing like something fierce though....
> 
> 60C core on air with not even hitting 50% fan speed? 200mv+ PLEASE!!!!


Heads up if you get blacksceens playing games with the FURY X and FX9590 you might need to up the voltage on your cpu. I went from 15.7.1 Beta to 15.10 Beta and for some reason the voltage requirements changed on the CPU. I was at stock 4.7 with 1.47 V and was getting black screens intermittent in games with 15.10B. Went up to 1.48V and seems to be okay now. If it starts doing it again might jump it up to 1.54 V. Probably around or about where your chip is at 5GHZ.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> Heads up if you get blacksceens playing games with the FURY X and FX9590 you might need to up the voltage on your cpu. I went from 15.7.1 Beta to 15.10 Beta and for some reason the voltage requirements changed on the CPU. I was at stock 4.7 with 1.47 V and was getting black screens intermittent in games with 15.10B. Went up to 1.48V and seems to be okay now. If it starts doing it again might jump it up to 1.54 V. Probably around or about where your chip is at 5GHZ.


Thanks for the tip, and funny you mention it...

My original test run in Crysis 3 at 5GHz was giving me black screens (running 25x200 @ 1.55v) and I couldn't figure out why, so I backed down to 23x217 for 5GHz and all was well.... I didn't have those issues before.

Runs great after that, but I did get some black screens initially....


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 60c at 45-48% fan speed


Those are good temps and fan speeds, this card should be Reference on how Air coolers should be designed.

* I've tinkered and used a GTX870 ti in my Mini Chassis, The Gainward Phantom series have incredible fan temps compared to other 780 ti Cooling methods. My 780 ti would not even break 80c (and that was when it was placed inside my Theatre cabinet) When it was outside open air it never passed 76c!! * Its a 3 fan cooler that extends across the entire board itself and 10 VRMS!!! + 1250 clocks.

That why I'm looking into this card because I think it can match the cooling performance of the 780 ti Phantom cooler. ( 3 Fan cooling seems to dramatically improve temps compared to Dual Fan cooling solutions.) As well the XFX Triple D has an extended rad, this deisgn allows another fan to be mounted beside it ( or in my case underneath the enite CARD so it can force air over top the extended Heat-Fin area. * I don't understand why there aren't too many VRM's on this CARD, From what I udnerstand VRM's help keep the Voltages more accurate if you've got a sloppy PSU etc, (But thats an entire other subject)

Either way this card seems to stay cooler than the Fury X??


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> Those are good temps and fan speeds, this card should be Reference on how Air coolers should be designed.
> 
> * I've tinkered and used a GTX870 ti in my Mini Chassis, The Gainward Phantom series have incredible fan temps compared to other 780 ti Cooling methods. My 780 ti would not even break 80c (and that was when it was placed inside my Theatre cabinet) When it was outside open air it never passed 76c!! * Its a 3 fan cooler that extends across the entire board itself and 10 VRMS!!! + 1250 clocks.
> 
> That why I'm looking into this card because I think it can match the cooling performance of the 780 ti Phantom cooler. ( 3 Fan cooling seems to dramatically improve temps compared to Dual Fan cooling solutions.) As well the XFX Triple D has an extended rad, this deisgn allows another fan to be mounted beside it ( or in my case underneath the enite CARD so it can force air over top the extended Heat-Fin area. * I don't understand why there aren't too many VRM's on this CARD, From what I udnerstand VRM's help keep the Voltages more accurate if you've got a sloppy PSU etc, (But thats an entire other subject)
> 
> Either way this card seems to stay cooler than the Fury X??


Well, it's not cooler than the X from my understanding, as many users see 50's on the core with those, but 60C is still excellent for air cooling, and I'm not even pushing the fans hard at all..... Maybe low 50's is possible with 80%+ fan though.... I'll test it out.... lol

Notice this card does have a bit of coil wine though..... luckily it's not louder than my case fans and TV, and doesn't bother me a bit, lol


----------



## Jflisk

You want to keep the temps down on the FURY X just dial in 52C max and 100% fan in Catalyst control Center. Your GPU should not see 50C after a hour of gaming and its surprising not loud.


----------



## Gamedaz

* In my Mini Case build I just let the fans Crank out 100%.

They are loud, but at 6Feet back its, it not too distracting , as long as the temps are stable.


----------



## Gamedaz

*Can you find what the power draw from the GPU is?


----------



## Agent Smith1984

A year from now, we'll have some well developed monitoring apps and overclocking apps with voltage control and some really nice drivers, along with some dx12 games, and we'll be glad we got these cards.... I just hope they aren't dirt cheap when that happens... Though i do feel good about getting my fury new for $520.... Gtx 980 has been $500-550 for a while now, and 390x still fetches $430, and while the 390 is a superb value in 1080p, this fury is unbelievable at 4k!

This is my first over-$400 GPU purchase since 2005! Fury don't fail me now! lol


----------



## Gamedaz

* Valve should should include a software temp monitor in steam you can reference somewhere , how difficult would it be to implement it.

* These cards should play most new releases for another few years,.

Then its time to move to True Photo Realism with Better DirectX12 Shaders that can use tons of Memory Bandwidth and draw up wonderful looking images...







, I'm sure that's what HBM or GDDR5X will be meant to do, so it's a trick that requires at least 1TB of memory though, but in the end its better than waiting for Ray-tracing compute to come to Market.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> A year from now, we'll have some well developed monitoring apps and overclocking apps with voltage control and some really nice drivers, along with some dx12 games, and we'll be glad we got these cards.... I just hope they aren't dirt cheap when that happens... Though i do feel good about getting my fury new for $520.... Gtx 980 has been $500-550 for a while now, and 390x still fetches $430, and while the 390 is a superb value in 1080p, this fury is unbelievable at 4k!
> 
> This is my first over-$400 GPU purchase since 2005! Fury don't fail me now! lol


I just looked at the Saphire Fury at MicroCenter. Was thinking of your card when I did... so tempting... so tempting...


----------



## Arizonian

@Agent Smith1984

Thanks for sharing man, +1 rep for the info since install...congrats on the Fury. Awesome it can crunch Crysis 3 @ 4K res.









I'm hoping I get my 780Ti sold to someone I know locally that's undecided. I've been eyeing the *Gigabyte Fury* which isn't available on Newegg just yet.

I only see this card getting better with time and I'll be at 1440p for the forseeable future. By April I may make the jump to 4K myself if not, at the very least IPS 120Hz @ 1440p & HBM1 on board..


----------



## buildzoid

Quote:


> Originally Posted by *Arizonian*
> 
> @Agent Smith1984
> 
> Thanks for sharing man, +1 rep for the info since install...congrats on the Fury. Awesome it can crunch Crysis 3 @ 4K res.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm hoping I get my 780Ti sold to someone I know locally that's undecided. I've been eyeing the *Gigabyte Fury* which isn't available on Newegg just yet.
> 
> I only see this card getting better with time and I'll be at 1440p for the forseeable future. By April I may make the jump to 4K myself if not, at the very least IPS 120Hz @ 1440p & HBM1 on board..


From what I can tell the Gigabyte R9 Fury is just using a stretched reference PCB and it looks like the cooler won't be on par with the XFX or the Sapphire card because those both have a direct pass through the finstack while the GB needs to force the air around the PCB.


----------



## xxela

@Agent Smith1984
Hi man, glad to see that everything working fine now! Can you test Crysis 3 at 1440 ultra/2mssa and 4mssa. I wanna know what fps should expect. Thanks


----------



## Medusa666

Finally got a good Fury X, minimal pump noise, no rattling fan, and zero coilwhine, amazing.


----------



## Neon Lights

Quote:


> Originally Posted by *Agent Smith1984*
> 
> A year from now, we'll have some well developed monitoring apps and overclocking apps with voltage control and some really nice drivers, along with some dx12 games, and we'll be glad we got these cards.... I just hope they aren't dirt cheap when that happens... Though i do feel good about getting my fury new for $520.... Gtx 980 has been $500-550 for a while now, and 390x still fetches $430, and while the 390 is a superb value in 1080p, this fury is unbelievable at 4k!
> 
> This is my first over-$400 GPU purchase since 2005! Fury don't fail me now! lol


A year from now the Arctic Island GPUs will have been released, unfortunately.

I personally just hope that the first games that were actually developed with DirectX 12 or Vulcan in mind that are released prior to Artic Islands GPUs will run best on Fiji GPUs, because then they can show their actual strength over the Nvidia GPUs.


----------



## clubber_lang

I am so freakin' impressed by this new Sapphire R9 fury Tri-X card!!

My old 7970's running on old drivers didn't even come close to what this card is doing in my machine right now. In rFactor 2 with my 7970's , I couldn't get about 16 - 25 FPS coming out of the pits ( with settings turned down! ) , and maybe hit between 25 and 50 fps on the track in low traffic ( Usually between 20 - 35 fps ). This new R9......pushing about 60 - 75 FPS coming out of the pits , with any where from 75 - 110 FPS on the track , in traffic! Silky smooth! I haven't seen it fall below 60 fps and most of the time stays up around 90 fps!

Coming from a race sim guy , this card is the bee's knee's man! Looks so damn good on my new Acer 34" 3440 X 1440 monitor too!

I'm in racing heaven right now!

Glad I took the time today to learn how to uninstall old drivers and install new ones correctly. I'm not going to fall that far behind again ever. Lesson learned.


----------



## Himo5

So even a little Spectre can make a difference. Or 72 GCN cores are better than 64.


----------



## SuperZan

Quote:


> Originally Posted by *Neon Lights*
> 
> A year from now the Arctic Island GPUs will have been released, unfortunately.
> 
> I personally just hope that the first games that were actually developed with DirectX 12 or Vulcan in mind that are released prior to Artic Islands GPUs will run best on Fiji GPUs, because then they can show their actual strength over the Nvidia GPUs.


I've been a bit concerned over this as well. Given AMD's recent drivers history it seems that the Fury series will only just be coming into its own when Arctic Islands GPU's begin rollout. That said, I'm keen on Arctic Islands, though I may sit out the first iterations to see what sort of partner options AMD allows.

edit: spelling


----------



## xer0h0ur

The only reason that Fiji cards could come on strong well after their release is because DX12 games could finally begin using GCN's async compute engines more. Well that and we don't know what we have in store yet in the upcoming Omega-esque Catalyst driver due this month (reportedly).


----------



## SuperZan

Quote:


> Originally Posted by *xer0h0ur*
> 
> The only reason that Fiji cards could come on strong well after their release is because DX12 games could finally begin using GCN's async compute engines more. Well that and we don't know what we have in store yet in the upcoming Omega-esque Catalyst driver due this month (reportedly).


Yes, the purported Omega 2 driver update (as it were) is precisely what I was referring to with regards to AMD's recent drivers history. Omega the First was quite the expansive bit of work..









We'll see what's actually delivered of course, but to live is to dream.


----------



## Thoth420

Couple more weeks and my cooling should be
Quote:


> Originally Posted by *clubber_lang*
> 
> I am so freakin' impressed by this new Sapphire R9 fury Tri-X card!!
> 
> My old 7970's running on old drivers didn't even come close to what this card is doing in my machine right now. In rFactor 2 with my 7970's , I couldn't get about 16 - 25 FPS coming out of the pits ( with settings turned down! ) , and maybe hit between 25 and 50 fps on the track in low traffic ( Usually between 20 - 35 fps ). This new R9......pushing about 60 - 75 FPS coming out of the pits , with any where from 75 - 110 FPS on the track , in traffic! Silky smooth! I haven't seen it fall below 60 fps and most of the time stays up around 90 fps!
> 
> Coming from a race sim guy , this card is the bee's knee's man! Looks so damn good on my new Acer 34" 3440 X 1440 monitor too!
> 
> I'm in racing heaven right now!
> 
> Glad I took the time today to learn how to uninstall old drivers and install new ones correctly. I'm not going to fall that far behind again ever. Lesson learned.


Another person happy with the ACER Predator....I might have to go nuts and breach my budget. The 27 inch 2560 x 1440 freesync offerings all seem pretty craptastic for the price. Eizo still hasn't mentioned a price or date on theirs. I am not a racing game guy more into shooters and mmo's etc. and have never gamed on a widescreen. Does it fisheye in first person games?


----------



## $k1||z_r0k

some Fury X2 news...

http://wccftech.com/amd-r9-fury-x2-specs-gemini/


----------



## battleaxe

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> some Fury X2 news...
> 
> http://wccftech.com/amd-r9-fury-x2-specs-gemini/


Something is wrong with that link


----------



## Agent Smith1984

After testing a little more, it's clear that the HBM is the star of the show on the fury....

At 1080p, where the frame buffer is less needed, the Fiji core shows only a marginal 3-10% at best over my "old" 390....

Considering the much higher amount of shaders, that's not that impressive... I attribute this to AMD not increasing the ROP count...

However, as we get into higher resolutions, where the frame buffer performance is key, the Fury comes alive, and blatantly outperforms the Hawaii/Grenada cards... Mind you, my 390 was running 8gb at 1750mhz and if i recall correctly, was putting up over 42* something GB of bandwidth, which is obserd prior to HBM, and yet, in 4k testing, the 512gb+ HBM equipped Fury absolutely dominates the 390....

At 1080 testing, my overclocked 390 was basically performing the same as the fury.... (Though the fury was faster than the stock 390 by a slight bit)

At 4k in Crysis3 though, using very high graphics settings, and high system spec, the Fury is getting 15-20fps more than the 390... It's outstanding!


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> After testing a little more, it's clear that the HBM is the star of the show on the fury....
> 
> At 1080p, where the frame buffer is less needed, the Fiji core shows only a marginal 3-10% at best over my "old" 390....
> 
> Considering the much higher amount of shaders, that's not that impressive... I attribute this to AMD not increasing the ROP count...
> 
> However, as we get into higher resolutions, where the frame buffer performance is key, the Fury comes alive, and blatantly outperforms the Hawaii/Grenada cards... Mind you, my 390 was running 8gb at 1750mhz and if i recall correctly, was putting up over 42* something GB of bandwidth, which is obserd prior to HBM, and yet, in 4k testing, the 512gb+ HBM equipped Fury absolutely dominates the 390....
> 
> At 1080 testing, my overclocked 390 was basically performing the same as the fury.... (Though the fury was faster than the stock 390 by a slight bit)
> 
> At 4k in Crysis3 though, using very high graphics settings, and high system spec, the Fury is getting 15-20fps more than the 390... It's outstanding!


How is overclocking going? Are you using latest GPUz for VRM sensor (I'm surprised it hasn't detected one is all).

Proportionally how much fast is that? 30 vs 45 FPS, because that's huge!


----------



## Wuest3nFuchs

Hello !

My very 1st machine was a chiligreen[only known here in austria] amd/ati complete pc with a 9500 and a 9700 pro from ati, hell yeah eleven years ago!

The way i went away from their gpu's, was that they died after a few months.

But all that dependeds on that 1st machine and that techguy that tried to repair it more than 7 times [whole machine was never stable].

So after getting more than mad/bored i finally got a refund.

So here i am changing from the green to the red team as i'm not willing to give nvidia's practices and business anymore money or even attention!

After spending nearly *3 months of testing and fixing instabilities* on a 980 classified, a friend told me, he really needs a better gpu, so i sold it to him and he is very happy with it!

A new GPU is in my mind since this year has started, but after doing research lasting for months my thought's came back to....read below.

Last week i wanted to order a 980 FTW, but as i read about the issues the FTW model has, i kindly asked them if they could gave me the option to have a sapphire r9 fury instead...which they accepted!

My main cause is that i really have enough of the green team and what they've done the past years.

I don't list it up to prevent a war between the green and red camp.

*A few questions*

One thing that was really good after switching back to my old GTX 670 FTW was the input lag and the IQ the GTX 980 did well&#8230; can I expect something good IQ on the Fury?

Is the Gaming Evolved App anything good for, or is it similar to geforce experience?

Will an i7 2700K be enough for her? It was enough for the 980, but that doesn't count anymore!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> How is overclocking going? Are you using latest GPUz for VRM sensor (I'm surprised it hasn't detected one is all).
> 
> Proportionally how much fast is that? 30 vs 45 FPS, because that's huge!


I have so far that the best clock for the HBM is actually around 530MHz.... It will bench higher, but it gets driver crashes past 550, and it scores lower past 530 so far....

As far as the core goes, I have tested as much as I'd like yet..... I have 1075 nailed down, but I don't think she's going to do much more. The core voltage only reports as 1.204 under load, which is lower than the 1.24+/- that I believe the tri-x cards run at.....

If I could dump some more juice through it, I'm sure I could nail down 1125-1150, which would greatly impact performance I believe....

Still no VRM sensor, but no biggie, I'd be shocked if it was breaking 80c, or maybe even lower than that.

This cooler is designed by PowerColor and will be the same as their PCS version of the Fury.

PowerColor has a great track record with their PCS line of coolers....

As far as performance versus the 390....

At 4k, my 390 @ 1200/1700 would average about 54FPS in Crysis 3 with Very High details, High System Spec settings.... this card is averaging around 68FPS, but the minimum FPS is 10FPS higher, and the max FPS is about 20 frames higher in some cases!


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> At 4k, my 390 @ 1200/1700 would average about 54FPS in Crysis 3 with Very High details, High System Spec settings.... this card is averaging around 68FPS, but the minimum FPS is 10FPS higher, and the max FPS is about 20 frames higher in some cases!


* From what I understand you need alot of VRMs to O.C., though it seems that AMD has a few tricks with how it's VRM's process sinwaves to reduce frame buffer times through their VRM;s.

* I like the framerates its getting in Crysis 3 @ 4K, I wonder if it can do VSR on a 1080p set and put out over 60 FPS, that would be interesting.

* My GTX 780 ti averaged 30-40FPS @ 4K, and scaling was average @ 1080p 40- 55 FPS.. Remarkably the low framerates where not too noticable, it played really smooth even at 37-454 FPS really good code.


----------



## Gamedaz

Catalyst is EOL!

* Radeon has ditched the 12 Year Driver control center Catalyst - And will be introducing - Radeon Software- Similar to Nvidia Gforce Experience. With Each Game O.C specifically and controlled.

http://www.pcgamer.com/amds-new-radeon-software-is-making-catalyst-a-thing-of-the-past/


----------



## Vlada011

New software look good, at least from pictures.
I doubt they will implement something revolutionary.
Even most advanced users and gamers try to escape from NVIDIA Experience for any cost.
News that NVIDIA want to force people to install that with driver cause rage and revolt in people.


----------



## buildzoid

the game specific OC is nice. I can run some really heavy games at 1170mhz core but less GPU heavy games like Toxikk and Tera crash for some odd reason so I need to run those at 1145


----------



## Thoth420

Quote:


> Originally Posted by *Vlada011*
> 
> New software look good, at least from pictures.
> I doubt they will implement something revolutionary.
> Even most advanced users and gamers try to escape from NVIDIA Experience for any cost.
> News that NVIDIA want to force people to install that with driver cause rage and revolt in people.


GeForce Experience is purely worthless garbage. Forcing it is a terrible idea.


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> GeForce Experience is purely worthless garbage. Forcing it is a terrible idea.


Whatever do you mean? I love my Gfarce Experience. It goes perfectly with my 970 which has 4gb... ahem... I mean 3.5gb of RAM.


----------



## Thoth420

Quote:


> Originally Posted by *battleaxe*
> 
> Whatever do you mean? I love my Gfarce Experience. It goes perfectly with my 970 which has 4gb... ahem... I mean 3.5gb of RAM.


Gfarce. I like that....gonna use it from now on. Nvidia Gfarce: The way it's meant to degrade!


----------



## battleaxe

LOL...


----------



## Agent Smith1984

Got the graphics score to 15,800+ in firestrike now @ 1080/530 stable so far... Nothing ground breaking for 1080p, but haven't ran ultra yet ( dont have it







)


----------



## EpicOtis13

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Got the graphics score to 15,800+ in firestrike now @ 1080/530 stable so far... Nothing ground breaking for 1080p, but haven't ran ultra yet ( dont have it
> 
> 
> 
> 
> 
> 
> 
> )


What are you getting for graphics score?


----------



## Vlada011

Off course it's garbage, but people could always to extract driver and update from device manager.
Only no NVIDIA Control Panal and some people use few settings.
Whole NVIDIA Experience is marketing and use only to push people to upgrade hardware earlier.
NVIDIA want rather to recommend people when they need to upgrade than people to decide alone with own settings. Because of that and force so much Experience and auto optimization.
You need 2x TITAN X for high settings without AA.







If you ask NVIDIA Experience.


----------



## Gumbi

Quote:


> Originally Posted by *EpicOtis13*
> 
> What are you getting for graphics score?


Reread his post.


----------



## ht_addict

Anyone know if there is a difference in the Bios's on the Sapphire FuryX cards? Any flash an updated Bios to their cards?


----------



## xer0h0ur

I thought the reason people were up in arms about the GFE thing was because of forcing registration, aka Nvidia farming e-mails to inevitably sell off to some rando company.


----------



## en9dmp

Hi guys, not been on for a while, but googling Fury X voltage control still just brings up old pages from June and July... Do we still not have it? If we don't have it by now then I guess it will never come... :'(


----------



## fewness

Quote:


> Originally Posted by *en9dmp*
> 
> Hi guys, not been on for a while, but googling Fury X voltage control still just brings up old pages from June and July... Do we still not have it? If we don't have it by now then I guess it will never come... :'(


It's hardware mod or never....I've accepted the fate...


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> I thought the reason people were up in arms about the GFE thing was because of forcing registration, aka Nvidia farming e-mails to inevitably sell off to some rando company.


I hate it because it is unecessary bloatware and since release of it I have also had to fix numerous driver corruption issues for almost all of my gamer friends due to the auto driver update via the GFE. It's touted as this great tool for those new to graphics settings but ends up being a pain. It also presets some terrible settings for games.

I hope that the new Radeon software is not similar.


----------



## EpicOtis13

Quote:


> Originally Posted by *Gumbi*
> 
> Reread his post.


I though that that many have been his total score just because it seems so low compared to my 290's


----------



## Gumbi

Quote:


> Originally Posted by *EpicOtis13*
> 
> I though that that many have been his total score just because it seems so low compared to my 290's


My best 290X score is 14.8k. So 15.8k isn't bad by any stretch...


----------



## EpicOtis13

Quote:


> Originally Posted by *Gumbi*
> 
> My best 290X score is 14.8k. So 15.8k isn't bad by any stretch...


I was talking about my dual 290's. I wish I still had my results from my 7970's. Since those are similar to a Fury.


----------



## Gamedaz

* Does the XFX Fury have a Dual Bios Switch?


----------



## xer0h0ur

Quote:


> Originally Posted by *en9dmp*
> 
> Hi guys, not been on for a while, but googling Fury X voltage control still just brings up old pages from June and July... Do we still not have it? If we don't have it by now then I guess it will never come... :'(


Unwinder is currently working on Afterburner to include voltage control for Fiji and the MSI Lightning 980 Ti. He didn't have a Fiji card to do the work before but one finally made its way through Russia to him. I have no idea what is the hold up with W1zzard's Trixx software.


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> I have no idea what is the hold up with W1zzard's Trixx software.


Sapphire is terrible at pushing Trixx updates. It took them like 4 or 5 months to add voltage control for the Vapor-X HD 7970 and 7950. I still have a Beta version of Trixx that never launched official and as far as I know it's the only version with support for the Vapor-X cards.

Trixx 5.0.0 is a buggy broken mess ATM. It works enough for me to use it but it's really really broken. 4.8.9 caused all kinds of crashes. 4.4.0(that's the Vapor-X compatible version which I have) never launched. IMO W1zz is busy trying to fix the disaster that 5.0.0 is and once everything works we'll see 5.1.0 with a working UI maybe with voltage control for Fury but I wouldn't count on it.


----------



## fewness

It's hard for me to accept the truth that there are a total of 2 (!) people in this whole world working for OC software of a flagship graphics card. Talking about niche market.....


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> It's hard for me to accept the truth that there are a total of 2 (!) people in this whole world working for OC software of a flagship graphics card. Talking about niche market.....


Hopefully once I finish my Computer Systems Engineering degree I'll know enough to be the 3rd but for now I will just churn out hard mods for anything and everything I get my hands on and find a datasheet for.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> Hopefully once I finish my Computer Systems Engineering degree I'll know enough to be the 3rd but for now I will just churn out hard mods for anything and everything I get my hands on and find a datasheet for.


I'm so tempted to follow your instruction and try that hardware mod...but I've never done anything like that before...

Question: why do you need a variable resistor? Am I supposed to adjust it every time I oc? Can I just use one fixed value resistor to get a fixed voltage? Like, I just want to fix it at 1.4V core/1.6V HBM, what should I do?


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> I'm so tempted to follow your instruction and try that hardware mod...but I've never done anything like that before...
> 
> Question: why do you need a variable resistor? Am I supposed to adjust it every time I oc? Can I just use one fixed value resistor to get a fixed voltage? Like, I just want to fix it at 1.4V core/1.6V HBM, what should I do?


The reason that you should use a variable resistor is that while there is an equation which should give you the correct voltage vs resistance values I have never successfully set a resistance and then gotten the voltage I expected. In the guide it says I'm stuck at 1.3V however the configuration of dip switch was calculated to provide 1.45V. So it's best to do it with a variable resistor because then all you have to do to configure the mod is give the GPU a 3D load and adjust the voltage until it's what you want to use. I strongly recommend putting a 10ohm resistor in series with the variable resistor so that you don't accidentally set the variable resistor to 0 ohms which would give you something north of 2V.


----------



## xer0h0ur

Quote:


> Originally Posted by *fewness*
> 
> It's hard for me to accept the truth that there are a total of 2 (!) people in this whole world working for OC software of a flagship graphics card. Talking about niche market.....


The days of individual people making software for free are mostly gone. It doesn't make sense for these guys who are making it to release it for free if they could get companies like MSI and Sapphire to pay them for it and its not exactly easy for another person to come out of the woodworks with their own software made from the ground up. Most people have lives and actual jobs so projects like I am describing wouldn't be easy to pull off.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> The reason that you should use a variable resistor is that while there is an equation which should give you the correct voltage vs resistance values I have never successfully set a resistance and then gotten the voltage I expected. In the guide it says I'm stuck at 1.3V however the configuration of dip switch was calculated to provide 1.45V. So it's best to do it with a variable resistor because then all you have to do to configure the mod is give the GPU a 3D load and adjust the voltage until it's what you want to use. I strongly recommend putting a 10ohm resistor in series with the variable resistor so that you don't accidentally set the variable resistor to 0 ohms which would give you something north of 2V.


No shortcut I see...

So should I put the variable one on the highest ohms end to begin with?

I have mine on EK waterblock now, what's the safe voltage you'd recommend for core and HBM? will it be realistic to expect a stable 1250 core/600 HBM at the highest safe voltage? Maybe I'm expecting too much but I just want to convince myself the gain will worth the risk


----------



## fewness

Quote:


> Originally Posted by *xer0h0ur*
> 
> The days of individual people making software for free are mostly gone. It doesn't make sense for these guys who are making it to release it for free if they could get companies like MSI and Sapphire to pay them for it and its not exactly easy for another person to come out of the woodworks with their own software made from the ground up. Most people have lives and actual jobs so projects like I am describing wouldn't be easy to pull off.


Still it's a bit shocking to me there are only 2 companies, each put only 1 people, on this oc software project. Doesn't it mean there is no demand, rather than saying its technical challenging?


----------



## xer0h0ur

Quote:


> Originally Posted by *buildzoid*
> 
> The reason that you should use a variable resistor is that while there is an equation which should give you the correct voltage vs resistance values I have never successfully set a resistance and then gotten the voltage I expected. In the guide it says I'm stuck at 1.3V however the configuration of dip switch was calculated to provide 1.45V. So it's best to do it with a variable resistor because then all you have to do to configure the mod is give the GPU a 3D load and adjust the voltage until it's what you want to use. I strongly recommend putting a 10ohm resistor in series with the variable resistor so that you don't accidentally set the variable resistor to 0 ohms which would give you something north of 2V.


Right but isn't each die also running a different voltage and every die can't necessarily handle a random given voltage?
Quote:


> Originally Posted by *fewness*
> 
> Still it's a bit shocking to me there are only 2 companies, each put only 1 people, on this oc software project. Doesn't it mean there is no demand, rather than saying its technical challenging?


I don't think you realize that overclocking as a whole is a niche market. Out of the thousands upon thousands of video cards sold on every generation for AMD and Nvidia, only a mere fraction of those cards are ever overclocked.


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> No shortcut I see...
> 
> So should I put the variable one on the highest ohms end to begin with?
> 
> I have mine on EK waterblock now, what's the safe voltage you'd recommend for core and HBM? will it be realistic to expect a stable 1250 core/600 HBM at the highest safe voltage? Maybe I'm expecting too much but I just want to convince myself the gain will worth the risk


Yeah with Vmods you always start with the VR at maximum ohms then apply a 3D load(Unigine Heaven is good) and slowly turn the resistance down which results in the voltage going up.

If you're getting 1120-1150 core on stock Vcore 1.4V should give you 1200-1300mhz in theory. My HBM however seem to not scale with voltage in the slightest. But maybe my HBM is running too hot or something.

For Vcore on water I wouldn't go over 1.4V and many people will say that's high but considering that there's an 11% decrease in power draw by just lowering core temps from 80C to 55C I'd say if you're sub 60C on core you can run 1.4V.

The HBM is speced by SK Hynix to run 1.2V. IRL it's run at 1.35V and so I don't think over volting it above 1.4V is safe. Also I doubt the HBM will scale with voltage but your mileage may vary.


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> Right but isn't each die also running a different voltage and every die can't necessarily handle a random given voltage?


All my Hawaii cards had the same stock voltage. I can almost guarantee that all Fiji GPUs running the reference PCB have a Vcore voltage of 1.22V. You are free to try prove me wrong. I've so far only had 1 Fury in my hand which came stock at 1.22V. W1zz's reviews also show the Fury Tri-X and the Fury X at 1.22V.


----------



## xer0h0ur

Oh I am not here to challenge your observations. Only asking you since I believe someone else had mentioned the Fury X dies were getting varied voltages.


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh I am not here to challenge your observations. Only asking you since I believe someone else had mentioned the Fury X dies were getting varied voltages.


I'd be a pretty bad "scientist" if I didn't challenge people to prove me wrong.


----------



## battleaxe

Quote:


> Originally Posted by *buildzoid*
> 
> All my Hawaii cards had the same stock voltage. I can almost guarantee that all Fiji GPUs running the reference PCB have a Vcore voltage of 1.22V. You are free to try prove me wrong. I've so far only had 1 Fury in my hand which came stock at 1.22V. W1zz's reviews also show the Fury Tri-X and the Fury X at 1.22V.


Maybe, maybe not. My Hawaii card runs about .1 lower volts than does almost everyone else. The BIOS was different. Seems that depending on how the chips are binned they could possibly be putting a slightly different BIOS to them to use different volts. My friend bought an MSI gaming 290 at the exact same time and his was like everyone else. So IDK. The Fury's might be different though. Hard to tell.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> Yeah with Vmods you always start with the VR at maximum ohms then apply a 3D load(Unigine Heaven is good) and slowly turn the resistance down which results in the voltage going up.
> 
> If you're getting 1120-1150 core on stock Vcore 1.4V should give you 1200-1300mhz in theory. My HBM however seem to not scale with voltage in the slightest. But maybe my HBM is running too hot or something.
> 
> For Vcore on water I wouldn't go over 1.4V and many people will say that's high but considering that there's an 11% decrease in power draw by just lowering core temps from 80C to 55C I'd say if you're sub 60C on core you can run 1.4V.
> 
> The HBM is speced by SK Hynix to run 1.2V. IRL it's run at 1.35V and so I don't think over volting it above 1.4V is safe. Also I doubt the HBM will scale with voltage but your mileage may vary.


Great. So maybe I will just skip HBM for my first work and get core to 1.4V.

I'm searching amazon for all things you mentioned in your blog now







Just couple of more questions:

Are those LED voltmeter accurate enough? I'm afraid even 0.1V inaccuracy would lead to either no oc at all or set the chip on fire.... e.g. http://www.amazon.com/bayite-wires-Digital-Voltmeter-Display/dp/B00YALV0NG/ref=sr_1_12?ie=UTF8&qid=1446600791&sr=8-12&keywords=led+0-30V+voltmeter+3+wire

I'm running a 1250W power supply for 5960x+Fury X CF now. If I over volt both Fury X, will the PSU hold? Tools to do the job seem cheap, I'm not sure if I want to invest another 1600W-ish PSU for this...


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> Great. So maybe I will just skip HBM for my first work and get core to 1.4V.
> 
> I'm searching amazon for all things you mentioned in your blog now
> 
> 
> 
> 
> 
> 
> 
> Just couple of more questions:
> 
> Are those LED voltmeter accurate enough? I'm afraid even 0.1V inaccuracy would lead to either no oc at all or set the chip on fire.... e.g. http://www.amazon.com/bayite-wires-Digital-Voltmeter-Display/dp/B00YALV0NG/ref=sr_1_12?ie=UTF8&qid=1446600791&sr=8-12&keywords=led+0-30V+voltmeter+3+wire
> 
> I'm running a 1250W power supply for 5960x+Fury X CF now. If I over volt both Fury X, will the PSU hold? Tools to do the job seem cheap, I'm not sure if I want to invest another 1600W-ish PSU for this...


Do you not have a good DMM that you could reference those voltmeters against? I'm pretty sure that if they are incorrect by 100mv they will be 100mv off across all voltages so if they read 1.12 instead of 1.22 just assume it is 1.22V IRL and set the VR such that the volt meters read 1.3V.

For the PSU yeah 1250W should be enough. However there is a chance the cards will ask for 400W or more and that might be a problem. I suggest you pick up a socket based watt meter so that you can check the power the PSU sucks in. If it's a Gold rated PSU 1420W at the wall is too much since you're pulling almost exactly 1250W from the PSU itself.

Make sure you get a good soldering pen with a good fine chisel tip. The chisel tip is key if you get a needle tip your life will suck. If the chisel tip you buy is too big your life will also suck.


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> Do you not have a good DMM that you could reference those voltmeters against? I'm pretty sure that if they are incorrect by 100mv they will be 100mv off across all voltages so if they read 1.12 instead of 1.22 just assume it is 1.22V IRL and set the VR such that the volt meters read 1.3V.
> 
> For the PSU yeah 1250W should be enough. However there is a chance the cards will ask for 400W or more and that might be a problem. I suggest you pick up a socket based watt meter so that you can check the power the PSU sucks in. If it's a Gold rated PSU 1420W at the wall is too much since you're pulling almost exactly 1250W from the PSU itself.
> 
> Make sure you get a good soldering pen with a good fine chisel tip. The chisel tip is key if you get a needle tip your life will suck. If the chisel tip you buy is too big your life will also suck.


Cool, so I'll pick one of those LED toys, along with a DMM...It's good to see a number to easy my mind









Just did a watt measure...the system draws 800W in Firestrike combo test, 750W in Valley...should have enough headspace~

Thank you so much!


----------



## mRYellow

Can someone here upload the Sapphire Fury Tri X OC bios?
Would like to flash by non OC with the OC bios.


----------



## buildzoid

Quote:


> Originally Posted by *mRYellow*
> 
> Can someone here upload the Sapphire Fury Tri X OC bios?
> Would like to flash by non OC with the OC bios.


Open after burner. Set core clock to 1040mhz. There you just got the same results as flashing without running the risk of bricking your card.


----------



## mRYellow

Quote:


> Originally Posted by *buildzoid*
> 
> Open after burner. Set core clock to 1040mhz. There you just got the same results as flashing without running the risk of bricking your card.


Lol, i'm already running at 1070mhz








I think the OC has more voltage. That's why i want the bios. Who know when AB will get voltage support for the Fury.


----------



## buildzoid

Quote:


> Originally Posted by *mRYellow*
> 
> Lol, i'm already running at 1070mhz
> 
> 
> 
> 
> 
> 
> 
> 
> I think the OC has more voltage. That's why i want the bios. Who know when AB will get voltage support for the Fury.


I doubt it. You can measure voltage pretty easily without any soldering using a DMM. Just check my volt mod guide. Measure from any of the highlighted GND points to any of the highlighted VCC points.


----------



## gupsterg

Wanted to highlight this if someone can test and report.

Was looking at Fury rom and noted on TPU database W1zzard has highlighted left rom increased power limit and right is stock.
Quote:


> Originally Posted by *Neon Lights*
> 
> Does anyone know what would have to be done to edit a Fury X BIOS?


Even the very old AtomDis available on web creates tables for these roms. Comparing PowerPlay data table (which is where in Hawaii cards powerlimit values are) there are value changes between the 2 linked roms. I have not yet made an attempt to translate values fully.

There is an increase on hex value lower down the table which is similar location for value which relate to powerlimit in Hawaii roms. There are 3 in Hawaii rom relating to powerlimit and would assume that would be the case for Fury as well.


----------



## Neon Lights

Quote:


> Originally Posted by *gupsterg*
> 
> Wanted to highlight this if someone can test and report.
> 
> Was looking at Fury rom and noted on TPU database W1zzard has highlighted left rom increased power limit and right is stock.
> Even the very old AtomDis available on web creates tables for these roms. Comparing PowerPlay data table (which is where in Hawaii cards powerlimit values are) there are value changes between the 2 linked roms. I have not yet made an attempt to translate values fully.
> 
> There is an increase on hex value lower down the table which is similar location for value which relate to powerlimit in Hawaii roms. There are 3 in Hawaii rom relating to powerlimit and would assume that would be the case for Fury as well.


I was primarily looking for a way to change the voltage. However, without a higher Power Limit overclocking is about as hard as without a higher voltage, so that is important as well. So, thank you for noting and looking into that.

I am planning to look into it.


----------



## gupsterg

Quote:


> Originally Posted by *battleaxe*
> 
> Maybe, maybe not. My Hawaii card runs about .1 lower volts than does almost everyone else. The BIOS was different. Seems that depending on how the chips are binned they could possibly be putting a slightly different BIOS to them to use different volts. My friend bought an MSI gaming 290 at the exact same time and his was like everyone else. So IDK. The Fury's might be different though. Hard to tell.


I'll have go to explain what I noticed with 290/X roms and I believe this is the same for other Hawaii GPU cards (have had some results from other owners) _and_ may well be the same for newer cards. ie Fury

Firstly a ROM can have a GPU core voltage offset within ROM (it can also be in the voltage control chip, same one used on Fury). If in MSI AB you see a preset GPU core voltage offset your card has it. This voltage applies to all DPM states, there are 8 in the ROM.

Next we must talk about ASIC profiling (Leakage ID (ASIC Quality)).
Quote:


> Originally Posted by *The Stilt*
> 
> High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
> Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient


SO say an owner has x ASIC Quality they will have x VID for GPU, another owner has y ASIC Quality they will have y VID for GPU.

Next we must talk about EVV (Electronic Variable Voltage) in every stock ROM only the lowest state of GPU voltage is the same ie DPM 0. Now why EVV is used for DPM 1 - 7 is so with all this profiling going on each ROM does not have to be tailored to a exact GPUs properties.

Now one thing that also happens under EVV is the default GPU clock affects VID.

For example when I flashed my card with a default GPU clock of 1100MHz I get a differing VID to 1000MHz. Everything in my testing was the same and the same ROM was used but edited GPU clock. This example is based on using EVV and only when a VID is manually set for a DPM state in ROM this effect stops. Now you can see why manufacturer wouldn't use manually set voltages, they'd be spending too much resources creating a tailored ROM per GPU used on card.

SO as default GPU clock in a ROM goes higher set VID will be lower. SO an owner of a card with say 1000MHz as default GPU clock will have higher VID than an owner with 1100MHz. That's why OC edition cards tend to have a GPU core voltage offset in ROM or voltage chip. This is quick easy voltage fix.

SO the 3 factors:-

1) Leakage ID (ASIC Quality)
2) Default GPU clock
3) ROM or Voltage chip having a global core voltage offset which bumps voltage in all DPM states

There is more to this as there are factors which deem good/bad asic even with same leakage, I can not elaborate as a) don't have full info b) don't understand some of it .

Now you maybe thinking how to assess VID (generally people are noting VDDC via software, a drooped value) well the Stilt created an app, this works for DPM7 on EVV gen 1 cards listed in linked post.


----------



## WheelZ0713

A brief scan suggests that there is still no voltage unlock for the Fury? Correct?


----------



## buildzoid

left BIOS is towards the display outputs right?

I run the left BIOS with unlock for 3840 cores.
Quote:


> Originally Posted by *WheelZ0713*
> 
> A brief scan suggests that there is still no voltage unlock for the Fury? Correct?


In software? No there isn't. However there are hard mods.


----------



## WheelZ0713

Quote:


> Originally Posted by *buildzoid*
> 
> left BIOS is towards the display outputs right?
> 
> I run the left BIOS with unlock for 3840 cores.
> In software? No there isn't. However there are hard mods.


Oooohhh. On the Sapphire or just the strix card still?


----------



## Agent Smith1984

Quote:


> Originally Posted by *EpicOtis13*
> 
> I was talking about my dual 290's. I wish I still had my results from my 7970's. Since those are similar to a Fury.


I tested dual 7970's at 1050/1500 days before going fury... The fury score 400 less graphic points in regular firestrike, but beats the 7970's at 4k in actual gameplay by 5-12fps, especially the min frames...


----------



## gupsterg

Quote:


> Originally Posted by *buildzoid*
> 
> left BIOS is towards the display outputs right?
> 
> I run the left BIOS with unlock for 3840 cores. .


Yes.


----------



## buildzoid

Quote:


> Originally Posted by *WheelZ0713*
> 
> Oooohhh. On the Sapphire or just the strix card still?


I made a guide to mod core voltage on all the reference PCB Fiji cards. That means the Fury X, Sapphire Fury, Powercolor Fury, XFX Fury.

Unfortunately I can't drop a link for it since it's on my blog and I already got one warning from the mods about advertising/external links/bla bla bla.... However if you google: "Actually Hardcore Overclocking Fury X volt mod guide" it should come right up.

I just remembered someone posted a link to it in the Vmod subforum and I can link to that. I hope...
Link to the link to the Vmod guide for reference PCB Furys and Fury X.


----------



## WheelZ0713

Quote:


> Originally Posted by *buildzoid*
> 
> I made a guide to mod core voltage on all the reference PCB Fiji cards. That means the Fury X, Sapphire Fury, Powercolor Fury, XFX Fury.
> 
> Unfortunately I can't drop a link for it since it's on my blog and I already got one warning from the mods about advertising/external links/bla bla bla.... However if you google: "Actually Hardcore Overclocking Fury X volt mod guide" it should come right up.


Found it. Thanks man, but it's a little out of my league.


----------



## EpicOtis13

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I tested dual 7970's at 1050/1500 days before going fury... The fury score 400 less graphic points in regular firestrike, but beats the 7970's at 4k in actual gameplay by 5-12fps, especially the min frames...


that's why I bought 290's for my 4k panel. However, my 7970's at 1300/1700 would probably trash a fury at 1080p. I'm just waiting for some money to come in to decide wether or not I want to sell my second 7970 or put in in a backup rig.


----------



## Agent Smith1984

Quote:


> Originally Posted by *EpicOtis13*
> 
> that's why I bought 290's for my 4k panel. However, my 7970's at 1300/1700 would probably trash a fury at 1080p. I'm just waiting for some money to come in to decide wether or not I want to sell my second 7970 or put in in a backup rig.


Yeah, i ran these at 1150/1600 and it was pretty solid (no voltage control on xfx stock bios)...

My 2) tri-x 290's did really well, and i also ran 2) 390's at around 1175/1700 and that was murderous.... I am guessing adding a second fury is going to really get it done at 4k... I want to play with voltage and tinker more with one card first though... If we get voltage control soon, then I'll consider a second one...


----------



## Agent Smith1984

So.... Haa anyone else gotten the xfx fury yet?

Have not found any reviews or other owners... I think I'm the first on here... I really love the card, but i was hoping for a little better of a clocker... Also... I STILL can't get 3dmark to detect the card.. I've tried everything... It still days "generic vga adapter" but gpu-z and everything else detects it properly, and it functions perfectly.


----------



## Agent Smith1984

Go figure.....

All my X's are on one side (best chance to unlock all 8 CU's)....

But it's hard locked..


----------



## EpicOtis13

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, i ran these at 1150/1600 and it was pretty solid (no voltage control on xfx stock bios)...
> 
> My 2) tri-x 290's did really well, and i also ran 2) 390's at around 1175/1700 and that was murderous.... I am guessing adding a second fury is going to really get it done at 4k... I want to play with voltage and tinker more with one card first though... If we get voltage control soon, then I'll consider a second one...


I wish the my 290 would do more than 1100/1350 fully stable, I really do need to try some benching with higher powerlimit.


----------



## Otterfluff

I finished my pipe bending and I have my two Asus Fury X plumbed in!




I have to update my catalyst drivers but I was running 33-34 C on the gpu core under load on stock settings and 27 C when I under-volt 50%. I am playing around with 630Mhz HBM overclock and -50% core under-volt and I haven't had any crashes yet.

The gpu cores run 22 C idle. This is mostly due to my MO-RA3 thats sitting right in the path of my AC allowing me to get slightly sub ambient temps on my water.



I have not connected my volt mods to my pcb yet "spring clamp-in pcb terminal" as I splashed a bit of water on it filling the loop. I will likely try it later. I have only soldered the top Fury X so far because I wanted to see what I could do with one first before I committed to the other. The pcb for the pots and control is already mounted for the second card however.


----------



## Arizonian

@Otterfluff

That is one sweet looking rig. Bravo.


----------



## Otterfluff

Quote:


> Originally Posted by *Arizonian*
> 
> @Otterfluff
> 
> That is one sweet looking rig. Bravo.


Thanks! I plan to post better pictures with a better camera soon, my tablet is taking some pretty shoddy pictures it seems









Once I update my under updated build log ill make sure to post some good pictures on the phanateks case thread!

I am getting HBM memory artifacts at 635MHz but 630Mhz dose not seem to have any problems completing Fire Strike Ultra or running Furmark. I guess I can try raising the under-volt and see how that effects the HBM.


----------



## Thoth420

RADtastic!


----------



## mRYellow

That's a RAD case


----------



## SuperZan

That is some outstanding work! *Standing ovation*


----------



## gupsterg

Quote:


> Originally Posted by *Neon Lights*
> 
> I was primarily looking for a way to change the voltage. However, without a higher Power Limit overclocking is about as hard as without a higher voltage, so that is important as well. So, thank you for noting and looking into that.
> 
> I am planning to look into it.


As not a Fury owner, had to find some data regarding the 3 roms I was gonna check Sapphire Fury Tri-X OC (Stock/upped PL) and Asus Strix Fury.
Quote:


> The dual-BIOS switch is present, and Sapphire ships the card with two different BIOSes. The default BIOS (switch right) uses the standard 300W ASIC power limit and 75C temperature target. Meanwhile the second BIOS (switch left) Increases the power and temperature limits to 350W and 80C respectively, for greater overclocking limits.


Link:- Above info for Sapphire card from
Quote:


> We're told that the card's default ASIC power limit is just 216W, and our testing largely concurs with this.


Link:- Above info for Asus card from

Then using some experience from the Hawaii ROM editing I took the PowerPlay tables of each card and a) did compares b) marked.


Spoiler: Warning: Spoiler!







The first 2 hex values of each table = length of table you can verify this by ref'ing the tables list atomdis creates, the second 2 hex values are revision again ref the tables list.

*This info is provided as is, for anyone to a) check b) use at their own risk*

*Note:* It may help to ref the Hawaii bios editing thread on OCN


----------



## Agent Smith1984

Firestrike with stock XFX BIOS...... It will not detect the card correctly at all for some reason...

This was at 1075/525

http://www.3dmark.com/fs/6381081

With the stock Sapphire Tri-X BIOS the card is detected properly in firestrike, however I can not overclock as well with this BIOS so the OC was done at 1060/500 (I think







)

http://www.3dmark.com/fs/6399335


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Firestrike with stock XFX BIOS...... It will not detect the card correctly at all for some reason...
> 
> This was at 1075/525
> 
> http://www.3dmark.com/fs/6381081
> 
> With the stock Sapphire Tri-X BIOS the card is detected properly in firestrike, however I can not overclock as well with this BIOS so the OC was done at 1060/500 (I think
> 
> 
> 
> 
> 
> 
> 
> )
> 
> http://www.3dmark.com/fs/6399335


With the Sapphire Bios does your CU's still read as locked and not unlockable?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> With the Sapphire Bios does your CU's still read as locked and not unlockable?


Yep









Doesn't clock all that well either....

I can bench at 1080/560 with no issues (1080P)..... but in game stable (1 hour of Crysis 3) I can only nail down around 1060/560.... I believe that is because I'm running 4k though....

I don't get any artifacts or hangs, I just get a driver crash.... which is a dead give away for lack of voltage...

Unwinder has said he is not making any more public comments on the progress of Fiji voltage control....... I am not sure if that is a good sign, or a bad sign, but I know he is working on it. (last update from him was on 10-24-15)....

I really hope I can shoot a solid 1.3v++ to this card soon, cause it's definitely got the cooling to handle it.....


----------



## xer0h0ur

Quote:


> Originally Posted by *Otterfluff*
> 
> Thanks! I plan to post better pictures with a better camera soon, my tablet is taking some pretty shoddy pictures it seems
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once I update my under updated build log ill make sure to post some good pictures on the phanateks case thread!
> 
> I am getting HBM memory artifacts at 635MHz but 630Mhz dose not seem to have any problems completing Fire Strike Ultra or running Furmark. I guess I can try raising the under-volt and see how that effects the HBM.


I salute you sir, that looks clean clean clean. One of these days I will learn how to do that solid piping bending and whatnot because that is really aesthetically pleasing versus your average tubing.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> I salute you sir, that looks clean clean clean. One of these days I will learn how to do that solid piping bending and whatnot because that is really aesthetically pleasing versus your average tubing.


I like hard tubing so much that I paid to have mine done professionally. I suffer from GAD so my hands shake especially when nervous. I am also heavy handed...have broken pci slots and dimm slots far too often. Suffice to say it wasn't cheap but at least I am not stressed out due to a screw-up.

A Mockup of the cooling scheme (short GPU length for the win!):


----------



## fewness

Quote:


> Originally Posted by *Otterfluff*


I'll be satisfied if I can achieve half of this elegance.









Actually I'll be totally satisfied if I do manage to get it work at all.


----------



## mRYellow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Firestrike with stock XFX BIOS...... It will not detect the card correctly at all for some reason...
> 
> This was at 1075/525
> 
> http://www.3dmark.com/fs/6381081
> 
> With the stock Sapphire Tri-X BIOS the card is detected properly in firestrike, however I can not overclock as well with this BIOS so the OC was done at 1060/500 (I think
> 
> 
> 
> 
> 
> 
> 
> )
> 
> http://www.3dmark.com/fs/6399335


3Dmark not recognising your card is normal.

As for OC, i think we need more voltage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mRYellow*
> 
> 3Dmark not recognising your card is normal.
> 
> As for OC, i think we need more voltage.


There's no doubt about it!









The XFX BIOS gets a VID of 1.204 under load at max OC....

The Sapphire BIOS gets a VID of 1.189 under load at max OC...

The stock XFX BIO benches at a full 20MHz higher.....


----------



## buildzoid

Could you post a link to the XFX BIOS?


----------



## xer0h0ur

As an update to the rumor I had stated about a Omega-esque AMD driver coming out this month, its been "confirmed" through a leak that the "AMD Radeon Software Crimson edition drivers" will be launching to the public on November 24th:

http://wccftech.com/amd-radeon-crimson-drivers-officially-launching-24th-november-public/


----------



## Gamedaz

* Just received my XFX Fury R9 Today.

Still testing it out, have it in a Mini atx Silverstone case Milo7.

* No O.C yet. But will look into it when the New AMD control center is released.

GPU temps are better than my GTX 780ti. 85%FANS @ 52c *. Which is better than the 80c I was getting with 25% fan speeds.

Tested Tomb Raider @ 1080p No Stuttering or Tearing * Plays rellay smoothly. Temps @ 62c Fans 85%.

*I've also discovered that it enables my Sharp Elites Panel 120HZ refresh rate. Which I had been trying to test with my previous card with no resolve.

I expect the double refresh rate will eliminate motion blur, I think the card pushes 119 FPS* When I was testing with V-sync enabled Splinter Cell Blacklist it would shoot 119FPS which I assume is due to the HDTV Panels refresh rate, which is a true refresh rate if enabled I believe. But not sure how it exactly works yet.

* Does the card have a dual BIOS?


----------



## gupsterg

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Firestrike with stock XFX BIOS...... It will not detect the card correctly at all for some reason...
> 
> This was at 1075/525
> 
> http://www.3dmark.com/fs/6381081
> 
> With the stock Sapphire Tri-X BIOS the card is detected properly in firestrike, however I can not overclock as well with this BIOS so the OC was done at 1060/500 (I think
> 
> 
> 
> 
> 
> 
> 
> )
> 
> http://www.3dmark.com/fs/6399335


I was wondering if you can dump XFX rom and post? would like to do a compare







.


Spoiler: Bios Backup method from Fiji Unlock thread



DON'T rely on GPU-Z backups and techpowerup's database since BIOS read from Fiji with GPU-Z is not complete, its only 128KB out of real 256KB.
Although main part of the BIOS is present in GPU-Z dumps, some part of data is not backed up.
GPU-Z backup can be used as a last resort, but I strongly suggest use of full backups produced by atiflash.

To use command line atiflash and atomtool.py, you'll need to know how to work with Windows command line (cmd.exe). If you're not sure what it is, google it first before you try.

- backup your current BIOS:
atiflash -s 0 bios_backup_xxx.rom
here 0 is the number of the card in your system. Use other numbers if you're backing up more that one card in CF configuration.
xxx is your current BIOS switch position, right (towards power connectors) of left (towards display connectors and a face plate)
- flip BIOS switch and backup second BIOS:
atiflash -s 0 bios_backup_yyy.rom
for yyy use current BIOS position.



Did a compare of the 3 Fury X roms on TPU today and also compared Fury X PowerPlay table with Fury to firm up things I've marked up (just out of curiosity).

Another forum member has said digital signature check is not enforced for Fury at present so bios mods are possible IMO.


----------



## Gamedaz

This is a really nice and practical card. The HeatFins at the rear allow air to pass through them without clumping against the PCB where it could collect heat.

My temps compared to the previous 3 Fan Gainward are something that are substantial improvments.

I went into the AMD software and set the Fans @ 85% Max - Temps 55c Max

So far in any the games I've recently tested I'm not reaching more than 54c. *Where before I was getting 62c? Does Catalyst throttle the GPU back somehow?

Core is @ 1000MHZ

All testing @ 1080p 60 (or 120hz possibly)

What are people using to O.C. this card? I would prefer to use AMD software instead of MSI.

Which one would be suggested?

This card is really stable with Tomb Raider Ultra. No 24hz. For some reason it Defaults to 50HZ in the Video Menu. But still looks solid with no tearing.


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> I was wondering if you can dump XFX rom and post? would like to do a compare
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Did a compare of the 3 Fury X roms on TPU today and also compared Fury X PowerPlay table with Fury to firm up things I've marked up (just out of curiosity).
> 
> Another forum member has said digital signature check is not enforced for Fury at present so bios mods are possible IMO.


I'm willing to do BIOS testing.

If you do figure out how to mod Vcore in BIOS make the following voltage presets:
1.3V for regular daily usage with 400A/500W power limits
1.35V for agressive fan profiles 400A/500W power limits
1.4V for custom water loop cooling 500A/600W power limits
1.45V for benching 600A/999W power limits(I have my doubts about if the Fury (X) VRM will stay cool enough to run more than 600A with the regular cooling)
1.5V for LN2 999A/999W power limits
1.6V for LN2 999A/999W power limits
1.7V for LN2 999A/999W power limits

1.2V for Hardmod users with 999A/999W power limits

That should cover everyone's needs on all the existing Fiji cards. Also once you have the BIOS figured out making all these versions shouldn't be too hard.


----------



## buildzoid

Quote:


> Originally Posted by *Gamedaz*
> 
> This card is really stable with Tomb Raider Ultra. No 24hz. For some reason it Defaults to 50HZ in the Video Menu. But still looks solid with no tearing.


Use Trixx 5.0.0 to OC if you don't want to use Afterburner.


----------



## Jflisk

Think the 15.11 drivers may have a memory leak went from 16 GB to 3.5 GB while watching the Flash on CW. Memoryleak.com


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is a really nice and practical card. The HeatFins at the rear allow air to pass through them without clumping against the PCB where it could collect heat.
> 
> My temps compared to the previous 3 Fan Gainward are something that are substantial improvments.
> 
> I went into the AMD software and set the Fans @ 85% Max - Temps 55c Max
> 
> So far in any the games I've recently tested I'm not reaching more than 54c. *Where before I was getting 62c? Does Catalyst throttle the GPU back somehow?
> 
> Core is @ 1000MHZ
> 
> All testing @ 1080p 60 (or 120hz possibly)
> 
> What are people using to O.C. this card? I would prefer to use AMD software instead of MSI.
> 
> Which one would be suggested?
> 
> This card is really stable with Tomb Raider Ultra. No 24hz. For some reason it Defaults to 50HZ in the Video Menu. But still looks solid with no tearing.


Just something i noticed with this card....

The cooling doesn't really improve after 55-60% fan speed....

That will likely change with increased voltage if we get the ability to increase it soon, because the heatsink will become more saturated, and at that point a higher fan curve may help dissipate more heat... Either way, these coolers are superb!


----------



## Gamedaz

* I was getting 82c temps at one point in my Tiny case, then I realized fans for spinning @ 25%

* Set them to 85% and temps droped to 62c.

Set the Target Temp in Catalyst software to 55c, and for some reason temps won't go beyond 54c (Even with Fans ramping up to 91%? Will have to do some more testing.


----------



## gupsterg

Quote:


> Originally Posted by *buildzoid*
> 
> I'm willing to do BIOS testing.
> 
> 
> Spoiler: bios wish list
> 
> 
> 
> If you do figure out how to mod Vcore in BIOS make the following voltage presets:
> 1.3V for regular daily usage with 400A/500W power limits
> 1.35V for agressive fan profiles 400A/500W power limits
> 1.4V for custom water loop cooling 500A/600W power limits
> 1.45V for benching 600A/999W power limits(I have my doubts about if the Fury (X) VRM will stay cool enough to run more than 600A with the regular cooling)
> 1.5V for LN2 999A/999W power limits
> 1.6V for LN2 999A/999W power limits
> 1.7V for LN2 999A/999W power limits
> 
> 1.2V for Hardmod users with 999A/999W power limits
> 
> 
> 
> That should cover everyone's needs on all the existing Fiji cards. Also once you have the BIOS figured out making all these versions shouldn't be too hard.


Right I have necessary info to fixchecksum on these ROM, ie atomtool posted by tx12 , I didn't wish to use the "other tools" due to these ROMs being 256KB and not having knowledge to know if they'd work with it.

Send me ROM you want modified, we'll do some tests in private and then once preliminary testing is done I'll start new thread regarding bios modding.

Some of your wish list may not be possible yet







.

Why I say that is I have no manual for rom or proprietary tools, Shamino had access to things when creating PT roms, which I don't; again so did the Stilt for his mining & MLU builds.

I will advise you what are first set of "safe" tests to do IMO and then we'll try the "safe" voltage mod IMO *but* be aware I and you are treading on new ground







, regardless of what I think is "safe" there is an element of *risk* as long as you accept that it's all good my end







.

I will give you all my working outs for you to judge if your happy to try a suggested mod and anything I can answer I will try *but* I'm not linked with AMD to have professional or inside knowledge.

I'm just an enthusiast and due to what I picked up with the Hawaii bios modding think it would be great if I can put that to use to make Fury bios modding happen.


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> Right I have necessary info to fixchecksum on these ROM, ie atomtool posted by tx12 , I didn't wish to use the "other tools" due to these ROMs being 256KB and not having knowledge to know if they'd work with it.
> 
> Send me ROM you want modified, we'll do some tests in private and then once preliminary testing is done I'll start new thread regarding bios modding.
> 
> Some of your wish list may not be possible yet
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Why I say that is I have no manual for rom or proprietary tools, Shamino had access to things when creating PT roms, which I don't; again so did the Stilt for his mining & MLU builds.
> 
> I will advise you what are first set of "safe" tests to do IMO and then we'll try the "safe" voltage mod IMO *but* be aware I and you are treading on new ground
> 
> 
> 
> 
> 
> 
> 
> , regardless of what I think is "safe" there is an element of *risk* as long as you accept that it's all good my end
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I will give you all my working outs for you to judge if your happy to try a suggested mod and anything I can answer I will try *but* I'm not linked with AMD to have professional or inside knowledge.
> 
> I'm just an enthusiast and due to what I picked up with the Hawaii bios modding think it would be great if I can put that to use to make Fury bios modding happen.


I'd be surprised if we manage to brick a card. Also which of my requests is unlikely to be possible?


----------



## gupsterg

Currently TDC ie xxx A .

That's assuming the Fury bios works like Hawaii and has it.

*** edit ***


Spoiler: Latest marked PowerPlay Table Fury


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> Currently TDC ie xxx A .
> 
> That's assuming the Fury bios works like Hawaii and has it.


And my interest in potentially sacrificing a Fury for science just died. The voltage is cool and all but I can do that with a hard mod. I can't do anything with the power/current limits because of the way the IR3567B works. Also I'm pretty sure we will see voltage control soon enough.


----------



## Randomdude

Quote:


> Originally Posted by *buildzoid*
> 
> And my interest in potentially sacrificing a Fury for science just died. The voltage is cool and all but I can do that with a hard mod. I can't do anything with the power/current limits because of the way the IR3567B works. Also I'm pretty sure we will see voltage control soon enough.


Your sig speaks the truth. +rep


----------



## gupsterg

Quote:


> Originally Posted by *buildzoid*
> 
> The voltage is cool and all but I can do that with a hard mod.


From what you described in PM regarding voltage drop.
Quote:


> The problem doesn't seem to be maximum power limit but the lower power levels. If I could do something to the card to have fewer voltage levels. ATM the card has 0.89V idle and then a bunch of voltages that top out at 1.22V. It seems that under lesser 3D loads the card runs 3D core clock but drops the core voltage a couple 10s of mv and that causes a hard crash.


This is what per DPM level voltage does in Hawaii, if you fix it manually then it doesn't droop/ work as EVV, so your hard mod isn't working like you think IMO. You may recall that table I PM'd.



If I replace EVV per DPM with manual voltage it will use that VID.
Quote:


> Originally Posted by *buildzoid*
> 
> I can't do anything with the power/current limits because of the way the IR3567B works. Also I'm pretty sure we will see voltage control soon enough.


Power I can mod, TDP I think I have found, perhaps there is no TDC value and is linked to the other 2 values.

Anyhow the stuff I have marked so far is 99.9% accurate in my opinion.

I have asked someone who maybe able to tell, yet no answer.


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> From what you described in PM regarding voltage drop.
> This is what per DPM level voltage does in Hawaii, if you fix it manually then it doesn't droop/ work as EVV, so your hard mod isn't working like you think IMO. You may recall that table I PM'd.
> 
> 
> 
> If I replace EVV per DPM with manual voltage it will use that VID.
> Power I can mod, TDP I think I have found, perhaps there is no TDC value and is linked to the other 2 values.
> 
> Anyhow the stuff I have marked so far is 99.9% accurate in my opinion.
> 
> I have asked someone who maybe able to tell, yet no answer.


Right then I'm back in.

IMO first we should test TDP. So give me a BIOS with a lowered TDP limit say 200W but don't change voltages. It will take probably until some time next week for me to get some cards here the UK so until then I don't have a BIOS to give you.


----------



## gupsterg

No worries, as I said before waiting on one person to reply regarding TDC but I'm also in the process of asking a 2nd, only problem is the forum that person is more active on I'm waiting for activation to be able to ask.

In the meantime @Ized has kindly pointed me to source of some driver code he sent me so I'll be digging into that.

Thirdly I'm gonna manually sift through new atombios.h files which are on some repositories that AMD personnel are updating (just in case a mod thinks it's hacking it isn't, their freely viewable but not downloadable).

If I manually take new updated atombios.h (ie copy script manually) and mod into old atomdis it doesn't works as there are certain "things" the rest of the files used to make atomdis lack. As I'm no programmer I can't make sense of what I have to do to update it.

What I think is MPDL is a) due to info in reviews (checked about 2-3 against values I've marked) b) its right next to MAX ASIC Temp, same location as Hawaii.

What I think is TDP may well be TDC, I don't know as can't find a figure for it online to cross ref with.

Now why I think TDP is TDP as in the Stock Sapphire bios it matches what I think is MPDL. And in Hawaii ROMs TDP / MPDL are always the same and TDC is usually lower by a few percent.

Now I can't yet find a value that resembles TDC, in Hawaii you found TDP > TDC > MPDL > MAX ASIC Temp (grey boxed value)



Now look at the Fury section.



It could well be there is no TDC value and its linked. Now you see when I test effect of TDP / MPDL / TDC when modding a Hawaii ROM I rely on a) monitoring gpu frequency b) software readings for VIN , IIN , IOUT , POUT , PIN .

You're in a position to get real data via DMM, which is beyond my capabilities.

Now if your wondering why TDP isn't the same as MPDL in the unlocked Sapphire Fury ROM, I think they "gimped" the ROM and that's why reviewers see the same GPU frequency range when they use that ROM for testing. In Hawaii all 3 values needed to be upped to effect GPU frequency.

Even when voltage control comes out for Fiji it will be the same as Hawaii. ie software applies a global GPU voltage offset hitting all DPM state voltages.

With bios modding some people on Hawaii have lowered the idle voltage well below stock and ramped up highest state to what they require. Then some have also tested what they require for the in between states and set those as they require.

Now Fiji is EVV GEN 2 and so is Tonga (aka R9 285 / 380) I'll also look at those ROMs, looking at those PowerPlays will also get me some data / experience.


----------



## Otterfluff

Finally got a day off work. I am getting a max Vcore of 1.344V from my volt mods using the 220ohm pot +7 ohm resistor from buildzoid's guide.



I can keep a 1192 Mhz core overclock with it and maintain my earlier 630Mhz HBM overclocks at 33C core temps. I am testing this under both Furmark and 3Dmark. The increased core voltage and core overclock dose not seem to mind the HBM at all. I still cant get past 630Mhz on the HBM as any higher results in artifacts.



I still have not tried mucking around with HBM voltage yet but maybe a little later.

What combo of resistors should I try to raise my core voltage to 1.4V+?


----------



## GorillaSceptre

Quote:


> Originally Posted by *Otterfluff*
> 
> Finally got a day off work. I am getting a max Vcore of 1.344V from my volt mods using the 220ohm pot +7 ohm resistor from buildzoid's guide.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> I can keep a 1192 Mhz core overclock with it and maintain my earlier 630Mhz HBM overclocks at 33C core temps. I am testing this under both Furmark and 3Dmark. The increased core voltage and core overclock dose not seem to mind the HBM at all. I still cant get past 630Mhz on the HBM as any higher results in artifacts.
> 
> 
> 
> I still have not tried mucking around with HBM voltage yet but maybe a little later.
> 
> What combo of resistors should I try to raise my core voltage to 1.4V+?


Awesome! But stay off of FurMark, it's useless except for burning GPU's.









It would be cool to see some benches with some games at that clock.







Can you get a higher core clock without messing with the HBM?


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Finally got a day off work. I am getting a max Vcore of 1.344V from my volt mods using the 220ohm pot +7 ohm resistor from buildzoid's guide.
> 
> 
> 
> I can keep a 1192 Mhz core overclock with it and maintain my earlier 630Mhz HBM overclocks at 33C core temps. I am testing this under both Furmark and 3Dmark. The increased core voltage and core overclock dose not seem to mind the HBM at all. I still cant get past 630Mhz on the HBM as any higher results in artifacts.
> 
> 
> 
> I still have not tried mucking around with HBM voltage yet but maybe a little later.
> 
> What combo of resistors should I try to raise my core voltage to 1.4V+?


Damn man nice clocks.

I'll explain some of the theory behind the volt mod. The VSENS pin on the IR3567B has a 3.4 ohm resistance from it to GND and a 2.1ohm resistance from it to Vcore. So basically it's a voltage divider circuit. By changing the ratio of resistance of the pin to GND and the the pin to Vcore you can change voltage. If you lower the Vcore to pin resistance you get less voltage. If you lower GND to pin resistance you get more voltage. In theory a 3.4 ohm resistor going from the pin to GND should give you a maximum Vcore of double what the normal Vcore is. However for some reason this calculation doesn't seem to work.

For the resistor I would next try a 5ohm and if that still doesn't give you high enough voltage go for 3ohm. The main problem with the mod is that the reistance of the pin to GND is really really low and so to get large voltage changes you need to changes in resistance of a couple ohms.


----------



## Otterfluff

I can only get up to 1.3342V on the HBm using 220ohm pot and 50 ohm resistor. But it did not have any effect on my HBM clocks. I can get to 633Mhz and it's crystal. Jump up one more time and it all goes to hell with white pixel artifacts and crashing. Cant complain about 26% over-clock on the memory I guess.

The core voltage sure likes to dip around alot. For some tests it's fine but for others it's all over the place. Different tests even give different ranges of voltage. Makes it very hard to tune the voltage for every test in 3Dmark when they all seem to pull something a little different.

I should probably try some games next I have witcher 3, ashes of singularity, Civ BE, Ark Survival. I do not have that many AAA games that are recent.
Quote:


> Originally Posted by *GorillaSceptre*
> 
> [/SPOILER]
> 
> Awesome! But stay off of FurMark, it's useless except for burning GPU's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It would be cool to see some benches with some games at that clock.
> 
> 
> 
> 
> 
> 
> 
> Can you get a higher core clock without messing with the HBM?


I do not think I can get a higher core clock than 1180-1190 but the HBM seems fairly independent of whatever the core clock is trying to do. I do not seem to get better core clocking with HBM at stock speeds or at 633Mhz. It seems to be set and forget for HBM so far.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> I can only get up to 1.3342V on the HBm using 220ohm pot and 50 ohm resistor. But it did not have any effect on my HBM clocks. I can get to 633Mhz and it's crystal. Jump up one more time and it all goes to hell with white pixel artifacts and crashing. Cant complain about 26% over-clock on the memory I guess.
> 
> The core voltage sure likes to dip around alot. For some tests it's fine but for others it's all over the place. Different tests even give different ranges of voltage. Makes it very hard to tune the voltage for every test in 3Dmark when they all seem to pull something a little different.
> 
> I should probably try some games next I have witcher 3, ashes of singularity, Civ BE, Ark Survival. I do not have that many AAA games that are recent.
> I do not think I can get a higher core clock than 1180-1190 but the HBM seems fairly independent of whatever the core clock is trying to do. I do not seem to get better core clocking with HBM at stock speeds or at 633Mhz. It seems to be set and forget for HBM so far.


Well you just confirmed my suspicion on HBM being completely temperature driven. I'll revise the resistor values in the guide depending on your experience. Just to check you get 1.3342V with the pot set to 0ohms right?

It might be possible to get higher HBM clocks with less HBM voltage if temps scale better than voltage. I'll do some testing on that soon and add it to the guide if it works.

Hopefully we will have a BIOS that fixes the core voltage dipping.


----------



## Otterfluff

Yes it's when it's at 0ohm the pot starts clicking once I have rotated it all the way.

I was playing around with crossfire and the main card's voltage is stable at whatever you set it at if your running crossfire, during loading and when it's actually at work. No dips, same voltage between 3dMark tests. Now I really want to solder in the volt mods for the second card









Maybe I should try over-clocking the main card under crossfire while leaving the slave at stock and testing that way?


----------



## Otterfluff

Ok so the crossfire made a huge difference with the stable voltage, it has let me hit 1180Mhz core @ 1.32V and 630Mhz HBM with no crashes and no artifacts. Crazy that a single card cant manage this but if you crossfire it eliminates the voltage dip problems.

I may be able to hit 1190 but it's a bit of a hit and miss, at least with a stable voltage I might eventually find a sweet spot.

I wonder if the slave voltage would act similarly.


----------



## Gamedaz

* Will AMD release a BIOS flash that will O.C. their HBM GPU's? Or will it be availlable in their new software suite for O.C.


----------



## buildzoid

Quote:


> Originally Posted by *Gamedaz*
> 
> * Will AMD release a BIOS flash that will O.C. their HBM GPU's? Or will it be availlable in their new software suite for O.C.


For voltage we just need new software no need for a special BIOS. It's not even the HBM's fault it's the new power management on the Fiji cards that make voltage control problematic.


----------



## gupsterg

@Buildzoid

Right been pressing ahead with R9 285 / 380 ROM PP compare, saw something in them that made me go back to Fury / FuryX







.

I present GPU Frequency per DPM.


Spoiler: Latest marked Fury PowePlay Table







Now you may recall in post 5123 a table I posted regarding Hawaii. Now if you ref the frequencies I've marked in image above, when you set those you will see default EVV for that GPU frequency.

At present this is how the Hawaii bios modders find DPM 1 - 6 voltage and they use the drooped VDDC value as a ref to set it manually how they require it. Once set manually you rerun the same app again and assess if you need to readjust.

The other benefit of these GPU frequencies for Hawaii is you can set a differing GPU frequency to gain better performance per DPM. In factory OC roms for Hawaii they set these as % of DPM 7 GPU freq (ref hawaii bios modding thread heading Making OC bios like factory pre OC'd card/rom).

One member even went as far as making them the same as highest state, due to his particular requirement.


Spoiler: MAX Clock per DPM mod



Quote:


> Originally Posted by *Unknownm*
> 
> Question!
> 
> Could I set the BIOS to always run 3d clocks even when idle on desktop?
> 
> Same idea like K-boost for Nvidia. Enable K-boost ran my 660 ti in full speed no matter what I was doing, Yes I don't care about temps and power saving... I would love to have both my AMD cards running 3d clocks all the time instead of downclocking
> 
> Thanks


Quote:


> Originally Posted by *Unknownm*
> 
> I wanted to share how awesome having stock clocks as idle.
> 
> Playing Counter Strike Source with a friend in Zombie Mod server. Normally the top GPU would run 500-800Mhz depending on frame rate which is capped at 120fps. However I would get dips down to 70-80fps when 30-40 people are shooting at once (unlimited ammo).
> 
> Now that idle clocks are full stock clocks, CSS is forced to run 1040Mhz on top GPU and I've never hit under 120fps even in the same place (40 people shooting at once).
> 
> Owning 660 Ti SLi I had the exact same issue with CSS. Core would downclock to the framerate and cause the game to run slower than it should and after enabling K-boost in EVGA I always had steady framerate.


----------



## xer0h0ur

Hard mods go flying way over my head but its interesting to read what you guys are doing. What a time to be alive!


----------



## gupsterg

Hard mods are out of my league as well







, that's what got me into the soft mod for Hawaii. There are distinct benefits of both but the soft mod is way more accessible to the masses once tested, both together also enhance OC'ing.

Soft mod can also be very powerful, for example the Stilt for mining roms did :-
Quote:


> - Elpida B-die performance issue fixed by rewriting the timings and the MC straps correctly.
> - Hynix Gemma-die performance improved by rewriting the timings and the MC straps correctly.
> - Enhanced the VRM configuration, yielding >5% improvement in VRM efficiency on the medium leaking test samples.
> - 50% of the RBs (ROP) shedded, one array (16) from each of the SHs (0/1) to improve the power and the thermal.


DO feel a little alone at it at present







, originally when the Hawaii bios modding thing started on Guru3D there were 3 of us going at it hard IMO. There were also many testing and reporting their results, some sharing their own experience of a mod they tried and failed or worked, which helped things a long a lot. It was a great thread to be part of.

I was hoping someone (besides buildzoid) would come forward to try things and help me looking at PP table as well.

The Hawaii bios reader was such a benefit also, after its conception, which @Oneb1t & @DDSZ did.

This allowed some quick reading of various roms, there was more variation between some roms due to some cards being custom PCB. This variation did also aid marking values, marking/comparing them manually how I'm doing now (and we were doing at first then) is very time consuming. Hawaii reader can be updated IMO to include Fuji support, it's open source code (link to the repository).


----------



## Agent Smith1984

1190 at 1.35v huh?

That's a shame....

I haven't had an ati(amd) based GPU clock as badly as my fury since my x1800xl in 05 or so, that only did 45mhz OC... THIS card does 1060mhz 4k game stable... Needs voltage for sure, but still...


----------



## Gamedaz

Quote:


> Originally Posted by *buildzoid*
> 
> For voltage we just need new software no need for a special BIOS. It's not even the HBM's fault it's the new power management on the Fiji cards that make voltage control problematic.


* Would AMD release the voltage limits in the new Driver software release within their utility. That would make sense since the software is O.C capable from what I've read etc.


----------



## Otterfluff

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 1190 at 1.35v huh?
> 
> That's a shame....
> 
> I haven't had an ati(amd) based GPU clock as badly as my fury since my x1800xl in 05 or so, that only did 45mhz OC... THIS card does 1060mhz 4k game stable... Needs voltage for sure, but still...


It is just one card, maybe my second will OC better, have to wait for me to volt mod and compare it. I am skeptical I can get better on another but anythings possible with the silicone lottery?


----------



## buildzoid

Great thanks.

This is changing my build plans.
Quote:


> Originally Posted by *Gamedaz*
> 
> * Would AMD release the voltage limits in the new Driver software release within their utility. That would make sense since the software is O.C capable from what I've read etc.


No idea


----------



## xer0h0ur

Quote:


> Originally Posted by *Gamedaz*
> 
> * Would AMD release the voltage limits in the new Driver software release within their utility. That would make sense since the software is O.C capable from what I've read etc.


While I highly doubt it, I won't flat out say no. Anything is possible. The new driver software seems to be more of a facelift while introducing a feature here or there and a marginal gaming performance boost.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> While I highly doubt it, I won't flat out say no. Anything is possible. The new driver software seems to be more of a facelift while introducing a feature here or there and a marginal gaming performance boost.


How epic would it be that the final voltage control for Fiji come from AMD themselves? They could could literally save themselves on lack luster fury OC results right now, because i know not everyone overclocks, but it does make an impact on sales.... I know if they did do v control it would be limited, but still....


----------



## buildzoid

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How epic would it be that the final voltage control for Fiji come from AMD themselves? They could could literally save themselves on lack luster fury OC results right now, because i know not everyone overclocks, but it does make an impact on sales.... I know if they did do v control it would be limited, but still....


If they gave us 150mv in the software that would be awesome.

Can't we suggest this as a feature to the driver team through the AMD website?


----------



## Greenland

Quote:


> Originally Posted by *Otterfluff*
> 
> Finally got a day off work. I am getting a max Vcore of 1.344V from my volt mods using the 220ohm pot +7 ohm resistor from buildzoid's guide.
> 
> 
> 
> I can keep a 1192 Mhz core overclock with it and maintain my earlier 630Mhz HBM overclocks at 33C core temps. I am testing this under both Furmark and 3Dmark. The increased core voltage and core overclock dose not seem to mind the HBM at all. I still cant get past 630Mhz on the HBM as any higher results in artifacts.
> 
> 
> 
> I still have not tried mucking around with HBM voltage yet but maybe a little later.
> 
> What combo of resistors should I try to raise my core voltage to 1.4V+?


Are those ASUS Fury strix? If so, where do you get the waterblocks?
Quote:


> Originally Posted by *buildzoid*
> 
> If they gave us 150mv in the software that would be awesome.
> 
> Can't we suggest this as a feature to the driver team through the AMD website?


Hi there,, what's the performance increase going from stock clock to yours? Also, is it possible to have 1.35v on air? I'm not doing any hard mod any time soon, and still waiting for the official OC ultility to overvolt my ASUS Fury card.


----------



## buildzoid

Quote:


> Originally Posted by *Greenland*
> 
> Are those ASUS Fury strix? If so, where do you get the waterblocks?
> Hi there,, what's the performance increase going from stock clock to yours? Also, is it possible to have 1.35v on air? I'm not doing any hard mod any time soon, and still waiting for the official OC ultility to overvolt my ASUS Fury card.


Otterfulff is running refrence PCB cards. The STRIX doesn't have the load indicator LEDs.

From my overclock I thing I got something around 9% over the stock clock of 1040 that the Tri-X OC comes with.

My recommendation for safe voltages is that if your load temps are bellow 60C you can run 1.35V(the only reason I don't run 1.35V is because I don't have a 100ohm pot) and if your temps are sub 40C you can run 1.4V. Otherwise run 1.3V.


----------



## Greenland

Thanks, I cant imagine what 1.3v on air would be like, probably 85C under load.


----------



## buildzoid

Quote:


> Originally Posted by *Greenland*
> 
> Thanks, I cant imagine what 1.3v on air would be like, probably 85C under load.


My Tri-X at 52% fan speed holds 1.3V at 55C


----------



## Greenland

At 50% fan speed, my ASUS is a bit loud and it hovers around 70C in Witcher 3, @40% fan speed the temp rises significantly to 78C so it's likely for me to have a 85C at around 45% fan [email protected] I hate compromise like this.


----------



## buildzoid

Quote:


> Originally Posted by *Greenland*
> 
> At 50% fan speed, my ASUS is a bit loud and it hovers around 70C in Witcher 3, @40% fan speed the temp rises significantly to 78C so it's likely for me to have a 85C at around 45% fan [email protected] I hate compromise like this.


Well the ASUS has by far the worst cooler of all the Furys. Mind you at 55% fan speed the Tri-X makes my computer go from 52db(GPU fans turned off) to 59dB and that's with the sound meter 15cm away from the card.


----------



## Greenland

Is it safe to run 1.3V at 85C?


----------



## buildzoid

Quote:


> Originally Posted by *Greenland*
> 
> Is it safe to run 1.3V at 85C?


yeah it is but the card might start to throttle a little.


----------



## Otterfluff

I have had a terrible time trying to OC my second Fury X slave. Afterburner I cant even modify the memory clocks and Trixx is very unstable and 90% crashes when trying to modify the settings, under crossfire the whole PC locks up when Trixx stalls. It's very random when i am trying to set the settings. Only a few random times I was able to get it to work let alone test.

Any suggestions for setting core and HBM for my second card in crossfire?


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> I have had a terrible time trying to OC my second Fury X slave. Afterburner I cant even modify the memory clocks and Trixx is very unstable and 90% crashes when trying to modify the settings, under crossfire the whole PC locks up when Trixx stalls. It's very random when i am trying to set the settings. Only a few random times I was able to get it to work let alone test.
> 
> Any suggestions for setting core and HBM for my second card in crossfire?


Try ask Pulse88 from team MLG. He has a bunch of 4 way Fury X crossfire scores on HWbot. He's not on OCN but I think team MLG has it's own forums so you could try ask him there.

Before you do that. Do you have unofficial overclocking mode enabled in Afterburner. That should unlock the memory slider.


----------



## clubber_lang

Quote:


> Originally Posted by *Thoth420*
> 
> Couple more weeks and my cooling should be
> Another person happy with the ACER Predator....I might have to go nuts and breach my budget. The 27 inch 2560 x 1440 freesync offerings all seem pretty craptastic for the price. Eizo still hasn't mentioned a price or date on theirs. I am not a racing game guy more into shooters and mmo's etc. and have never gamed on a widescreen. Does it fisheye in first person games?


I haven't actually played any FPS games as of yet. I've been so damned hooked on what my race sims look like right now , I can't seem to find the time to stop haha.

What's weird is that in the driving sims , and on paper I should of gotten about double the performance of one of my 7970's. My crossfired 7970's should of been on par with what the R9 Fury is doing , but they weren't even close. The " real " driving sims such as Rfactor 2 , iracing , and Game stock car extreme don't do very well on crossfire or Sli.

Maybe it was the combination of using those older 7970's , on old drivers , I'm not sure. But the single 7970 struggled to make my race sims look good with descent frame rates on this 3440 x 1440 monitor. This new R9 , with new drivers ( 15.7 ) , and I am getting real close to triple the performance BUT....with all the eye candy turned on! I have everything set to max in these sims now , and even though my monitor will only do 75mhz , the R9 is staying steady at any where between 60-120 FPS.

Example : coming out of the pits on the track called Circuit of the America's , I'd get like 16 - 25 fps , it was like watching a slideshow. The R9 is around 65-75 FPS coming out of the pits , and jumps up to around a steady 90-100 FPS as soon as I leave the pits.

I get more and more excited about this new R9 card and this monitor combo every time I drive. Good times for me right now.


----------



## Otterfluff

Ah yeah I have unofficial overclocking mode enabled, but it only seems to work for the primary card and not the slave.

I will look into team MLG, thanks for the advice.


----------



## Otterfluff

Asus GPU tweak II 1.1.4.0 works really well. I can set the HBM and core clocks and no problems or crashes. Even better you can set it to Always 3D clock and it runs it at the max core clock 24/7 which makes the voltage stable even outside of running 3D applications.

I haven't figured out how to do separate overclocking for each card but I guess for a crossfire setup that's kinda moot since all cards run at the lowest clocked card's speeds.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Asus GPU tweak II 1.1.4.0 works really well. I can set the HBM and core clocks and no problems or crashes. Even better you can set it to Always 3D clock and it runs it at the max core clock 24/7 which makes the voltage stable even outside of running 3D applications.
> 
> I haven't figured out how to do separate overclocking for each card but I guess for a crossfire setup that's kinda moot since all cards run at the lowest clocked card's speeds.


Actually you can run different core and memory clocks on each GPU in a crossfire setup and it does still scale.

Hopefully I will figure something out when I get my 4 way Fury-X setup so that I can test asymmetrical clock scaling.


----------



## Otterfluff

I think I figured it out there is a dropbox at the top for selecting which card and you can just untick Sync with all cards. It seems to work very painlessly. Very nice.


----------



## xer0h0ur

You can also force non stop 3d clocks in afterburner by running unofficial overclocking mode at "without powerplay support".


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> You can also force non stop 3d clocks in afterburner by running unofficial overclocking mode at "without powerplay support".


I think I tried that and it went really badly. I had to boot safe mode and uninstall Afterburner(yeah yeah I know there's a config file that I could've bla bla...).


----------



## DMatthewStewart

Quote:


> Originally Posted by *xer0h0ur*
> 
> You can also force non stop 3d clocks in afterburner by running unofficial overclocking mode at "without powerplay support".


You know, I have an early alpha game that is throttling my clocks down below my regular 3d profile (and at times way below my 2d profile). Im going to have to try that option out. If it works I cant believe I found the answer in this thread. Ive been trying to figure this out for weeks.

Really, I only came to this thread to try and figure out if I should get Fury X's and throw away the radiator, fan, stock block and pump (because I have my own custom loop) or get an air-cooled r9 Fury (like the Strix) But the reviews so far have been pretty bad on the air-cooled Fury's. Also, Im trying to figure out if we are going to get a Lightning version. I saw that they just released the 980Ti Lightning and Id hate to buy a Fury X, dismantle it, and then have a Lightning version pop up.


----------



## xer0h0ur

Quote:


> Originally Posted by *buildzoid*
> 
> I think I tried that and it went really badly. I had to boot safe mode and uninstall Afterburner(yeah yeah I know there's a config file that I could've bla bla...).


For what its worth I had no problems using it. Worked like a charm when I was having clock throttling issues. However setting it back to disabled which is the default setting didn't do a thing for me. It kept the non-stop 3d clocks so I had to uninstall afterburner without keeping settings to get it back to "normal." That was the only wonky behavior it showed for me.


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> Well you just confirmed my suspicion on HBM being completely temperature driven. I'll revise the resistor values in the guide depending on your experience. Just to check you get 1.3342V with the pot set to 0ohms right?


If I want to try to raise the HBM voltage some more for the sake of testing, what resistor should I replace the 50 ohm resistor with?

I had alot of success with HBM overclocks on single cards but crossfire seems very temperamental about high HBM clocks. I still think there is room to explore the effect of voltage on the HBM to reduce the artifacts I get more easily during crossfire. So I would like to explore it.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> If I want to try to raise the HBM voltage some more for the sake of testing, what resistor should I replace the 50 ohm resistor with?
> 
> I had alot of success with HBM overclocks on single cards but crossfire seems very temperamental about high HBM clocks. I still think there is room to explore the effect of voltage on the HBM to reduce the artifacts I get more easily during crossfire. So I would like to explore it.


Try 20ohms. I guess the Vsense pin on IR chips is not linear because if it was linear 50ohms would've been just right.


----------



## fewness

Are you guys sure this can be done by a human hand? Damn these things look so tiny even through a magnifying glass!


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> Are you guys sure this can be done by a human hand? Damn these things look so tiny even through a magnifying glass!


In the guide I specifically state not to solder onto the pins but onto the SMD components that are on the trace coming from the pin.


----------



## Thoth420

Hats off to all you guys with surgeon's hands that can do this stuff! I shake like a leaf when nervous....


----------



## Otterfluff

Right tools for the right job you need a decent soldering iron and maybe some technique. buildzoid's tips on soldering and gear was spot on and helped me a ton. First time I ever soldered something that small and I could barely see it.


----------



## Mega Man

Jvc ( have not seen him around in forever ) has an excellent guide. They are actually pretty easy

I can't even find his user anymore ...


----------



## Otterfluff

I got a 3.3 ohm and 20 ohm resistors from the electronics shop and another 4L of demineralised water, going to bleed my loop and try volt modding the second fury X today. Ill post some more photos of the second card.


----------



## Gamedaz

* AMD should allow to O.C the card, we've got proof that 1145MHZ core clock @ 1.3v is reasonably stable, it probably interprets shaders as well, which IMO would help them stay clocked with the screen refresh. I Playing some Ryse son of Rome on the XFX and I've got 120HZ refresh enabled in my Monitor?

AMD uses a X.v color technology with my monitor, so I even have Film mode enabled in my TV as well, something that produces Film Like results in the Game. But still keeps that soild Video Play feel, (but with object stuttering and deep background stuttering / flicker) It is distracting a bit, instead of that smooth VIDEO feel, I think the clocks themselves are what are alluding to the effect in Ryse. * I still don't know if it's safe to keep my screens [email protected] 120hz enabled, this wasn't possible before with my GTX 780 ti which blocked that feature on my set.

* Is AMD attempting to use some sort of driver that will take advantage of the screens higher refresh rate? Which I doubt since the TV itself only doubles/ interpolates the existsing frames - but does'nt add new ones, I still feel an O.C will help smooth it out more.
Does anyone agree? I still have to test what my Frame rates are, I want to use FRAPS but I don't want too much bloatware on my New Steam Machine install, I know Steam has a FPS feature but I don't want a complete glitchy Beta either.

Does anyone know when Steam will release FPS mode without being in beta?

* I hope AMD will release O.C. utlities for the newer 900 series cards. if they even hit 78c, I'm sure the HBM will handle those temps by design. GDDR5x is for the budget GPU that will be released. I think HBM is for the more pricier GPU release to Market etc.

How has the Memory been holding with everyones MOdded O.C.'s? Is there any memory Temp / Heat issues yet?

Please post, I'm sure higher clocks will improve responsiveness of some data on screen, but not sure if AMD will update their drivers to improve on DX11 performance in Games like Nvidia, but I'm sure thats what their new Game GPU division is responsible for at the least. I'm sure theres a Market for at least 3-4 year Titles that may be worth improving upon driver wise, why would they ditch or DOA earlier releases.

* I expect Drivers will conform to what GPU you have possibly? So you get the driver updates specific to a GPU up to a certain previous series if that makes sense, that way they can focus on 3 series of GPU's instead of 5-6.


----------



## Otterfluff

I do not expect amd to unlock the voltage, as in the past it has always been the efforts of individuals in the community that have made the tools that support this.

Amd will not be putting any work into directx11, they haven't for a long time and I expect them to continue work on only directX12.

It's still very early for me to say how good a OC I can get but I think cooling the card down makes a big difference, ie under water. Holding 1150 core clocks with higher voltage is very easy to do.

HBM is still relatively un-explored and Im not sure what to make of it yet. It defiantly has room to move around but I need alot more testing. On a single card getting 600Mhz HBM seems stable but when I crossfire it's very picky. I have not really been able to add much voltage to the HBM yet so I cant really say if it would have any real effect. I am hoping to change my mods to try a better HBM voltage, maybe I might learn a little more but I am really only screwing around right now, trying to get a feel for things.

I am not sure what to expect yet.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> I do not expect amd to unlock the voltage, as in the past it has always been the efforts of individuals in the community that have made the tools that support this.
> 
> Amd will not be putting any work into directx11, they haven't for a long time and I expect them to continue work on only directX12.
> 
> It's still very early for me to say how good a OC I can get but I think cooling the card down makes a big difference, ie under water. Holding 1150 core clocks with higher voltage is very easy to do.
> 
> HBM is still relatively un-explored and Im not sure what to make of it yet. It defiantly has room to move around but I need alot more testing. On a single card getting 600Mhz HBM seems stable but when I crossfire it's very picky. I have not really been able to add much voltage to the HBM yet so I cant really say if it would have any real effect. I am hoping to change my mods to try a better HBM voltage, maybe I might learn a little more but I am really only screwing around right now, trying to get a feel for things.
> 
> I am not sure what to expect yet.


If I'm correct with my power state theory there is a good chance that we will have cards hitting 1130mhz stock 1170mhz on 1.3V and 1200mhz on 1.4V. My personal goal is to get 1250mhz core stable enough to finish Firestrike Extreme because I hate how the 980Ti currently dominates those rankings on HWbot. With the HBM I'm hopping that we find that it scales with voltage bellow 40C at the very worst because sub 40C is doable with ambient cooling(waiting on those HBM voltage results Otterfluff).


----------



## Gamedaz

* The entire purpose to have a feature to O.C a card is to utilize the Hardware to allow that, it would be pointless to introduce a feature (which will not be an open source mod) and claim you need a specific Open GL software to use the Voltage utility with the AMD Drivers etc.

What would the purpose of bringing to Market an Overclock Feature that cant be implemented via the Drivers. - Software.

As a casual user, to market a feature and not implement it properly is neglecting their product , which further shows inconsistent investment in a growing Market.

I assume HBM has the GPU and memory on the same Die, which should both share the same cooling properties on the Heat spreader (Liquid or air cooled)

Are 78c temps caustic to the memory or the GPUÉ.


----------



## DMatthewStewart

Quote:


> Originally Posted by *Gamedaz*
> 
> Are 78c temps caustic to the memory or the GPUÉ.


Do you mean strictly speaking for the Fury series or just gpu's in general? If its just gpu's in general Ive seen people run them hotter than that. I wouldnt. I liquid cool my cards and I get weird about hitting 50C

I am curious as to what others say regarding the 78C mark in regards to the Fury/Fury X. Ive only heard that they run much cooler than previous cards but I havent heard what the thermal limits should be


----------



## Gamedaz

* I have been checking other forums on the benefits of OC the HBM, they say it will improve the shader count, but it needs to be coupled to the ROP, and have it tuned with lower latency if possible.

AMD might re-work their drivers so that Texture render from the Software - Game- will be implemented in the Memory possibly improving Frame times of textures or distant objects.

28nm GPU IMO have a 78-82c threshold, 60c @ 28nm should be below their operating cppabilities, OC the Core GPU should improve other rendering of incoming draw calls that doesnt bottleneck the memory. The extra voltage should be used to OC the GPU first 1175MHZ or 110mhz clock = 350ghz core clock increase which could prevent the GPU from bottle necking the HBM (AMD Tuned).

A Marginal increase of Frame Rate of even 15 FPS @ 1080p is an ideal improvvvment to O.C the card in specific games such as Ryse, Ièd like to see Ryse stay Above 60 FPS, those small dips under 56 every few seconds can create stuttering of objects not being rendered in sync with the entire fram (If that makes sense) Maybe tripple buffering will help that, but still could see AMD use the HBM to improve texture rendering for an Overclocked GPU core. Without temps exceeding the Dies Threshold etc.

OC would be usefull to saturate the Core clocks with low latency (HBM) rendering data (Textures) that the stream processor can take advantage of.

Although it still unknown if AMD is marketing HBM as a Gimick or something that can be practically invested in and implemented to the Market successfully.

I also heard the latency comes down to the memory tree or something which was supposed to be released as a 645mhz memory clock. Nvidias Memory Tree (Which is located in the BIOS itself) is QAd to ensure smooth low latency performance etc. The extra my be worth the time for the Drivers in General.

* The practical use for HBM is for Developers to utilize it more to shoot more draw calls to the GPU, the CPU processes those draw calls in Direct X12 then goes to memory then to GPUès Stream processors. So ultimatly the developers will use the available memory in the Hardware as a source to ffed the Core with cached draw calls- now Intel has algorythms that improves cache errors, similarly DirectX12 has Logic streams built into it (Logic streams prevent a crash when a command does not know the direction itself) so it can be coded with reduced errors due to the logic tree Built into DiretX12 etc, if that makes sense.


----------



## DMatthewStewart

Quote:


> Originally Posted by *Gamedaz*
> 
> Although it still unknown if AMD is marketing HBM as a Gimick or somthing that can be practically invested in and implemented to the Market successfully.


This is the other reason why Im not sure if I should get Fury X's or 390x's this Xmas. While I dont think HBM is gimmicky, I get what youre saying as far as market supporting it. After all, we had BluRay and HD DVD. Both worked, but the market only supported one. I really hope HBM survives (and thrives) if its workable.

Im totally blown away that these guys are soldering new resistors in in an effort to OC these cards. Thats way out of my league and I dont think I could work up the nerve to do it to a perfectly good, brand new, expensive card. Kudos to all of you. I'll be watching this thread closely. It might be worth making a separate thread just for this experiment (if one doesnt already exist)


----------



## Gamedaz

It been shown in other forums that these card can be OC to 1145MHZ easy. 550 Stable @ 1.3 voltage is fine (AMD) needs to improve the latency in the BIOS, sure its a dirty job but someone has to do it. It would be dismal to see GDDR5x RAM come to market.

* Ive read that HBM will be used in the higher price range cards.


----------



## Gumbi

@Gamedaz Hawaii cards have been operating at 95c for 2 years now with no issues, I presume you pulled that 78-82c number from your rear?

Sure, it's desirable, but not necessary or critical.


----------



## Otterfluff

Replaced the 7 ohm with 3.3 and 50 ohm with 20 ohm on both veraboards. I also stuck in a pcb terminal to power the Radeon logo led flat panel.





Just waiting on the plasti dip to go solid before I re-assemble everything.


----------



## Otterfluff

Ended up plasti diping led logos on so im going to let it dry overnight. Ill plumb them back in later in the morning.


----------



## Gamedaz

* My card is Fiji XFX Fury R9. Ièm sure it can run up to 78-80 c as well no issues as posted on this thread with modders.


----------



## pdasterly

any info on fury x2 yet?


----------



## Jflisk

Quote:


> Originally Posted by *pdasterly*
> 
> any info on fury x2 yet?


Try here
http://wccftech.com/amd-r9-fury-x2-dual-gpu/


----------



## pdasterly

two months old, card should be upon us soon


----------



## Jflisk

Quote:


> Originally Posted by *pdasterly*
> 
> any info on fury x2 yet?


Quote:


> Originally Posted by *pdasterly*
> 
> two months old, card should be upon us soon


There is a newer post somewhere I believe it is in this forum .Let me see if I can find it.

Give this one a shot still looking for the one on this forum .
http://www.overclock3d.net/articles/gpu_displays/amd_fury_x2_specifications_confirmed_launch_within_two_months/1

The post I was looking for
http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/5020#post_2456256

Link in post
http://wccftech.com/amd-r9-fury-x2-specs-gemini/


----------



## Gamedaz

* There doesnt seem to be any concrete release times yet for it, I'm sure they would like it released soon, maybe with their new catalyst software before the end of the month?


----------



## Otterfluff

All plumbed in now, my second volt metre lcd seems to be 100mv lower than the first one. I will have to compare with my real multimeter to see if it's mis-reporting. The first one was very accurate when I compared it.


----------



## battleaxe

Quote:


> Originally Posted by *Otterfluff*
> 
> 
> 
> 
> All plumbed in now, my second volt metre lcd seems to be 100mv lower than the first one. I will have to compare with my real multimeter to see if it's mis-reporting. The first one was very accurate when I compared it.


Impressive. So can you dial in any voltage you want?


----------



## kayan

Hey guys, I've been following this thread off and on again for months, but had a couple of questions:

1) How are the drivers in W10, is it stable?
2) Any overclocking yet?
3) I've currently got a 295x2 and tired of dealing with x-fire, as most of my current games are Nvidia anyway. Would moving to a single Fury/X/Nano work well for me @ 3440x1440 (maxing almost everything)?
4) If so, what would users here recommend the Fury, the X, or the Nano?


----------



## Otterfluff

The voltage for core capped at 1.34V on my old setup, should be able to get higher now hopefully 1.4v+

I tested the lcd's with my more expensive multimeter and it's accurately reporting the voltage on both of the cards. The core and HBM seem to be 100mv lower on the second card than my first?

The main fury starts at 1.24V core and 1.3V HBM
The slave is getting 1.13V core and 1.2V HBM

I know it's not my resistors because I swaped the boards. The new volt mod is using the old control board. Either my second fury X runs 100mv lower or is this a result of it being a slave?


----------



## Otterfluff

Quote:


> Originally Posted by *kayan*
> 
> Hey guys, I've been following this thread off and on again for months, but had a couple of questions:
> 
> 1) How are the drivers in W10, is it stable?
> 2) Any overclocking yet?
> 3) I've currently got a 295x2 and tired of dealing with x-fire, as most of my current games are Nvidia anyway. Would moving to a single Fury/X/Nano work well for me @ 3440x1440 (maxing almost everything)?
> 4) If so, what would users here recommend the Fury, the X, or the Nano?


A single Fury X is pretty good deal for 4k but it's still not powerful enough to max everything and get 60fps. There is no single gpu solution out that can handle the high settings of 4k gaming yet.

If you got the space for Fury X radiator then get it. Cooler quieter and faster.
Get the reg air cooled fury if you cant fit the Fury X
Skip the nano unless you want a very small form factor.


----------



## kayan

Quote:


> Originally Posted by *Otterfluff*
> 
> A single Fury X is pretty good deal for 4k but it's still not powerful enough to max everything and get 60fps. There is no single gpu solution out that can handle the high settings of 4k gaming yet.
> 
> If you got the space for Fury X radiator then get it. Cooler quieter and faster.
> Get the reg air cooled fury if you cant fit the Fury X
> Skip the nano unless you want a very small form factor.


I've got a huge case at the moment, so space isn't an issue. I do have a custom loop that I may add my next GPU to it. Could I get the same kind of performance from a watercooled Fury, as I could from a Fury X? Fury plus waterblock = Fury X stock price. I know the specs are different, but what are users saying?


----------



## Otterfluff

Do not get the Asus Fury Strix as they use a custom pcb and you wont find a waterblock for it.

You will save $100 getting the reg Fury if you plan to get a custom waterblock, you might get lucky with unlocking CU's to a full Fury X, but that's a lottery. Sometimes they bin the chips because some cores are no good other times they just disable them in which case you might be able to unlock them with a bios mod.


----------



## buildzoid

Another good reason to avoid the ASUS is that it has by far the worst cooler.


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> Another good reason to avoid the ASUS is that it has by far the worst cooler.


They have consistnently dropped the balls with their high end AMD cards the past few gens,. Their 390(x)s are the worst of the bunch. Much worse VRM/core cooling compared to other brands.

The 290/290X Asus was appalling. They lifted the heatsink straight from the 780 and plopped it down on the 290, leaving one or two of the heatpipes not touching to core at all! VRM cooling was very poor too.


----------



## Otterfluff

Ok so my HBM voltage now maxes out at 1.42V, good news is it defiantly made it more stable and zero artifacts when clocked to 630. The extra voltage made a difference. I cant scale past the 630Mhz wall though, with extra voltage I get far less white spot artifacts when going above 630 in windows but it's defiantly a wall there of some sort.

Hopefully the extra volts into the HBM will help stability with HBM overclocks during crossfire, that would be really nice.

I got up to 1.46V on the core and decided I dident need to make it go higher. So I am not sure where the ceiling is for that now with this resistor combo. More good news is I managed a 1200 core overlock at 1.36-1.37V core and I had no artifacts or lost black frames. Completed all tests on 3DMark. I think 1200 core stable is a significant milestone, it certainly feels very good.

For now I am going to start testing my other newly modded Fury X and explore why it has 100mv lower stock voltage. I tried running it as the primary card by disconnecting the first one, and the voltage didn't change.

Maybe they set different stock voltage depending on how the chip responds in the factory? I Should find out where it's limits are compared to my first Fury X and will compare them.


----------



## kayan

Quote:


> Originally Posted by *buildzoid*
> 
> Another good reason to avoid the ASUS is that it has by far the worst cooler.


Quote:


> Originally Posted by *Gumbi*
> 
> They have consistnently dropped the balls with their high end AMD cards the past few gens,. Their 390(x)s are the worst of the bunch. Much worse VRM/core cooling compared to other brands.
> 
> The 290/290X Asus was appalling. They lifted the heatsink straight from the 780 and plopped it down on the 290, leaving one or two of the heatpipes not touching to core at all! VRM cooling was very poor too.


Yeah, I have a personal ban against most Asus stuff anyway. I won't buy them unless necessary. For AMD GPUs it's XFX, Visiontek, or Sapphire.

Anyway, are the XFX and Sapphire Fury stock pcb or are they custom?


----------



## Wovermars1996

Quote:


> Originally Posted by *kayan*
> 
> Yeah, I have a personal ban against most Asus stuff anyway. I won't buy them unless necessary. For AMD GPUs it's XFX, Visiontek, or Sapphire.
> 
> Anyway, are the XFX and Sapphire Fury stock pcb or are they custom?


I'm sure that Asus is the only one that is a custom PCB


----------



## Otterfluff

Figured out 100mv differance in voltage between cards, the ground was not making a secure connection in my pcb terminal clamp, all seems good now.

I reproduced 1200 core OC on the second card @ 1.33V and 1.36V Hbm. It looks like this card likes less voltage for the same OC. Tested in 3Dmark with no artifacts or black lost frames.

It can do 630Mhz hbm, I am in process testing 1200 core and 630 hbm combined for each card and then will experiment with crossfire.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Figured out 100mv differance in voltage between cards, the ground was not making a secure connection in my pcb terminal clamp, all seems good now.
> 
> I reproduced 1200 core OC on the second card @ 1.33V and 1.36V Hbm. It looks like this card likes less voltage for the same OC. Tested in 3Dmark with no artifacts or black lost frames.
> 
> It can do 630Mhz hbm, I am in process testing 1200 core and 630 hbm combined for each card and then will experiment with crossfire.


Sweet! Have you tried 1.4V on core? Also does HBM voltage actually help with anything?


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> Sweet! Have you tried 1.4V on core? Also does HBM voltage actually help with anything?


I havent tested 1.4V yet as if I put voltage too high for the core clock I get crashes. After I finish trying to get both cards on 1200 630 stable crossfire I will try higher core clocks/voltage.

I am now trying out 1.4V HBM and it seems to like it. Especially good at making the particle effects at the last test of Fire Strike Ultra not deform.

The second card is at 1.33V 1200Mhz core and 630Mhz 1.4V hbm right now and I think I have it stable. But this second card is alot more fussy than the first that likes 1.6-1.7V core. The first card takes more voltage but seems to have more room to move around to get it stable.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> I havent tested 1.4V yet as if I put voltage too high for the core clock I get crashes. After I finish trying to get both cards on 1200 630 stable crossfire I will try higher core clocks/voltage.
> 
> I am now trying out 1.4V HBM and it seems to like it. Especially good at making the particle effects at the last test of Fire Strike Ultra not deform.
> 
> The second card is at 1.33V 1200Mhz core and 630Mhz 1.4V hbm right now and I think I have it stable. But this second card is alot more fussy than the first that likes 1.6-1.7V core. The first card takes more voltage but seems to have more room to move around to get it stable.


Hmm sounds like a cap mod might help get better Vcore scaling. Though 6 phases really should be pretty clean.


----------



## Agent Smith1984

Quote:


> Originally Posted by *kayan*
> 
> Yeah, I have a personal ban against most Asus stuff anyway. I won't buy them unless necessary. For AMD GPUs it's XFX, Visiontek, or Sapphire.
> 
> Anyway, are the XFX and Sapphire Fury stock pcb or are they custom?


My xfx is definitely reference, but Cu's are hard locked...


----------



## battleaxe

Quote:


> Originally Posted by *Otterfluff*
> 
> I havent tested 1.4V yet as if I put voltage too high for the core clock I get crashes. After I finish trying to get both cards on 1200 630 stable crossfire I will try higher core clocks/voltage.
> 
> I am now trying out 1.4V HBM and it seems to like it. Especially good at making the particle effects at the last test of Fire Strike Ultra not deform.
> 
> The second card is at 1.33V 1200Mhz core and 630Mhz 1.4V hbm right now and I think I have it stable. But this second card is alot more fussy than the first that likes 1.6-1.7V core. The first card takes more voltage but seems to have more room to move around to get it stable.


Your results are looking really good. This is showing to be a nice card. Will be very interesting to see what kind of scores at these settings. Looking forward to your updates.

+1


----------



## Otterfluff

Ok so after alot of testing this is what I have got.

My first card can do 1200Mhz 1.36-1.37V core 630Mhz 1.4V HBM fully stable as a single card with no artifacts.

My second card can do 1150Mhz 1.28V core 630Mhz 1.4V HBM fully stable as a single card with no artifacts.

I cant keep the first card at 1200 under crossfire. I have successfully been running crossfire with both cards at 1150Mhz @ 1.32V other card 1.28V core and 630Mhz 1.4V HBM for both.

This has been tested under 3Dmark and Witcher 3.

I found I could not crossfire with stock HBM voltage the OC was unstable and had severe artifacts. After raising the voltage those artifacts disappeared.

I did try voltages up to 1.42V on the first card but I was un-successful at raising the clock above 1200Mhz. Even 1220Mhz was just plain unstable for me no matter what I tried to do. But I can get 1200Mhz on it easily within a margin of 300mv.

I will try screwing around with raising the core clocks on both under crossfire, but I honestly doubt I can do better than say 1200 on the first card and 1170 on the second.

There is a wall for HBM overclocks at 630Mhz, it dose not seem to be effected by anything it's just there. I did notice with more voltage the artifacts after 630Mhz diminishes but does not disappear completely, but you would need more than 1.4V if thats even safe? The artifacts are white dots that slowly cover the screen, this is done inside windows as soon as you set the HBM above 630Mhz, it ends up with windows locking up shortly after.

Right now I consider 1150 and the 630 HBM overclock to be solid in crossfire for playing games without any worries.


----------



## Neon Lights

Quote:


> Originally Posted by *Otterfluff*
> 
> Right now I consider 1150 and the 630 HBM overclock to be solid in crossfire for playing games without any worries.


I can play at ~1140MHz and ~570MHz without a volt mod (however, I have not tested this in Crossfire).


----------



## buildzoid

Quote:


> Originally Posted by *Neon Lights*
> 
> I can play at ~1140MHz and ~570MHz without a volt mod (however, I have not tested this in Crossfire).


That card could probably do 1200mhz+ on voltage.


----------



## Neon Lights

Quote:


> Originally Posted by *buildzoid*
> 
> That card could probably do 1200mhz+ on voltage.


Well, I hope it will.

I am not planning to do a hard voltage mod, I am really looking forward to a soft one (via BIOS). What did I buy those waterblocks for if I have to keep the cards at stock voltage?


----------



## Alastair

How much power do you think your cards are pulling at 1.35+- and 1200 core?


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> How much power do you think your cards are pulling at 1.35+- and 1200 core?


Enough to take down Eskom.


----------



## buildzoid

Quote:


> Originally Posted by *Alastair*
> 
> How much power do you think your cards are pulling at 1.35+- and 1200 core?


Around 400W.


----------



## Gumbi

Quote:


> Originally Posted by *buildzoid*
> 
> Around 400W.


How does that compare to a 290x at 1.35v and 1200mh? My 290X VaporX can bench 3Dmark no prob at 1240mhz/1640mhz at 200mv which is 1.4v (it peaks at 1.422 actually).


----------



## Alastair

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> How much power do you think your cards are pulling at 1.35+- and 1200 core?
> 
> 
> 
> Enough to take down Eskom.
Click to expand...

dude my computer was shutting down Eskom back when it was still o ky running two 6850's.


----------



## Alastair

Quote:


> Originally Posted by *buildzoid*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> How much power do you think your cards are pulling at 1.35+- and 1200 core?
> 
> 
> 
> Around 400W.
Click to expand...

I am assuming that that is 400w per card obviously.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Alastair*
> 
> dude my computer was shutting down Eskom back when it was still o ky running two 6850's.


----------



## buildzoid

The 290X pulls a little more power in games.

However Fiji running Furmark is much much worse than Hawaii running Furmark. The reason why Fiji uses so little power is that most of the 4096 SPs sit idle. if you load them all 100%(like running Furmark) the card uses 420W on stock clocks whereas the 290X only pulls 320W on stock clocks.
Quote:


> Originally Posted by *Alastair*
> 
> I am assuming that that is 400w per card obviously.


Yep per card. At stock the cards need 250W in game and overclocked it should be 400W +/- 10%(yeah that's a big error margin but better safe than sorry)


----------



## xer0h0ur

Quote:


> Originally Posted by *Gamedaz*
> 
> Although it still unknown if AMD is marketing HBM as a Gimick or something that can be practically invested in and implemented to the Market successfully.


HBM1 arrived just before GDDR reached the end of its efficiency life cycle. GDDR5X will more than likely be one of the last if not the last evolution of GDDR and its comparable to HBM1 in terms of bandwidth but that is where the comparison ends. The power requirements and space taken up on the PCB by GDDR5/X and its components factor in greatly in the decision for AMD/Nvidia to move forward with HBM in subsequent generations. Pascal and Arctic Islands will be unlikely to carry HBM2 throughout its entire lineup merely due to availability and cost of HBM2. Its not like this will be the case forever. It will eventually be manufactured by Samsung as well which will drive the cost down significantly over time as is always the case once there is a competitor. HBM1 is currently viewed as a gimmick because the full bandwidth isn't in use yet. The upcoming generations (the high end dies) though are due to pack a hell of a lot more power and will overcome GDDR5X and HBM1's bandwidth making HBM2 and beyond necessary. Remember, HBM2 is far greater of a leap in terms of capacity and bandwidth. GDDR5, even in the X variety can't hold a candle to HBM2.


----------



## Thoth420

I just hope the next flagship single gpus with hbm will still be about the same dimensions as the fury x. My loop has just enough space because it's so small.


----------



## DMatthewStewart

Quote:


> Originally Posted by *hyp36rmax*
> 
> 
> 
> 
> Are we getting closer...?


Im asking the same thing about a Lightning version. Do we know anything about that? Or are we going to get a 390x Lightning? I saw the 980Ti version just came out.

Also, do you think they will release a naked version or card only? Because Im thinking its a waste to pay for the pump, stock block, rad, the fan, etc. Im tearing all that stuff off when I get it

*But my main question* (and concern), bc you'll prob know this, do you know anyone running the Fury X on the Crosshair V Formula Z mobo? I want to try and figure out if they have experienced any problems. I wanted to build an i7 next year so Im wondering if I should wait to get Fury's until Im ready to build that rig.


----------



## Gamedaz

* Thats a 380X @ $259, it supposed to compete with the gtx 960 ti possibly.


----------



## Gamedaz

* So it seems that the current Fjiji Cards can be overclocked without much voltage needed?
Quote:


> Originally Posted by *Neon Lights*
> 
> I can play at ~1140MHz and ~570MHz without a volt mod (however, I have not tested this in Crossfire).


* So you've OC to 1140MHZ / 570MHZ on a Fury? What are the temps?

That would help possibly with Nvidia Games that might need to squirt some extra juice to get better texture frame rates if that makes sense.

* I will definatly consider OC some games that would need it.

* Has anyone heard when the New Catalyst (Crimson) software will be released ~ Supposed soon on Nov 24.

I like the idea of a better user interface, but more importantly, the idea that AMD will now support NEW game releases before they come to Market with proper drivers.


----------



## solariss

He might be using a Fury X, that would make more sense. That would be one hell of a Fury if it can go 1140/570 without a volt mod. I can get 1080/550 at most from my Sapphire R9 Fury OC.


----------



## dagget3450

I can confirm with furyX that CF reduces overclocks and stability previously not an issue. Not sure if its just simple logic that each gpu clocks less(differently). I guess i should take some time and test each one by itself.


----------



## fewness

So I hooked up wires to the voltage measuring points....easier job for practice...

Glad I didn't destroy it, but now I'm only measuring 1.16 v core and 1.25 v HBM when running Valley and Heaven. Is my multimeter off or I'm missing some steps here?


----------



## buildzoid

Quote:


> Originally Posted by *fewness*
> 
> So I hooked up wires to the voltage measuring points....easier job for practice...
> 
> Glad I didn't destroy it, but now I'm only measuring 1.16 v core and 1.25 v HBM when running Valley and Heaven. Is my multimeter off or I'm missing some steps here?


Looks like your DMM is of by 0.05V. Try replace it's battery.


----------



## Gumbi

Bit of a noob with thia kind of stuff. I have an analog voltmeter with two probes, if I wanted to read the voltage of my card (290X VaporX) how would I go about doing so?


----------



## Neon Lights

Quote:


> Originally Posted by *solariss*
> 
> He might be using a Fury X, that would make more sense. That would be one hell of a Fury if it can go 1140/570 without a volt mod. I can get 1080/550 at most from my Sapphire R9 Fury OC.


I never said I am using a Fury without the X. I am using two Fury Xs with Aqua Computer water blocks. I have already posted the pictures in this thread: http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/3000_100#post_24214780 (By the way, these pictures are not new enough, I have already exchanged the RAM with 4x4GB 2133MHz CL8)


----------



## Agent Smith1984

So question.... is the Fury pro using a lower vcore than the Fury X, and if so, is that a result of BIOS differences, or hardware differences?

I just don't understand how the Fury X OC results are so much higher than the Fury Pro's..... I can't see them putting that much work into binning, and the temps on my Fury are maxing at around 60-64C at max load for an hour, so there's not a drastic temp difference to explain it either.

Anyone tested actual load voltage on each card to verify??

I am getting 1.168 vcore under load during 4k gaming, which is SO low (mind you, this is at 1060 core clock)..... Is there any power differences in the first and second BIOS?

I have the Sapphire BIOS flashed on switch one, and the stock XFX BIOS left in place on switch two (away from d-ports).


----------



## Neon Lights

Quote:


> Originally Posted by *Gamedaz*
> 
> * So you've OC to 1140MHZ / 570MHZ on a Fury? What are the temps?


I am using Aqua Computer water blocks with a 560mm radiator to cool two Fury Xs and the temps with that overclock - though only using one card - about 32°C. I could turn up the fans more and the temps would go down (I am running 8x 140mm fans, 4 on each side of the radiator, with PWM over the mainboard sets to "level 1" of 10 levels, which I believe is something between 300 and 500 rpm, the fans go up to I believe 1400 rpm).

The overclock settings are, in Sapphire Trixx 5.0.0, ~1140MHz core clock, ~570MHz HBM and +19% power limit.


----------



## Neon Lights

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So question.... is the Fury pro using a lower vcore than the Fury X, and if so, is that a result of BIOS differences, or hardware differences?
> 
> I just don't understand how the Fury X OC results are so much higher than the Fury Pro's..... I can't see them putting that much work into binning, and the temps on my Fury are maxing at around 60-64C at max load for an hour, so there's not a drastic temp difference to explain it either.
> 
> Anyone tested actual load voltage on each card to verify??
> 
> I am getting 1.168 vcore under load during 4k gaming, which is SO low..... Is there any power differences in the first and second BIOS?
> 
> I have the Sapphire BIOS flashed on switch one, and the stock XFX BIOS left in place on switch two (away from d-ports).


It seems at least to me to be because of the BIOS because it has been said in this thread that the Fury X gets up to about 1.3V under load and if your card is really only getting 1.168V, then that would be very low and it would explain why the overclocking results would be worse.

I would not say that it is because of the card or binning because the PCB design is the same on the Fury and Fury X cards and the ASUS Fury Strix has been overclocked to ~1400MHz core clock and ~1000MHz HBM.


----------



## gupsterg

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So question.... is the Fury pro using a lower vcore than the Fury X, and if so, is that a result of BIOS differences, or hardware differences?


Quote:


> Originally Posted by *Neon Lights*
> 
> It seems at least to me to be because of the BIOS because I has been said in this thread that the Fury X gets up to about 1.3V under load and if your card is really only getting 1.168V, than that would be very low and it would explain why the overclocking results would be worse.


From what I recall when comparing Fury PowerPlay table to FuryX, FuryX has higher voltage between the GPU DPM voltage markers. Convert the pairs of HEX to DEC (also do the endian conversion before hand) and you will see 3 values per DPM. On Hawaii there was also a section in PowerPlay table with 3 voltages per GPU DPM state. These had no effect , the way they were in PowerPlay differed to the Fury/X.

The way you guys are describing difference between a card flashed with Fury or X bios says to me it has an effect on Fury. I did suspect this as those 3 differing values are between the GPU DPM voltage markers, which they weren't on Hawaii.


----------



## Decade

Wow... just moved my Sapphire R9 Fury X from my Air 240 into a new Core 500 and it's suddenly developed severe coil whine... sounds like an angry wasp trapped in a can.
Also just switched to an RM650 which appears to show no unusual noise when loading just the CPU.


----------



## xer0h0ur

Quote:


> Originally Posted by *Decade*
> 
> Wow... just moved my Sapphire R9 Fury X from my Air 240 into a new Core 500 and it's suddenly developed severe coil whine... sounds like an angry wasp trapped in a can.
> Also just switched to an RM650 which appears to show no unusual noise when loading just the CPU.


Methinks the PSU change is at fault on this one.


----------



## Decade

Quote:


> Originally Posted by *xer0h0ur*
> 
> Methinks the PSU change is at fault on this one.


Yep, just tore this thing apart and can hear it from the PSU exhaust mesh.
Ugh. RMA to Corsair or return to Fry's? I'll be running my EVGA 750 externally for now.


----------



## xer0h0ur

Although manufacturers always say to return the item to them I never bother with using manufacturer warranties unless I absolutely have to. If its within the buy period from a retailer I will always just have them change it out for me then they can deal with returning the defective item to the manufacturer to get credited for it.


----------



## Decade

Considering I purchased it about 4 hours ago, it's well within the return window.

Just a hassle to completely pull a power supply from a small form factor system, but the noise is unbearably loud and borders on "So loud that I'll damage hearing if I was using headphones" loud.


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> Although manufacturers always say to return the item to them I never bother with using manufacturer warranties unless I absolutely have to. If its within the buy period from a retailer I will always just have them change it out for me then they can deal with returning the defective item to the manufacturer to get credited for it.


I agree with this! You're also guaranteed to get a new one in the box if you go back to a legitimate retailer.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Methinks the PSU change is at fault on this one.


Edit: Soon as I saw Corsair RM...my spoder sense started ***pling.


----------



## xer0h0ur

Quote:


> Originally Posted by *hyp36rmax*
> 
> I agree with this! You're also guaranteed to get a new one in the box if you go back to a legitimate retailer.


I actually completely forgot about that one. If you return the item to the manufacturer you're far more likely to get a refurbished unit than a new power supply in return while if you go back to the retailer the worst case scenario is a refund instead of a brand new PSU.


----------



## Decade

Quote:


> Originally Posted by *xer0h0ur*
> 
> I actually completely forgot about that one. If you return the item to the manufacturer you're far more likely to get a refurbished unit than a new power supply in return while if you go back to the retailer the worst case scenario is a refund instead of a brand new PSU.


I'll do the refund option at Fry's tomorrow considering I just picked it up today.
I'll then drive 30 miles to Micro Center to grab an EVGA SuperNova 550 G2 for the same price.
My 750 G2 never gave me a single issue outside of just being too big for the Fractal Core 500.


----------



## Thoth420

Quote:


> Originally Posted by *Decade*
> 
> I'll do the refund option at Fry's tomorrow considering I just picked it up today.
> I'll then drive 30 miles to Micro Center to grab an EVGA SuperNova 550 G2 for the same price.
> My 750 G2 never gave me a single issue outside of just being too big for the Fractal Core 500.


Good choice!


----------



## Decade

Quote:


> Originally Posted by *Thoth420*
> 
> Good choice!


Gonna order one on the cheap at a later date. Fry's had an Antec HCG-620M in stock... hidden behind some 520w version.
EVGA is definitely a better unit, but all I can hear now is my Fury X's normal (and rather quiet) coil whine that gets drowned out with audio. And it's unlikely the Antec unit is going to blow up my rig anyways.
We'll see how loud the PSU fan gets, but I can deal with fan noise better than insane whine.

Now that it's been tested... time to put my Core 500 back together.


----------



## Thoth420

Quote:


> Originally Posted by *Decade*
> 
> Gonna order one on the cheap at a later date. Fry's had an Antec HCG-620M in stock... hidden behind some 520w version.
> EVGA is definitely a better unit, but all I can hear now is my Fury X's normal (and rather quiet) coil whine that gets drowned out with audio. And it's unlikely the Antec unit is going to blow up my rig anyways.
> We'll see how loud the PSU fan gets, but I can deal with fan noise better than insane whine.
> 
> Now that it's been tested... time to put my Core 500 back together.


Yep the G2 and P2 units are SuperFlower and are fantastic. Never had a coil whine issue since using them.

Props to username: Shilka (dude is like PSU Yoda)


----------



## fewness

Quote:


> Originally Posted by *buildzoid*
> 
> Looks like your DMM is of by 0.05V. Try replace it's battery.


I knew this wouldn't be an easy job....thanks again. Hopefully I can come back with some progress...


----------



## Luftdruck

Hey guys!

Did someone already try to flash the Fury X's BIOS on a Nano?


----------



## fat4l

It probable has been answered already but ....
Are all the FuryX, a reference pcb ?


----------



## Ceadderman

Yes.

~Ceadder


----------



## buildzoid

Quote:


> Originally Posted by *Luftdruck*
> 
> Hey guys!
> 
> Did someone already try to flash the Fury X's BIOS on a Nano?


AFAIK no and I don't recommend trying it. The cards use completely different voltage controllers and power management systems.


----------



## Medusa666

Hey guys, can I change the dip switch for the LED color on the Fury X and Sapphire Fury while the card is on? Or do I need to turn off the PC before I do it?

Edit: Oh, and what colors are available on the Sapphire Fury? I can only get blue or purple on my card, isn't there supposed to be red?

Thank you : )


----------



## Gundamnitpete

Hey guys!

A few days ago one of my 290's in my crossfire 290 setup died on me. It was an old mining card so I figured that might happen.

I decided to go back to single card and bought an Asus Strix Fury. I was mighty impressed! I ran a few benchmarks in TW3 and it average frame rates were nearly the same as my crossfire setup. Minimum frame rates were higher, maximum were lower, but average was within 2-3 fps. Also, I never noticed how much microstutter I was putting up with! Single card is so smooth.

Now onto the issues I'm having:

I'm thinking my card is dead. Artifacts appear even at stock clocks. Often there will be a small, bright red "orb" looking thing that flashes for a second. Textures can sometimes get stretched all the way across the screen. And most importantly, Games will crash regularly. TW3, FO4, and even Titan fall have all locked up on me. Windows give me a "display driver has crashed and recovered" message.

At first I thought this was a problem with my screen OC tool(I have a Qnix 2710). I was on 15.11 drivers and the crashes kept happening. I uninstalled using AMD's method, and also used DDu later on, and reinstalled drivers multiple times. It was still crashing on me. I reverted back a few driver revisions but that didn't really seem to help.

Did I get a back card? It's been like this since day one. I bought it at microcenter so I'm thinking I can take it back and exchange for another or get a refund and buy a new one. Would you all agree?


----------



## buildzoid

Quote:


> Originally Posted by *Gundamnitpete*
> 
> Hey guys!
> 
> A few days ago one of my 290's in my crossfire 290 setup died on me. It was an old mining card so I figured that might happen.
> 
> I decided to go back to single card and bought an Asus Strix Fury. I was mighty impressed! I ran a few benchmarks in TW3 and it average frame rates were nearly the same as my crossfire setup. Minimum frame rates were higher, maximum were lower, but average was within 2-3 fps. Also, I never noticed how much microstutter I was putting up with! Single card is so smooth.
> 
> Now onto the issues I'm having:
> 
> I'm thinking my card is dead. Artifacts appear even at stock clocks. Often there will be a small, bright red "orb" looking thing that flashes for a second. Textures can sometimes get stretched all the way across the screen. And most importantly, Games will crash regularly. TW3, FO4, and even Titan fall have all locked up on me. Windows give me a "display driver has crashed and recovered" message.
> 
> At first I thought this was a problem with my screen OC tool(I have a Qnix 2710). I was on 15.11 drivers and the crashes kept happening. I uninstalled using AMD's method, and also used DDu later on, and reinstalled drivers multiple times. It was still crashing on me. I reverted back a few driver revisions but that didn't really seem to help.
> 
> Did I get a back card? It's been like this since day one. I bought it at microcenter so I'm thinking I can take it back and exchange for another or get a refund and buy a new one. Would you all agree?


Yeah sounds like a bad card. I'd return it.


----------



## xer0h0ur

I would recommend trying BradleyW's manual driver removal instructions to manually clear registry keys left behind that even DDU doesn't remove. If that doesn't do it then more than likely you are dealing with a hardware issue.


----------



## Gundamnitpete

Quote:


> Originally Posted by *buildzoid*
> 
> Yeah sounds like a bad card. I'd return it.


Thanks for the input!
Quote:


> Originally Posted by *xer0h0ur*
> 
> I would recommend trying BradleyW's manual driver removal instructions to manually clear registry keys left behind that even DDU doesn't remove. If that doesn't do it then more than likely you are dealing with a hardware issue.


Thanks! I actually saw you post about that earlier in the thread (Yes, I read all +500 pages over the last few days, haha. Dat new hardware hype doe).

Did you mean this one specifically? http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers

If so, I'll try that when I get off work tomorrow morning around 7am. Thanks for the input!


----------



## Gamedaz

Quote:


> Originally Posted by *Medusa666*
> 
> Hey guys, can I change the dip switch for the LED color on the Fury X and Sapphire Fury while the card is on? Or do I need to turn off the PC before I do it?
> 
> Edit: Oh, and what colors are available on the Sapphire Fury? I can only get blue or purple on my card, isn't there supposed to be red?
> 
> Thank you : )


There a Manual for the Fury X which shows what dip switch does what.

DIP# 1 & 2

1=on 2=on = RED + BLUE (BOTH ON)
1=on 2=off = BLUE ONLY

* I find it a convenient feature when you trying to remove variables when evaluating your hardware.


----------



## dagget3450

It appears the led color switches only apply to the load indicator leds and not the radeon logo?


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> It appears the led color switches only apply to the load indicator leds and not the radeon logo?


That's correct.


----------



## solariss

Had to RMA my Sapphire Fury. My display would randomly become illegible a few times a day. Contacted Sapphire support and they said it was a hardware issue. I had this exact same problem with my old EVGA GTX 670 when I first bought it, ended up being a bad batch of silicon that made it through QA. Makes me wonder if there was a batch of bad silicon for the Fury's because I see a bunch of other people having the same issue.


----------



## Agent Smith1984

Oh my....

http://www.newegg.com/Product/Product.aspx?Item=N82E16814161475&cm_re=r9_fury-_-14-161-475-_-Product


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh my....
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814161475&cm_re=r9_fury-_-14-161-475-_-Product


Not bad, but still room to fall IMO.


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> Not bad, but still room to fall IMO.


I scored my XFX Fury at $520.... which is still too much for this card in my opinion, but with the GTX 980 still selling for $500, and this thing trumping it.... it was a no brainer....
I couldn't resist the urge...... I actually have this card for sale right now at a good price, but that's only cause it's kind of boring with no tweakability


----------



## p4inkill3r

Selling the Fury already?


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> Selling the Fury already?


Yeah... unless we get some voltage control soon










$485 in the market section right now.... open to trades + cash also.


----------



## Gamedaz

* I hope AMD releases a driver that can OC all Fury Cards.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> * I hope AMD releases a driver that can OC all Fury Cards.


We can already OC all of the Fury cards....

It's the lack of voltage control through 3rd party apps that are killing the OC ability of these cards....

I mean.... my card will get through firestrike right now at 1085MHz with the vcore reporting 1.168v!! If I could get that in the 1.25-1.3 range, I'd probably be able to hit a minimum of 1150MHz with more voltage, which isn't ground breaking, but overclocking the core seems to offer up a good bit of performance increase on these cards.....

I have never seen voltage control take this long for any GPU.... at this point, I am ALMOST ready to just assume it won't be happening.
Last I read, unwinder won't even comment on it anymore.... yet we know it IS possible, since it has been done....


----------



## fat4l

Quote:


> Originally Posted by *Agent Smith1984*
> 
> We can already OC all of the Fury cards....
> 
> It's the lack of voltage control through 3rd party apps that are killing the OC ability of these cards....
> 
> I mean.... my card will get through firestrike right now at 1085MHz with the vcore reporting 1.168v!! If I could get that in the 1.25-1.3 range, I'd probably be able to hit a minimum of 1150MHz with more voltage, which isn't ground breaking, but overclocking the core seems to offer up a good bit of performance increase on these cards.....
> 
> I have never seen voltage control take this long for any GPU.... at this point, I am ALMOST ready to just assume it won't be happening.
> Last I read, unwinder won't even comment on it anymore.... yet we know it IS possible, since it has been done....


and what programes have you tried ?
No sucess with iTurbo ?


----------



## xer0h0ur

Quote:


> Originally Posted by *Gundamnitpete*
> 
> Thanks for the input!
> Thanks! I actually saw you post about that earlier in the thread (Yes, I read all +500 pages over the last few days, haha. Dat new hardware hype doe).
> 
> Did you mean this one specifically? http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
> 
> If so, I'll try that when I get off work tomorrow morning around 7am. Thanks for the input!


Yup, that is the one.


----------



## Gamedaz

* The Fury Tripple D have excellent Heat spreading Capabilities, are they underestimating their new reference design? aside from the circuit board and VRM, if these cards can take up to 1.25v @1150MHZ that close as your gonna get to a 980ti @ 82c?? If temps stay below 78c well cooled will it really cook the card over time? Would'nt the Memory surface act as a buffer to distibute heat evenly actoss the GPU/Memory Die?
A shop told me the Fury X was liquid cooled because the temps where too hot @ 82c.

I'm sure AMD can play with the VRMS inside and actuallow allow a voltage increase but we won't see it in the panel, w'ell see some sort of Oscilliscope Eq;s inside the voltage area where take 1.1v and make 1.25 free? How does my card stay at 52c when it was running 62c before I set the temps, maybe the temp control is actually the voltage control interpreted by AMD catalyst. I'm sure these cards can throttle (like nvidias) up to 1150MHZ without reducing the life of the card. Memory is a huge neglect when it comes to cooling, sure VRM need heat damping, but memory is more critical for cooling am I correct?


----------



## Alastair

Quote:


> Originally Posted by *Gamedaz*
> 
> * The Fury Tripple D have excellent Heat spreading Capabilities, are they underestimating their new reference design? aside from the circuit board and VRM, if these cards can take up to 1.25v @1150MHZ that close as your gonna get to a 980ti @ 82c?? If temps stay below 78c well cooled will it really cook the card over time? Would'nt the Memory surface act as a buffer to distibute heat evenly actoss the GPU/Memory Die?
> A shop told me the Fury X was liquid cooled because the temps where too hot @ 82c.
> 
> I'm sure AMD can play with the VRMS inside and actuallow allow a voltage increase but we won't see it in the panel, w'ell see some sort of Oscilliscope Eq;s inside the voltage area where take 1.1v and make 1.25 free? How does my card stay at 52c when it was running 62c before I set the temps, maybe the temp control is actually the voltage control interpreted by AMD catalyst. I'm sure these cards can throttle (like nvidias) up to 1150MHZ without reducing the life of the card. Memory is a huge neglect when it comes to cooling, sure VRM need heat damping, but memory is more critical for cooling am I correct?


Shop clearly has no idea what they are talking about. Fury x is liquid cooled because AMD wanted it to run cooler and quieter than the competition. Too bad that backfired cause they got noisy pumps instead. They should of let Sapphire put their Tri-X on full phat Fury X, because the extra cores would not of added much heat into the mix and the cooler would still handle it with miles to spare.


----------



## dagget3450

I realize there is a lot of hate for FuryX. The thing is i love the card myself. The AIO may not be the best when compared to a full custom water loop but it damn close. My pumps are quiet and after handling they feel like the best constructed GPU i have ever bought. I guess i am just either easy to please or cannot find fault where others have. The two biggest things i wish for are voltage control and less Gamework titles.
Looks like those are pipe dreams.


----------



## Otterfluff

Ok so I am fed up with where my trim pots are they are a pain to access and turn inside of my loop. I have lost fine tip screwdrivers for turning them in my case too many times and there is one in a radiator under my fans I still cant find.









So I was thinking of just terminating my wires into a rj45 socket then running cat5 network cables from the card itself to outside the case then terminating into a box where the trimpots, lcd and whatever else would be housed. My question is would running the wires that long from the card be a bad idea?

I also might lower the resistor combo on the HBM memory so I can hit a little higher than 1.4V just to test it for higher clock stability above 630Mhz. would you try 10ohm?

Another thing I want to get done is to hot glue the coil whine. It is not loud but it's defiantly louder than before I took off the stock fury X box. The waterblock dose not muffle it like the stock cooler did.


----------



## p4inkill3r

Quote:


> Originally Posted by *dagget3450*
> 
> I realize there is a lot of hate for FuryX. The thing is i love the card myself. The AIO may not be the best when compared to a full custom water loop but it damn close. My pumps are quiet and after handling they feel like the best constructed GPU i have ever bought. I guess i am just either easy to please or cannot find fault where others have. The two biggest things i wish for are voltage control and less Gamework titles.
> Looks like those are pipe dreams.


The only people hating on the Fury X are nvidiots, pay them no mind.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Ok so I am fed up with where my trim pots are they are a pain to access and turn inside of my loop. I have lost fine tip screwdrivers for turning them in my case too many times and there is one in a radiator under my fans I still cant find.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So I was thinking of just terminating my wires into a rj45 socket then running cat5 network cables from the card itself to outside the case then terminating into a box where the trimpots, lcd and whatever else would be housed. My question is would running the wires that long from the card be a bad idea?
> 
> I also might lower the resistor combo on the HBM memory so I can hit a little higher than 1.4V just to test it for higher clock stability above 630Mhz. would you try 10ohm?
> 
> Another thing I want to get done is to hot glue the coil whine. It is not loud but it's defiantly louder than before I took off the stock fury X box. The waterblock dose not muffle it like the stock cooler did.


I considered designing something like that and there should be no issues unless the wire ads more than 2.5 ohms of resistance in both directions. Also I strongly recommend switching from trimmers to potentiometers. Pots are adjustable by hand.

I really wouldn't go over 1.4V on HBM for 24/7. I can't find any detailed datasheets from Hynix about what the limits of HBM are. I suspect that it will die above 1.5V. The resistor you would use to reach that would indeed be 10ohm.


----------



## xer0h0ur

Quote:


> Originally Posted by *Alastair*
> 
> Shop clearly has no idea what they are talking about. Fury x is liquid cooled because AMD wanted it to run cooler and quieter than the competition. Too bad that backfired cause they got noisy pumps instead. They should of let Sapphire put their Tri-X on full phat Fury X, because the extra cores would not of added much heat into the mix and the cooler would still handle it with miles to spare.


This ^

Even Nvidia allowed AIBs to put a different cooler and add a backplate on the Titan X which was reference locked before. IMO this is one of the dumbest things AMD did on this generation and that is overlooking getting rid of the Dual link DVI port.


----------



## Gamedaz

Quote:


> Originally Posted by *xer0h0ur*
> 
> This ^
> 
> Even Nvidia allowed AIBs to put a different cooler and add a backplate on the Titan X which was reference locked before. IMO this is one of the dumbest things AMD did on this generation and that is overlooking getting rid of the Dual link DVI port.


* I actually intended to say : *The shop stated there was no FuryX Air cooled because it would run too hot with all the Cores.*

* But I think my Fury can OC to 1150 with decent temps @ 78c which is Normal for a 28nm Die.

If AMD can allow in their radeon software to OC based on temps, then Maybe 1125MHZ is a decent OC with stable temps under 76c possibly?


----------



## xer0h0ur

There are plenty of air coolers with a cooling capacity well beyond the heat output of a Fury X. Hell, an AIB just debuted an air cooler capable of handling up to 700W of heat. Its a load of crap to not allow an air cooled Fury X.


----------



## Gamedaz

* I feel GPU's in general should sway from a Dated Air cooling solution. * Its time to move forward and start using Liquid cooling solutions, I don't think I'll ever buy an Air cooled card, too much unecessary Heat inside the case which requires TONS of fans to expel, Simple Liquid cooled CPU GPU solution, improve Heat capabilities.

The only reason why I haven't gone Liquid cooled is because my Mini itx case will not fit the Fury X liquid Cooled (it actually will fit with a small mod, but prefered to stay with the XFX air cooled, which has this 12.5" Long Finned heatpipe which is an engineering accomplishment IMO. 3 Fan cooling provides the best air cooling distribution efficiency etc.


----------



## Otterfluff

Quote:


> Originally Posted by *xer0h0ur*
> 
> This ^
> 
> Even Nvidia allowed AIBs to put a different cooler and add a backplate on the Titan X which was reference locked before. IMO this is one of the dumbest things AMD did on this generation and that is overlooking getting rid of the Dual link DVI port.


I actually have a Qnix 1440p 120Hz Monitor on my desk next to my 40" 4k Phillips and the only reason I am not using it right now is because AMD overlooked a DVI port on the Fury X.

I would love a display-port adapter for it but they are expensive. Any updates on a adapter that works and dose not cost $200+?


----------



## xer0h0ur

Well the problem is that it wasn't an oversight by AMD. They had already planned on phasing out the DVI port since a long time ago. No one knew when it would happen but they decided on this generation. The head scratcher for me was not including HDMI 2.0.


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well the problem is that it wasn't an oversight by AMD. They had already planned on phasing out the DVI port since a long time ago. No one knew when it would happen but they decided on this generation. The head scratcher for me was not including HDMI 2.0.


Agreed about the HDMI 2.0. The only people I can see who take issue with lack of a DVI-D are owners of Korean OC'd 1440 PLS panels etc.
Anyone on 1080 doesn't need a Fury X(not sure what ports are on the Fury(s) or the Nano).


----------



## Otterfluff

I did try to buy a Monoprice DP to DVI adapter and it works but it has terrible screen flickering which makes me not use it. This adapter was only $35 but it's a 4k @ 60Hz adapter.

I have no idea if spending more money on a better adapter will even work right, it's a pain and I still do not know what I should do with it.


----------



## xer0h0ur

If Arctic Islands doesn't have HDMI 2.0 I will personally fling poop in AMD's direction but I really would love to see DisplayPort 1.3 finally make its way into video cards on that generation or else I may very well skip Arctic Islands as well. Daddy wants 120Hz @ 4K


----------



## Thoth420

Quote:


> Originally Posted by *Otterfluff*
> 
> I did try to buy a Monoprice DP to DVI adapter and it works but it has terrible screen flickering which makes me not use it. This adapter was only $35 but it's a 4k @ 60Hz adapter.
> 
> I have no idea if spending more money on a better adapter will even work right, it's a pain and I still do not know what I should do with it.


Acer 2560 x 1440 IPS Freesync right around the corner. XF270HU I believe using the same panel as the XB270HU and Asus MG279Q. Still no word on Freesync Range though.


----------



## Otterfluff

I do not intend to go back to 1440p, I am hooked on 40" 4k @ 60Hz.

It's just sad that I cant use my old 27" 1440p @ 120Hz for anything, it seems like such a waste.


----------



## Thoth420

Quote:


> Originally Posted by *Otterfluff*
> 
> I do not intend to go back to 1440p, I am hooked on 40" 4k @ 60Hz.
> 
> It's just sad that I cant use my old 27" 1440p @ 120Hz for anything, it seems like such a waste.


Ah gotcha. I haven't bothered to mess with 4K yet because I am diehard single GPU.
TV I am still on 1080 60hz


----------



## Gamedaz

* Does anyone here know how The AMD Fury GPU's work with a 120HZ refresh HDTV?

* Its seems that AMD have unlocked the Functional capability of that feature in my HDTV set, as otherwise it was not available with the 780 ti GPU I had, AMD card also activates x.v color on my display which I assume is similar to deep color my display is capable of as well.

* I like the idea of doubling the frame rate to get a smoother experience during game play, no soap opera effect, as it actually mimics higher framer rates than actually is, which is why I've decided to keep that feature enabled.

Did AMD enable this feature to improve and create a smoother rendering of images? Instead of relying on OC's?

There is a latency limit if the term exist that prevents certain latencies under 40ms from being percieved as stutter or flicker, but I notice stutter (Background stutter mostly ) On Sniper Ghost Warrior, not sure if its the engine and AMD or even low frame rates but the 120HZ refresh still isnt enough to reduce that much latency or stutter. Which is subtle, maybe an O.C of the core and memory will reduce this, this is only 1080p as well, but can still be a big push in some games.

* I would suspect AMD unrefined drivers for DX11 possibly, although my GTX 780ti has similar issues.


----------



## Gundamnitpete

Quote:


> Originally Posted by *Gundamnitpete*
> 
> Thanks for the input!
> Thanks! I actually saw you post about that earlier in the thread (Yes, I read all +500 pages over the last few days, haha. Dat new hardware hype doe).
> 
> Did you mean this one specifically? http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
> 
> If so, I'll try that when I get off work tomorrow morning around 7am. Thanks for the input!


Update!

I tried the complete registry driver removal, and that didn't solve the problem. So today I went to Microcenter and exchanged for a new one. It looks like my issues have been solved!

And, my new Fury OCs, where as my last one wouldn't(or course it wasn't stable at all). I seem to be able to run 1078mhz on stock voltages which gives me 47FPS in Unigine Heaven at 1440P with maxed settings, up from 44.6 on stock clocks. And improvement of around 5%!


----------



## MerkageTurk

Hi fellow members, just sold my 980 ti for £559, was thinking of fury,x,nano,390x?

What do you recommend?


----------



## MrKoala

Rest of the hardware? What games to play?


----------



## hyp36rmax

Quote:


> Originally Posted by *Otterfluff*
> 
> I did try to buy a Monoprice DP to DVI adapter and it works but it has terrible screen flickering which makes me not use it. This adapter was only $35 but it's a 4k @ 60Hz adapter.
> 
> I have no idea if spending more money on a better adapter will even work right, it's a pain and I still do not know what I should do with it.


Accel has a 1.2DP to HDMI 2.0 Adapter soon. At least in production in the next couple months.

*Source:*

__
https://www.reddit.com/r/3sqdj5/accel_dp12_to_hdmi_20_adapter_soon_for_amd_4k/cwzq8k8


----------



## MerkageTurk

I7 5820k
Rampage v
16gb Ddr4 16gb 3200 dominator

Gta v 30MINS
Bf4 20mins
Civ v hour
Rome 2 and Atilla 45min
FSX 5mins
MORTAL KOMBAT 10mins


----------



## MerkageTurk

Just ordered a fury x


----------



## Cool Mike

Waiting on The fury X2. Hoping for a release within 30 days.


----------



## HagbardCeline

So the main reason I built my new computer was video editing/rendering, and I picked the Radeon card because of the OpenCL support. The software I use is Sony Vegas. A couple versions ago they released a test project that you could download and benchmark. My old computer took 5.5 minutes to render it. New computer: 20 seconds!. Now if I set Vegas to render without the GPU helping, it goes from 20 seconds to 2mins 20 seconds. The Fury X makes it render 10x faster basically. Phew.


----------



## Elmy

Quote:


> Originally Posted by *Cool Mike*
> 
> Waiting on The fury X2. Hoping for a release within 30 days.


I got 2 of them coming when they come out....but I can't say when....


----------



## xer0h0ur

Quote:


> Originally Posted by *Elmy*
> 
> I got 2 of them coming when they come out....but I can't say when....


I hate you. The Jelly is real.


----------



## WheelZ0713

Any word on Voltage unlocks yet? Software only, i don't have the balls for any physical modding.


----------



## Gamedaz

* Software update could be out in 10 days, with possible O.C Utility etc. Maybe they'll support O.C to 1100MHZ Some people have posted stable clocks with 1.2v+, this could be possible to implement in the Utility.


----------



## ozyo

Quote:


> Originally Posted by *Gamedaz*
> 
> * Software update could be out in 10 days, *without* possible O.C Utility etc. Maybe they'll support O.C to 1100MHZ Some people have posted stable clocks with 1.2v+, this could be possible to implement in the Utility.


fixit








btw my cards stable @ 1,145 MHz stock voltage & power limit
http://www.3dmark.com/fs/6498661
not sure about games


----------



## Gamedaz

*Impressive, I'd like to see my XFX Fury R9 do those clocks, if they'red released!


----------



## Agent Smith1984

Anybody looking a Fury....

XFX Triple D - SPECIAL PRICE OF $470 if you buy today...
http://www.overclock.net/t/1580010/xfx-triple-dissipation-r9-fury-3-weeks-old-with-everything-included

You won't find that kind of price anywhere


----------



## kayan

I've got a Fury X on its way to me. I am so looking forward to going back to a single GPU again. Now I just have to sell my 295, heh.

I'll test it and see what it can do.


----------



## MerkageTurk

None of the cables supplied work with my xl2420t even using the displayport dvi convert

No display on screen, hdmi is not working too

just ordered display port > displayport


----------



## Greenland

Quote:


> Originally Posted by *Gamedaz*
> 
> * Software update could be out in 10 days, with possible O.C Utility etc. Maybe they'll support O.C to 1100MHZ Some people have posted stable clocks with 1.2v+, this could be possible to implement in the Utility.


Got any sources on that?


----------



## Agent Smith1984

Latest news on Fury voltage control:

There is none









Seriously the only reason why mine is fore sale.... and I don't know why it even bothers me that bad, it just does....

I like to push stuff to it's limits..... Sometimes like benching more than the actual gaming.... just me I guess


----------



## Gamedaz

* I see how GPU makers have neglected voltage control, but Fury Cards have them in the BIOS!!, the Saphire Fury Tx has Dual BIOS, for what reason?????>>>>to O.C. tha CARD!!! Brick your BIOS Voltage specs, then switch to the other BIOS, issues with memory clocks ~ Switch Bios - Its a real feature that isnt politically correct to sell due to the misuse and damage it can cause the card.

* To implement a feature in a card that is specific to voltage control is like releasing a vehicle without a steering wheel and say you have two mounting options in the car to use a Steering wheel with?

* And seeing by all the O.C'ers in this post, it seems almost impossible to brick the cards because they're well cooled!!! The Fury X is Liquid cooled, it will take the doors off a 9890ti and still last 3-5 years without any issues at all (IMO). The Tester shows the cards can handle temps well into the 1.24v range possibly 1.3v, with temps staying well under the specs for possibly any 28nm Die (Which I beleive is 90c) so unless the O.C.'s are sending temps shooting above 82c ( They should run all day with no issues O.C.)

Thats why I would be satisfied if AMD just simply implemented an OC based on temps because more voltage = more Temps etc. Set your Temps threshold, and let AMD programmed OC utility set the clocks, the XFX is a well cooled card, especially if you have Dual 120mm Fans pushing air onto the card like I have, my Temps reached AT WORST CASE 25% FANS 82c???? I might as well trun the fans off that how well the card cools the GPU!!


----------



## Clockster

If anyone is interested..









http://www.overclock.net/t/1580719/catalyst-15-11-1-beta-driver-for-windows-with-star-wars-battlefront-optimizations


----------



## fewness

Absolutely the best of the best of the best I can do on my Fury X Crossfire....



That, with the help of a 4.625G 5960X, I'm getting:



on FSU (http://www.3dmark.com/3dm/9308347)

Results of my trial and error runs:


----------



## Gamedaz

1190MHZ is a decent clock, my 78ti throttled up to 1200MHZ 1300 could be speradic and useless (IMO)


----------



## Otterfluff

Quote:


> Originally Posted by *fewness*
> 
> Absolutely the best of the best of the best I can do on my Fury X Crossfire....
> 
> 
> 
> That, with the help of a 4.625G 5960X, I'm getting:
> 
> 
> 
> on FSU (http://www.3dmark.com/3dm/9308347)
> 
> Results of my trial and error runs:


Very nice, my best Fsu score was 8415 with card 1 1200 630, card 2 1150 630 on a i7 4790k @ 4.8ghz

I am jelly of your haswel-e!









Ive been playing falllout 4 on a single fury @ 4k maxed and it seems pretty decent but there is alot of tweaking you can do to make it run better. The new beta driver out added some more frames, bordless window, enabled multithreading on all four cores, cpu priority, god rays off, Disabling Vsync -> frame control via rivatuner and I overclocked my ddr3 1600 to 2000 and that added another 10 min/max frames in itself. I am pulling 55-60 outdoors with drops to 46fps at times which is around 20 fps higher on average than on release before tweaking.

Really need to dig around with tweaking to get it running well but its very playable. If crossfire worked it would be a dream.


----------



## xer0h0ur

Honestly that is one of the biggest issues that the Radeon Technologies Group needs to tackle head on. They need to start pumping out crossfire profiles fast and furious at the launch if not before launch of game titles. This is the only way they are going to carve into Nvidia's market share to any real extent.


----------



## Gamedaz

* Do they have the resources do implement this? The new Graphics Division seems like they could expand on that.


----------



## sugarhell

Quote:


> Originally Posted by *xer0h0ur*
> 
> Honestly that is one of the biggest issues that the Radeon Technologies Group needs to tackle head on. They need to start pumping out crossfire profiles fast and furious at the launch if not before launch of game titles. This is the only way they are going to carve into Nvidia's market share to any real extent.


Yeah lets improve a feature that only 1% percent use to gain marketshare vs nvidia. This is so wrong on many things.

They need better low end,mid range gpus with new features. Rebranding will not going to increase any marketshare.

And better drivers. Crossfire profiles are irrelevant


----------



## DMatthewStewart

*Is anyone here running a Fury X or Fury X crossfire with the FX 8350?* Everyone I have seen so far is running an Intel cpu.


----------



## Alastair

Quote:


> Originally Posted by *DMatthewStewart*
> 
> *Is anyone here running a Fury X or Fury X crossfire with the FX 8350?* Everyone I have seen so far is running an Intel cpu.


Sapphire Fury Tri-X crossfire with 8370 @ 4.95GHz.


----------



## Agent Smith1984

Quote:


> Originally Posted by *DMatthewStewart*
> 
> *Is anyone here running a Fury X or Fury X crossfire with the FX 8350?* Everyone I have seen so far is running an Intel cpu.


I'm running an overclocked Fury @ 1080/560 on a 9590 @ 5Ghz on the nose. Runs beautifully...


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah lets improve a feature that only 1% percent use to gain marketshare vs nvidia. This is so wrong on many things.
> 
> They need better low end,mid range gpus with new features. Rebranding will not going to increase any marketshare.
> 
> And better drivers. Crossfire profiles are irrelevant


I disagree with the last part, because we see all the time where people buy two cheaper cards to run in crossfire in the hopes of out performing the more-expensive-than-both flagship models.

Improving the crossfire support, provides potential for even more sales of the mid-range cards.....

Just my opinion though....


----------



## xer0h0ur

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah lets improve a feature that only 1% percent use to gain marketshare vs nvidia. This is so wrong on many things.
> 
> They need better low end,mid range gpus with new features. Rebranding will not going to increase any marketshare.
> 
> And better drivers. Crossfire profiles are irrelevant


You're quite out of the loop if you seriously believe 1% of AMD's dGPUs are crossfired. Its more than that. Hell, the only way you can get away with saying 1% is if you're talking about trifire/quadfire.

You also would be ignoring what AMD is working towards with future generations. They are trying to get APUs and dGPUs to work in tandem so they are banking on crossfire quite a bit and that is without even bringing VR into the picture which is another initiative they have in which they are banking hard on crossfire.


----------



## buildzoid

AMD needs to fix the DX11 single GPU CPU overhead.
4 way Crossfire has similar overhead to a 4 way SLI setup but a single AMD's single GPU over head is about 30% worse than Nvidia. So if for any reason your CPU gets heavily loaded you will suffer greater performance issues than on an Nvidia card.


----------



## xer0h0ur

Quote:


> Originally Posted by *buildzoid*
> 
> AMD needs to fix the DX11 single GPU CPU overhead.
> 4 way Crossfire has similar overhead to a 4 way SLI setup but a single AMD's single GPU over head is about 30% worse than Nvidia. So if for any reason your CPU gets heavily loaded you will suffer greater performance issues than on an Nvidia card.


Where are you pulling this 30% number from? The worst single card performance numbers you can manage to get out of a Fury X versus a Titan X / 980 Ti are at low resolutions and even if you do so its no 30% you're quoting. The performance advantage also gets literally wiped out once its two GPUs. At three GPUs AMD pulls ahead and at 4 GPUs AMD pulls further ahead. Either way you will see this for yourself once AMD and Nvidia each bring to market their pending dual GPU cards.


----------



## Semel

*Sapphire TriXX v5.2.1 is out! AMD Fury Voltage Control*

Quote:


> New features:
> 
> New look and interface
> 
> Now supports over-volting on Radeon R300 series
> 
> Now supports HBM memory overclock on FURY cards
> 
> Now supports over-volt on FURY cards
> 
> Minimise TriXX to task bar


----------



## Agent Smith1984

Quote:


> Originally Posted by *Semel*
> 
> *Sapphire TriXX v5.2.1 is out! AMD Fury Voltage Control*


YANKING FURY SALE THREAD DOWN NOW


----------



## josephimports

Quote:


> Originally Posted by *Semel*
> 
> *Sapphire TriXX v5.2.1 is out! AMD Fury Voltage Control*


Thanks for the update. I'll post my results asap.


----------



## Agent Smith1984

WTH IS IT?????

I see nothing on Sapphire site or anywhere else on this!!


----------



## Semel

http://asia.dl.sapphiretech.com/archive/gm/drivers/SAPPHIRE_TRIXX_installer_5.2.1.exe

Damn.. the interface is fugly..lol..

I'd say it's a very..simplistic tool compared to afterburner features wise.. no offense to the developer..


----------



## buildzoid

The CPU bottleneck shows up either when you hammer the CPU and the GPU(like the Combined test in 3Dmark where a Fury X will consistently be 30% behind a GTX 980Ti even if they get the same Graphics Test and Physics scores).

Also look at the GRID Autosport performance difference here

When using the 290X the 7870K is getting less FPS than when running the GTX 770. The drop between those 2 is huge however the pentium loses significantly less FPS going from the 770 to the 290X. If the drop going from the 770 to the 290X was the same for both CPUs you could just blame driver optimization. However when the CPU with less IPC loses 62% of it's FPS when the other CPU only loses 33% you know it's a GPU driver + CPU problem and not a GPU problem.

If you run 4 Nvidia or 4 AMD cards the performance in CPU impacted tests look identical because Nvidia's driver gets more and more CPU hungry as you add GPUs. AMD's driver starts CPU hungry at 1GPU but the increase in CPU overhead with more GPUs is no where near as bad.

ALSO HOORAY FOR SOFTWARE VOLTAGE CONTROL!


----------



## xer0h0ur

Right so since it performs as such in quite a specific scenario for one game then lets go ahead and make the sweeping generalization that its 30% across the board. Seems legit.

Note nowhere on there do I even find what drivers they were using for this testing. This makes a huge difference as if you say test with the Omega versus a current driver you will in fact note that there was DX11 driver overhead reduction from then till now.


----------



## Semel

How do I monitor my temps in games using trixx?? It's monitoring tool seems to work only as a standalone one.


----------



## p4inkill3r

Quote:


> Originally Posted by *Semel*
> 
> How do I monitor my temps in games using trixx?? It's monitoring tool seems to work only as a standalone one.


Download hwinfo64, you can log temps with it.


----------



## Agent Smith1984

This thread will be dead for the next 6 hours while people juice the crap out of their cards and test max OC's!!!


----------



## xer0h0ur

Holy crap. Anyone watching the news? Ambulance with explosives found outside stadium in Hanover Germany.


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> This thread will be dead for the next 6 hours while people juice the crap out of their cards and test max OC's!!!


I'm stuck at work :/

Where's some Fury X results with that sketchy looking Trixx file?


----------



## buildzoid

Quote:


> Originally Posted by *xer0h0ur*
> 
> Right so since it performs as such in quite a specific scenario for one game then lets go ahead and make the sweeping generalization that its 30% across the board. Seems legit.
> 
> Note nowhere on there do I even find what drivers they were using for this testing. This makes a huge difference as if you say test with the Omega versus a current driver you will in fact note that there was DX11 driver overhead reduction from then till now.


I'll be honest I'm not 100% sure if it's 30% however it is not insignificant. I do have idea for how to test it but I'd need an Nvidia GPU to do it. The last Nvidia card I've had is the GTX 590. I don't plan to change that any time soon so I can't really do the testing. I have a number of ideas on how to do it though.


----------



## Otterfluff

Quote:


> Originally Posted by *Semel*
> 
> http://asia.dl.sapphiretech.com/archive/gm/drivers/SAPPHIRE_TRIXX_installer_5.2.1.exe
> 
> Damn.. the interface is fugly..lol..
> 
> I'd say it's a very..simplistic tool compared to afterburner features wise.. no offense to the developer..


In my experience with hard voltage control the Trixx was more stable than afterburner and worked well when it worked. It seems they sorted out the bugs and it's working flawlessly.
Quote:


> Originally Posted by *Semel*
> 
> *Sapphire TriXX v5.2.1 is out! AMD Fury Voltage Control*


Ahaha this is great. I replicated my core OC of 1150 on both cards with 75mv from Trixx under crossfire and it's stable under FirestrikExtreeme. This confirms that the voltage is unstable from throttling on hard volt mods.

Going to see how high I can take this, after that try mixing hard mods + software.

This is great tho because people are going to get the results as good as I got using hard mods just with more stable software. It means this card really has alot further we can push it we just haven't gotten there yet.


----------



## Semel

Quote:


> Originally Posted by *p4inkill3r*
> 
> Download hwinfo64, you can log temps with it.


I did.. Heaven crashes within 15 seconds after launch but works fine without hwinfo(and worked fine with afterburner).. Am I missing something?

PS Hmm now ti works fine.. I guess changing some riva tuner settings helped..or switching to fullscreen =)


----------



## xer0h0ur

I take back all the trash I was talking about W1zzard holding back Trixx. Great to see some software finally supporting voltage tweaking Fiji.


----------



## buildzoid

Quote:


> Originally Posted by *Otterfluff*
> 
> Ahaha this is great. I replicated my core OC of 1150 on both cards with 75mv from Trixx under crossfire and it's stable under FirestrikExtreeme. This confirms that the voltage is unstable from throttling on hard volt mods.
> 
> Going to see how high I can take this, after that try mixing hard mods + software.
> 
> This is great tho because people are going to get the results as good as I got using hard mods just with more stable software. It means this card really has alot further we can push it we just haven't gotten there yet.


What are your voltage readouts showing. If you still have the mods hooked up +75mv will be more than 1.275V. Also the stock voltage for the Fury X is 1.22V IIRC and that means +75mv would be 1.295V which would be more or less the same as the hard mods no?

The hardmods shouldn't throttle because what they do is skew the voltage sensing circuit. If anything the hardmods should have a linear increase in power limits with voltage. However the IR3567B is a horribly documented monstrosity so that might not be entirely the case.


----------



## Agent Smith1984

So, this is working for everyone then?

I'm still at work









Can't wait to test!!!


----------



## Semel

MEh.. I couldn't get my unlocked fury pass heaven at +45 mv 1150.. When it crashed gpu was at 50-52C at 50% fan speed.

I could run it stable at 1080 without voltage control in most games except witcher 3(1050).

IS it "safe" to use max +75 mv?

I wish we could see vrm temps..neither afterburner nor trixx show vrm temps.perhaps it gets hotter than in hell there lol


----------



## p4inkill3r

Quote:


> Originally Posted by *Semel*
> 
> MEh.. I couldn't get my unlocked fury pass heaven at +45 mv 1150.. When it crashed gpu was at 50-52C at 50% fan speed.
> 
> I could run it stable at 1080 without voltage control in most games except witcher 3(1050).
> 
> IS it "safe" to use max +75 mv?


I've never _not_ jacked the power up to max when I'm initially OCing a GPU.
Walking the voltage back is easier to me than moving it forward incrementally.


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> What are your voltage readouts showing. If you still have the mods hooked up +75mv will be more than 1.275V. Also the stock voltage for the Fury X is 1.22V IIRC and that means +75mv would be 1.295V which would be more or less the same as the hard mods no?
> 
> The hardmods shouldn't throttle because what they do is skew the voltage sensing circuit. If anything the hardmods should have a linear increase in power limits with voltage. However the IR3567B is a horribly documented monstrosity so that might not be entirely the case.


I haven't plugged in the volt lcd's yet but Im about to, ill tell you what I find out. I am about to start fiddling with the volt mods again because I can get 1150 on software 75mv but It crashes at 1170.


----------



## battleaxe

Quote:


> Originally Posted by *Otterfluff*
> 
> I haven't plugged in the volt lcd's yet but Im about to, ill tell you what I find out. I am about to start fiddling with the volt mods again because I can get 1150 on software 75mv but It crashes at 1170.


Is +75mv all it gives you?


----------



## Semel

Quote:


> Originally Posted by *p4inkill3r*
> 
> I've never _not_ jacked the power up to max when I'm initially OCing a GPU.
> Walking the voltage back is easier to me than moving it forward incrementally.


Heaven crashes the moment it loads up if I have +75mv







) I guess my card is just bad overclocking wise even by fury standards lol
Quote:


> Originally Posted by *Otterfluff*
> 
> I haven't plugged in the volt lcd's yet but Im about to, ill tell you what I find out. I am about to start fiddling with the volt mods again because I can get 1150 on software 75mv but It crashes at 1170.


have u unlocked ur fury?

I'm thinking maybe I should switch to the default bios..maybe it could help


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> Heaven crashes the moment it loads up if I have +75mv
> 
> 
> 
> 
> 
> 
> 
> ) I guess my card is just bad overclocking wise even by fury standards lol


With stock clocks?


----------



## buildzoid

Ok here's some basic data so you don't blow up your cards.

Max out the power slider. Even if the VRM for Vcore hits 125C you can shove 500W through it. Also it will throttle the card if it gets too hot. The IR 3567B running the default BIOS won't let you blow the VRM.

+100mV is safe across the board.

+200mv is safe if you keep the core bellow 60C I would not expect clock scaling going past 150/175mV depending on you card.

Yeah that's basically the least "safe" looking overclocking advice you will ever read however the fact is that the Fury/Fury X PCBs are ridiculously over built.


----------



## Semel

Quote:


> Originally Posted by *battleaxe*
> 
> With stock clocks?


No, of course not =) . I tried @1150 +75mV(max available in trixx) -> Insta crash.

It looks like I've managed to pass heaven benchmark @ 1140 +48mV. GPU temp was sitting at 51-52C in the end. But I'm pretty sure if temps gets closer to 60 it would crash. And I think witcher 3 wouldn't be stable at these settings


----------



## Otterfluff

Ok so at stock my two fury are at 1.2412V and 1.2238V

If I add the software 75mv the jump to 1.3163V and 1.2984V

Adjusting the power Limit slider seems to have no effect, what does that actually do under software voltage?


----------



## Semel

For some reason Trixx changes voltage settings if you set a specific number.. For instance, I set 70 but then it automatically changed it to 66 for some reason


----------



## Otterfluff

Under testing both cards hit 1.312V under load with +75mv, when crossfire is not in use the voltage is a bit different between cards until it gets put to use.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> No, of course not =) . I tried @1150 +75mV(max available in trixx) -> Insta crash.
> 
> It looks like I've managed to pass heaven benchmark @ 1140 +48mV. GPU temp was sitting at 51-52C in the end. But I'm pretty sure if temps gets closer to 60 it would crash. And I think witcher 3 wouldn't be stable at these settings


My point was it didn't crash as a result of the voltage but the clock being too high. Try setting a voltage amount then finding max clock at that voltage. Say +75mv for example.
Quote:


> Originally Posted by *Otterfluff*
> 
> Under testing both cards hit 1.312V under load with +75mv, when crossfire is not in use the voltage is a bit different between cards until it gets put to use.


I'm hoping you can get to 1200 at 1.312. Does that sound feasible?


----------



## Gumbi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Where are you pulling this 30% number from? The worst single card performance numbers you can manage to get out of a Fury X versus a Titan X / 980 Ti are at low resolutions and even if you do so its no 30% you're quoting. The performance advantage also gets literally wiped out once its two GPUs. At three GPUs AMD pulls ahead and at 4 GPUs AMD pulls further ahead. Either way you will see this for yourself once AMD and Nvidia each bring to market their pending dual GPU cards.


This is well documented and you are living in lalaland if you deny it. It's not a nice scenario, even if it doesn't happen all the time. Why do you think a 770 beats a 290X in WoW?


----------



## Otterfluff

I can get two cards in crossfire to 1190 by adjusting power limit down but I cant seem to breech 1200.

On a single card I am running 1200 just fine with 0% power limit. Now testing 1210 and its just completed fsu. Will keep increasing core by 10mhz incriments.


----------



## Semel

Quote:


> the voltage controller's minimum step-size is 6 mV)


That's why Trixx kept changing voltage if it was different.. So it needs to be,say, 24,30,36,42,48 etc
Quote:


> Originally Posted by *Otterfluff*
> 
> I can get two cards in crossfire to 1190 by adjusting power limit down but I cant seem to breech 1200.
> 
> On a single card I am running 1200 just fine with 0% power limit. Now testing 1210 and its just completed fsu. Will keep increasing core by 10mhz incriments.


Power limit down? And ur card didn't throttle? I had to increase power limit to get stable overclock even without voltage control.


----------



## Kaapstad

Fury Xs are very temp sensitive

Custom waterblocks can get them to 1150 on stock volts and they will run @35C

If people are going to use extra volts they need to be vary careful of the running temps.


----------



## Otterfluff

Quote:


> Originally Posted by *Semel*
> 
> That's why Trixx kept changing voltage if it was different.. So it needs to be,say, 24,30,36,42,48 etc
> Power limit down? And ur card didn't throttle? I had to increase power limit to get stable overclock even without voltage control.


My card was not throttling until I hit below -25% power limit. Somewhere between -25% and -30% it starts throttling the voltage. My water cools down to 24C before it hits my card maybe keeping it cool helps?


----------



## Semel

During my heaven run I saw core speed dropping sometimes from 1140 to 760 although i had +50 power limit set(without +50 it dropped it more often)..Maybe my 900W Antec HCG-900 is not enough for 1140/+xx voltage or something







)


----------



## p4inkill3r

Quote:


> Originally Posted by *Semel*
> 
> During my heaven run I saw core speed dropping sometimes from 1140 to 760 although i had +50 power limit set(without +50 it dropped it more often)..Maybe my 900W Antec HCG-900 is not enough for 1140/+xx voltage or something
> 
> 
> 
> 
> 
> 
> 
> )


Throttling due to hot VRMs would be my guess.


----------



## Semel

Yet the card was running at 50C..it's a real bummer we can't monitor VRM temps..


----------



## joeh4384

Anyone rig a sensor or point an IR gun at the VRMs?


----------



## Gamedaz

* The VRMS seem to be a part that is the most sensitive to voltage fluctuation, In my understanding they're there to filter voltage spikes that occur possibly in the ms. PSU's don't always provide baseline voltages, without peaking, the peaks create extra voltage = more heat = possible GPU core throttle down etc. More VRMS = stable filtered voltages = More Vrms required.

* AMD Version of Voltage control in the VRMS are a different approach to filtering those voltage anomolies, since they're only using 6VRM's with a 3500 Stream processors.

The Tech is called :XFX Voltage Control Technology. * So I assume this will be available now or in the next version of their GPU Drivers. * Not sure if it is only implementable in crossfire setups, which is what the XFX website advertises, * I'm sure voltages can be controlled for Single XFX Fury cards as well, if they decide to implement it for single GPU's.

* Why are Fury owners unable to increase their voltage clocks with AMD software? And require trix to do that? My XFX card specifically has that Voltage control feature, but not sure if its Only for Crossfire setups and Memory clocks only.


----------



## Gamedaz

Quote:


> Originally Posted by *Semel*
> 
> No, of course not =) . I tried @1150 +75mV(max available in trixx) -> Insta crash.
> 
> It looks like I've managed to pass heaven benchmark @ 1140 +48mV. GPU temp was sitting at 51-52C in the end. But I'm pretty sure if temps gets closer to 60 it would crash. And I think witcher 3 wouldn't be stable at these settings


* You got the Same Clock with lower voltage settings, that decent.


----------



## Semel

Quote:


> Originally Posted by *Gamedaz*
> 
> * You got the Same Clock with lower voltage settings, that decent.


Not the same







anyways it's not relevant anymore. I didn't pay attention back then and as it seems anything past 1100 leads to my card throttling sometimes when running heaven regardless of power limit set or voltage.Not much, just here and there but it's disappointing. GPU was sitting at 50-52C @1140 or 47C+ with a lower core clock.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gumbi*
> 
> This is well documented and you are living in lalaland if you deny it. It's not a nice scenario, even if it doesn't happen all the time. Why do you think a 770 beats a 290X in WoW?


Isn't WoW a heavily CPU bound game? I don't play MMORPGs anyways to know this with any certainty. Regardless the average person that harps about AMD DX11 CPU overhead keeps pulling up old benchmarks with old drivers. Show me something modern using the current driver set and even better if being tested on Windows 10. I bet you anything its not nearly as exaggerated as this 30% figure. Even when doing ******ed testing like 720p, because we all know people with high end and enthusiast class video cards game at 720p *rolleyes*


----------



## Otterfluff

Ok so my cards are responding to the software voltage much the same as the hard mods, except I can play with the power slider which is useful. I can get both cards to 1200 but it requires alot of fiddling around to get them nice. Crossfire needs something sub 1200Mhz to play nice.

Ive gone to using software voltage, unhooking my hard core voltage pots and keeping my HBM voltage at 1.4 with the hard mods. Without the HBM voltage mods HBM dosent clock very nicely. I am currently running both cards as:

First card is at 1150, "software 75mv" which reads 1.31V, HBM 1.4V 630MHz

Second card is at 1150 "software 48mv" reading at 1.28V, HBM 1.4V 630Mhz

Both cards are playing very nicely with each other at these settings, also noticed it gave me the extra fps to walk around at 60fps in fallout 4. Drops have gone from down from 46fps to 51fps when OC'd.

I am much more happy to keep the software mods 24/7 than my hard ones. The software voltage certainly seems to have more room to move around to find sweet spots. First time I ever completed Firestrike on the second card at 1200Mhz, albeit some artifacts, which I couldn't manage to do using the hard volt mods.


----------



## The Mac

deleted


----------



## p4inkill3r

Quick Firestrike Extreme run: http://www.3dmark.com/3dm/9321563?

I artifact heavily at 1180MHz and instacrash the driver @ 1200MHz. 600MHz on the HBM dialed in no problem at all, I wonder how much leg room is left in it.


----------



## josephimports

Quote:


> Originally Posted by *josephimports*
> 
> Thanks for the update. I'll post my results asap.


After some quick testing using FSE, the cards reached a max core clock of 1200MHz. Increasing HBM speed caused artifacts. Vcore maxed at 1.3125v on both cards. Stock voltage was 1.2437. I was expecting a bit more than 50MHz from voltage control but hey, its something. Let's see what Afterburner brings.

http://www.3dmark.com/3dm/9321160


----------



## Mega Man

Quote:


> Originally Posted by *buildzoid*
> 
> Yeah that's basically the least "safe" looking overclocking advice you will ever read however the fact is that the Fury/Fury X AMD ref PCBs are ridiculously over built.


Quote:


> Originally Posted by *Kaapstad*
> 
> Fury Xs AMD cards are very temp sensitive


fixed both for you


----------



## Alastair

Finally some voltage control. That surely means afterburner is sure to follow shortly.


----------



## Semel

Damn..guys..I'm reading how you OC your furys to 1150 and I feel depressed.. My card throttles sometimes even at 1100(!!) at a very nice GPU temp 50C even with power limit +50...Just my luck







throttling seems to happen the same at 1100 and 1140(the absolute stable maximum the card can push) although it depends on a benchmark'\game as in sometimes it happens more often


----------



## Gumbi

Quote:


> Originally Posted by *Semel*
> 
> Damn..guys..I'm reading how you OC your furys to 1150 and I feel depressed.. My card throttles sometimes even at 1100(!!) at a very nice GPU temp 50C even with power limit +50...Just my luck
> 
> 
> 
> 
> 
> 
> 
> throttling seems to happen the same at 1100 and 1140(the absolute stable maximum the card can push) although it depends on a benchmark'\game as in sometimes it happens more often


That's... not normal at all. Are you sure you're throttling?


----------



## Semel

Quote:


> Originally Posted by *Gumbi*
> 
> That's... not normal at all. Are you sure you're throttling?


Well, when my card runs at 1100-1140 I see that sometimes(not often but at least once in 1-2 min) my core clock drops to 740+\750+.

Does it look like throttling?

PS I've got
Intel Core i7-3770K @4.5 (~1.32 at 100% load voltage, offset)
Sapphire Fury Tri-X (3840 Stream Processors)
SSD 128GB + 3 HDDs
ASUS SABERTOOTH Z77
4x4 DDR3 1600, Hynix
Thermalright Silver Arrow
ATX 900W Antec HCG-900


----------



## Gumbi

Quote:


> Originally Posted by *Semel*
> 
> Well, when my card runs at 1100-1140 I see that sometimes(not often but at least once in 1-2 min) my core clock drops to 740+\750+.
> 
> Does it look like throttling?
> 
> PS I've got
> Intel Core i7-3770K @4.5 (~1.32 at 100% load voltage, offset)
> Sapphire Fury Tri-X (3840 Stream Processors)
> SSD 128GB + 3 HDDs
> ASUS SABERTOOTH Z77
> 4x4 DDR3 1600, Hynix
> Thermalright Silver Arrow
> ATX 900W Antec HCG-900


How are you stressing the card? You could well be downclocking due to being CPU bound.


----------



## Semel

Quote:


> Originally Posted by *Gumbi*
> 
> How are you stressing the card? You could well be downclocking due to being CPU bound.


Firestrike(normal\extreme), modern games. It doesn't get downclocked when running at 1050 without any +voltage. And it didn't get downclocked when it was running at 1080(I reduced it to 1050 coz it was stable in all games even witcher 3). But I used afterburner for that although it prolly doesn't matter.

CPU bound? at 4.5 Mhz? I don't think so.


----------



## flopper

Quote:


> Originally Posted by *xer0h0ur*
> 
> Isn't WoW a heavily CPU bound game? I don't play MMORPGs anyways to know this with any certainty. Regardless the average person that harps about AMD DX11 CPU overhead keeps pulling up old benchmarks with old drivers. Show me something modern using the current driver set and even better if being tested on Windows 10. I bet you anything its not nearly as exaggerated as this 30% figure. Even when doing ******ed testing like 720p, because we all know people with high end and enthusiast class video cards game at 720p *rolleyes*


I LIke to ask them,.so how many fps does the 980ti do in mantle?


----------



## Alastair

From those with the hardmod knowledge.

How far do you guys reckon these cards will go with 1.4V fed through them? That's assuming software control in future will let us go that high?

Also how much power would two Fury's At 1.4V pull? Reckon 1200w will be enough for my two cards overclocked and my 8370?


----------



## Gumbi

Quote:


> Originally Posted by *Semel*
> 
> Firestrike(normal\extreme), modern games. It doesn't get downclocked when running at 1050 without any +voltage. And it didn't get downclocked when it was running at 1080(I reduced it to 1050 coz it was stable in all games even witcher 3). But I used afterburner for that although it prolly doesn't matter.
> 
> CPU bound? at 4.5 Mhz? I don't think so.


Many games can be CPU bound. Starcraft 2 is heavily CPU boubd much of the time, and I play with a 4790k at 4.9ghz.

It seems like you are having throttling issues when you apply voltage then. I dunno. Try the other BIOS?


----------



## Agent Smith1984

Mad, I didn't get to test yet, in the process of moving, and my rig is 50 miles away!









So is everybody pretty much getting in the 1120-1150 pretty easily on Fiji pro with just some voltage??

I plan on using something in the 100-150mv range to keep VRM's in check, not knowing what temps they are actually hitting, but core should be fine at sub 70c even with that kind of voltage.

Anybody have any Firestrike runs @ 1100+ MHz Fury air results (on 3586 shaders)?


----------



## SuperZan

My Fury X managed 1130 at +26 on voltage in trixx. 1150 is proving to be more elusive without resorting to double-plus that. I wouldn't call 1150 a wall but it certainly seems to be the first real hurdle.

Edit: I still need to test in some gaming (we'll try FO4 and some TSW PvP) but no artifacting or errors at 1150clock with +42 to voltage in trixx. I feel like there is a decent amount of headroom yet.


----------



## Agent Smith1984

I'm hoping to get 1150 on core, anything else will be icing on the cake.....

I am getting 1080 stable now with load voltage reporting at 1.169, so I am pretty sure this thing will have some MHz left in it with some juice.... Probably won't run anything over 1.3v for daily, but it should handle it, considering how great this Triple D/PCS+ cooler does..... These coolers turned a tiny little GPU, into an overlength monster, which isn't the most desirable thing in the world, but man it sure does cool good!!


----------



## p4inkill3r

Quote:


> Originally Posted by *SuperZan*
> 
> My Fury X managed 1130 at +26 on voltage in trixx. 1150 is proving to be more elusive without resorting to double-plus that. I wouldn't call 1150 a wall but it certainly seems to be the first real hurdle.


Dump all 75mv into it, it isn't going to hurt it.


----------



## Otterfluff

Quote:


> Originally Posted by *Alastair*
> 
> From those with the hardmod knowledge.
> 
> How far do you guys reckon these cards will go with 1.4V fed through them? That's assuming software control in future will let us go that high?
> 
> Also how much power would two Fury's At 1.4V pull? Reckon 1200w will be enough for my two cards overclocked and my 8370?


I haven't been getting anything past 1200 core thats stable. I have two fury X and I have been fine on a 1200W psu when really raising the voltage.


----------



## josephimports

Broke 16,000 graphics score in FSE using 1180/550. Artifacts with 600 on the HBM. More testing tonight.

http://www.3dmark.com/3dm/9326276


----------



## Agent Smith1984

Phew, if I could hit close to 1200 on my Fury I'd be thrilled....

I imagine some of the folks with Fury X cards that were getting 1130-1150 on the card with stock voltage have a real good chance of getting to 1200 plus....

Sadly though, there is nothing super stellar about any of the OC results.... Tahiti was really the last great clocking silicon for AMD GPU's.... 800Mhz introductory clock..... people getting 1200-1400mhz depending on BIOS/voltage/cooling.... Hell my 280X core would hit 1260MHz on 1.3v with air cooling!!!

Then again, it didn't have 3500+ shaders either..... but I digress









Looking forward to seeing more results from Fury pro owners.....


----------



## Gamedaz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Phew, if I could hit close to 1200 on my Fury I'd be thrilled....
> 
> I imagine some of the folks with Fury X cards that were getting 1130-1150 on the card with stock voltage have a real good chance of getting to 1200 plus....
> 
> Sadly though, there is nothing super stellar about any of the OC results.... Tahiti was really the last great clocking silicon for AMD GPU's.... 800Mhz introductory clock..... people getting 1200-1400mhz depending on BIOS/voltage/cooling.... Hell my 280X core would hit 1260MHz on 1.3v with air cooling!!!
> 
> Then again, it didn't have 3500+ shaders either..... but I digress
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward to seeing more results from Fury pro owners.....


The Fury Series cards where gong to be released as 800Mhz core clocks, then decided to releases them @ 1000Mhz clocks to compete with Nvidia 980 series Cards etc.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> The Fury Series cards where gong to be released as 800Mhz core clocks, then decided to releases them @ 1000Mhz clocks to compete with Nvidia 980 series Cards etc.


At 800MHz, this card wouldn't even beat my 390.... and they had that comparison to make in house, so I doubt 800 was the original target, unless they set the target clock speed prior to realizing how underwhelming their plan to stick with 64 ROP's and just add shaders would actually work out.


----------



## sugarhell

Quote:


> Originally Posted by *Gamedaz*
> 
> The Fury Series cards where gong to be released as 800Mhz core clocks, then decided to releases them @ 1000Mhz clocks to compete with Nvidia 980 series Cards etc.


No. Nano is the 800 mhz card. Fury for months was supposed to ship with 1ghz clocks.

Except if you are a member of Amd


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> Damn..guys..I'm reading how you OC your furys to 1150 and I feel depressed.. My card throttles sometimes even at 1100(!!) at a very nice GPU temp 50C even with power limit +50...Just my luck
> 
> 
> 
> 
> 
> 
> 
> throttling seems to happen the same at 1100 and 1140(the absolute stable maximum the card can push) although it depends on a benchmark'\game as in sometimes it happens more often


Are you using Afterburner to monitor? If so uninstall afterburner without keeping your settings and try testing again using something else like GPU-Z to monitor for you. Most people don't realize that Afterburner causes clock throttling or at the very least it reports clock throttling even if its not actually throttling. Plenty of people have realized this after removing Afterburner.


----------



## Gamedaz

* This was posted on the AMD forums discussions possibly with an AMD rep that mentioned it, so it is a realistic assumption they intended these as 800mhz orginal, they seem to be trying to differentiate themselves into a specific market, and possibly realzied the gtx 900 series would be in stiff competition with their original clocks.

* The Nano would not be in the same class to compete with the gtx 980 ti, so 800mhz is what its intened clocks are.

* AMDs cards can offer the same performance as Nvidias Cards if clocks are adjusted and + if Developers Tranfer their CUDA language to somthing AMD can interpret properly, its seems that some games are written in CUDA language, which inhibits AMD's cards from reaching the same performance levels.

* Due to this limition AMD has embarked into developing a CUDA compiler software to interpret CUDA code and allow its GPU to work with CUDA language.

* This means that C++ can be used with the majority who want to go in or tweak exhisting CUDA language code in games or Industrial appplications. * It comes down to simplifying access to code by using a universal language of C++, which most GPU's are deisgned to interpret more efficiently and easily.

LINK:

http://wccftech.com/amd-cuda-compilercompatibility-layer-announced-with-the-boltzmann-initiative/


----------



## AliNT77

is undervolting doable now with trixx ?

if yes , everyone who has a nano should do it


----------



## Jflisk

Okay so this is what I am up too. I used GPU Z to pull the newer bios off of my Newer Power color FURY X and apply it to my older FURY X . The boards are the same by the IDs and I don't think there are any manufacturer customized boards for these. So I used ati win flash in the CMD prompt ran as admin issued the commands atiwinflash -f -p 0 (biosnamehere). When I restarted the computer no good - no boot black screen. I changed the bios switch computer started - Second bios shows the new bios. Turned computer off and restarted put the old bios back on by putting switch in first position. Tried the second position again showing older bios now. Am I missing something here or is there another way I am supposed to be flashing this card. Thanks in advance


----------



## xer0h0ur

Quote:


> Originally Posted by *Gamedaz*
> 
> * This was posted on the AMD forums discussions possibly with an AMD rep that mentioned it, so it is a realistic assumption they intended these as 800mhz orginal, they seem to be trying to differentiate themselves into a specific market, and possibly realzied the gtx 900 series would be in stiff competition with their original clocks.
> 
> * The Nano would not be in the same class to compete with the gtx 980 ti, so 800mhz is what its intened clocks are.
> 
> * AMDs cards can offer the same performance as Nvidias Cards if clocks are adjusted and + if Developers Tranfer their CUDA language to somthing AMD can interpret properly, its seems that some games are written in CUDA language, which inhibits AMD's cards from reaching the same performance levels.
> 
> * Due to this limition AMD has embarked into developing a CUDA compiler software to interpret CUDA code and allow its GPU to work with CUDA language.
> 
> * This means that C++ can be used with the majority who want to go in or tweak exhisting CUDA language code in games or Industrial appplications. * It comes down to simplifying access to code by using a universal language of C++, which most GPU's are deisgned to interpret more efficiently and easily.
> 
> LINK:
> 
> http://wccftech.com/amd-cuda-compilercompatibility-layer-announced-with-the-boltzmann-initiative/


You just posted a whole lot of nonsense. I won't even bother explaining why.


----------



## Randomdude

I'm running 1175/575 with +72mV, I haven't tried anything higher but this is rock-solid, stable in a few games, most notably Witcher 3. Card is definitely faster to heat up however. Doesn't seem to throttle at all.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> Okay so this is what I am up too. I used GPU Z to pull the newer bios off of my Newer Power color FURY X and apply it to my older FURY X . The boards are the same by the IDs and I don't think there are any manufacturer customized boards for these. So I used ati win flash in the CMD prompt ran as admin issued the commands atiwinflash -f -p 0 (biosnamehere). When I restarted the computer no good - no boot black screen. I changed the bios switch computer started - Second bios shows the new bios. Turned computer off and restarted put the old bios back on by putting switch in first position. Tried the second position again showing older bios now. Am I missing something here or is there another way I am supposed to be flashing this card. Thanks in advance


Win flash is a no-no onthese. Use Ati flash in command prompt


----------



## Jflisk

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Win flash is a no-no onthese. Use Ati flash in command prompt


Then atiflash -f -p 0 (bios name) at command prompt is that correct .


----------



## Alastair

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Win flash is a no-no onthese. Use Ati flash in command prompt
> 
> 
> 
> Then atiflash -f -p 0 (bios name) at command prompt is that correct .
Click to expand...

correct.


----------



## Jflisk

Thanks all


----------



## Gumbi

These overclock numbers are encouraging! Initial results were very depressing, but over 1150 generally seems easily achievable, which is quite good.


----------



## Agent Smith1984

Check this guy out....
http://www.3dmark.com/fs/6104547


----------



## Kana-Maru

I hit 1150Mhz on the core with only +6mV GPu voltage. I completed 3DMark FireStrike, but I haven't played any games with the OC yet.

I'm running a Fury X.


----------



## battleaxe

Quote:


> Originally Posted by *Kana-Maru*
> 
> I hit 1150Mhz on the core with only +6mV GPu voltage. I completed 3DMark FireStrike, but I haven't played any games with the OC yet.
> 
> I'm running a Fury X.


bring the rain.


----------



## Gumbi

Quote:


> Originally Posted by *Kana-Maru*
> 
> I hit 1150Mhz on the core with only +6mV GPu voltage. I completed 3DMark FireStrike, but I haven't played any games with the OC yet.
> 
> I'm running a Fury X.


You scumbag!


----------



## rv8000

Quote:


> Originally Posted by *Gumbi*
> 
> You scumbag!


Wish I still had my original Fury X, I could bench @ 1150 with no voltage increase


----------



## Neon Lights

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Check this guy out....
> http://www.3dmark.com/fs/6104547


I can do it with the same GPU clock (although I used 1231MHz).

I am able to run the "Furry and Tessy (GL4)" Test (settings are 1920x1080, 4xMSAA and Fullscreen) on the MSI Kombustor 2.5.0 without it crashing.
At +10MHz, so 1241MHz, I get slight artifacts and at another +10MHz, so 1251MHz, I get a bit more artifacts and after a few seconds a crash.


----------



## Semel

*Jflisk*
Quote:


> gpu-z


it doesn't properly save fury bios
Quote:


> atiwinflash


you can't use atiwinflash to deal with fury cards. use atiflash


----------



## Neon Lights

How do you save the BIOS with atiflash again?


----------



## Semel

Quote:


> Originally Posted by *Neon Lights*
> 
> How do you save the BIOS with atiflash again?


atiflash -s 0 FuryBIOS.rom

or you can use sapphire trixx to save it.


----------



## Neon Lights

Thanks.

Also, I wanted to say that I hope that, even though the voltage of Fiji GPUs can now be increased, there will be a way to increase it by a lot more than 75mV, because as far as I know the GPUs have a lot of overclocking potential left in them, especially when properly cooled which would be the case for me because I am using water blocks.


----------



## Gamedaz

* I would prefer if AMD released OC profile into their newer RADEON Software, this would simplify O.C. the card, If it can throttle up to 1180mhz then...


----------



## sugarhell

Quote:


> Originally Posted by *Gamedaz*
> 
> * I would prefer if AMD released OC profile into their newer RADEON Software, this would simplify O.C. the card, If it can throttle up to 1180mhz then...


They cant. Thats why there isnt a voltage support too. Here take this feature that it can possible burn your gpu or affect your stability.

Amd and Nvidia use 3rd Party apps for voltage control


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gamedaz*
> 
> * I would prefer if AMD released OC profile into their newer RADEON Software, this would simplify O.C. the card, If it can throttle up to 1180mhz then...


I mean, AMD has had overclocking in the CCC for a long time.... if you are referring to voltage control; yes it would be awesome, but I HIGHLY HIGHLY doubt you'll ever see it from AMD themselves.

Two words: "LOSS PREVENTION"


----------



## hyp36rmax

I'm waiting on a FuryX2 now with all this talk with voltage control I may jump with two FURY X soon haha. OP will finally join the ranks


----------



## Alastair

Well I am testing one card at a time. I have reached 1150mhz core so far on my top Sapphire Fury and I have yet to hit the voltage control. temps are well in control thanks to the EKWB block. 32C.

Using Heaven at 1440P max settings and I'll just keep going up till she crashes.

Edit. Oh and also it seems the stock voltage on my Fury no. 1 sits around the 1.2-1.2125 mark


----------



## mRYellow

I can get the mem to 600. As soon as i hit 600 my PC freezes.
What are you guys getting?


----------



## Alastair

Haven't tried the mem yet on my cards. But the core crashed at 1170mhz on heaven at stock voltage and power limits. Not bad me guesses. Temps at 32C on the core. So I guess temperature will have a big part to play with these chips.


----------



## Alastair

I will also add i am at 50% on my fan Controller.


----------



## Asus11

how do these fury x compare to 980ti/titan x? I heard recent updates made the card better? also now volt is allowed

let me know Im interested!


----------



## xer0h0ur

Quote:


> Originally Posted by *Asus11*
> 
> how do these fury x compare to 980ti/titan x? I heard recent updates made the card better? also now volt is allowed
> 
> let me know Im interested!


Current driver performance is supposed to be good in Windows 10 and compare favorably versus a 980 Ti / Titan X but I can't comment as I am not on that OS yet.


----------



## Mr-Dark

Hello guys

Sapphire drop a new version from " Sapphire Trixx " with new OC option for Fury card and for sure some voltage control

http://www.sapphiretech.com/catapage_tech.asp?cataid=291&lang=eng

GL


----------



## Otterfluff

Quote:


> Originally Posted by *Asus11*
> 
> how do these fury x compare to 980ti/titan x? I heard recent updates made the card better? also now volt is allowed
> 
> let me know Im interested!


They are about same at stock, the 980TI will overclock more and will beat a fury X OC'd.

I cant remember but I think the 980TI can get a real world 20% performance increase maybe more from OC'ing while the fury can only manage 10%.

For a single card the 980TI wins out still, unless you get into multi gpu configs where the Fury is just more efficient under crossfire.


----------



## DMatthewStewart

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm running an overclocked Fury @ 1080/560 on a 9590 @ 5Ghz on the nose. Runs beautifully...


Quote:


> Originally Posted by *Alastair*
> 
> Sapphire Fury Tri-X crossfire with 8370 @ 4.95GHz.


Awesome! You guys are great. Thanks for that. Now Im more inclined to wait before doing an Intel build. I really didnt want to pay $400 for the 6700K right now. Id like to see it come down just a touch. I just have more fun OC'ing AMD chips. Plus, the 8350 that Im running now does 5ghz and I think I can get it even higher. We'll see. I was stuck between getting a Fury X or a 980 Ti Lightning. I think if I already had the 6700k and the mobo I might lean towards the 980 Ti Lightning. Not sure. That card, (980 Ti Lightning) is really pretty beastly. And I havent owned an nVidia card in a while. I hate to ask but out of curiousity waht are you hitting on Firestrike? Im a little concerned that the 980 Ti Lightning is beating the Fury X out (at least when Guru3D tested it). Its also why I might wait for a Lightning Fury X

Hey Alastair, do you mean youre running Tri-Fire/3 Cards? or the Sapphire model Tri-X (fan design)?


----------



## DMatthewStewart

Quote:


> Originally Posted by *Alastair*
> 
> Sapphire Fury Tri-X crossfire with 8370 @ 4.95GHz.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm running an overclocked Fury @ 1080/560 on a 9590 @ 5Ghz on the nose. Runs beautifully...


Thanks. I may hold off my on intel build until the i7-6700k comes down a little in price. Im so undecided right now on what to do, which way to go. Im eyeballing the 980 Ti Lightning against the Fury X. Decisions, decisions...


----------



## Neon Lights

Quote:


> Originally Posted by *Gamedaz*
> 
> The Tech is called :XFX Voltage Control Technology. * So I assume this will be available now or in the next version of their GPU Drivers. * Not sure if it is only implementable in crossfire setups, which is what the XFX website advertises, * I'm sure voltages can be controlled for Single XFX Fury cards as well, if they decide to implement it for single GPU's.
> 
> * Why are Fury owners unable to increase their voltage clocks with AMD software? And require trix to do that? My XFX card specifically has that Voltage control feature, but not sure if its Only for Crossfire setups and Memory clocks only.


You can raise your card's voltage in the drivers? Are you sure?


----------



## Davehillbo

I'm running 2x Sapphire Fury Tri-x, can only get 1070Mhz with max volts








What speed are people running the hmb at?


----------



## sugarhell

Quote:


> Originally Posted by *Neon Lights*
> 
> You can raise your card's voltage in the drivers? Are you sure?


Its Bs. Its a 3rd party app probably. Amd cant provide voltage control via their drivers. They cant even add lod settings

Oh yeah this is totally misleading

http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-liquid-cooled-r9-fury-4wfa
Quote:


> Complete control over the power of your card. XFX knows the enthusiast gamer wants to squeeze every last ounce of performance out of the card, our voltage control technology allows you to fine tune your card to push it to the limit. Thanks to AMD Overdrive Technology, you can tweak the card right within the AMD Catalyst Control center, no extra software required.
> - See more at: http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-liquid-cooled-r9-fury-4wfa#sthash.w0dCvKWW.dpuf


Nothing specal about this ,PR material only


----------



## battleaxe

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Thanks. I may hold off my on intel build until the i7-6700k comes down a little in price. Im so undecided right now on what to do, which way to go. Im eyeballing the 980 Ti Lightning against the Fury X. Decisions, decisions...


depends if you plan on 4k or not I think. If 4kk, Fury is only gonna get stronger with drivers. Happens every time. 1440 is less definitive, and 1080p you may as well go nvidia, Fury is just too far behind IMO. Personally, I think 1080 is yesterday's news and not worth nvidia claiming any bragging rights.

I think this Fury/x and 4k is a winner especially now with volts unlocked. We should see in the next few weeks what these can really do. Remember when the 290x first came out? Wasn't long and it was owning the Titan. I'm hoping for a repeat.


----------



## SuperZan

Quote:


> Originally Posted by *Davehillbo*
> 
> I'm running 2x Sapphire Fury Tri-x, can only get 1070Mhz with max volts
> 
> 
> 
> 
> 
> 
> 
> 
> What speed are people running the hmb at?


With a Fury X @ 1100mhz clock I found 525mhz on the HBM improved stability for me, especially in FO4. Likewise for 1150 w/ 550mhz on HBM. Working past 1170 clock at the moment before I touch memory clocks again but that's what worked for me. YMMV.


----------



## Neon Lights

Quote:


> Originally Posted by *sugarhell*
> 
> Its Bs. Its a 3rd party app probably. Amd cant provide voltage control via their drivers. They cant even add lod settings
> 
> Oh yeah this is totally misleading
> 
> http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-liquid-cooled-r9-fury-4wfa
> Nothing specal about this ,PR material only


Is there actually a software from XFX to control voltage or do they just mean that they have better power delivery?


----------



## sugarhell

Quote:


> Originally Posted by *Neon Lights*
> 
> Is there actually a software from XFX to control voltage or do they just mean that they have better power delivery?


I bet nothing. Just a fancy name. The vrms looks like a rebranded oem and the voltage controller is the same as the reference cards. So...


----------



## Jflisk

Anyone know if the first bios position is locked on the FURY X cards. I believe they are all the same card does not matter manufacturers. Trying to take the bios off my second newer card with .64 bios and put it on my older card with .63 on it . Both cards are from same manufacturer. I am using atiflash with -p switch dropping the -f (force) Keep getting the ELO error.


----------



## MerkageTurk

My fury x does stock volts @1100

Not sure how the power limit works etc though as I am from 980 ti


----------



## Gamedaz

Quote:


> Originally Posted by *sugarhell*
> 
> I bet nothing. Just a fancy name. The vrms looks like a rebranded oem and the voltage controller is the same as the reference cards. So...


*Seems its a feature intended for the O.C community only (using 3rd Party apps etc)

* It would be nice to OC the card using the catalyst software, get it to the edge for everyday use, and let the Extreme O.C use other software that can squeeze out that extra bit of juice, that what it seems to come down to mostly. If I could get a 8FPS gain + Better Drivers which could add more frame rates then that would be worth it, I'm not into O.C, it is possible, but I would rather use an OEM recommended clock, than one I would guess through trial and error which works better. )IMO)


----------



## THUMPer1

Quote:


> Originally Posted by *Mr-Dark*
> 
> Hello guys
> 
> Sapphire drop a new version from " Sapphire Trixx " with new OC option for Fury card and for sure some voltage control
> 
> http://www.sapphiretech.com/catapage_tech.asp?cataid=291&lang=eng
> 
> GL


Did you guys miss this post? *Trixx now has Voltage Control*


----------



## Alastair

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I'm running an overclocked Fury @ 1080/560 on a 9590 @ 5Ghz on the nose. Runs beautifully...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Sapphire Fury Tri-X crossfire with 8370 @ 4.95GHz.
> 
> Click to expand...
> 
> Awesome! You guys are great. Thanks for that. Now Im more inclined to wait before doing an Intel build. I really didnt want to pay $400 for the 6700K right now. Id like to see it come down just a touch. I just have more fun OC'ing AMD chips. Plus, the 8350 that Im running now does 5ghz and I think I can get it even higher. We'll see. I was stuck between getting a Fury X or a 980 Ti Lightning. I think if I already had the 6700k and the mobo I might lean towards the 980 Ti Lightning. Not sure. That card, (980 Ti Lightning) is really pretty beastly. And I havent owned an nVidia card in a while. I hate to ask but out of curiousity waht are you hitting on Firestrike? Im a little concerned that the 980 Ti Lightning is beating the Fury X out (at least when Guru3D tested it). Its also why I might wait for a Lightning Fury X
> 
> Hey Alastair, do you mean youre running Tri-Fire/3 Cards? or the Sapphire model Tri-X (fan design)?
Click to expand...

I am using TWO 2 Sapphire Radeon Fury Tri-x cards. I have two cards. And they are plumbed into my water cooling loop.


----------



## Clockster

http://www.3dmark.com/fs/6523611

http://www.3dmark.com/3dm/9334373?

http://www.3dmark.com/3dm/9334390?

Just some dirty runs quickly, I suspect this driver isn't as good for fs as the previous driver as well, so I'm sure I'll get more out of the card once I sit down and play with it properly.


----------



## mRYellow

Quote:


> Originally Posted by *Clockster*
> 
> http://www.3dmark.com/fs/6523611
> 
> http://www.3dmark.com/3dm/9334373?
> 
> http://www.3dmark.com/3dm/9334390?
> 
> Just some dirty runs quickly, I suspect this driver isn't as good for fs as the previous driver as well, so I'm sure I'll get more out of the card once I sit down and play with it properly.


I can confirm this. 15.11.1 aren't the fastest for 3dmark FS drivers. About 200 points slower on ave.


----------



## p4inkill3r

Heaven 4.0 1440p
1175/600
Quote:


> FPS:
> 47.1
> 
> Score:
> 1187
> 
> Min FPS:
> 24.7
> 
> Max FPS:
> 61.8
> 
> System
> 
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> 
> CPU model:
> Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz (4007MHz) x4
> 
> GPU model:
> AMD Radeon (TM) R9 Fury Series 15.201.1151.1010 (4095MB) x1
> 
> Settings
> 
> Render:
> Direct3D11
> 
> Mode:
> 2560x1440 8xAA fullscreen
> 
> Preset
> Custom
> 
> Quality
> Ultra
> 
> Tessellation:
> Extreme


----------



## Clockster

Quote:


> Originally Posted by *mRYellow*
> 
> I can confirm this. 15.11.1 aren't the fastest for 3dmark FS drivers. About 200 points slower on ave.


Yeah that's about right, I reckon around 150-200pts lower than before, so should be really good with the older driver.


----------



## mRYellow

Quote:


> Originally Posted by *p4inkill3r*
> 
> Heaven 4.0 1440p
> 1175/600


What volt are you running to get mem at 600?


----------



## Alastair

Quote:


> Originally Posted by *p4inkill3r*
> 
> Heaven 4.0 1440p
> 1175/600
> Quote:
> 
> 
> 
> FPS:
> 47.1
> 
> Score:
> 1187
> 
> Min FPS:
> 24.7
> 
> Max FPS:
> 61.8
> 
> System
> 
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> 
> CPU model:
> Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz (4007MHz) x4
> 
> GPU model:
> AMD Radeon (TM) R9 Fury Series 15.201.1151.1010 (4095MB) x1
> 
> Settings
> 
> Render:
> Direct3D11
> 
> Mode:
> 2560x1440 8xAA fullscreen
> 
> Preset
> Custom
> 
> Quality
> Ultra
> 
> Tessellation:
> Extreme
Click to expand...

I got a score of 1100 and 42.7 FPS at 1160/500 with identical settings. Up from 39.2 when running 1000/500.

The difference there is about 10% between our cards. The fact that I am using a Fury and you a full Fury X I guess that makes sense. Your probably pulling away a bit more with the memory OC.


----------



## Greenland

Weird, didn't expect my result to be this close to yours, Fury 1100/570:

FPS: 46.2119
Min FPS: 23.1191
Max FPS: 90.3435
Score: 1164.08


----------



## kayan

Ugh, my Fury X has some very noticeable coil whine while playing any games (Dota 2 and Far Cry 4 is all I've tried so far). Does anyone else have this issue on recent Fury X?

Also, every time I try to overclock, my PC locks up.









The coil whine though, soooo annoying. I've never had a GPU that had it before. I think it's bothering me more than it is my dog, and I think it's going back due to this.


----------



## mRYellow

Quote:


> Originally Posted by *kayan*
> 
> Ugh, my Fury X has some very noticeable coil whine while playing any games (Dota 2 and Far Cry 4 is all I've tried so far). Does anyone else have this issue on recent Fury X?
> 
> Also, every time I try to overclock, my PC locks up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The coil whine though, soooo annoying. I've never had a GPU that had it before. I think it's bothering me more than it is my dog, and I think it's going back due to this.


Coil whine is common these days. Just depends how audible.
If it's loud, rma the card.

I have no coil whine.


----------



## kayan

Quote:


> Originally Posted by *mRYellow*
> 
> Coil whine is common these days. Just depends how audible.
> If it's loud, rma the card.
> 
> I have no coil whine.


What brand do you have? Mine is XFX. Starting to lose a little bit of faith in them (not just because of this, but other things too), sadly. I am happy to report that I have 0 pump noise.


----------



## carajean

I bought the Sapphire Fury tri-x and its funny I ONLY have coil whine at 96% gpu usage in WoW's character selection screen. I have only had this card for maybe 2 days now.


----------



## mRYellow

Quote:


> Originally Posted by *kayan*
> 
> What brand do you have? Mine is XFX. Starting to lose a little bit of faith in them (not just because of this, but other things too), sadly. I am happy to report that I have 0 pump noise.


Sapphire Tri x.
Just standard Fury.


----------



## Alastair

Coil whine is pretty normal these days. It just depends on how much. I only get any sort of major coil buzz out of my two Fury's when in the CSGO main menu. That seems to be the only time I ever hear them.


----------



## Agent Smith1984

I'm pretty sure, almost all of these cards have a little whine.....

Most of the users on here, and several reviews have reported coil whine on the X, the Pro, and the Nano.... it's just a "thing" with these cards I suppose.

The little cricket in my case doesn't really bother me really. I sit across the living room from a 55" screen and it's drowned out by the ridiculous case fans that I run... lol

I could see it maybe being a bit obnoxious on top of a desk 2-4' away though......


----------



## malitze

Quote:


> Originally Posted by *Alastair*
> 
> I got a score of 1100 and 42.7 FPS at 1160/500 with identical settings. Up from 39.2 when running 1000/500.
> 
> The difference there is about 10% between our cards. The fact that I am using a Fury and you a full Fury X I guess that makes sense. Your probably pulling away a bit more with the memory OC.


Can't get my Fury X as far as painkiller. Beyond 570 in memory and 1160 gets me a crashing Heaven Benchmark.
Now at 1140 / 570 and will try to lower voltage again (currently @ 48mV)

FPS: 47.4
Score: 1194
Min FPS: 27.6
Max FPS: 94.7

Platform: Windows NT 6.2 (build 9200) 64bit
CPU model: Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz (3598MHz) x4
GPU model: AMD Radeon (TM) R9 Fury Series 15.201.1151.1007 (4095MB) x1


----------



## mRYellow

Quote:


> Originally Posted by *malitze*
> 
> Can't get my Fury X as far as painkiller. Beyond 570 in memory and 1160 gets me a crashing Heaven Benchmark.
> Now at 1140 / 570 and will try to lower voltage again (currently @ 48mV)
> 
> FPS: 47.4
> Score: 1194
> Min FPS: 27.6
> Max FPS: 94.7
> 
> Platform: Windows NT 6.2 (build 9200) 64bit
> CPU model: Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz (3598MHz) x4
> GPU model: AMD Radeon (TM) R9 Fury Series 15.201.1151.1007 (4095MB) x1


Have you given it more volts?


----------



## Kana-Maru

Quote:


> Originally Posted by *battleaxe*
> 
> bring the rain.


I'm trying, but these Fury X's are pretty bogged down. This card has everything going for itself so I don't know wth AMD is doing. You strapped a looped watercooler to the dang GPU so they were clearly targeting enthusiast [while trying to take care of the heat problem]. I still think they are afraid that people are going to OC the heck out of these cards and return them once they crap out. Therefore I think they aren't letting us go wild with it. Performance wise I have no complaints about the frame-rate and frame-times + GPU temps. Even when overclocked the GPU is below 47c.

Quote:


> Originally Posted by *Gumbi*
> 
> You scumbag!


Haha ouch.

Ok so I maxed out the GPU voltage. The VDDC read around 1.29v-1.32v. Somewhere above 1160Mhz [1165Mhz+ maybe] theirs issues with the graphics looking weird. 1170 and 1180 wasn't happening. Apparently I need more voltage or something. I've updated to the latest GPU drivers and ran some quick test this morning. Here are some comparisons from stock vs OC [HBM stock].

*3DMARK FIRE STRIKE - Performance*

*Stock*
Score: 15,111
Graphics Score: 17,776
Physics Score: 16,344
Combined Score: 6,679

*Core: 1160Mhz Overclock*
Score: 15,776
Graphics Score: 18,517
Physics Score: 16,448
Combined Score: 7,266

*3DMARK11 - Performance*

*Stock*
Score: 17,894
Graphics Score: 22,786
Physics Score: 11,339
Combined Score: 10,266

*Core: 1160Mhz Overclock*
Score: 18,764
Graphics Score: 23,603
Physics Score: 12,238
Combined Score: 10,800


----------



## Alastair

It would appear that the first big hurdle is the 1150 mark. I reached 1160 on the core without touching volts. But now I'm all the way up on the trixx voltage slider trying to get 1200 Mhz dialed in.


----------



## Alastair

In closing. I mailed out that voltage slider using all +75 available to me. And I am running 1200 core. And just leaving it to run heaven bench see how stable it is. Temps staying in the 35C region .


----------



## malitze

Quote:


> Originally Posted by *mRYellow*
> 
> Have you given it more volts?


Yep, the attempt at 1160 was with +75mV. Might give it another shot


----------



## THUMPer1

Quote:


> Originally Posted by *Alastair*
> 
> In closing. I mailed out that voltage slider using all +75 available to me. And I am running 1200 core. And just leaving it to run heaven bench see how stable it is. Temps staying in the 35C region .


----------



## Alastair

Quote:


> Originally Posted by *THUMPer1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> In closing. I mailed out that voltage slider using all +75 available to me. And I am running 1200 core. And just leaving it to run heaven bench see how stable it is. Temps staying in the 35C region .
Click to expand...

Quote:


> Originally Posted by *THUMPer1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> In closing. I mailed out that voltage slider using all +75 available to me. And I am running 1200 core. And just leaving it to run heaven bench see how stable it is. Temps staying in the 35C region .
Click to expand...

She can't do it captain. She just don't have the power!


----------



## p4inkill3r

It isn't, really.
Having 200mv to mess with would be better,of course, especially after this long of a wait, but I'll take whatever I can get.


----------



## Alastair

I pass a few loops of heaven. But eventually black screens.

I'll try clean drivers as these are old. But yeah.


----------



## Neon Lights

Why would you use the Heaven benchmark to test a stable overclock?
An optimal test for a stable overlcok would be a test where the GPU usage is 100% all the time/over the whole run. So unless the Heaven benchmark is ran at a UHD 4K resolution there will not be 100% GPU usage during the whole run.


----------



## Gumbi

Quote:


> Originally Posted by *Neon Lights*
> 
> Why would you use the Heaven benchmark to test a stable overclock?
> An optimal test for a stable overlcok would be a test where the GPU usage is 100% all the time/over the whole run. So unless the Heaven benchmark is ran at a UHD 4K resolution there will not be 100% GPU usage during the whole run.


It might not be as intense as, say, 3DMark, but Heaven at 1080p with AA and Tess on pegs my gpu at 100% all the time (290X Vaporx and a 4790k at 4.9ghz).


----------



## joeh4384

Heaven and Fire Strike are pretty good stability testers. Heaven is more intense than Valley.


----------



## Neon Lights

Quote:


> Originally Posted by *Gumbi*
> 
> It might not be as intense as, say, 3DMark, but Heaven at 1080p with AA and Tess on pegs my gpu at 100% all the time (290X Vaporx and a 4790k at 4.9ghz).


If it utilizes your GPU 100%, then there is no problem. But I could imagine that a Fiji GPU might not get 100% usage (at least at 1080p).


----------



## Gumbi

Quote:


> Originally Posted by *Neon Lights*
> 
> If it utilizes your GPU 100%, then there is no problem. But I could imagine that a Fiji GPU might not get 100% usage (at least at 1080p).


Well, even if something is loaded 100% it doesn't necessarily mean it's as intense. Crysis 3 heats up my GPU a lot more than Heaven, yet the GPU usage is generally the same at 100%.


----------



## sugarhell

Quote:


> Originally Posted by *Gumbi*
> 
> Well, even if something is loaded 100% it doesn't necessarily mean it's as intense. Crysis 3 heats up my GPU a lot more than Heaven, yet the GPU usage is generally the same at 100%.


Yeah. It is just a general percentage. You never max out rops,shaders etc etc with games


----------



## Alastair

Quote:


> Originally Posted by *Neon Lights*
> 
> Why would you use the Heaven benchmark to test a stable overclock?
> An optimal test for a stable overlcok would be a test where the GPU usage is 100% all the time/over the whole run. So unless the Heaven benchmark is ran at a UHD 4K resolution there will not be 100% GPU usage during the whole run.


I am not. I am running benches with heaven. Fact is if it cant pass a few runs of heaven it wont be stable in anything else.


----------



## littlestereo

HOLY ****. Crossfire scaling in firestrike for the Fury X is over 100%!!!!!!!! http://www.3dmark.com/compare/fs/6525252/fs/6525107

With voltage control I was able to hit 1200/500 (Fury X) http://www.3dmark.com/fs/6525203

I've messed around with it and the voltage control has allowed me to run +100 MHz core and +100MHz mem for a total of 1150/600 stable in games. I'll post some game framerates later tonight after work. The highest I was able to push core clock was up to 1200, but at that core clock the mem won't reach 600 without dying on me. Best performance I got overall was from +130 core and +100 mem for 1180/600 stable. These cards are in EK waterblocks with their own loop of 480mm of 30mm thick rads and Swiftech MCP50X pump driving the loop/

Here's the firestrikes from the important runs: http://www.3dmark.com/compare/fs/6525252/fs/6525156/fs/6525107/fs/6524667/fs/6525203


----------



## malitze

Quote:


> Originally Posted by *littlestereo*
> 
> HOLY ****. Crossfire scaling in firestrike for the Fury X is over 100%!!!!!!!! http://www.3dmark.com/compare/fs/6525252/fs/6525107
> 
> With voltage control I was able to hit 1200/500 (Fury X) http://www.3dmark.com/fs/6525203
> 
> I've messed around with it and the voltage control has allowed me to run +100 MHz core and +100MHz mem for a total of 1150/600 stable in games. I'll post some game framerates later tonight after work. The highest I was able to push core clock was up to 1200, but at that core clock the mem won't reach 600 without dying on me. Best performance I got overall was from +130 core and +100 mem for 1180/600 stable. These cards are in EK waterblocks with their own loop of 480mm of 30mm thick rads and Swiftech MCP50X pump driving the loop/
> 
> Here's the firestrikes from the important runs: http://www.3dmark.com/compare/fs/6525252/fs/6525156/fs/6525107/fs/6524667/fs/6525203


Woah nice results! Looking forward to game fps.

Tested again with 1150 / 570 @ max. Volts, black screened in Heaven. Guess theres still some room for optimization on the cooling front.


----------



## xer0h0ur

Quote:


> Originally Posted by *littlestereo*
> 
> HOLY ****. Crossfire scaling in firestrike for the Fury X is over 100%!!!!!!!! http://www.3dmark.com/compare/fs/6525252/fs/6525107
> 
> With voltage control I was able to hit 1200/500 (Fury X) http://www.3dmark.com/fs/6525203
> 
> I've messed around with it and the voltage control has allowed me to run +100 MHz core and +100MHz mem for a total of 1150/600 stable in games. I'll post some game framerates later tonight after work. The highest I was able to push core clock was up to 1200, but at that core clock the mem won't reach 600 without dying on me. Best performance I got overall was from +130 core and +100 mem for 1180/600 stable. These cards are in EK waterblocks with their own loop of 480mm of 30mm thick rads and Swiftech MCP50X pump driving the loop/
> 
> Here's the firestrikes from the important runs: http://www.3dmark.com/compare/fs/6525252/fs/6525156/fs/6525107/fs/6524667/fs/6525203


If the above result is gaming stable then its the first time I have seen dual Fury X's surpass by a decent margin my 295X2 + 290X overclocked. That is a nice result.


----------



## igrease

Couple questions.

Is the VRM temps still an issue with this card?

Did the 295x have the VRM temp issues as well?

Why is the Nano $650?


----------



## Agent Smith1984

Quote:


> Originally Posted by *igrease*
> 
> Couple questions.
> 
> Is the VRM temps still an issue with this card?


_"Ok here's some basic data so you don't blow up your cards. Max out the power slider. Even if the VRM for Vcore hits 125C you can shove 500W through it. Also it will throttle the card if it gets too hot. The IR 3567B running the default BIOS won't let you blow the VRM. +100mV is safe across the board. +200mv is safe if you keep the core bellow 60C I would not expect clock scaling going past 150/175mV depending on you card. Yeah that's basically the least "safe" looking overclocking advice you will ever read however the fact is that the Fury/Fury X PCBs are ridiculously over built." - Buildzoid
_
Quote:


> Did the 295x have the VRM temp issues as well?


Yes, to a certain degree....
Quote:


> Why is the Nano $650?


Cause it's smaller than the $650 Fury X, runs at the same clock speed with the same amount of shaders, and only uses 175w (cough*cough cough)
Most reviewers found the card would hover around 800-950mhz core clock if the power limit was not increased, which caused the card to use around 200-220w, which is still pretty impressive.

Did I mention it's tiny? It should definitely not be a $650 card in my opinion though....


----------



## Semel

Quote:


> Originally Posted by *p4inkill3r*
> 
> It isn't, really.
> Having 200mv to mess with would be better,of course, especially after this long of a wait, but I'll take whatever I can get.


Quote:


> Near +96 mV, the power limiter will start to kick in from time to time during games, when set to default, which is why we set it to +50% for all these tests.
> 
> Once we reach +144 mV, which results in a scorching 1.35 V on the GPU, the maximum stable frequency reaches its peak. At this point, the VRMs are running temperatures above 95°C even though they are cooled by the watercooling loop via a nearby copper pipe. That much heat on the VRMs is definitely not good for long-term use. I would say a safe permanent voltage increase on an unmodded card is around 40 mV or so.
> 
> Going beyond 144 mV, to 168 mV, just causes the card to get massively unstable, with maximum stable clocks nearly down to stock voltage levels.


That's from Trixx developer.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> That's from Trixx developer.


Yeah, but on a full cover block all that's a non-issue. So what about full block users? Still need 200mv.


----------



## Bartouille

It's a shame but it's so easy to unlock trixx max voltage offset with an hex editor lol But at the same time they probably put +75mv for a reason and knowing how poor the fury x vrm cooling is I don't want anyone to blow up his card with crap like furmark.


----------



## Agent Smith1984

I'd dump 150mv to mine right now without question....

I'm 60c core on 50% fan.... I'll take my chances at sub 70c with a slightly higher fan speed...

I'm going to chase 1200mhz like a ride on the white pony









Note: that was totally metaphorical, I'm 31 with 4 kids


----------



## Semel

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd dump 150mv to mine right now without question....


Here is a hint for you
Quote:


> 0.075


I hope you have a beautiful coffin prepared for your card


----------



## xer0h0ur

The 295X2's VRMs were only being cooled by a fan of which you aren't given control over so if your case's temperature is less than stellar then those VRMs would get hot and it was a possibility to get GPU clock throttling. Some people disconnected the fan from the PCB and manually controlled it themselves so this wouldn't be an issue. Some people slapped some Fujipoly thermal pads on the VRMs and kept the fan connected to the PCB, some did both, and others like myself went the waterblock route from the get go.

TL;DR The VRMs can be a problem under less than ideal case temperatures or poor airflow to the VRM fan.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> Here is a hint for you
> I hope you have a beautiful coffin prepared for your card


Why? With a proper block these cards can easily handle way over .075mv extra. Easy. Most of us here don't care about killing our cards after a few years use anyway. Not that I ever have, I've never seen one of my cards die from OC'ing. It happens, but not often enough to justify not having some serious fun with them first.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Semel*
> 
> Here is a hint for you
> I hope you have a beautiful coffin prepared for your card


Your original post is in my email









This card should handle 100mv easily!


----------



## Otterfluff

With my hard mods My first card could do 1200 stable @ 1.36V which is above the voltage that +75mv can provide.

75mv puts you at 1.31V

However my second card could never make 1200 stable no matter how much voltage i threw at it. Some cards will just OC better than others.

Mind you what you can do stable on a single card is not what is stable under crossfire, I down clock to 1150 so that both cards play nice in crossfire.

As people have noticed HBM easily gets to 600 under a custom water block.

You can even get to 630 on the HBM if you do voltage mods but it's probably not worth it just for the extra 30Mhz.

Personally if they allowed more than 75mv on software I would probably run my first card at 1200 under crossfire because the software voltage has been alot more stable. I think that 125mv would be the sweet spot to have software control for use with watercooling.


----------



## fat4l

Have you tried His iTurbo program for oc? No voltage control ?
https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view
(Thx to Gupsterg)


----------



## mRYellow

Quote:


> Originally Posted by *fat4l*
> 
> Have you tried His iTurbo program for oc? No voltage control ?
> https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view
> (Thx to Gupsterg)


Why use this?


----------



## Alastair

How does one edit trixx? I want to edit trixx. I would probably only need +85 to get 1200 stable. Which still leaves me with a ton of room!


----------



## Semel

Meh.. I still can't get even 1100 not to throttle sometimes in games\benchmarks.....I tried changing power limit, I tried increasing and decreasing +mV..My core temps are in check..(50-55C max)... I even checked how much power my PC was using and it was lower than my PSU can deliver..


----------



## Gumbi

Quote:


> Originally Posted by *Semel*
> 
> Meh.. I still can't get even 1100 not to throttle sometimes in games\benchmarks.....I tried changing power limit, I tried increasing and decreasing +mV..My core temps are in check..(50-55C max)... I even checked how much power my PC was using and it was lower than my PSU can deliver..


What PSU do you have, and what are your other components?


----------



## Semel

Quote:


> Originally Posted by *Gumbi*
> 
> What PSU do you have, and what are your other components?


i7-3770K @ 4.5
Sapphire Fury Tri-X (3840 Stream Processors unlocked) -> I tried to OC with a default number of SP but there was no difference)
SSD 128GB+ 3HDD
ASUS SABERTOOTH Z77
4x4GB DDR3 1600Mhz, Hynix
Thermalright Silver Arrow
ATX 900W Antec HCG-900
Cooler Master Storm Trooper (air flow seems to be Ok)


----------



## fat4l

Quote:


> Originally Posted by *mRYellow*
> 
> Why use this?


because normally, this program allows +400mV on R9 cards while other software allow much less...


----------



## mRYellow

Quote:


> Originally Posted by *fat4l*
> 
> because normally, this program allows +400mV on R9 cards while other software allow much less...


It might for regular cards but it doesn't support voltage control for Fury.


----------



## fat4l

Quote:


> Originally Posted by *mRYellow*
> 
> It might for regular cards but it doesn't support voltage control for Fury.


yes this is what i was asking ..."if it supports fury cards or not".


----------



## Skinnered

Doesn't have read everything here yet, but is the HBM memory of the second card in CF now overclockable? I always read 500 mhz in MSI AB for the second GPU when overclocking the HBM memory?


----------



## Semel

Could someone show me on a photo(sapphire tri-x fury) where VRM is?

Thanx.

PS Btw could throttling be related to some TDP limit set in bios? I think I should test very carefully power limit setting.. I mean it's a double edged sword..

+ power limit-> more stable OC
+ power limit-> might cause throttling


----------



## Alastair

Quote:


> Originally Posted by *Semel*
> 
> Could someone show me on a photo(sapphire tri-x fury) where VRM is?
> 
> Thanx.
> 
> PS Btw could throttling be related to some TDP limit set in bios? I think I should test very carefully power limit setting.. I mean it's a double edged sword..
> 
> + power limit-> more stable OC
> + power limit-> might cause throttling


----------



## buildzoid

Quote:


> Originally Posted by *Semel*
> 
> Could someone show me on a photo(sapphire tri-x fury) where VRM is?
> 
> Thanx.
> 
> PS Btw could throttling be related to some TDP limit set in bios? I think I should test very carefully power limit setting.. I mean it's a double edged sword..
> 
> + power limit-> more stable OC
> + power limit-> might cause throttling


If you hit the power limit you get throttling. Power limit doesn't really impact stability. Just max it and forget about it. The VRM on the air cooled cards runs much cooler than on the Fury-X(copper pipe vs aluminum base plate with direct airflow).


----------



## Semel

Quote:


> Originally Posted by *buildzoid*
> 
> If you hit the power limit you get throttling. Power limit doesn't really impact stability. Just max it and forget about it. The VRM on the air cooled cards runs much cooler than on the Fury-X(copper pipe vs aluminum base plate with direct airflow).


How do you explain this?
http://semiaccurate.com/2015/06/24/overclocking-amds-fury-x/
Quote:


> Thus we came out with a maximum GPU clock setting of 10 percent. Given that the Fury X is clocked a 1050 Mhz by default this gives us an overclock of 105 Mhz for a final clock speed of 1155 Mhz. We then began working our now overclocked Fury X through our benchmarking suite when things got weird. Only one game showed a performance increase and all of the other games in our suite actually took performance hits. Our theory is that by raising the power limit to 50 we triggered PowerTune's thermal throttling which caused the performance hit. So we lowered our power limit settings back to zero percent and retested our ten percent overclock. Then we consistently saw performance gains in all of the titles we tested.
> 
> From there we built this graph of the impact of different power limit settings on the performance of our overclocked Fury X. As you can see performance trends downwards as we increase our power limit setting. We achieved the best performance with a five percent boost to the power limit setting.


----------



## buildzoid

Quote:


> Originally Posted by *Semel*
> 
> How do you explain this?
> http://semiaccurate.com/2015/06/24/overclocking-amds-fury-x/


Interesting. That's the first time I've heard about that. Guess I'll test it when I get my on Fury X because in theory that shouldn't happen at all.


----------



## Agent Smith1984

This: http://wccftech.com/amd-r9-fury-x-performance-ahead-nvidia-980-ti-latest-drivers/

*+*

Voltage control and additional overclocking

*=*


----------



## Agent Smith1984

Vote please....

Buying a new game today to try and really stretch the legs on this Fury

1) Fallout 4

2) Witcher 3

3) GTA V

4) Shadow of Mordor

Numbers or names, or both, are fine


----------



## p4inkill3r

gta 5! Awesome graphics, awesome gameplay, tons of mods.


----------



## joeh4384

Witcher 3 was a great game.


----------



## Agent Smith1984

VOTES GET REPS


----------



## Nunzi

GTA 5


----------



## josephimports

Quote:


> Originally Posted by *p4inkill3r*
> 
> gta 5! Awesome graphics, awesome gameplay, tons of mods.


+1. Also, runs great in crossfire
 








Currently at 1150/550 +50mv. 1.2938 max vcore. Averaging ~1.250v in-game.


----------



## Semel

Yesss!! I finally got it working!


Core Clock: 1140Mhz
Voltage: +72 mV
Power Limit: 50%
Memory Clock: 550Mhz
Custom Fan Profile: 0-50C->0-50% fan speed / 50C++ -> 60% fan speed
3dmark extreme stable(+demo)< Witcher 3 is stable, no throttling or anything.

Sadly it seems my card doesn't really like going higher than 1140Mhz I tried 1150Mhz but even +100 mV didn't help and I don't want push voltage any higher.
Still, 15% core OC over base 1000Mhz is pretty nice.


----------



## Mads1

Quote:


> Originally Posted by *Agent Smith1984*
> 
> VOTES GET REPS


Def gta v.


----------



## Alastair

I would say either Witcher 3 or GTAV


----------



## GorillaSceptre

Yeah, Witcher 3 or GTA.


----------



## SuperZan

My heart says FO4, but my brain says GTA V. You'll just get more mileage out of it.


----------



## littlestereo

GTA V, although Starwars Battlefront is truly the best game to demonstrate the Fury's ability paired with a top-of-the-line graphics engine and AMD's full driver optimization at launch. 45 FPS @4k ultra with a single Fury X, 80+ FPS @4k Ultra with Fury X CF (2x).

Here's some benchmarks I've ran along with some 4K images demoing it:

__
https://www.reddit.com/r/3tldqc/fury_x_crossfire_2x_overclocking_swbf4k_max/


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> I would say either Witcher 3 or GTAV


Quote:


> Originally Posted by *GorillaSceptre*
> 
> Yeah, Witcher 3 or GTA.


Should be giving you two half reps!!







Quote:


> Originally Posted by *littlestereo*
> 
> GTA V, although Starwars Battlefront is truly the best game to demonstrate the Fury's ability paired with a top-of-the-line graphics engine and AMD's full driver optimization at launch. 45 FPS @4k ultra with a single Fury X, 80+ FPS @4k Ultra with Fury X CF (2x).
> 
> Here's some benchmarks I've ran along with some 4K images demoing it:
> 
> __
> https://www.reddit.com/r/3tldqc/fury_x_crossfire_2x_overclocking_swbf4k_max/


Thanks for the suggestion, forgot all about Battlefront.... It really looks great..... HMMMMM.....

May add that to the vote list, and am now considering this one too.

On votes alone, looks like GTA V though


----------



## the9quad

GTA V is a great game. The Witcher is a great game. GTA V is more fun to me though.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Should be giving you two half reps!!


.5 for each suggestion, a whole rep seems fine to me.









I prefer GTA but it depends what type of game you like, can't go wrong with any of them, Fallout included. It's been a pretty great year so far.


----------



## Agent Smith1984

Quote:


> Originally Posted by *GorillaSceptre*
> 
> .5 for each suggestion, a whole rep seems fine to me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I prefer GTA but it depends what type of game you like, can't go wrong with any of them, Fallout included. It's been a pretty great year so far.


Awesome.....

I think GTA V it is then...... seems like the kind of game I can just pick up and get stuck on, then pick it up a few days later, and get plenty more mileage out of it.

I remember playing GTA II on PS2 like... FOREVER.... lol


----------



## Gamedaz

GTA seems to be a very replayable game. Star Wars does look tempting.


----------



## AliNT77

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Vote please....
> 
> Buying a new game today to try and really stretch the legs on this Fury
> 
> 1) Fallout 4
> 
> 2) Witcher 3
> 
> 3) GTA V
> 
> 4) Shadow of Mordor
> 
> Numbers or names, or both, are fine


Witcher3

Already spent 260hours for it ??


----------



## Agent Smith1984

My lil bro just got his 8core xeon rig going with 32gb and a 980ti classified he bought on craigslist for $480... That kid always always has to one-up me damn it! LOL

He's boosting over 1500mhz on core, what a beast...


----------



## joeh4384

Quote:


> Originally Posted by *josephimports*
> 
> +1. Also, runs great in crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Currently at 1150/550 +50mv. 1.2938 max vcore. Averaging ~1.250v in-game.


crossfire was great on gta and mediocre at best on witcher 3


----------



## Semel

I've managed to push my fury's core clock to 1150Mhz however it required a whopping +108mV

- Core Clock: 1150Mhz

- Voltage: +108 mV

- Power Limit: 50%

- Memory Clock: 550Mhz

- Custom Fan Profile: 0-50C->0-50% fan speed / 50C++ -> 60% fan speed

Temps were real nice in Witcher 3 (1920x1200 all maxed out) , 52-53C @60% fan speed. Thanx to sapphire superior fan design it wasn't loud.

However I think it's not worth it.

I had :

- Core Clock: 1140Mhz

- Voltage: +72 mV

- Power Limit: 50%

- Memory Clock: 550Mhz

- Custom Fan Profile: 0-50C->0-50% fan speed / 50C++ -> 60% fan speed

I mean.. +36mV just for 10Mhz?? Nah.. Although it does look nice.. I mean..
1150/550


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Vote please....
> 
> Buying a new game today to try and really stretch the legs on this Fury
> 
> 1) Fallout 4
> 
> 2) Witcher 3
> 
> 3) GTA V
> 
> 4) Shadow of Mordor
> 
> Numbers or names, or both, are fine


Witcher 3 for Fun
GTA5 to test the cards limitations


----------



## Agent Smith1984

Quote:


> Originally Posted by *Thoth420*
> 
> Witcher 3 for Fun
> GTA5 to test the cards limitations


hmmm, I'd of thought the opposite.


----------



## Arizonian

@Agent Smith1984

Sorry if I missed it in the thread, have you tried the new Trixx which now supports over voltage on Fury cards? Curious if you tried this and if it helped bump your OC?

I know you weren't satisfied with your card, wondering if this helped any?

http://www.overclock.net/t/1580885/sapphiretrixx-trixx-now-supports-over-voltage-on-fury-cards


----------



## Agent Smith1984

Quote:


> Originally Posted by *Arizonian*
> 
> @Agent Smith1984
> 
> Sorry if I missed it in the thread, have you tried the new Trixx which now supports over voltage on Fury cards? Curious if you tried this and if it helped bump your OC?
> 
> I know you weren't satisfied with your card, wondering if this helped any?
> 
> http://www.overclock.net/t/1580885/sapphiretrixx-trixx-now-supports-over-voltage-on-fury-cards


To be honest, I have been so slammed with trying to move that I haven't even turned my box on since last Friday









I am however, firing her up tonight, and pushing it to the limit. I am even going to go as far as attempting to hex edit Trixx to get over 75mv if I'm still not happy.

I will post results, and throw you in a mention also, the minute I have her settled in.


----------



## The Mac

Quote:


> Originally Posted by *Agent Smith1984*
> 
> hmmm, I'd of thought the opposite.


i have all of those, witcher 3 by far has the richest story, characters, quests and immersion.

its one of those games you think you are just going to finish a quick quest you started and then 10 hours later you realize its 5AM.

The visuals are stunning.

It is by far the best game i have ever played.

And ive been gaming since the days of the Atari 2600.


----------



## mRYellow

W3 and GTA V are the best games this year. You can't go wrong with either.
If you like RPGs and fantasy/magic then i'd say W3 but it's so hard, almost unfair to choose.


----------



## The Mac

DAI is also great if you dont have that.


----------



## sugarhell

I think that witcher 3 is not that good. It feels big and empty. And the characters are cliche. Also some animations looks bad.

For me gta especially if you have friends to play online is one of the best games. Almost nothing wrong with the game


----------



## The Mac

Quote:


> Originally Posted by *sugarhell*
> 
> I think that witcher 3 is not that good. It feels big and empty. And the characters are cliche. Also some animations looks bad.












youre cray-cray...


----------



## p4inkill3r

Quote:


> Originally Posted by *sugarhell*
> 
> For me gta especially if you have friends to play online is one of the best games. Almost nothing wrong with the game


Even offline though, its great.
I'll sometimes get on and just cruise the highways with the radio on.


----------



## sugarhell

Quote:


> Originally Posted by *The Mac*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> youre cray-cray...


It is a good game but far from perfect or really good.


----------



## SuperZan

Quote:


> Originally Posted by *sugarhell*
> 
> It is a good game but far from perfect or really good.


Ya, I enjoyed my playthrough and it was certainly a nice game but GTA V keeps me consistently entertained. Sorting the graphics options and watching your VRAM consumption rise astronomically is also quite fun.


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My lil bro just got his 8core xeon rig going with 32gb and a 980ti classified he bought on craigslist for $480... That kid always always has to one-up me damn it! LOL
> 
> He's boosting over 1500mhz on core, what a beast...


I am sure you can give him a run. We can use trixx and we can modify through a hex editor to give us as many volts as we want. I am up to +102 trying to get 1200 core to pass at least one run on Firestrike.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> I am sure you can give him a run. We can use trixx and we can modify through a hex editor to give us as many volts as we want. I am up to +102 trying to get 1200 core to pass at least one run on Firestrike.


Well, i guess I'm the only person who can't use the new trixx. No matter what settings i try to change, i get hard locks....

With both xfx and sapphire bios...

Anyone else unable to make adjustments without locking up?

Ab works fine...

I try to even add 1mhz with trixx, and the screen flashes and system locks.


----------



## The Mac

Quote:


> Originally Posted by *sugarhell*
> 
> It is a good game but far from perfect or really good.


I guess I have the same problem with gtav.

If I wanted to run around Los Angeles and interact with thugs, I could do that irl.

Lol


----------



## p4inkill3r

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, i guess I'm the only person who can't use the new trixx. No matter what settings i try to change, i get hard locks....
> 
> With both xfx and sapphire bios...
> 
> Anyone else unable to make adjustments without locking up?
> 
> Ab works fine...
> 
> I try to even add 1mhz with trixx, and the screen flashes and system locks.


Are you using a custom BIOS or something?


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> Are you using a custom BIOS or something?


Tried stock xfx bios, and stock sapphire bios... Nothing custom at all


----------



## Otterfluff

Damn that hex edit for trixx was easy. I now have 150mv to play with.


----------



## The Mac

watch your vrms, will start to throttle after 95c


----------



## Otterfluff

Custom loop, I wont ever worry about my vrm's. I have reached 1.4V using my hard mods already, this just makes it far more easier and stable.


----------



## Jflisk

Thanks agent smith and allister for helping me get my bios flashed. I used the sapphire bios ending in .66 on my FURY X and it took it. I pulled the bios from my other FURY X ending in .64 and my other card didn't take it with bios .63.Also UEFi compliant now might have to do the second one.


----------



## kayan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Vote please....
> 
> Buying a new game today to try and really stretch the legs on this Fury
> 
> 1) Fallout 4
> 
> 2) Witcher 3
> 
> 3) GTA V
> 
> 4) Shadow of Mordor
> 
> Numbers or names, or both, are fine


I'm kind of late to the party, but W3 runs beautifully at 3440x1440 maxed on the Fury X. GTA does not, only like 20fps, hah. Fallout is easy to run on a 290x. Battlefront is a gorgeous game, but not as demanding as W3 or GTA V.

TLR....Get all three! That's my vote. (But if I only have 1 vote, it's Witcher 3).


----------



## Agent Smith1984

Soooo...

Anyone with XFX Fury Triple D or PowerCOlor PCS Fury tried Trixx yet?

I get hard locks sometimes just from opening the program, and other times when I hit the apply button, regardless of what I am applying.... this is driving me CRRAAAZZYYYY


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Soooo...
> 
> Anyone with XFX Fury Triple D or PowerCOlor PCS Fury tried Trixx yet?
> 
> I get hard locks sometimes just from opening the program, and other times when I hit the apply button, regardless of what I am applying.... this is driving me CRRAAAZZYYYY


have you redone your drivers. That surely foes not sound normal at all


----------



## Semel

Guys, is it normal that real performance boost is less than stock\overclocked difference?

For instance

tomb raider 1920х1080 ultimate preset, motion blur off

stock(1000\500)
72
92.4
120

oc (1140\550)
82
104.8
130

3dmark firestrike

stock http://www.3dmark.com/3dm/9362658
oc http://www.3dmark.com/3dm/9362845

I've overclocked my core by 14% but I get only 8-10% performance increase in games\benchmarks.(it depends on a game\benchmark)


----------



## Fastvedub

That is a nice increase in FPS for that little bit of Overclock,imo.I don't own a Fury,but from all the years of GPU Overclocking,i would say it is a nice increase in performance for such clock increase.


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> Guys, is it normal that real performance boost is less than stock\overclocked difference?
> 
> For instance
> 
> tomb raider 1920х1080 ultimate preset, motion blur off
> 
> stock(1000\500)
> 72
> 92.4
> 120
> 
> oc (1140\550)
> 82
> 104.8
> 130
> 
> 3dmark firestrike
> 
> stock http://www.3dmark.com/3dm/9362658
> oc http://www.3dmark.com/3dm/9362845
> 
> I've overclocked my core by 14% but I get only 8-10% performance increase in games\benchmarks.(it depends on a game\benchmark)


You overclock your core by 14% but your memory overclock is lower. You need to keep the same ratio between core and memory so you can have a linear increase. Probably in some other app that it needs more your shaders to perform better than 10%


----------



## mRYellow

Quote:


> Originally Posted by *Alastair*
> 
> I am sure you can give him a run. We can use trixx and we can modify through a hex editor to give us as many volts as we want. I am up to +102 trying to get 1200 core to pass at least one run on Firestrike.


How do you increase trixx voltage limits?


----------



## Alastair

Quote:


> Originally Posted by *mRYellow*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I am sure you can give him a run. We can use trixx and we can modify through a hex editor to give us as many volts as we want. I am up to +102 trying to get 1200 core to pass at least one run on Firestrike.
> 
> 
> 
> How do you increase trixx voltage limits?
Click to expand...

This particular edit works for Fury cards.Afaik there are other limits (max voltage allowed) set in Trixx for other cards that need to be edited.

- make a backup of your trixx.exe just in case.

-download notepad++ and install hex plugin.

-open trixx.exe via notepad++

- press Ctrl+Alt+SHift+H to switch to HEX view

- press Ctlr+F, select "Data Type" Unicode

-copy\paste into the search field VDDCOffsetMax=0.075 and change 0.075 to what you need

-save& exit.


----------



## sugarhell

Quote:


> Originally Posted by *mRYellow*
> 
> How do you increase trixx voltage limits?


Open trixx exe with notepad++ and hex editor plugin. Press control alt shift h to go to hex view.

Then control+ F and choose unicode on the data type. And search for this VDDCOffsetMax=0.075. Change the 0.075 to whatever you want


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> You overclock your core by 14% but your memory overclock is lower. You need to keep the same ratio between core and memory so you can have a linear increase. Probably in some other app that it needs more your shaders to perform better than 10%


I've overclocked memory to 570Mhz but I still get less than 14% performance boost,

http://www.3dmark.com/3dm/9363523

I guess my fury doesn't scale well past XXX core\memory clock. Seeing as I needed a whopping +36mV just to make it work at 1150Mhz that is +10Mhz lol it is quite possible.


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> I've overclocked memory to 570Mhz but I still get less than 14% performance boost,
> 
> http://www.3dmark.com/3dm/9363523
> 
> I guess my fury doesn't scale well past XXX core\memory clock. Seeing as I needed a whopping +36mV just to make it work at 1150Mhz that is +10Mhz lol it is quite possible.


I have seen better scores than yours around the same clocks so maybe you are not 100% stable or your system slows you down.

Your vrm's temps?


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> I have seen better scores than yours around the same clocks so maybe you are not 100% stable or your system slows you down.
> 
> Your vrm's temps?


I dunno. Software can't detect it. and I don't have tools to monitor them.My GPU temp never exceeds 55C, usually it's 52-53C

I kept logging while runnign benchmarks and I didn't see any throttling. Core\memory clocks were stable


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> I dunno. Software can't detect it. and I don't have tools to monitor them.My GPU temp never exceeds 55C, usually it's 52-53C
> 
> I kept logging while runnign benchmarks and I didn't see any throttling. Core\memory clocks were stable


Because you don't crash doesnt mean it is stable.


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Because you don't crash doesnt mean it is stable.


No crashing, no artifacts, no throttling. What else do I need to look for huh? You tell me and I'll check it.

PS If vrms were getting too hot the card would start throttling..

PPS The results I got on my system seems to be perfectly OK( I looked for cpu clock, gpu core clock\memory clock and graphics score)

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1419/1051/500000?minScore=0&gpuName=AMD%20Radeon%20R9%20Fury&cpuName=Intel%20Core%20i7-3770K%20Processor

So it's not a stability "issue". And I have no doubt that I would get a moderate increase in my score if I had a new CPU for instance.

PPPS
Quote:


> As you can see, Fiji scales nearly linearly with voltage, and *performance follows at roughly half the clock increase rate*.


https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/2.html

Yeah, I was getting same 8-10% performance increase. It's roughly half the clock increase.

*Fury is the real issue*. it's performance scaling sucks.


----------



## Alastair

Quote:


> Originally Posted by *Semel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Because you don't crash doesnt mean it is stable.
> 
> 
> 
> No crashing, no artifacts, no throttling. What else do I need to look for huh? You tell me and I'll check it.
> 
> PS If vrms were getting too hot the card would start throttling..
> 
> PPS The results I got on my system seems to be perfectly OK( I looked for cpu clock, gpu core clock\memory clock and graphics score)
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1419/1051/500000?minScore=0&gpuName=AMD%20Radeon%20R9%20Fury&cpuName=Intel%20Core%20i7-3770K%20Processor
> 
> So it's not a stability "issue". And I have no doubt that I would get a moderate increase in my score if I had a new CPU for instance.
> 
> PPPS
> Quote:
> 
> 
> 
> As you can see, Fiji scales nearly linearly with voltage, and *performance follows at roughly half the clock increase rate*.
> 
> Click to expand...
> 
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/2.html
> 
> Yeah, I was getting same 8-10% performance increase. It's roughly half the clock increase.
> 
> *Fury is the real issue*. it's performance scaling sucks.
Click to expand...

Getting a 10%-12% scaling with a 14% over-clock is normal with any card. I overclocked my 6850's by over 35% but the most I managed out of them was a bout a 30% performance boost. It is normal. Stop kicking a gift horse in the mouth and actually be thankful you are getting as close to linear scaling as you are. The near linear scaling you are getting is some of the best I have seen in a long while so stop complaining. Jeezuz.


----------



## Semel

Quote:


> Originally Posted by *Alastair*
> 
> be thankful .


I'm gonna be thankful when AMD gets it's act together and release something like 980ti in terms of performance and overclocking capabilities







Winter 2016 is coming so let's hope for the best..


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> No crashing, no artifacts, no throttling. What else do I need to look for huh? You tell me and I'll check it.
> 
> PS If vrms were getting too hot the card would start throttling..
> 
> PPS The results I got on my system seems to be perfectly OK( I looked for cpu clock, gpu core clock\memory clock and graphics score)
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1419/1051/500000?minScore=0&gpuName=AMD%20Radeon%20R9%20Fury&cpuName=Intel%20Core%20i7-3770K%20Processor
> 
> So it's not a stability "issue". And I have no doubt that I would get a moderate increase in my score if I had a new CPU for instance.
> 
> PPPS
> https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/2.html
> 
> Yeah, I was getting same 8-10% performance increase. It's roughly half the clock increase.
> 
> *Fury is the real issue*. it's performance scaling sucks.


Ok now you need a proper explanation after posting again this article.

This article is BS. This guy is trying to overclock with 90C vrms+. And then he increase the volts and then he complains about the efficiency.

The vrms are losing efficiency after 70C. With the stock volts you will not gonna get throttling or lose efficiency even if you pass 70C. But when you overclock and you are not stable 100% with VRMs over 70C you lose efficiency.

If the VRMs are losing efficiency that means that your core or you memory gets wrong VDDC.

Here a result from a fury tri-x with a lot better vrm cooling than fury x. 1190/570



Dont judge a situation when you dont have the proper knowledge. Fury scale with the clocks the same as the rest GCN gpus.


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Here a result from a fury tri-x with a lot better vrm cooling than fury x. 1190/570.


1190 vs mine 1140. Yeah I would get the same graphics score results if I could push it to 1190. Whats your point?

So you are saying that I got this performance scaling (8-10% per 14% overclock) coz of my VRms? Could you , please, provide a source where I could see a better performance scaling than mine on fury?


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> 1190 vs mine 1140. Yeah I would get the same graphics score results if I could push it to 1190. Whats your point?


I tho that you have a fury x. lol


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> I tho that you have a fury x. lol


No worries. It happens


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> No worries. It happens


Still.

Bad scaling vs good

http://www.3dmark.com/fs/5749459

http://www.3dmark.com/fs/6104547


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Still.
> 
> Bad scaling vs good
> 
> http://www.3dmark.com/fs/5749459
> 
> http://www.3dmark.com/fs/6104547


You think I should try lowering core clock\voltage and see if it performs better or performance\overclocking scaling is better?


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> You think I should try lowering core clock\voltage and see if it performs better or performance\overclocking scaling is better?


Try the old trick 100% fan speed @ idle then try again the 3dmark


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> have you redone your drivers. That surely foes not sound normal at all


I'm beginning to think that XFX/PC has done something to hard lock access to my cards voltage control.

I've uninstalled ccc, whipped drivers, uninstalled trixx, uninstalled ab/rivatuner, reinstalled drivers, then installed fresh trixx 5.2.1, and tested with both stock sapphire, and stock xfx bios.

I can open trixx, see the voltage slider (as well add all others), but if i move it. Or move the power limit, or fans. Or any clocks..... And click apply, i get a quick flash of the screen, and everything freezes.


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Try the old trick 100% fan speed @ idle then try again the 3dmark


Here you go

OCed : 100% fan speed (Maximum GPU temp recorded was 41C)
http://www.3dmark.com/3dm/9365256

OCed : 0-50C->0-50%, 50C+-> 60% fan speed fan curve
http://www.3dmark.com/3dm/9362845

Stock: 0-50C->0-50%, 50C+-> 60% fan speed fan curve
http://www.3dmark.com/3dm/9362658

I think the difference is within the margin of error.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> power limit, or fans. Or any clocks...


Can you change them in afterburner?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Semel*
> 
> Here you go
> 
> OCed : 100% fan speed (Maximum GPU temp recorded was 41C)
> http://www.3dmark.com/3dm/9365256
> 
> OCed : 0-50C->0-50%, 50C+-> 60% fan speed fan curve
> http://www.3dmark.com/3dm/9362845
> 
> Stock: 0-50C->0-50%, 50C+-> 60% fan speed fan curve
> http://www.3dmark.com/3dm/9362658
> 
> I think the difference is within the margin of error.
> Can you change them in afterburner?


AB works PERFECT


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm beginning to think that XFX/PC has done something to hard lock access to my cards voltage control.
> 
> I've uninstalled ccc, whipped drivers, uninstalled trixx, uninstalled ab/rivatuner, reinstalled drivers, then installed fresh trixx 5.2.1, and tested with both stock sapphire, and stock xfx bios.
> 
> I can open trixx, see the voltage slider (as well add all others), but if i move it. Or move the power limit, or fans. Or any clocks..... And click apply, i get a quick flash of the screen, and everything freezes.


Are you sure UPLS is off? I've noticed a bug with AB. I had to use Trixx to actually get it to stay off. That can cause the issue of which you speak.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Are you sure UPLS is off? I've noticed a bug with AB. I had to use Trixx to actually get it to stay off. That can cause the issue of which you speak.


If i try to use trixx to turn it off, it locks though, lol

I did try to disable it in ab and it says it's disabled, but who knows?


----------



## Semel

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If i try to use trixx to turn it off, it locks though, lol
> 
> I did try to disable it in ab and it says it's disabled, but who knows?


Open regedit as admin , seach for enableulps and set everything you find to 0


----------



## Jflisk

ULPS configuration utility

Information found here - Run as admin
http://www.overclock.net/t/1088266/ulps-gui-config-utility-enable-disable


----------



## josephimports

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Soooo...
> 
> Anyone with XFX Fury Triple D or PowerCOlor PCS Fury tried Trixx yet?
> 
> I get hard locks sometimes just from opening the program, and other times when I hit the apply button, regardless of what I am applying.... this is driving me CRRAAAZZYYYY


Jeez, first the locked shaders and now this?









Have you tried placing it in a different pci slot? Maybe in another PC. I wish you luck in getting it sorted out.


----------



## Semel

Hmm.. this is strange..I was playing a game for a while and running hwinfo monitoring. It showed that I had drops to 760Mhz(although I didn't notice it fps wise at all)

IHowever, Trixx log revealed that I didn't have any drops. It was stable 1140Mhz all the way.. I wonder what tool was "lying"









I wish afterburner had voltage support. I really want to see what afterburner would report

PS GPU-Z log doesn't show any core clock drops either.. I guess it's hwinfo's fault


----------



## kayan

A run of both Firestrike Extreme and Ultra comparing a 295x2 and a Fury X, both at stock clocks. Same exact system in both runs.

Extreme:
http://www.3dmark.com/compare/fs/6523084/fs/6548220
Ultra:
http://www.3dmark.com/compare/fs/6523108/fs/6548653

Everything is at stock, except my CPU, which is at 4.5ghz. Also posted in 295x2 owner's club.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> Hmm.. this is strange..I was playing a game for a while and running hwinfo monitoring. It showed that I had drops to 760Mhz(although I didn't notice it fps wise at all)
> 
> IHowever, Trixx log revealed that I didn't have any drops. It was stable 1140Mhz all the way.. I wonder what tool was "lying"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wish afterburner had voltage support. I really want to see what afterburner would report
> 
> PS GPU-Z log doesn't show any core clock drops either.. I guess it's hwinfo's fault


GPU-Z has never falsely reported for me but I can in fact confirm that Afterburner is certified garbage at reporting GPU clocks throttling even when they aren't and in some cases its Afterburner actually causing the clock throttling when it is happening. I never use HWinfo so I can't say with any certainty there.


----------



## Semel

Quote:


> Originally Posted by *xer0h0ur*
> 
> in some cases its Afterburner actually causing the clock throttling when it is happening.


lol yeah, this sounds nasty. Has it it ever been reported?


----------



## Gamedaz

Quote:


> Originally Posted by *sugarhell*
> 
> Ok now you need a proper explanation after posting again this article.
> 
> This article is BS. This guy is trying to overclock with 90C vrms+. And then he increase the volts and then he complains about the efficiency.
> 
> The vrms are losing efficiency after 70C. With the stock volts you will not gonna get throttling or lose efficiency even if you pass 70C. But when you overclock and you are not stable 100% with VRMs over 70C you lose efficiency.
> 
> If the VRMs are losing efficiency that means that your core or you memory gets wrong VDDC.
> 
> Here a result from a fury tri-x with a lot better vrm cooling than fury x. 1190/570
> 
> 
> 
> Dont judge a situation when you dont have the proper knowledge. Fury scale with the clocks the same as the rest GCN gpus.


* Those are some decent good numbers, they should improve alot of issues at those clocks, * If those VRMS being cooled properly with the Fury Heat fins then I congradulate AMD for finding a way to cool the VRMs to stay effiicient, as well cool, with a standard PCB ~ My Gainward phantom 780 ti ...had molded Aluminum Plate flush mounted to the PCB, it helped draw heat away from the 10 VRMs it had, but the Fury 3 FAN seems to keep things up at higher clocks. * Still waiting to See if AMD new Drivers will allow proper (core clocking) it basically comes down to fan speeds possibly, AMD could in theory allow the card to clock to 1150-1190 ,as long as (the BIOS needs to be re-written to allow voltages to exceed 1.2v ) the fan speeds are set properly ie 100%, which would then open the BIOS to increase voltages (albiet finly written code that references the FAN speeds and temps to Overvolt) it could be worth the time to allow it on these cards, these cards can keep temps easily into the 70c without breaking a sweat (Assuming you have the right fan support to allow the to stay that cool.)

* How difficult would it be to Write code where the BIOS can adjust Voltage according to AMDS Drivers request of the GPU Bios? = OC?

They sell O.C. as a Brand, so why not invest in something that improves the Brand.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> lol yeah, this sounds nasty. Has it it ever been reported?


I can only speak for myself and the times its happened to me I never have reported it to MSI. I just uninstall Afterburner without remembering settings then re-install it. I just know I have lost count on the number of people who have had problems with Afterburner's clock reporting. I have had versions of Afterburner that were rock solid and never gave me issues but the last two iterations have been buggy for me.


----------



## Semel

I've been playing with my Sapphire Fury Tri-X for the last 45 or so minutes, testing if I can push it any more and here is my summary:

1)
Core clock 1140Mhz
HBM: 570Mhz
voltage: +72mV
power limit 50%

Everything is stable,no throttling, artifacts, positive performance gain

2)
Core clock 1150Mhz
HBM: 570Mhz
voltage: +108mV
power limit 50%

Everything is stable,no throttling, artifacts, STILL positive performance gain

However, seeing that to get to 1150 I had to add a whopping +36mV just for 10Mhz core clock increase probably is a sign that going any further might be unwise.

And, as it turns out it is.

1160Mhz - Everything is stable,no throttling, artifacts, NEGATIVE performance gain .(stock clocks performance).
1170Mhz- Everything is stable,no throttling, artifacts, NEGATIVE performance gain (stock clocks performance)
1180Mhz - Everything is stable,no throttling, artifacts, NEGATIVE performance gain (stock clocks performance)

Everything in 1160Mhz-1180Mhz (absolute maximum I could get not going above 1.35V) range yielded SAME (in numbers) NEGATIVE performance gain.. Changing voltage\power limit didn't have any effect.

VRM temps seemed to be fine . I used the trick with 100% fan speed at idle,sapphire cooling system is a beast.

Btw in my case it didn't matter overclocking wise whether I had 3584 (default) stream processors or 3840 (unlocked)

I hope these observations might prove useful\interesting to someone.


----------



## Gamedaz

1140mhz is what is needed RMS + Stable. Peaking above 1200 might not yield consistent enough results @ 1080p Res ( And possibly alot of games) etc If it can lock onto the Freq when it throttles up (for what ever game needs it) then that is a fine O.C, given that its 28nm Die which IMO is already a major heat point. 3 Fan solutions seems to have the best cooling performance IMO, although it would be intersting to see some real Benchmarks somewhere...the Saphire has been tested but XFX and Power color Tripple Fan solutions have yet to be Tested online anywhere.

* I can't wait to O.C my XFX and see how it responds. Although I've heard that the new AMD drivers might require to Remove the drivers in iGPU Mode, then re-install after, but this is what I've heard with the AMD Beta 15.11 Drivers. Pain in the **** to pull card out and re-install into my case.


----------



## battleaxe

Quote:


> Originally Posted by *Gamedaz*
> 
> 1140mhz is what is needed RMS + Stable. Peaking above 1200 might not yield consistent enough results @ 1080p Res ( And possibly alot of games) etc If it can lock onto the Freq when it throttles up (for what ever game needs it) then that is a fine O.C, given that its 28nm Die which IMO is already a major heat point. 3 Fan solutions seems to have the best cooling performance IMO, although it would be intersting to see some real Benchmarks somewhere...the Saphire has been tested but XFX and Power color Tripple Fan solutions have yet to be Tested online anywhere.
> 
> * I can't wait to O.C my XFX and see how it responds. Although I've heard that the new AMD drivers might require to Remove the drivers in iGPU Mode, then re-install after, but this is what I've heard with the AMD Beta 15.11 Drivers. Pain in the **** to pull card out and re-install into my case.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Semel*
> 
> I've been playing with my Sapphire Fury Tri-X for the last 45 or so minutes, testing if I can push it any more and here is my summary:
> 
> 1)
> Core clock 1140Mhz
> HBM: 570Mhz
> voltage: +72mV
> power limit 50%
> 
> Everything is stable,no throttling, artifacts, positive performance gain
> 
> 2)
> Core clock 1150Mhz
> HBM: 570Mhz
> voltage: +108mV
> power limit 50%
> 
> Everything is stable,no throttling, artifacts, STILL positive performance gain
> 
> However, seeing that to get to 1150 I had to add a whopping +36mV just for 10Mhz core clock increase probably is a sign that going any further might be unwise.
> 
> And, as it turns out it is.
> 
> 1160Mhz - Everything is stable,no throttling, artifacts, NEGATIVE performance gain .(stock clocks performance).
> 1170Mhz- Everything is stable,no throttling, artifacts, NEGATIVE performance gain (stock clocks performance)
> 1180Mhz - Everything is stable,no throttling, artifacts, NEGATIVE performance gain (stock clocks performance)
> 
> Everything in 1160Mhz-1180Mhz (absolute maximum I could get not going above 1.35V) range yielded SAME (in numbers) NEGATIVE performance gain.. Changing voltage\power limit didn't have any effect.
> 
> VRM temps seemed to be fine . I used the trick with 100% fan speed at idle,sapphire cooling system is a beast.
> 
> Btw in my case it didn't matter overclocking wise whether I had 3584 (default) stream processors or 3840 (unlocked)
> 
> I hope these observations might prove useful\interesting to someone.


Tis the nature of current Amd silicon brother.... 390's "Grenada" and vishera "e series" behave the same way!

They'll let you know real quick, when they're done!


----------



## Unwinder

Quote:


> Originally Posted by *xer0h0ur*
> 
> GPU-Z has never falsely reported for me but I can in fact confirm that Afterburner is certified garbage at reporting GPU clocks throttling even when they aren't and in some cases its Afterburner actually causing the clock throttling when it is happening. I never use HWinfo so I can't say with any certainty there.


It's such a rare case when I have a chance to speak to certified GPU programming guru in person. You seem to have a rich experience of developing GPU monitoring tools and know a lot about internals of each of them to detect "certified garbage" applications. I'm really wondering why none of vendors still hired you to create software for them to replace that terribly broken tools.
Now, can I please ask you to provide a sample source code reading "right" GPU clocks in realtime to prove that AB or HwInfo mentioned above show certified garbage. From my side I see no problems creating and showing here an open source application reading the same ADL GPU clock activity sensor as displayed on GPU clock graphs in AB, CCC, HwInfo or virtually any other tool relying on ADL. I have no doubts that such an experienced person like you knows what is ADL. I'm afraid that you have to do it to prove that you're not a certified liar, sir.


----------



## MerkageTurk

Hi fellow fury x users

My fury x overclocks to 1140 with no voltage increase, should I increase the core clock a bit more?

Power limit +50

Plus the image quality is far superior than my 980 ti,you have to play it to know or see how it is.

Firestrike score is around 15k


----------



## Gumbi

Quote:


> Originally Posted by *Semel*
> 
> Meh.. I still can't get even 1100 not to throttle sometimes in games\benchmarks.....I tried changing power limit, I tried increasing and decreasing +mV..My core temps are in check..(50-55C max)... I even checked how much power my PC was using and it was lower than my PSU can deliver..


What PSU do you have, and what are your other components?
Quote:


> Originally Posted by *MerkageTurk*
> 
> Hi fellow fury x users
> 
> My fury x overclocks to 1140 with no voltage increase, should I increase the core clock a bit more?
> 
> Power limit +50
> 
> Plus the image quality is far superior than my 980 ti,you have to play it to know or see how it is.
> 
> Firestrike score is around 15k


Yes! That's an excellent stock volts overclock.


----------



## waltercaorle

Hello guys..
fury tri-x
1130/570 @ V Stock
1180/570 + 72mV, very good performance, the PL that is a 0 or +50 nothing changes
1190/570 or 1200/570, bad performance
1210 crash at any voltage


----------



## Semel

Quote:


> Originally Posted by *Unwinder*
> 
> reading the same ADL GPU clock activity sensor as displayed on GPU clock graphs in AB, CCC, HwInfo .


Could you ,please, explain how come hwinfo shows core clock dropping even at stock clocks (it happens much less than when core is OCed) whereas GPU-Z, Trixx and Afterburner report that everything is stable even up to 1180Mhz with proper settings (disregarding negative performance gain past 1150Mhz) Scan interval 1 second I think.. If out of 4 tools only one shows strange results then I think it is safe to assume that there is something wrong with this particular tool. I really like hwinfo but it proves to be not reliable when it comes to monitoring my fury
Quote:


> Originally Posted by *Gumbi*
> 
> What PSU do you have, and what are your other components?


It turns out there was no problems at all. HWinfo didn't properly report my core clock(it sometimes showed throttling whereas there was none) unlike GPU-z, Trixx, afterburner.


----------



## Darklyric

So I'm temped to grab a fury non x during the current sale with swbf which I was going to buy anyway but don't really want to wait on a sale... So bonus there and the pricing seems right now on furys too.

Whats your guys thoughts on the coolers on the sapphire triX vs the XFX triple D ?

Also I know the sapphire is ref pcb but is the xfx also reference? I'm probably going water so this makes a big deal.

Thanks on the thoughts here as I cant find much of a benchmark for the xfx cooler.


----------



## Alastair

Quote:


> Originally Posted by *Darklyric*
> 
> So I'm temped to grab a fury non x during the current sale with swbf which I was going to buy anyway but don't really want to wait on a sale... So bonus there and the pricing seems right now on furys too.
> 
> Whats your guys thoughts on the coolers on the sapphire triX vs the XFX triple D ?
> 
> Also I know the sapphire is ref pcb but is the xfx also reference? I'm probably going water so this makes a big deal.
> 
> Thanks on the thoughts here as I cant find much of a benchmark for the xfx cooler.


Both XFX and Sapphire use reference PCB's. The only non-reference card out at the moment is the Asus Strix. I know the Powercolor Fury is also reference but I haven't seen it yet.


----------



## Thoth420

Being just an average user I have to say MSI AB installed while swapping a driver has caused issues for me quite a few times. Nothing Bradley's guide couldn't solve and in the end I found if I uninstall AB and it's profiles (as well as the RTSS) swap drivers and then reinstall the chance of an issue is 0% (so far).

I have also ran into similar issues with EVGA precision but AFAIK they are bascially similar software that both uses RTSS.

Just saying from my experience: problems pop up sometimes for me when those programs are installed and all I was doing with the software was altering the fan speeds to be more agressive.

Not trying to bash anything just giving my feedback from what I have experienced.

Always been single GPU configs if that matters (Nvidia and AMD GPUs both had issues).

I find the snarky elitist response from Unwinder to be counterproductive and this same drama happened on the EVGA forums last yearish...People complained of issues involving the RTSS version in the latest Precison and the response was very nasty...threatens to quit producing the software. Seems egotisitcal to not be able to take criticism or bug feedback from end users.

I often hear ASUS and Nvidia play the old "we can't replicate your issue in our enviornment so it MUST be user error". Tons of different configs out there so to discount issues off the cuff or with lackluster investigation makes my BS gauge pin.

Please take this just as one mans experience and opinion as that all it is. I also most likely lack some info to clarify it...my point was to highlight why some people may have some bad things to say about it.

If the case is user error well something should still be done to protect the reputation of the software. Fear spreads so even rumors of issues on the web can spread fear which leads to more negative post parroting. The water gets muddy as the fingers point around....gotta be a better way to handle these kinds of issues.


----------



## xer0h0ur

Quote:


> Originally Posted by *Unwinder*
> 
> It's such a rare case when I have a chance to speak to certified GPU programming guru in person. You seem to have a rich experience of developing GPU monitoring tools and know a lot about internals of each of them to detect "certified garbage" applications. I'm really wondering why none of vendors still hired you to create software for them to replace that terribly broken tools.
> Now, can I please ask you to provide a sample source code reading "right" GPU clocks in realtime to prove that AB or HwInfo mentioned above show certified garbage. From my side I see no problems creating and showing here an open source application reading the same ADL GPU clock activity sensor as displayed on GPU clock graphs in AB, CCC, HwInfo or virtually any other tool relying on ADL. I have no doubts that such an experienced person like you knows what is ADL. I'm afraid that you have to do it to prove that you're not a certified liar, sir.


Nerve struck confirmed. I am not here to prove anything to you. I am only relaying my own issues I have run into with your software and I have seen plenty of other users on this forum report various oddball issues with using Afterburner and seeing clock throttling disappear after removing Afterburner. While I did go to school back in 2001 for computer engineering I do not work in the field so I am by no means nearly as experienced of a software developer as you are. While I experience issues with your software, note I still keep using it exclusively as the only overvoltage/overclocking software on my PC. I hold no ill will against you and in fact my beef was with W1zzard and his delaying getting an approved/working version of Trixx out to the public. I am still in fact waiting to see what you manage to get done with your upcoming update to Afterburner.


----------



## The Mac




----------



## fat4l

Quote:


> Originally Posted by *Unwinder*
> 
> It's such a rare case when I have a chance to speak to certified GPU programming guru in person. You seem to have a rich experience of developing GPU monitoring tools and know a lot about internals of each of them to detect "certified garbage" applications. I'm really wondering why none of vendors still hired you to create software for them to replace that terribly broken tools.
> Now, can I please ask you to provide a sample source code reading "right" GPU clocks in realtime to prove that AB or HwInfo mentioned above show certified garbage. From my side I see no problems creating and showing here an open source application reading the same ADL GPU clock activity sensor as displayed on GPU clock graphs in AB, CCC, HwInfo or virtually any other tool relying on ADL. I have no doubts that such an experienced person like you knows what is ADL. I'm afraid that you have to do it to prove that you're not a certified liar, sir.


my 50-c...
I had a problem with afterburner + dropping clocks.
This only happened when I enabled freesync + crossfire.
With freesync off + crossfire on, everything was fine.
I had to uninnstall AB, reinstal my drivers and then it worked perfectly.
Games are much smoother, 3d mark results are much better. I tried to intall Ab again and the same issue, dropping clocks.
Sorry


----------



## mRYellow

Man, Sapphire TRIXX's UI is awful....way to huge. Is there a way to make it smaller?


----------



## Unwinder

Quote:


> Originally Posted by *fat4l*
> 
> my 50-c...
> I had a problem with afterburner + dropping clocks.
> This only happened when I enabled freesync + crossfire.
> With freesync off + crossfire on, everything was fine.
> I had to uninnstall AB, reinstal my drivers and then it worked perfectly.
> Games are much smoother, 3d mark results are much better. I tried to intall Ab again and the same issue, dropping clocks.
> Sorry


Seeing performance hit due to mixing two different framerate limiting technologies on the same system has zero relation to more than direct statement telling that _"AB is certified garbage at reporting GPU clocks throttling even when they aren't"_, which I treat as lie and which I asked to prove with a simple GPU clock monitoring sample code. Quite opposite, you're seeing expected performance hit corresponding to dropped clocks.
I'm wondering how many more unrelated posts from those who had some difficulties with configuring AB will I see here.
sorry


----------



## xer0h0ur

So now using a framerate limiter to drop GPU usage is equal to GPU clock throttling? I expected better from you honestly. But by all means keep blaming user error instead of looking into your own software.


----------



## Unwinder

Quote:


> Originally Posted by *xer0h0ur*
> 
> So now using a framerate limiter to drop GPU usage is equal to GPU clock throttling? I expected better from you honestly. But by all means keep blaming user error instead of looking into your own software.


GPU DPM logic is tied to many parameters, GPU load based clock control aimed to reduce power consumption is one of them. I did expect that you have zero knowledge to prove that AB's clock readings are certified garbage as you declared, but I didn't expect that you've got no balls to apologize being caught on it.
And in case you did't notice and considering that you keep insisting that I simply don't want to look into my own faulty software, I'm offering you one more time the same very simple solution: I'll create small clock monitoring app with open source code reoroducing MSI AB's clock sensor and you'll fix "certified garbage" implementation for everyone. Sounds like a fair deal for me.


----------



## sugarhell

Fiji can alter clocks or even shut down shaders to reduce power consumption. If you dont want "clock throttling" disable powerplay.


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Fiji can alter clocks or even shut down shaders to reduce power consumption. If you dont want "clock throttling" disable powerplay.


And how do we do that for fury cards? Afaik there is no bios editing tool...

PS AMD's "frame limiter" in fact throttled my core clock unlike RIva tuners frame limiter.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> And how do we do that for fury cards? Afaik there is no bios editing tool...


You can disable PP in AB under the 'settings' tab.


----------



## xer0h0ur

Quote:


> Originally Posted by *Unwinder*
> 
> GPU DPM logic is tied to many parameters, GPU load based clock control aimed to reduce power consumption is one of them. I did expect that you have zero knowledge to prove that AB's clock readings are certified garbage as you declared, but I didn't expect that you've got no balls to apologize being caught on it.
> And in case you did't notice and considering that you keep insisting that I simply don't want to look into my own faulty software, I'm offering you one more time the same very simple solution: I'll create small clock monitoring app with open source code reoroducing MSI AB's clock sensor and you'll fix "certified garbage" implementation for everyone. Sounds like a fair deal for me.


From what I can tell you're getting hung up on those words "certified garbage" more than anything else I or anyone else says for that matter. If what you want is an apology for stating those specific words in admitted exaggeration then yes, sorry for calling your software's monitoring certified garbage.

So since you're talking about two different methods. Lets be realistic here. You're on a forum based on overclocking. Not a single one of us gives a hoot about power consumption or saving pennies on our electricity bills. So for the sake of this conversation lets assume people aren't trying to save power and are instead going for pure performance. Which method is Afterburner employing? Why do I for instance see Afterburner reporting my GPUs clocks all over the place after installing a new driver? This is one of the scenarios I was referencing before. I can avoid this issue by removing Afterburner without remembering settings along with the previous driver / registry key wipe / DDUing and then re-installing Afterburner after installing the new driver.

A previous poster alluded to Fiji being able to shut down shaders to reduce power consumption and he suggests disabling power play to avoid this. This would normally be a perfectly acceptable workaround for me except for the fact that once I set "Unofficial overclocking mode" to "without powerplay support" then it remains stuck as such with 3D clocks non-stop unless I again remove Afterburner without remembering settings and then re-install it. Simply toggling it back to the default setting doesn't work anymore. User Error on my part apparently

There are other oddball issues that occur for me on occasion which I can't make heads or tails of either. For instance at random my GPUs' clock settings in Afterburner will set themselves low. Way under the default clocks and the sliders no longer allow me to set them back to my previous clocks or even the default clocks for that matter. Sometimes a simple system restart will get rid of this and sometimes I have to again remove the software without remembering settings and re-install.

I understand you felt offended by the words I chose to describe a feature of your software but you're still entirely avoiding the possibility there is something wrong in Afterburner. I have had previous versions of Afterburner work flawlessly for me where I never had any of these problems. I only started getting the aforementioned results on your last two software versions released to the public. However you still seem to be missing the fact I not only support your software, I rely on it. Its still the only overvoltage/overclocking software I use on my rig.

I am no genius nor am I perfect. I previously had issues with games crashing on me either at loading or during gaming and I had come around to blaming Afterburner. I later ended up isolating the cause to it actually being the OSD causing the crashing for me which is related to RTSS aka Rivatuner so clearly I am always open to a possibility of it being user error when diagnosing a cause.


----------



## sugarhell

That means that you dont know what powerplay means and also you cant set up AB properly.


----------



## xer0h0ur

Quote:


> Originally Posted by *sugarhell*
> 
> That means that you dont know what powerplay means and also you cant set up AB properly.


By all means I am always open to correcting user error but can you please tell me where I went wrong there? Far as I can tell there is nothing to set up within Afterburner with respect to powerplay. You're given two or three options if I remember correctly as I am not at my comp right now to know with certainty. Why would running without powerplay support then going back to the default setting no longer drop the GPU clocks back to 2D mode instead of maintaining 3D clocks?


----------



## battleaxe

This is about to get ugly(er)...


----------



## sugarhell

Quote:


> Originally Posted by *xer0h0ur*
> 
> By all means I am always open to correcting user error but can you please tell me where I went wrong there? Far as I can tell there is nothing to set up within Afterburner with respect to powerplay. You're given two or three options if I remember correctly as I am not at my comp right now to know with certainty. Why would running without powerplay support then going back to the default setting no longer drop the GPU clocks back to 2D mode instead of maintaining 3D clocks?


When you remove powerplay means that the gpu cant control the clocks. So you have to do it manually via the AB.

It is like the amd linux drivers that they lack powerplay and fury stay to 2d clocks forever .


----------



## xer0h0ur

Quote:


> Originally Posted by *sugarhell*
> 
> When you remove powerplay means that the gpu cant control the clocks. So you have to do it manually via the AB.
> 
> It is like the amd linux drivers that they lack powerplay and fury stay to 2d clocks forever .


So just to be certain, you're saying that if I set unofficial overclocking mode to without powerplay support and then toggle it back to the default setting then Afterburner would not re-enable powerplay?


----------



## sugarhell

Quote:


> Originally Posted by *xer0h0ur*
> 
> So just to be certain, you're saying that if I set unofficial overclocking mode to without powerplay support and then toggle it back to the default setting then Afterburner would not re-enable powerplay?


You have to manually control the clocks. If you enable again powerplay everything is back to normal. Next time search more before you call something bad


----------



## battleaxe

Quote:


> Originally Posted by *sugarhell*
> 
> When you remove powerplay means that the gpu cant control the clocks. So you have to do it manually via the AB.
> 
> It is like the amd linux drivers that they lack powerplay and fury stay to 2d clocks forever .


So what does it mean when you just check "disabled"? Compared to "without Powerplay support"?


----------



## sugarhell

Quote:


> Originally Posted by *battleaxe*
> 
> So what does it mean when you just check "disabled"? Compared to "without Powerplay support"?


Unofficial overclocking mode = disabled


----------



## xer0h0ur

Quote:


> Originally Posted by *sugarhell*
> 
> You have to manually control the clocks. If you enable again powerplay everything is back to normal. Next time search more before you call something bad


You're still not managing to say anything clearly here. I specifically am telling you that I am returning the setting back to default within Afterburner so that its no longer on "without powerplay support." I am asking if by doing so this does not re-enable powerplay. In other words I am asking if Afterburner will in fact re-enable powerplay or not when I set unofficial overclocking mode back to its default setting.


----------



## sugarhell

Quote:


> Originally Posted by *xer0h0ur*
> 
> You're still not managing to say anything clearly here. I specifically am telling you that I am returning the setting back to default within Afterburner so that its no longer on "without powerplay support." I am asking if by doing so this does not re-enable powerplay. In other words I am asking if Afterburner will in fact re-enable powerplay or not when I set unofficial overclocking mode back to its default setting.


Probably error user.

If you go back to the default that means that you disable unofficial overclocking mode. Nothing to do with powerplay. Then you have 2 modes 1 with powerplay and the second one without powerplay. Man if you just search on the forum you will find it.


----------



## battleaxe

Quote:


> Originally Posted by *sugarhell*
> 
> Unofficial overclocking mode = disabled


Cool. Thanks. +1


----------



## The Mac

Deleted


----------



## dagget3450

Maybe i am not understanding what people are talking about with clock throttling but does the following not work for you?
Quote:


> Op try these settings in MSI AB highlighted with blue circles below:
> 
> 
> See if clocks change now.


----------



## The Mac

ULPS is only relevant for xfire...


----------



## battleaxe

Quote:


> Originally Posted by *The Mac*
> 
> ULPS is only relevant for xfire...


You sure? Could be user error on my part, but seems it has an affect on AB even with one GPU in use? My memory of an incident could be incorrect though. Just seems like I ran into something before.


----------



## dagget3450

Quote:


> Originally Posted by *battleaxe*
> 
> You sure? Could be user error on my part, but seems it has an affect on AB even with one GPU in use? My memory of an incident could be incorrect though. Just seems like I ran into something before.


He is correct on the Xfire part. The thing is i never have clock throttling issues with these settings, of course i run xfire. Maybe i don't understand what the arguments are but for me MSI AB works fine. Considering i paid nothing for it i don't throw complaints either. Anyways good luck with it guys.


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> He is correct on the Xfire part. The thing is i never have clock throttling issues with these settings, of course i run xfire. Maybe i don't understand what the arguments are but for me MSI AB works fine. Considering i paid nothing for it i don't throw complaints either. Anyways good luck with it guys.


Oh, I'm not complaining about AB at all. Used it for years and I like it. I don't have any issues with it. Seems there can be some bugs, but I have no idea if its the software or other things even the OS causing said issues. For the most part, it works great and is a very useful tool. Like you said. Its free on top of that, so what's to complain about?


----------



## Forceman

Quote:


> Originally Posted by *battleaxe*
> 
> You sure? Could be user error on my part, but seems it has an affect on AB even with one GPU in use? My memory of an incident could be incorrect though. Just seems like I ran into something before.


Don't know about Fury cards, but some 290X's would do the "idle black screen" thing if ULPS was enabled, even with only one card.


----------



## The Mac

this was true, but it was a powerplay issue.

disabling ULPS would stop powerplay from entering its lowest state.

thats been fixed however.

ULPS basically shuts off the 2nd card when its not being used.

Its not supposed to effect single cards.


----------



## xer0h0ur

Quote:


> Originally Posted by *sugarhell*
> 
> Probably error user.
> 
> If you go back to the default that means that you disable unofficial overclocking mode. Nothing to do with powerplay. Then you have 2 modes 1 with powerplay and the second one without powerplay. Man if you just search on the forum you will find it.


Okay, so to re-enable powerplay I would just use "with powerplay support" in Afterburner and it should go back to 2D clocks? If so then that particular issue I had was user error on my part.


----------



## fat4l

Quote:


> Originally Posted by *Unwinder*
> 
> Seeing performance hit due to mixing two different framerate limiting technologies on the same system has zero relation to more than direct statement telling that _"AB is certified garbage at reporting GPU clocks throttling even when they aren't"_, which I treat as lie and which I asked to prove with a simple GPU clock monitoring sample code. Quite opposite, you're seeing expected performance hit corresponding to dropped clocks.
> I'm wondering how many more unrelated posts from those who had some difficulties with configuring AB will I see here.
> sorry


I'm not saying AB is a garbage software. It's a very good software indeed. I just said "for me it's causing clock-dropping when used with freesync+crossfire" for unknown reason.
By saying this I was hoping that maybe you could look at it as I want to use it again








I can provide you with some more info and screenshots so you can actually see the fluctuation of clocks. It's really crazy.


----------



## xSneak

I've been testing my card this week and i got it to 1135 core,535 memory with +60mV (3840 SP). With +72mV it was only stable at 1140mhz. I get 7492 on firestrike extreme. If I put my computer to sleep with the overclock on, the display won't get a signal when I resume.
I love how quiet and power efficient this card is compared to my old 290x. I never go past 40% fan and 72 degrees celsius.


----------



## 98uk

Hello AMD people.

I purchased a Sapphire R9 Fury Tri-x 4gb. It's meant to be a good card, but I read a thing or two regarding coil whine.

Is this still a common thing, or solely related to early card batches?


----------



## Gumbi

Quote:


> Originally Posted by *98uk*
> 
> Hello AMD people.
> 
> I purchased a Sapphire R9 Fury Tri-x 4gb. It's meant to be a good card, but I read a thing or two regarding coil whine.
> 
> Is this still a common thing, or solely related to early card batches?


Every card has it to some extent, how bad it is usualy comes down to the quality of the card, luck and assuming user is using a good psu etc.

The TriXs are generally great so you should have no issue, especially if you got the new edition model.


----------



## Agent Smith1984

Anyone tested trixx with XFX Fury yet?

I have tested everything... nothing but freezes

Running Trixx with GPU OD enabled and disabled in CCC

Running Trixx with AB open and ULPS disabled, no power play, with power play, running Trixx without AB running at all, and the reigstry key removed from all AB settings.

Cleared drivers, reinstalled drivers, tried different drivers.... no matter what I do, I get a quick flash when making any change in Trixx at all, and the system freezes.

I've reset CMOS, I've tried everything.

I have seen one other person with the same issue, and they said it is happening when using HDMI and not when using DP, but I can't test DP to see if that is the root cause or not.

Would love to see if anyone has been successful in using this app with XFX while running HDMI, or just running XFX card in general.


----------



## MerkageTurk

fellow members I have lower fps with Mortal Kombat than my 290x


----------



## mRYellow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone tested trixx with XFX Fury yet?
> 
> I have tested everything... nothing but freezes
> 
> Running Trixx with GPU OD enabled and disabled in CCC
> 
> Running Trixx with AB open and ULPS disabled, no power play, with power play, running Trixx without AB running at all, and the reigstry key removed from all AB settings.
> 
> Cleared drivers, reinstalled drivers, tried different drivers.... no matter what I do, I get a quick flash when making any change in Trixx at all, and the system freezes.
> 
> I've reset CMOS, I've tried everything.
> 
> I have seen one other person with the same issue, and they said it is happening when using HDMI and not when using DP, but I can't test DP to see if that is the root cause or not.
> 
> Would love to see if anyone has been successful in using this app with XFX while running HDMI, or just running XFX card in general.


Strange, no issue here. How about trying DVI port?


----------



## Agent Smith1984

Quote:


> Originally Posted by *mRYellow*
> 
> Strange, no issue here. How about trying DVI port?


These cards don't have one...


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mRYellow*
> 
> Strange, no issue here. How about trying DVI port?
> 
> 
> 
> These cards don't have one...
Click to expand...

HDMI to DVI adaptor?


----------



## SLK

Anyone here running Fury Tri-X crossfire on air? I just bought a second one and hoping for the best here.


----------



## Alastair

Quote:


> Originally Posted by *SLK*
> 
> Anyone here running Fury Tri-X crossfire on air? I just bought a second one and hoping for the best here.


let us know how it goes!


----------



## Alastair

So I can get heaven to run nicely at 1200 MHz with +108 mv. I can let it loop heaven and it will have 0 issues. And this is at 1440P so the GPU is definitely being loaded. But then running 3D mark is just insta crashes. Wondering if its drivers or if I borked 3D Mark.


----------



## Semel

Quote:


> Originally Posted by *Alastair*
> 
> So I can get heaven to run nicely at 1200 MHz with +108 mv. I can let it loop heaven and it will have 0 issues. And this is at 1440P so the GPU is definitely being loaded. But then running 3D mark is just insta crashes. Wondering if its drivers or if I borked 3D Mark.


Try adding +6mV/+6mV/ etc and testing after each time 3dmark stability..Oh and even if it passes don't get all cocky







) Instead launch Witcher 3 and play it for at least 1 hour
Witcher 3 is da best gpu stability test







.. I remember I passed furmark, 3dmark,heaven, all games.. Witcher 3 crashed within 1-2 hours.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anyone tested trixx with XFX Fury yet?
> 
> I have tested everything... nothing but freezes
> 
> Running Trixx with GPU OD enabled and disabled in CCC
> 
> Running Trixx with AB open and ULPS disabled, no power play, with power play, running Trixx without AB running at all, and the reigstry key removed from all AB settings.
> 
> Cleared drivers, reinstalled drivers, tried different drivers.... no matter what I do, I get a quick flash when making any change in Trixx at all, and the system freezes.
> 
> I've reset CMOS, I've tried everything.
> 
> I have seen one other person with the same issue, and they said it is happening when using HDMI and not when using DP, but I can't test DP to see if that is the root cause or not.
> 
> Would love to see if anyone has been successful in using this app with XFX while running HDMI, or just running XFX card in general.


Get a DP to HDMI adapter. Can you do that?


----------



## mRYellow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> These cards don't have one...


Sorry, had a moment of brilliance


----------



## Agent Smith1984

So, how many people are using HDMI?
Cause I don't want to get an HDMI to DVI adapter if using the HDMI is the issue, and I would consider getting a DP to HDMI adapter, but I wanted to wait for the new CLub 3d DP to HDMI 2.0 adapter so I can get 4k60....

I'm sure some of the folks in here are using Trixx with HDMI. This is racking my brain... I will just try HDMI to an Asus 24" monitor I have and see if it's something to do with that, but I highly doubt it considering Afterburner works perfectly.


----------



## Gamedaz

* It could be your 4K Set. If the HDMI does not have the Bandwidth it might not clock the card appropiatly possibly?


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, how many people are using HDMI?
> Cause I don't want to get an HDMI to DVI adapter if using the HDMI is the issue, and I would consider getting a DP to HDMI adapter, but I wanted to wait for the new CLub 3d DP to HDMI 2.0 adapter so I can get 4k60....
> 
> I'm sure some of the folks in here are using Trixx with HDMI. This is racking my brain... I will just try HDMI to an Asus 24" monitor I have and see if it's something to do with that, but I highly doubt it considering Afterburner works perfectly.


Xfx is a ref pcb model? If it is a ref try to flash a sapphire bios


----------



## Himo5

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, how many people are using HDMI?
> Cause I don't want to get an HDMI to DVI adapter if using the HDMI is the issue, and I would consider getting a DP to HDMI adapter, but I wanted to wait for the new CLub 3d DP to HDMI 2.0 adapter so I can get 4k60....
> 
> I'm sure some of the folks in here are using Trixx with HDMI. This is racking my brain... I will just try HDMI to an Asus 24" monitor I have and see if it's something to do with that, but I highly doubt it considering Afterburner works perfectly.


It's too late for me to test it tonight but the thought just now surfaced about whether Trixx can allow for PCIE 2.0?

I'm OC'ing a Sapphire Fury X to 1170/555/+60 in Firestrike that I'm running in an FM2+ system with an A10-7870K which can use the PCIE 3.0 lanes, but I also have an A10-6800K which can't.

Maybe I can swap the apu's and see if Trixx still works?

(I'm also operating through an HDMI cable.)


----------



## SLK

Quote:


> Originally Posted by *sugarhell*
> 
> Xfx is a ref pcb model? If it is a ref try to flash a sapphire bios


This.

I was going to say the same thing. Great minds think alike









Agent Smith, I have a reference 1000mhz BIOS if you need it. Looks like TPU only has the 1040mhz bios.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Xfx is a ref pcb model? If it is a ref try to flash a sapphire bios


Tried that too...


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Tried that too...


Do you have msi AB at all? ANy other program that maybe wants to get i2c info from your gpu?


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Do you have msi AB at all? ANy other program that maybe wants to get i2c info from your gpu?


Tried it with ab launched and closed...
Tried it with ab uninstalled and Riva tuner uninstalled too.


----------



## Thoth420

Mission accomplished thanks to the tiny form factor of the Fury X! I was nervous it wouldn't fit by a tad but everything does. More pics to come(not my work)...

Thinking of painting the heat spreaders on the RAM white


----------



## SuperZan

Very nice, and as far as the heat spreader customisation, go for it







, gotta keep to the theme.


----------



## Thoth420

Will do! my gpu backplate is white, using white dye in the loop etc.









I was told what I wanted was impossible and then the fury x came out and now a proof of concept is becoming a reality.


----------



## EpicOtis13

Quote:


> Originally Posted by *Agent Smith1984*
> 
> VOTES GET REPS


GTA V!


----------



## Agent Smith1984

Quote:


> Originally Posted by *EpicOtis13*
> 
> GTA V!


already bought it!
Love it!

Also... Loving that i moved, have 200mb internet here, downloaded 65GB game inn 3hrs!


----------



## mRYellow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So, how many people are using HDMI?
> Cause I don't want to get an HDMI to DVI adapter if using the HDMI is the issue, and I would consider getting a DP to HDMI adapter, but I wanted to wait for the new CLub 3d DP to HDMI 2.0 adapter so I can get 4k60....
> 
> I'm sure some of the folks in here are using Trixx with HDMI. This is racking my brain... I will just try HDMI to an Asus 24" monitor I have and see if it's something to do with that, but I highly doubt it considering Afterburner works perfectly.


DP to HDMI would be better.

HDMI to DP is the same thing....same source.


----------



## Himo5

I tried it with the A10-6800K and everything worked fine, so it's not a PCIE problem.


----------



## mRYellow

New Crimson drivers leaked

http://file2.mydrivers.com/2015/display/amd_radeon_crimson_15_30.zip

https://mega.nz/#!ZU1mhIDY!zKROMe4N29W1GKm96OF_GcbzsynOQKmSdmsEU07RbOo


----------



## Agent Smith1984

Quote:


> Originally Posted by *mRYellow*
> 
> New Crimson drivers leaked
> 
> http://file2.mydrivers.com/2015/display/amd_radeon_crimson_15_30.zip
> 
> https://mega.nz/#!ZU1mhIDY!zKROMe4N29W1GKm96OF_GcbzsynOQKmSdmsEU07RbOo


Nice!
http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/2/


----------



## 98uk

Drivers are official on AMD's site too now:

http://support.amd.com/en-us/download

EDIT: also...

It lives! I can't believe how quiet it is. Under load it is quieter than my stock R9 290


----------



## mRYellow

They are mate


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice!
> http://wccftech.com/amd-radeon-software-performance-analysis-is-this-the-crimson-tide/2/


It didn't help yout overvolting issue by any chance did it? You prob haven't tried yet but let us know when you do!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> It didn't help yout overvolting issue by any chance did it? You prob haven't tried yet but let us know when you do!


Going to test tonight!

Crossing my fingers.... I really want to juice this thing up man...


----------



## solariss

What kind of overclock are owners of the Sapphire Fury Tri-X getting? I can't get mine past 1110 stable, even with +72 voltage. I black screen/crash on anything over 1110 in the heaven benchmark and I haven't even touched the memory yet.


----------



## Agent Smith1984

Boy, they got some deals going on Newegg!!

New 290's $200

#90X Devil (watercooled) for $399

Fury for $500

Fury X for $590

GTX 980 for $440...

The list goes on...


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Boy, they got some deals going on Newegg!!
> 
> New 290's $200
> 
> #90X Devil (watercooled) for $399
> 
> Fury for $500
> 
> Fury X for $590
> 
> GTX 980 for $440...
> 
> The list goes on...


Which model 290s? 290s are insane value for money...


----------



## Agent Smith1984

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043&cm_re=r9_290-_-14-202-043-_-Product


----------



## dagget3450

So the new crimson drivers.. for me overclocking i still get same fps as if i am running stock. Not sure whats going on here can anyone else test? I am using CF and getting same results overclocked or stock at 4k. Weird.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043&cm_re=r9_290-_-14-202-043-_-Product


yeah, they've had that sale going a few weeks now off and on. Pretty crazy deal TBH. So very hard not to get one.


----------



## dagget3450

Quote:


> Originally Posted by *dagget3450*
> 
> So the new crimson drivers.. for me overclocking i still get same fps as if i am running stock. Not sure whats going on here can anyone else test? I am using CF and getting same results overclocked or stock at 4k. Weird.


Looks like Sapphire Trixx is broken for me on new drivers. It's not changing the clocks.


----------



## Alastair

Quote:


> Originally Posted by *dagget3450*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> So the new crimson drivers.. for me overclocking i still get same fps as if i am running stock. Not sure whats going on here can anyone else test? I am using CF and getting same results overclocked or stock at 4k. Weird.
> 
> 
> 
> Looks like Sapphire Trixx is broken for me on new drivers. It's not changing the clocks.
Click to expand...

Uninstall trixx before you install new drivers. Then do the drivers. Then reinstall trixx when done.


----------



## Thoth420

So I can't test yet but what are some impressions of the new Crimson software etc.?


----------



## xer0h0ur

Quote:


> Originally Posted by *Alastair*
> 
> Uninstall trixx before you install new drivers. Then do the drivers. Then reinstall trixx when done.


This ^

I always uninstall my GPU overvoltage/overclocking software when I am uninstalling my drivers / doing a driver registry key wipe. This way I ensure that any problems I run into were not driver installation related and I have as clean of an installation of a new driver as possible short of doing a new Windows installation. Wish I was on Windows 10 to try out this driver though.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> This ^
> 
> I always uninstall my GPU overvoltage/overclocking software when I am uninstalling my drivers / doing a driver registry key wipe. This way I ensure that any problems I run into were not driver installation related and I have as clean of an installation of a new driver as possible short of doing a new Windows installation. Wish I was on Windows 10 to try out this driver though.


I don't I'm incredibly lazy and stupid. I just use DDU in safe mode and call it a day.









And sometimes it bites me in the butt too. LOL


----------



## xer0h0ur

I am on the same lazy boat my friend. I used to literally just DDU and install a new driver every single time a new one was released. That caught up to me after like a dozen drivers leaving behind registry keys that DDU didn't remove. So now unless there is a good reason to update drivers or I am trying to play a new title then I just stick with a driver I know to be stable for me. This would qualify as one of those drivers worth the effort to upgrade to so I am going to give it a whirl tonight when I get off work.


----------



## Gamedaz

*When installing the New Drivers does AMD software detect the old drivers and remove them from the Reigistary as well? ( Where would they be located in there?) Why would'nt AMD remove the older drivers properly? I've installed / removed Drivers with Nvidia Cards with 0 issue's.

* I expect it won't be too complicated.


----------



## xer0h0ur

Just because you uninstall something doesn't mean that every single registry key associated with what you uninstalled is removed. This isn't isolated to drivers/driver software either. Its a common occurrence with software. Thing is that unless you're diving into the windows registry after removing something you will likely never notice it.


----------



## Gamedaz

* Crimson Installer keeps crashing??

Anyone have issues on Windows 8.1 64bit 1080 HDMI Display?

Whats is causing the Installer to keep crashing. Removed Stopped numerous background processes, still Installer stops reponding. GPU is XFX Fury R9.


----------



## xer0h0ur

Idk man, you're not the only person running into this either. I won't find out for myself until I give it a go in a few hours when I get off from work.


----------



## Gamedaz

* Running a Image restore now, just in case, not sure why the installer keeps freezing, already opened a ticket with AMD. Can't wait to get it properly installed...









EDIT: Could this be a Net framework issue? Or Windows Update requirement?


----------



## Noirgheos

So how are the performance improvements overall for people? Fallout 4 and WItcher 3 specifically though.

Also can I anybody with a Fury(Non-X) say if they have coil whine or not? I'm planning on the Sapphire or XFX model. The XFX one is $70 cheaper so it's looking good for it.


----------



## Jflisk

OMG WOW AMD fixed every problem I have ever had with Crimson.


----------



## Noirgheos

Quote:


> Originally Posted by *Jflisk*
> 
> OMG WOW AMD fixed every problem I have ever had with Crimson.


How? What was wrong in the first place?


----------



## Jflisk

Black ops 3 - Lags fixed. COD advanced Warfare - Crossfire fixed . Battlefront -Now playable in Crossfire. No black screen when windows 10 starts. Just checking the performance boost in 2550X1440 but it looks marginal in all games Fury X x 2.This is looking extremely good for once.


----------



## Noirgheos

Quote:


> Originally Posted by *Jflisk*
> 
> Black ops 3 - Lags fixed. COD advanced Warfare - Crossfire fixed . Battlefront -Now playable in Crossfire. No black screen when windows 10 starts. Just checking the performance boost in 2550X1440 but it looks marginal in all games Fury X x 2.This is looking extremely good for once.


Wow were those issues do to Crossfire mostly?


----------



## dagget3450

Crimson drivers are giving me grief with clock throttling. It's not with all games but specific things. I.E. Valley stays maxed, Descent Underground is 300-900mhz clocks and its causing my 120hz vsynch to drop to 60 or less in corners. Think i am going to roll back to beta as i cant seem to find a way to stop it.


__
https://www.reddit.com/r/3u49tb/are_fury_owners_getting_odd_clock_throttling_in/

Edit: it's an issue with vsynch it looks like. If i turn off vsynch i get max clocks where as with vsynch i am getting clock throttling.


----------



## Crisium

To anyone who gets their driver installation not responding. Just wait. A long time. It will do it eventually. Start the installation to where you get to the progress bar and just walk away from your computer. It will install.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> So how are the performance improvements overall for people? Fallout 4 and WItcher 3 specifically though.
> 
> Also can I anybody with a Fury(Non-X) say if they have coil whine or not? I'm planning on the Sapphire or XFX model. The XFX one is $70 cheaper so it's looking good for it.


I am using the XFX Fury, and it definitely has coil whine, but not that bad really, and it's actually a lot quieter than when I first got it.

Be aware that my XFX has completely unlockable CU's, if you are planning to try that.... some Sapphire's do too, and some don't, so it's a coin toss there, but most likely, any Fury you get, will exhibit some degree of coil whine.

As far as price, last I saw, the Sapphire, and the XFX were both $500 on newegg, and both have excellent coolers (shocked at how good the one on the XFX is- though it makes sense, since it's the PowerColor PCS+ team's design).

It really just depends on if you like yellow on your back plate I guess


----------



## Gamedaz

Quote:


> Originally Posted by *Crisium*
> 
> To anyone who gets their driver installation not responding. Just wait. A long time. It will do it eventually. Start the installation to where you get to the progress bar and just walk away from your computer. It will install.


Not sure if you can walk away from it since windows stops the process as not responding.


----------



## p4inkill3r

Anyone installing Crimson needs to uninstall AB/Trixx/etc. beforehand.


----------



## 98uk

Quote:


> Originally Posted by *p4inkill3r*
> 
> Anyone installing Crimson needs to uninstall AB/Trixx/etc. beforehand.


Why? I didn't uninstall AB and Crimson works fine for me. AB is also working to OC the core clock.


----------



## p4inkill3r

Quote:


> Originally Posted by *98uk*
> 
> Why? I didn't uninstall AB and Crimson works fine for me. AB is also working to OC the core clock.


Many problems people are having seem to be attributed to OCing programs, in particular the memory error message that occurs during startup/during installation/during shutdown.


----------



## fat4l

Quote:


> Originally Posted by *p4inkill3r*
> 
> Many problems people are having seem to be attributed to OCing programs, in particular the memory error message that occurs during startup/during installation/during shutdown.


yep, I experienced this too. Some mem error during shutdown.


----------



## Crisium

Quote:


> Originally Posted by *Gamedaz*
> 
> Not sure if you can walk away from it since windows stops the process as not responding.


Mine was not stopped by Windows on its own, only if I tried to click anything after starting. If Windows forces your hand you can try clicking "Wait for Program to Respond".


----------



## hyp36rmax

Quote:


> Originally Posted by *p4inkill3r*
> 
> Anyone installing Crimson needs to uninstall AB/Trixx/etc. beforehand.


I'll try this when i get home. My install broke with AB.


----------



## xer0h0ur

Quote:


> Originally Posted by *fat4l*
> 
> yep, I experienced this too. Some mem error during shutdown.


Yup, same here. Even when its "working" I get the mem error on shut down. I am going to try one last time with another clean install and zero overclocks to see if its my overclocks causing that error and my bad CS:GO performance.


----------



## p4inkill3r

I received the mem error on shutdown until I DDU'd, uninstalled Trixx, then reinstalled the Crimson software.


----------



## Jflisk

Any one with 2 X fury X have COD Ghost multiplayer on there machines. I know its an older game - Could you please try with and without crossfire with the Crimson driver and let me know If it is seizure fest . Like Lines moving all around the screen black gun sights. Basically only game I have a problem with. The game was a problem to begin with but seemed to have worked fine with the 15.7.11 Beta. Thanks


----------



## xer0h0ur

Anyone else getting borked DX9 game performance with this driver? Particularly people using more than one GPU.


----------



## fat4l

Quote:


> Originally Posted by *xer0h0ur*
> 
> Anyone else getting borked DX9 game performance with this driver? Particularly people using more than one GPU.


not sure but crysis 1(dx10?), threat sensor + energy + health is flickering..


----------



## Thoth420

I won't even swap GPU drivers without uninstalling all traces of OC software(I prefer MSI AB) as well as the previous driver. I certainly wouldn't expect the new software to install without issue with that stuff installed.


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*
> 
> I won't even swap GPU drivers without uninstalling all traces of OC software(I prefer MSI AB) as well as the previous driver. I certainly wouldn't expect the new software to install without issue with that stuff installed.


Most people pay lip service to following the same protocols, yet the day of a new release, all of the sudden they have issues.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> Most people pay lip service to following the same protocols, yet the day of a new release, all of the sudden they have issues.


I see it all the time: "It was fine the 30 other times I changed drivers..." (insert your avatar here as it perfectly represents my reaction)


----------



## SuperZan

Quote:


> Originally Posted by *p4inkill3r*
> 
> Most people pay lip service to following the same protocols, yet the day of a new release, all of the sudden they have issues.


Right? Installed on two systems without an issue.


----------



## Gamedaz

Quote:


> Originally Posted by *SuperZan*
> 
> Right? Installed on two systems without an issue.


* Did you require to turn your AV off?


----------



## SuperZan

Quote:


> Originally Posted by *Gamedaz*
> 
> * Did you require to turn your AV off?


I didn't need to do so, but I've seen that others did. It may just depend on the make/version of the individual's AV.


----------



## xer0h0ur

Quote:


> Originally Posted by *Gamedaz*
> 
> * Did you require to turn your AV off?


Oh man I forgot to disable my AV during all install attempts. I wonder if that had any effect.


----------



## diggiddi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Oh man I forgot to disable my AV during all install attempts. I wonder if that had any effect.


firewall too


----------



## MerkageTurk

Hi fellow members

When i play bf4 the clocks down clock to 2d for a split second causing stutter etc


----------



## xer0h0ur

Quote:


> Originally Posted by *diggiddi*
> 
> firewall too


The only chance in hell of my system having its firewall disabled is if its not connected to the internet.


----------



## Gamedaz

Quote:


> Originally Posted by *SuperZan*
> 
> I didn't need to do so, but I've seen that others did. It may just depend on the make/version of the individual's AV.


* So some have actually had to turn off AV for it to extract?...Maybe that why it freezes, AV might see it as Uknown .exe? Though its never reported anything about the File.

* I know BitDefender has Signature Logic that detects algorythms associated with Day 1Malware etc.


----------



## 98uk

Quote:


> Originally Posted by *MerkageTurk*
> 
> Hi fellow members
> 
> When i play bf4 the clocks down clock to 2d for a split second causing stutter etc


Disabled ulps?


----------



## Jflisk

My ghost problem was I forgot to DDU.









I removed all drivers then DDUed .
Everything seems to be working now.


----------



## dagget3450

Quote:


> Originally Posted by *MerkageTurk*
> 
> Hi fellow members
> 
> When i play bf4 the clocks down clock to 2d for a split second causing stutter etc


I am having this issue also but different games. It appears if i disable vsynch it doesn't do this. At any rate dunno if it will help you any.

I don't want to rollback because without pixel patch my monitor is showing 120hz on crimson drivers. Really nice not to have to use a third party app for high refresh rate.


----------



## Agent Smith1984

I'm beginning to think that this xfx is hard locked...

Reset cmos, tried different slots, full driver cleaning, reinstalled everything... Several times!

Attempting any setting change in trixx causes a freeze and a restart.

Afterburner works fine in every way.

It makes no sense at all....

Unfortunately no-one has an xfx card to test.

My last hope it's too order a dp to hdmi adapter and see if it's an HDMI issue like the one other user i found.


----------



## MerkageTurk

I am using Trixx, so no ulps

I swear previous drivers had 120hz display option dagget?

Gta v runs fine

But Bf4 stutters when it clocks down to 2d clocks.

Should I return it?


----------



## Crisium

The new drivers gave me periodic clock drops in Battlefront. It would randmly throttle to anywhere between 600-900 ish just for a moment which really stuttered the game. I had to do power limit +50 to get rid of it. I didn't have to do this before. This is with a 1090/550 Fury.


----------



## Noirgheos

Just letting you all know, you should report any bugs you guys experience in detail at the Bug Report page in the Crimson Driver Panel.


----------



## Otterfluff

I used ddu to remove drivers and uninstalled all OC software -> install crimson, and no matter what I do when I load up a game after 10 seconds of load the drivers crash and go back to windows.

Ive done this four times now and im not sure what I am doing wrong.


----------



## Otterfluff

Somone on reddit suggested to use the amd clean install utility and surprise it worked a charm I have no issues with crimson dirver. DDU did not work for me at all.

Amazing how much faster games load.


----------



## Crisium

Ok, I'm still getting unacceptable Throttling even at +50%. Gonna try to clean and reinstall the drivers then I am reverting if that doesn't fix it.


----------



## xer0h0ur

So just to drop a few nuggets of information. Don't use the Crimson driver expecting your previous overclocks to be working just peachy with it. Using my overclocks that had been stable for me with lots of Catalysts did not work out for me at all. It seems to be particularly temperamental with memory overclocks. So as a base suggestion try not using any overclocks and verify if issues persist or go away. Using default vRAM clocks on my 290X stopped the shut down mem error I was getting.

The other tidbit I figured out was that this driver simply doesn't disable crossfire in any way shape or form on the 295X2. Doesn't matter at all if you globally disabled crossfire, disabled crossfire on a game profile or if you did both. Either way Afterburner confirms both GPUs are clocking up and processing a load on each GPU. Granted I only confirmed this on CS:GO so not certain if its game specific or perhaps isolated to DX9 games.

As a workaround I connected my monitor to the 290X instead of the 295X2 so that when I was gaming it was using that video card instead of the 295X2's GPUs. Afterburner then confirmed only GPU3 was clocking up and processing any load.


----------



## mRYellow

Quote:


> Originally Posted by *p4inkill3r*
> 
> Anyone installing Crimson needs to uninstall AB/Trixx/etc. beforehand.


I didn't uninstall trixx just disabled it from starting up with windows and have no issues.


----------



## fat4l

Quote:


> Originally Posted by *Otterfluff*
> 
> Somone on reddit suggested to use the amd clean install utility and surprise it worked a charm I have no issues with crimson dirver. DDU did not work for me at all.
> 
> Amazing how much faster games load.


have you tried the newest ddu, made for crimson drivers ?


----------



## Otterfluff

I just got the latest one off the Guru3d site at the time. Possible it was the older version. 15.7.0.1?


----------



## Randomdude

Disabled no programs, didn't uninstall drivers, just installed crimson and everything is fine.


----------



## rdr09

Quote:


> Originally Posted by *Randomdude*
> 
> Disabled no programs, didn't uninstall drivers, just installed crimson and everything is fine.


DUD - Didn't Uninstall Driver. That's my method. Just install over the old for both my AMD and Intel rigs.


----------



## mRYellow

Quote:


> Originally Posted by *rdr09*
> 
> DUD - Didn't Uninstall Driver. That's my method. Just install over the old for both my AMD and Intel rigs.


Classic!

BTW, are you guys enabling shader cache manually for your game profiles or are you leaving it on AMD Optimized?


----------



## Gamedaz

* It would be nice to get Crimson to install on my system, latest news is that Crimson ADs in the Installer get flagged by the AV and prevents the install from executing,







..* Ive never used Bitdefender before, so I am not sure how up to date they are on false positives but, it could be its signature detecting software that learns Day 1 type malware which deems the AD as possibly suspicious, although I get no warning from bit-defender.

* As much as I would like to install Crimson Drivers by removing th AV, Ièm gonna wait till AMD fixes this.


----------



## flopper

Quote:


> Originally Posted by *mRYellow*
> 
> Classic!
> 
> BTW, are you guys enabling shader cache manually for your game profiles or are you leaving it on AMD Optimized?


amd optimized then it will work great as its tested by amd.
on all the time you might run into issue with games not tested.
however if you dont play many games,leave it on and check


----------



## Jflisk

Quote:


> Originally Posted by *Gamedaz*
> 
> * It would be nice to get Crimson to install on my system, latest news is that Crimson ADs in the Installer get flagged by the AV and prevents the install from executing,
> 
> 
> 
> 
> 
> 
> 
> ..* Ive never used Bitdefender before, so I am not sure how up to date they are on false positives but, it could be its signature detecting software that learns Day 1 type malware which deems the AD as possibly suspicious, although I get no warning from bit-defender.
> 
> * As much as I would like to install Crimson Drivers by removing th AV, Ièm gonna wait till AMD fixes this.


You should be able to create an allowance in any antivirus to install an application and the allowance would need to be done by Bitdefender not AMD. Bitdefender is giving the results.


----------



## Jflisk

Happy Thanksgiving all









Keeping it on point FURY X


----------



## josephimports

Quote:


> Originally Posted by *Jflisk*
> 
> Happy Thanksgiving all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keeping it on point FURY X


Cheers.


















Happy Thanksgiving OCN.


----------



## Agent Smith1984

So, with crimson, i get bad frame drops in gta v now, and my overclock of 1060/550 it no longer stable.... In rolling back...

Happy Thanksgiving!


----------



## xer0h0ur

Like I said before, you can't assume your overclocks will be the same with this driver. Its shown to not withstand the same overclocks as the Catalysts were so far.

If you're getting frame drops have you checked with Afterburner or some other monitoring software to see if your clocks are throttling or simply all over the place?


----------



## Thoth420

I pick her up this weekend probably Saturday!


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I pick her up this weekend probably Saturday!


Any more white and black and I would have to beg you to call it StormTrooper, LOL. That is an envy inducing rig. *stands up and salutes*


----------



## wdpir32k3

Does anyone know how to fix this I'm trying to overclock my Fury's in crossfire with the new drivers it keeps making the cores go back to default clock speeds anyone know how to fix it so it keeps the overclock


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Any more white and black and I would have to beg you to call it StormTrooper, LOL. That is an envy inducing rig. *stands up and salutes*


I figured a legit star wars mod was coming and didn't want to steal the name. It is heavily Storm Trooper inspired.


----------



## Gamedaz

* If I get instability with these drivers I might just do a Fresh Install and see how they perform.

Is anyone here using a 1080p Resolution or HDTV panel? If so how do the drivers work in most games?


----------



## Greenland

I think God hates me, first dead unlocked Fury now the new one maxed out at 1090 with +75 mV.


----------



## Crisium

So to fix my throttling that the new drivers introduced I have to manually change the 2D clocks to match the 3D clocks. You can do it in MSI Afterburner or Asus GPU Tweak. GPU Tweak: http://i.imgur.com/ZJHGn0i.png

Of course I must now remember to switch back and forth between the profiles when gaming and not gaming.


----------



## 98uk

Quote:


> Originally Posted by *Crisium*
> 
> So to fix my throttling that the new drivers introduced I have to manually change the 2D clocks to match the 3D clocks. You can do it in MSI Afterburner or Asus GPU Tweak. GPU Tweak: http://i.imgur.com/ZJHGn0i.png
> 
> Of course I must now remember to switch back and forth between the profiles when gaming and not gaming.


How do you do this in afterburner? I'm having trouble with BF4 stuttering as it moves to 2d clocks and back.

Have you a screenshot of how to do this in AB?


----------



## mRYellow

I can confirm that the new AB that's currently going through beta testing is working well with Crimson driver set and now has votlage support for Furys. Clocks speeds are also constant with no fluctuations.

Link to imminent beta announcement.
http://forums.guru3d.com/showthread.php?p=5196977


----------



## zdziseq

better this

http://forums.guru3d.com/showthread.php?t=404185


----------



## 98uk

Quote:


> Originally Posted by *mRYellow*
> 
> I can confirm that the new AB that's currently going through beta testing is working well with Crimson driver set and now has votlage support for Furys. Clocks speeds are also constant with no fluctuations.
> 
> Link to imminent beta announcement.
> http://forums.guru3d.com/showthread.php?p=5196977


Sorry, how do you mean "Clocks speeds are also constant with no fluctuations". Is this something you noticed with your own card, or you mean it's a fix in the new afterburner?

I have the problem whereby in bf4 (specifically), the game stutters sometimes and the card appears to switch to 2d clocks and back. This doesn't happen for PCars.

The only software with 2d clock control I found was Asus GPU tweak and it is buggy at best... I was trying to manually set 2d clocks not to fall below 3d clocks, thus removing any associated stuttering.


----------



## mRYellow

Quote:


> Originally Posted by *98uk*
> 
> Sorry, how do you mean "Clocks speeds are also constant with no fluctuations". Is this something you noticed with your own card, or you mean it's a fix in the new afterburner?
> 
> I have the problem whereby in bf4 (specifically), the game stutters sometimes and the card appears to switch to 2d clocks and back. This doesn't happen for PCars.
> 
> The only software with 2d clock control I found was Asus GPU tweak and it is buggy at best... I was trying to manually set 2d clocks not to fall below 3d clocks, thus removing any associated stuttering.


I'm just stating that i've experienced no clock drops. Not sure why users are having this issue.


----------



## 98uk

Quote:


> Originally Posted by *mRYellow*
> 
> I'm just stating that i've experienced no clock drops. Not sure why users are having this issue.


Ah ok, fair enough.

Well, for me the clock issues seem related specifically to certain games. As I mentioned, Project Cars runs absolutely fine, whereas BF4 drops the clocks ever so often.

As such, I wonder if this is caused specifically by the new Crimson "settings" area where you can change things per game. I will try to set everything to default and see what happens.

Right now, the only workaround is the mess of software that is Asus GPU Tweak. I checked afterburner, but it seems that the ability to change 2d/3d profiles is no longer there... or at least I don't have it available.


----------



## mRYellow

Quote:


> Originally Posted by *98uk*
> 
> Ah ok, fair enough.
> 
> Well, for me the clock issues seem related specifically to certain games. As I mentioned, Project Cars runs absolutely fine, whereas BF4 drops the clocks ever so often.
> 
> As such, I wonder if this is caused specifically by the new Crimson "settings" area where you can change things per game. I will try to set everything to default and see what happens.
> 
> Right now, the only workaround is the mess of software that is Asus GPU Tweak. I checked afterburner, but it seems that the ability to change 2d/3d profiles is no longer there... or at least I don't have it available.


It is there in AB. You create a profile and then assign that profile to either 2D or 3D.


----------



## Szaby59

Can somebody upload the original Sapphire Fury *OC* vBIOSes ?

I guess it's: 113-1E3292U-Q4D and 113-1E3292U-O4C


----------



## 98uk

Quote:


> Originally Posted by *mRYellow*
> 
> It is there in AB. You create a profile and then assign that profile to either 2D or 3D.


I don't have this. I have a profile, but nowhere can I do anything else...


----------



## mRYellow

Quote:


> Originally Posted by *98uk*
> 
> I don't have this. I have a profile, but nowhere can I do anything else...


Strange, which build are you running?


----------



## 98uk

Quote:


> Originally Posted by *mRYellow*
> 
> Strange, which build are you running?


Afterburner 4.1.1 and the latest Crimson drivers...


----------



## xer0h0ur

Well color me confused. I am running the same build version and I do see the same options as mRYellow


----------



## 98uk

OK, resolved.

One must install the Rivatuner Statistics Server software as well! With only the AB software, the 2d/3d profiles options are not available.









EDIT: Doesn't seem to make a difference anyway. 2d clocks still are 300mhz despite what I choose...


----------



## MerkageTurk

I am getting blue screens "no driver thread or something"


----------



## velocityx

Quote:


> Originally Posted by *mRYellow*
> 
> Strange, which build are you running?


have you enabled all the overclocking options on the first tab? try to clean uninstall with deleting left files and do a clean install.


----------



## 98uk

Quote:


> Originally Posted by *velocityx*
> 
> have you enabled all the overclocking options on the first tab? try to clean uninstall with deleting left files and do a clean install.


See above/...

OK, resolved.

One must install the Rivatuner Statistics Server software as well! With only the AB software, the 2d/3d profiles options are not available. smile.gif


----------



## MerkageTurk

Guys my card does not have UEFI

It's Saphire


----------



## Jflisk

Quote:


> Originally Posted by *MerkageTurk*
> 
> Guys my card does not have UEFI
> 
> It's Saphire


Depends what card. The Fury X has a newer bios with UEFI and it is available .

This one is for The FURY X .66 Gave me the UEFI on my Powercolor card but is for Sapphire Fury X .

https://www.techpowerup.com/vgabios/177517/sapphire-r9furyx-4096-150721.html

Remember to make a backup of your original bios.


----------



## Vesimas

Quick question, since here in Italy the price are too inflated compared to the U.S., and since i found a nice promo, would you buy a Sapphire R9 Fury X for 560$ (it's the lowest price in Italy but i think it's the lowest also on Newegg)??? I think i'll buy an Asus MG279Q too. What do you think?









PS: don't look the PC in sign because i'll start to build the new one with 5820k or 6700k


----------



## SuperZan

Quote:


> Originally Posted by *Vesimas*
> 
> Quick question, since here in Italy the price are too inflated compared to the U.S., and since i found a nice promo, would you buy a Sapphire R9 Fury X for 560$ (it's the lowest price in Italy but i think it's the lowest also on Newegg)??? I think i'll buy an Asus MG279Q too. What do you think?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: don't look the PC in sign because i'll start to build the new one with 5820k or 6700k


I can't say that I wouldn't, because I bought one for the full price earlier this year. It's a great card and the form-factor is fantastic if you can slot the radiator in your case. The 500-odd EUR question you have to ask yourself is, are you okay with spending so much on a GPU knowing that Arctic Islands / Pascal are due next year? I rationalised it by telling myself that I'd wait for the second wave of next-gen GPU's, and so can count on the Fury X for at least a year of service in my primary gaming PC (and an indefinite term in my secondary or the HTPC).

Your situation may be different and if you're planning on being an early adopter of the new architectures next year, I'd hold off or buy something a bit more budget-friendly. If you want a great current-gen card, and know how you're going to play things when the Arctic Islands and Pascal come along, then by all means, you'll definitely enjoy the Fury X.


----------



## Thoth420

Quote:


> Originally Posted by *Vesimas*
> 
> Quick question, since here in Italy the price are too inflated compared to the U.S., and since i found a nice promo, would you buy a Sapphire R9 Fury X for 560$ (it's the lowest price in Italy but i think it's the lowest also on Newegg)??? I think i'll buy an Asus MG279Q too. What do you think?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: don't look the PC in sign because i'll start to build the new one with 5820k or 6700k


Wait on the Acer XF270HU..better FS range and same panel as the ASUS. Expecting a revision of the MG279Q as well.


----------



## MerkageTurk

Quote:


> Originally Posted by *Jflisk*
> 
> Depends what card. The Fury X has a newer bios with UEFI and it is available .
> 
> This one is for The FURY X .66 Gave me the UEFI on my Powercolor card but is for Sapphire Fury X .
> 
> https://www.techpowerup.com/vgabios/177517/sapphire-r9furyx-4096-150721.html
> 
> Remember to make a backup of your original bios.


How do i update it?


----------



## Jflisk

Quote:


> Originally Posted by *MerkageTurk*
> 
> How do i update it?


Look up ATIflash directions you can use it off an administrative command line. There are two in the one package atiwinflash and atiflash use the atiflash command should look like

instructions
http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/

Atiflash download
https://www.techpowerup.com/downloads/2531/atiflash-2-71/

From elevated command prompt
atiflash -p 0 ( 0 First card 1 second card and so on) biosname.bin

atiflash -p 0 biosname.bin

Any problems asked here there are a lot of people that are really good with it

FOR SAFTY SAKE download GPUZ and save the old bios before flashing . SHOULD BE 2 BIOSES on card switch at back of card near the metal plate on side of card quarter of way down position one and two. TWO is if you have a problem switch to two after shutting off the computer. The switch can be moved after the computer starts again to bios one if there's a problem.


----------



## BaddParrot

I wanted to take a sec & post here for you guys,

First off, I only semi/kind of know what I'm doing here. But I can read & learn/follow directions!

Asus Crosshair V Formula-Z
AMD FX -8350 (oc 4700)
GSkill Trident X series 16 GB (2x8GB) DDR3 SDRAM 1866 (PC3 14900)(8,9,9,24)
Sapphire R9 Fury X (550/1100)
Win7 64 bit Sev pack 1

Pc was running GREAT.

Crimson- I Installed it the day it came out. It looked cool & I had faith in AMD to have the bugs worked out.
That night, I noticed most of the games I played throttling the furry x. Fallout 4, World of Warships (I'm a Supertester), even 3dmarks Firestrike Ultra appeared to be throttling (Despite decent scores around 4000).

I Used AMD's Clean up utility & reinstalled Crimson. Same results. I installed the beta 15.11.1 drivers over crimson, same results.

I went into Afterburner & matched the 2d & 3d settings, same results.

The other issues I noticed with Crimson so far is It did NOT have all my games listed & I actually had a few Temporary Black screens in the middle of World of Warships matches. These were fast (4-5 seconds) as if I lost my vid driver for a sec.

Today, I just realized it was not worth it, I did Another AMD clean up & just put in the 15.11.1 CCC again.

Every things Back to Great now!

Maybe I will try again one of these days after AMD/MSI afterburner updates.


----------



## 98uk

Quote:


> Originally Posted by *MerkageTurk*
> 
> Guys my card does not have UEFI
> 
> It's Saphire


I have a Sapphire Fury Tri-x... Pretty Sure it said it had uefi. do you want??

Not sure how to upload files...


----------



## wesbluemarine

My Sapphire R9 nano doesn' t have UEFI vbios too.
I' m using a fury x one with UEFI.
Is it better if i find a dedicated r9 nano vbios? I don't know where to find it









EDIT
http://www.gigabyte.com/products/product-page.aspx?pid=5620#bios

Thanks GIGABYTE!


----------



## Jflisk

Quote:


> Originally Posted by *wesbluemarine*
> 
> My Sapphire R9 nano doesn' t have UEFI vbios too.
> I' m using a fury x one with UEFI.
> Is it better if i find a dedicated r9 nano vbios? I don't know where to find it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT
> http://www.gigabyte.com/products/product-page.aspx?pid=5620#bios
> 
> Thanks GIGABYTE!


You need to have the bios for the card you own. Not sure about the nanos but the FURY X BIOSES all come from AMD and should all be the same. You can ask gigabyte if they have a newer BIOS for the nano.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *BaddParrot*
> 
> I wanted to take a sec & post here for you guys,
> 
> First off, I only semi/kind of know what I'm doing here. But I can read & learn/follow directions!
> 
> Asus Crosshair V Formula-Z
> AMD FX -8350 (oc 4700)
> GSkill Trident X series 16 GB (2x8GB) DDR3 SDRAM 1866 (PC3 14900)(8,9,9,24)
> Sapphire R9 Fury X (550/1100)
> Win7 64 bit Sev pack 1


I have the same mb and ram as yours. My PC has been off for a few weeks while I was waiting for a new Fury X to arrive. I put the new card in today, and saw that the AMD drivers had changed to Crimson. So I let everything update and I am getting utterly _insulting_ fps in Fallout 4 (which is the only game that I've tried so far tonight.

Using the new Crimson suite, I left things pretty much stock in there. Now, my set up is _slightly_ different, in that I'm running two, yes TWO Fury X's (EK-blocked) at 1440p and I'm getting...ready for this?

14-19 FPS with these settings.

Something is clearly NOT right...

I've seen many posts on various forums with people running a single Fury X, or even a 290X and getting significantly better numbers with these settings. This should not be happening on a rig with twin Fury X's.

So I used Display Driver Uninstaller and completely cleaned the driver out of there. I put on a clean install of the Crimson suite and I'm still having the same issue.

(btw, it's really annoying that the built in FPS counter is built on an overlay and does not allow the screen capture utility to show the displayed FPS)

And I'm not sure if this would affect anything, but I'm running three M279Q monitors, which are 144hz, but I have FreeSync turned on each of them, so the refresh rate would actually be capped up to 90hz. I only display the game on the center monitor though.



Based on the settings in the image above, any thoughts/suggestions?


----------



## BaddParrot

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I have the same mb and ram as yours. My PC has been off for a few weeks while I was waiting for a new Fury X to arrive. I put the new card in today, and saw that the AMD drivers had changed to Crimson. So I let everything update and I am getting utterly _insulting_ fps in Fallout 4 (which is the only game that I've tried so far tonight.
> 
> Using the new Crimson suite, I left things pretty much stock in there. Now, my set up is _slightly_ different, in that I'm running two, yes TWO Fury X's (EK-blocked) at 1440p and I'm getting...ready for this?
> 
> 14-19 FPS with these settings.
> 
> Something is clearly NOT right...
> 
> I've seen many posts on various forums with people running a single Fury X, or even a 290X and getting significantly better numbers with these settings. This should not be happening on a rig with twin Fury X's.
> 
> So I used Display Driver Uninstaller and completely cleaned the driver out of there. I put on a clean install of the Crimson suite and I'm still having the same issue.
> 
> (btw, it's really annoying that the built in FPS counter is built on an overlay and does not allow the screen capture utility to show the displayed FPS)
> 
> And I'm not sure if this would affect anything, but I'm running three M279Q monitors, which are 144hz, but I have FreeSync turned on each of them, so the refresh rate would actually be capped up to 90hz. I only display the game on the center monitor though.
> 
> 
> 
> Based on the settings in the image above, any thoughts/suggestions?


Yes, I also had issues with the recommended settings (Pre-Crimson) with 1 fury X on Fallout 4.
I simply switched the Antialiasing to the other setting FXAA & the 16 samples to 8 & it ran fine.
I'm using 1 Acer XG270HU 27" 1ms 144HZ @ 1440 & had to turn off the Freesync (Some light slight flickering).

It was funny you mentioned it. When I started playing, I enjoyed the begining of the game so much, I ignored the 18-20 fps for the first 4 hours. I had forgotten all about it.


----------



## MalsBrownCoat

I just tried the settings that you mentioned above.

Absolutely no change what so ever.


----------



## BaddParrot

I Know I had some throttling with Crimson yesterday in FO4 but it certainly didn't drop my fps that low.
I edited my other post to include the Screen shot.
As I posted, I did uninstall crimson but I was playing tonight for a while & I run 50-60 fps.

Other games running alright for you? Almost sounds like the Xfire is not working? Other than the settings, I'm unsure cause I am noob.


----------



## dartmaul15

I just wanted to drop by here and ask for some help.

I've heard the sapphire radeon r9 fury tri-x has quite a hefty heatsink, and it has left me worried about how that affects crossfire performance. Will i even fit 2 of them on the mobo options i've picked?

http://pcpartpicker.com/part/gigabyte-motherboard-gaz97xgaming5
http://pcpartpicker.com/part/msi-motherboard-z97gaming5
http://pcpartpicker.com/part/gigabyte-motherboard-gaz97xud3h

I'd love some help with this, as i REALLY want to put this monster into crossfire mode


----------



## wesbluemarine

Quote:


> Originally Posted by *Jflisk*
> 
> You need to have the bios for the card you own. Not sure about the nanos but the FURY X BIOSES all come from AMD and should all be the same. You can ask gigabyte if they have a newer BIOS for the nano.


The Gigabyte one in the link works ok.


----------



## MerkageTurk

Guys just updated my bios to UEFI, with a fellow members help.

So far so good, this is on fury x saphhire


----------



## MalsBrownCoat

Quote:


> Originally Posted by *BaddParrot*
> 
> I Know I had some throttling with Crimson yesterday in FO4 but it certainly didn't drop my fps that low.
> I edited my other post to include the Screen shot.
> As I posted, I did uninstall crimson but I was playing tonight for a while & I run 50-60 fps.
> 
> Other games running alright for you? Almost sounds like the Xfire is not working? Other than the settings, I'm unsure cause I am noob.


I used DDU again and cleared off the Crimson driver. Installed 15.7.1. Made no changes to the stock settings of Catalyst, and kept the same settings that I was using for Fallout 4 (re: screenshot).

Now I'm getting 30 fps. An improvement, which clearly points to something being wrong with the Crimson driver. But, there is still something very wrong here over all.
At this resolution, with Freesync on, I should be getting a _minimum_ of 50-60 fps on a single Fury X. And running _two_, it should be even more than that.

I just don't get it.


----------



## rdr09

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I used DDU again and cleared off the Crimson driver. Installed 15.7.1. Made no changes to the stock settings of Catalyst, and kept the same settings that I was using for Fallout 4 (re: screenshot).
> 
> Now I'm getting 30 fps. An improvement, which clearly points to something being wrong with the Crimson driver. But, there is still something very wrong here over all.
> At this resolution, with Freesync on, I should be getting a _minimum_ of 50-60 fps on a single Fury X. And running _two_, it should be even more than that.
> 
> I just don't get it.


Did you verify that 15.7.1 has a crossfire profile for FO4?

Anyways, not sure if you've done it . . . have you verified that crossfire works when you enable it in ccc? a simple test like running a synthetic bench like Firestrike with Afterburner running in the background to show graphs of cpu and gpus usages.


----------



## MalsBrownCoat

Haven't seen a specific profile for it. So I created one in CCC.



I had "Enable AMD CrossFire X" checked in CCC, but the "Enable AMD CrossFireX for applications that have no associated application profile" was not checked.

So, I checked that box and then I launched MSI Afterburner. With Fallout 4 running, I can see that there is activity now on GPU 2. However, FPS is still at ~30.

Even with CrossFire turned off, I should still be getting 50-60 fps. I don't have many games installed on this system yet, but maybe I'll see how Borderlands (Pre Sequel) does.


----------



## rdr09

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Haven't seen a specific profile for it. So I created one in CCC.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ]
> 
> I had "Enable AMD CrossFire X" checked in CCC, but the "Enable AMD CrossFireX for applications that have no associated application profile" was not checked.
> 
> So, I checked that box and then I launched MSI Afterburner. With Fallout 4 running, I can see that there is activity now on GPU 2. However, FPS is still at ~30.
> 
> Even with CrossFire turned off, I should still be getting 50-60 fps. I don't have many games installed on this system yet, but maybe I'll see how Borderlands (Pre Sequel) does.


not sure if that's gonna work for FO4. you gonna have to testother games like BF4, which is known to work with crossfire.

worst case . . . you may have to reinstall the driver but i suggest to do this first . . .

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers

actually, i suggest to disable crossfire first, shutdown system, and unplug power to the secondary card, then use method above. after clearing the driver . . . install driver anew. reboot and test with a single card. if it works . . . shutdown and plug power to the secondary. when you boot . . . crossfire should automatically set itself unless it has changed with crimson.

edit: i don't use that method, though, but i don't recommend DDU. i normally just install the new driver over the old but i disable crossfire every time.


----------



## xer0h0ur

Hold on man, you're trying to play a brand new game with old drivers and expecting it to work fine in crossfire? I would only be trying to use the Catalyst 15.11.1 OR the Crimson 15.11.1, they are two different drivers.

This is the link to the Catalyst 15.11.1: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## BaddParrot

Quote:


> Originally Posted by *xer0h0ur*
> 
> Hold on man, you're trying to play a brand new game with old drivers and expecting it to work fine in crossfire? I would only be trying to use the Catalyst 15.11.1 OR the Crimson 15.11.1, they are two different drivers.
> 
> This is the link to the Catalyst 15.11.1: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


Thats what I rolled back too from crimson, the 15.11.1 beta drivers.
But I still went to the FXAA & 8 samples to get my 60fps in FO4. (With the single furyx)


----------



## BaddParrot

Personally, I would turn Off the Xfire & try FO4 & note the changes.
Then Install the 15.11.1 & note changes. They try the Xfire again.

I really don't think the new card is bad. It has to be in the settings or drivers.


----------



## Otterfluff

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I used DDU again and cleared off the Crimson driver. Installed 15.7.1. Made no changes to the stock settings of Catalyst, and kept the same settings that I was using for Fallout 4 (re: screenshot).
> 
> Now I'm getting 30 fps. An improvement, which clearly points to something being wrong with the Crimson driver. But, there is still something very wrong here over all.
> At this resolution, with Freesync on, I should be getting a _minimum_ of 50-60 fps on a single Fury X. And running _two_, it should be even more than that.
> 
> I just don't get it.


I share your pain I was running fallout 4 with 55-60 most areas with drops down to 46 on a single fury X pre crimson.

Now it's at 30-31 fps with drops to 25.

Something is off and I can not figure out how to change it back short of rolling back to the beta driver before crimson.

*edit* did try d/c'ing power from second card and no change. I have re-done every edit from before. Crossfire never worked in fallout 4 but after recent updates it did not degrade performance either. I have no idea what they stuffed up in this driver but it's performing worse than before the pre-beta drivers for fallout 4.


----------



## Otterfluff

Ok I figured it out. Profile for fallout 4 dose not work and the global profile for all games does. The default every profile is set to -50% clock. Change the default profile to 0% clock and you will get 100% of the core clock working. I am now back to 55-60fps in fallout 4.

I have no idea why they set the profiles all to -50% by default, It seems ******ed.


----------



## xer0h0ur

I think they royally fubared the game profiles. I already reported my issue with Crossfire not disabling in CS:GO profile and others confirmed the same issue with other DX9 games as well.


----------



## hyp36rmax

Quote:


> Originally Posted by *xer0h0ur*
> 
> I think they royally fubared the game profiles. I already reported my issue with Crossfire not disabling in CS:GO profile and others confirmed the same issue with other DX9 games as well.


I found this also


----------



## caenlen

I need an XFX Fury X bios, you can save it from GPU-Z. Attempting a flash on my R9 Fury XFX to full Fury X as soon as I can get my hands on the bios.

Someone help!!! Techpowerup doesn't have it, already checked. https://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=&model=R9+Fury+X&interface=&memType=&memSize=

edit: CUInfo says I can unlock it to full fury x.... just waiting on an XFX bios... come on guys


----------



## Digitalwolf

xfxr9.zip 46k .zip file


This is the bios I had saved from my XFX R9 Fury X a few months back. Hopefully that will help you...


----------



## Orgios

EDIT: oops sorry deleted bios as it was for an air version

And this is mine, having more than one wont hurt







I hope all goes well , I managed to unlock mine to 3840, wont risk going further


----------



## caenlen

Quote:


> Originally Posted by *Digitalwolf*
> 
> xfxr9.zip 46k .zip file
> 
> 
> This is the bios I had saved from my XFX R9 Fury X a few months back. Hopefully that will help you...


thanks digital, +rep


----------



## NBrock

Anyone doing more reviews on these since the new drivers? I only saw one and it was for Battle Front on the new drivers.
I ended up ordering the Power Color Fury X since they dropped the price. Looking forward to playing with a new card.

Also....anyone have any information on how the Fury series do with [email protected]?


----------



## lullerkitten

Hey all,

Just showing mine














Best result so far

http://www.3dmark.com/3dm/9497626?

ps: crimson driver and fallout 4 don't mix well


----------



## sugarhell

Next version of AB will finally support fury x


----------



## xer0h0ur

Quote:


> Originally Posted by *lullerkitten*
> 
> Hey all,
> 
> Just showing mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Best result so far
> 
> http://www.3dmark.com/3dm/9497626?
> 
> ps: crimson driver and fallout 4 don't mix well


If you're referring to FPS in Fallout 4, check the global settings and the Fallout 4 profile settings for your Fury X's clock speeds. Someone had an issue where the global GPU clock speed was set to negative 50% which was murdering his framerate in half.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *lullerkitten*
> 
> Best result so far
> 
> http://www.3dmark.com/3dm/9497626?
> 
> ps: crimson driver and fallout 4 don't mix well


How are you getting 13519 with a single Fury X and I'm only getting 12759 with _two_?

http://www.3dmark.com/fs/6627329

It's values/figures like this that are causing me to think that something is wrong with my configuration.


----------



## sugarhell

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> How are you getting 13519 with a single Fury X and I'm only getting 12759 with _two_?
> 
> http://www.3dmark.com/fs/6627329
> 
> It's values/figures like this that are causing me to think that something is wrong with my configuration.


Your cpu is way slower.


----------



## MalsBrownCoat

It is?

Admittedly, I'm not well versed in the whole "AMD vs Intel" platform thing, but on paper (or rather, screen), it appears that our maximum clocks are very similar.

Does this have something to do with AMD not processing Physics very well?


----------



## sugarhell

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> It is?
> 
> Admittedly, I'm not well versed in the whole "AMD vs Intel" platform thing, but on paper (or rather, screen), it appears that our maximum clocks are very similar.
> 
> Does this have something to do with AMD not processing Physics very well?


It is especially on single thread performance. Check your combined score vs the other one


----------



## xer0h0ur

Nevermind the "overall score" because 3dmark will always score higher on Intel's CPUs for the physics test therefore skewing the hell out of the overall score. So you should look at the graphics score on your test. So you're getting 27774 to his 17335 graphics score.

AMD won't have a worthy FX series processor to compare with Intel's offerings until the Zen processor comes out.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *xer0h0ur*
> 
> Hold on man, you're trying to play a brand new game with old drivers and expecting it to work fine in crossfire? I would only be trying to use the Catalyst 15.11.1 OR the Crimson 15.11.1, they are two different drivers.


For what it's worth, I just tried using the Catalyst 15.11.1 drivers (which are still listed as beta), and my FPS is now 17-19.

So, the 15.7.1 drivers worked better than 15.11.1.

I'll try upgrading to Crimson now and check the default clock settings as previously specified.


----------



## MalsBrownCoat

After installing Crimson and setting the sliders for the clock settings within the Fallout 4 profile in Crimson, I went to launch Fallout (from the launcher within Crimson), and my system locked up. I did a hard reboot and couldn't even get into Windows. The Windows loading screen came up and just went black after that.

I had to unplug GPU2 from the power (can't remove it from the system because it's part of the custom loop with hardlines). Rebooted a few times and finally got Windows to let me in. I went back into Crimson and changed the slider for the card (since it was only showing one now) back to the middle (0%). Fired up Fallout 4 and had around 14-17 FPS.

At this point, it's pretty clear that it's not a single, or even dual-card issue. It has to be something with the drivers as a whole.

So for me:

Crimson = 14-17 FPS
15.11.1 = 17-19 FPS
15.7.1 = 30 FPS

Suffice it to say, I am less than satisfied with this result. I expected significantly stellar results with the amount of money that I've invested into this build.

= /

I'm going to make myself a drink now, and force myself to laugh about this.


----------



## xer0h0ur

Dude I highly suggest doing a manual driver wipe using BradleyW's guide. You can even run DDU after doing that simply for redundancy but I can't help but think you have old driver remnants in your registry causing you hell. There is no reason that you should also be getting those issues while running a single card.

BTW when you checked your clock speeds did you also check the global settings for the GPU clock? If either the global or game profile settings have your clock speed reduced then it will affect you in game.


----------



## sugarhell

I do this every single time with zero problems:

DDU to safe mode
Clean but dont restart
CCleaner registry
Restart
Install
Restart

If you have multiple monitors disconnect them before you start


----------



## MalsBrownCoat

I haven't made any changes since the last posting, but after checking the following games, I'm getting:

Borderlands (Presequel) - 163 FPS
SWTOR - 109 FPS
BF4 - 138 FPS

Doesn't look like there is any activity on GPU2 though.



I had no idea that there was even a "Global Settings" tab, so thanks for that suggestion.

Here's what it looks like so far.





One tab of the Overdrive Settings seems to have the settings centered. Though, I can't tell if that's GPU1 or GPU2.



The second overdrive tab seems to have the GPU settings all the way to the right.



Based on Afterburner showing activity on GPU1, is it a fair assumption that Global Overdrive tab #2 is actually GPU1?

If so, that's kind of odd that it would be set up that way.


----------



## xer0h0ur

Good god man. +91% should be trying to super ultra mega overclock your GPU. Play with the sliders and see how its affects your framerate. Something sure seems off there.


----------



## MalsBrownCoat

Very odd, considering that I did not set any of these myself. Like I said, I wasn't even aware that this tab was here.

Gremlins.

Anyway, will adjusting the sliders affect the FPS in real time, or will I need to restart the game each time to see the difference?

(sorry for the remedial questions)


----------



## xer0h0ur

Oh I know this, not putting any blame on you. Its a reported Crimson driver glitch. One of several glitches in this driver.


----------



## xer0h0ur

Since you mentioned you're using Afterburner you may need to uninstall it without remembering settings if its conflicting with the Crimson driver. Although for the record I am not getting any conflicts while using Afterburner to set my GPU clocks.


----------



## p4inkill3r

Tips for Crimson installation:

uninstall Afterburner/Trixx/GpuTweak/etc.
AMD Clean Uninstall Utility
Install Crimson


----------



## MalsBrownCoat

^ I'll give that a try.

Though, I'll point out that in the meantime, I tried some further testing. And, I hadn't made any further changes after having simply tested the other games that I listed below.
Repeat: No settings had been changed in Crimson, _however_, now I'm getting a steady 30 FPS in Fallout 4.

What.

In the actual.

ShXt.

?!?

Okayyyyy.......

The follow up testing after noticing this -

In the Fallout 4 profile (note that CrossFire is disabled):

Test 1:
In the Profile Overdrive tab (first one), I moved the GPU clock slider to the left and put it at -60.5%. Clicked back in to the open Fallout game and had 30 FPS.
Profile Overdrive tab (second one) was still set at middle (0%).

In the Profile Overdrive tab (second one), I moved the GPU clock slider to the left and put it at -60.5%. Clicked back in to the open Fallout game and had 30 FPS.

Test 2:
In the Profile Overdrive tab (first one), I moved the GPU clock slider to the right and put it at +90.4%. Clicked back in to the open Fallout game and had 30 FPS.
Profile Overdrive tab (second one) was reset at middle (0%).

In the Profile Overdrive tab (second one), I moved the GPU clock slider to the right and put it at +90.4%. Clicked back in to the open Fallout game and had 30 FPS.

(I find it strange that despite the profile being set to CrossFire disabled, there were still 2 Profile Graphics and Profile OverDrive tabs available)

In the Global Settings:

Test 1:
In the Global Overdrive tab (first one), I moved the GPU clock slider to the left and put it at -60.5%. Clicked back in to the open Fallout game and had 30 FPS.

Test 2:
In the Global Overdrive tab (second one), I moved the GPU clock slider to the right and put it at +90.4%. Clicked back in to the open Fallout game and had 30 FPS.

It appears that using the sliders for either the profile overdrive in the fallout 4 profile, or in the global settings, and moving them all the way to the right or almost all the way to the left for the GPU clock had no affect; even after hitting apply and trying the battery of tests after fully restarting the game.

I've somehow miraculously broken the 17-19 FPS barrier, but I still seem to be locked at 30 FPS, no matter what settings I change to.


----------



## xer0h0ur

For what its worth I personally believe that the game profiles do not disable Crossfire. Open Radeon Settings, click Preferences then click Radeon Additonal Settings and after a moment its going to open up an old CCC style menu and you can globally disable Crossfire there. From what I understand opening this particular menu though will trigger the shut down error but it should still keep the setting change.


----------



## MalsBrownCoat

Very strange stuff going on. I think I've noticed a few things though. The first is that the secondary tab for each setting, whether it's Profile Graphics or Profile OverDrive, seems to be for GPU #1.
Why this is not for GPU #1, makes no sense.

Anyway, the second thing that I noticed is that within the Fallout 4 profile, the profile does NOT like the slider to be at the middle (+/- 0).

Now, if I move the GPU Clock slider on Profile OverDrive tab #2, over to the far right (+90.4%), suddenly I'm getting about 4 FPS.

If I move the slider to the left, and really leave it anywhere other than 0 (tried -20%, - 30%, - 60%, - 90%), I get a max of 30 FPS. Can't seem to get any higher no matter what I do.

Why on earth would a card such as this only perform when the GPU is underclocked? Shouldn't it at bare minimum perform very well at 0? And, shouldn't it be far more than a measly 30 FPS?

I find it hard to accept something as simple as "well, this driver is completely b0rked with this game", considering that others are using the same card, and the same driver, and getting substantially higher FPS.

Feels like I'm beating a dead horse here, but I'm really at my wits end trying to figure this out.

= /


----------



## xer0h0ur

You really need to do a thorough driver wipe including getting rid of Afterburner without remembering settings and wiping the registry of driver remnants. If you're not starting with a clean slate then frankly you're wasting time testing things that will likely never work or fix anything.


----------



## huzzug

Hey. Congrats on your new Fury. In the Crimson page, go to game profiles. Create a new profile for Fallout4 and set the preferences individually. Now once done, go to the main game profile page and on the top left side, you'll see More: Under this heading, select Enable All. I'm not sure if it stays the same after reboot, but I was having trouble with the W3 profile and doing this solved it. Not to mention, the clocks not sticking to what is set in Overdrive.


----------



## en9dmp

Just thought I'd chime in with my experience of Fallout 4 with my 2 Fury Xs... 15.11.1 Beta drivers basically disable crossfire for the game by default but gave a slightly smoother experience than the previous 15.11 drivers I was using. When I DDU'ed and installed crimson I got way worse performance, where in the same area the frame rate would go from 60-30 just by looking around me. It seemed to really tank if there was any fog being rendered.

No workable crossfire profile exists, the only way to get perfect scaling is to force AFR mode, but this introduces a number of annoying artifacts, such as extreme lighting, corpse flickering, AO flickering and gives your character a black face! So all that isn't really worth it for 60fps.

I play in 4k and I've had to go back to 15.11.1 to get reasonable frame rates with a single card.

I don't use AA though, so that might be helping.


----------



## lullerkitten

Quote:


> Originally Posted by *huzzug*
> 
> Hey. Congrats on your new Fury. In the Crimson page, go to game profiles. Create a new profile for Fallout4 and set the preferences individually. Now once done, go to the main game profile page and on the top left side, you'll see More: Under this heading, select Enable All. I'm not sure if it stays the same after reboot, but I was having trouble with the W3 profile and doing this solved it. Not to mention, the clocks not sticking to what is set in Overdrive.


This is giving me pretty stable 60 fps everywhere... but now there's some sort of stuttering or delay going on :/
Quote:


> Originally Posted by *xer0h0ur*
> 
> If you're referring to FPS in Fallout 4, check the global settings and the Fallout 4 profile settings for your Fury X's clock speeds. Someone had an issue where the global GPU clock speed was set to negative 50% which was murdering his framerate in half.


Everything normal here, still it is only in the big cities outside where i get the serious framedrops... however with the previous beta drivers before crimson it was a lot better.
And i Clean installed the drivers etc before crimson








Quote:


> Originally Posted by *en9dmp*
> 
> Just thought I'd chime in with my experience of Fallout 4 with my 2 Fury Xs... 15.11.1 Beta drivers basically disable crossfire for the game by default but gave a slightly smoother experience than the previous 15.11 drivers I was using. When I DDU'ed and installed crimson I got way worse performance, where in the same area the frame rate would go from 60-30 just by looking around me. It seemed to really tank if there was any fog being rendered.
> 
> No workable crossfire profile exists, the only way to get perfect scaling is to force AFR mode, but this introduces a number of annoying artifacts, such as extreme lighting, corpse flickering, AO flickering and gives your character a black face! So all that isn't really worth it for 60fps.
> 
> I play in 4k and I've had to go back to 15.11.1 to get reasonable frame rates with a single card.
> 
> I don't use AA though, so that might be helping.


Have you tried the assassin's creed crossfire profile from amd? This worked for me on skyrim with my 290's, and since it's the same engine... Don't know if you can choose that in crimson driver tho?


----------



## NBrock

Quote:


> Originally Posted by *xer0h0ur*
> 
> For what its worth I personally believe that the game profiles do not disable Crossfire. Open Radeon Settings, click Preferences then click Radeon Additonal Settings and after a moment its going to open up an old CCC style menu and you can globally disable Crossfire there. From what I understand opening this particular menu though will trigger the shut down error but it should still keep the setting change.


You would be correct. I am seeing that with my 295x2. It is killing me in Battle Front since it hates crossfire and I get mad flickering. No matter what i do it still uses the second GPU. I reported the issue to AMD as I am sure others probably have.


----------



## en9dmp

Quote:


> Originally Posted by *lullerkitten*
> 
> Have you tried the assassin's creed crossfire profile from amd? This worked for me on skyrim with my 290's, and since it's the same engine... Don't know if you can choose that in crimson driver tho?


I've tried the skyrim profile as that is also the same engine but I believe all these profiles just disable crossfire. I've not seen any confirmation on various forums that confirms any crossfire scaling exists with any profile on this engine. Would be good if someone could confirm otherwise?

If they can't be bothered (or don't have the ability) to build a decent engine that's up to today's standards I've no idea why they didn't just licence one that is... Pretty frustrating for everyone who's struggling to run this well on the best hardware that currently exists.


----------



## 98uk

Anyone fixed the 2d clock stutter in BF4?

Once every few minutes it goes down to 300mhz and everything lags.


----------



## lullerkitten

Quote:


> Originally Posted by *en9dmp*
> 
> I've tried the skyrim profile as that is also the same engine but I believe all these profiles just disable crossfire. I've not seen any confirmation on various forums that confirms any crossfire scaling exists with any profile on this engine. Would be good if someone could confirm otherwise?
> 
> If they can't be bothered (or don't have the ability) to build a decent engine that's up to today's standards I've no idea why they didn't just licence one that is... Pretty frustrating for everyone who's struggling to run this well on the best hardware that currently exists.


Well can't find it anymore but back then the assassin's creed crossfire profile worked for me (you still have to edit the ini files a bit for some weird interactions tho)









Or maybe try this? Seems to be working for some https://steamcommunity.com/app/377160/discussions/0/496881136898634986/


----------



## p4inkill3r

So, has anyone burned up their card due to Crimson?


----------



## Agent Smith1984

Quote:


> Originally Posted by *p4inkill3r*
> 
> So, has anyone burned up their card due to Crimson?


My card just continues to lockup when I try to use Trixx, so I can't even attempt to burn it up









Looking forward to 4.2.1 Afterburner.....


----------



## lullerkitten

Not yet









Did have a weird one time occurance tho :/ at stock after few hours just browsing screen artifacts, which is quite disturbing actually









still after that happenstance i'm monitoring it but voltage stays with normal browsing at 0.9 and gpu tems at 24 degrees so... odd


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My card just continues to lockup when I try to use Trixx, so I can't even attempt to burn it up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward to 4.2.1 Afterburner.....


Hey,I ran intpo a somewhat similar issue on my 290x VaporX. My entire PC would freeze if opened MSI Afterbruner (worked OK with Trixx though), and this is since installing Crimson.

Turns out it was conflicting with the Crimson Overdrive software. The first thing that tipped me off, was I set a fan speed of 50% in Trixx. Then I reset it and restarted PC and it reverted to 50%. Crimson mirrors whatever settings you apply in another software, so when you reboot, even if you had reverted the changes they come back again.

I figured there was some kind of conflict, I uninstalled the Crimson software (but kept the drivers), and MSI worked again. Might help you a bit...


----------



## Jflisk

Quote:


> Originally Posted by *p4inkill3r*
> 
> So, has anyone burned up their card due to Crimson?


Quote:


> Originally Posted by *lullerkitten*
> 
> Not yet
> 
> 
> 
> 
> 
> 
> 
> 
> Nope not yet
> 
> Did have a weird one time occurance tho :/ at stock after few hours just browsing screen artifacts, which is quite disturbing actually
> 
> 
> 
> 
> 
> 
> 
> 
> 
> still after that happenstance i'm monitoring it but voltage stays with normal browsing at 0.9 and gpu tems at 24 degrees so... odd


The weird artifacts green half screen weird tearing - That's documented and normal on some FURYs


----------



## HagbardCeline

The only issue I've had since installing Crimson is a problem in Star Wars: Battlefront in the 20v20 Hoth map. The lighting is screwed up and there are a couple spots where past a certain distance everything turns black. One part of the cave also starts off looking starnge but it resolves when you step all the way in. I changed the graphics settings and it got slightly better but was still screwed up. Will probably revert.


----------



## dagget3450

All my fury x's fried, i installed crimson driver and it immediately ejected all the cooling fluid from the pump housings. Then it turned off my fans and loaded furmark. It's weird cause i didn't have furmark even loaded or on my pc before installing crimson. I couldn't exit it and shutdown, even my power button was disabled so i immediately tried to unplug the psu from the wall. I received an instant electrical jolt and it kicked me back from the wall. I ran to the kitchen and got a bucket of water to douse the pc with but apparently the crimson driver somehow changed the molecular make up of the surface of the pc. The water just fell off and wouldn't absorb at all. As the gpus became red glowing hot,i began to hear evil voice from the on board speaker laughing at me as it said powerplay and ultra low power was disabled. It even broke the AMD gpus i have on my work bench which were not installed into anything they literally melted through the desk and 4 foot into the ground and the surrounding dirt was turned to glass. After i woke up....

lol no issues here aside from clock throttling in some games.


----------



## Himo5

Quote:


> Originally Posted by *Jflisk*
> 
> The weird artifacts green half screen weird tearing - That's documented and normal on some FURYs


Better to call it for what it is rather than 'documented and normal', which is that replacing the card will not stop this happening, regardless of card manufacturer, operating system, application or driver version. In fact, AMD don't yet know what it is, let alone have a solution for it, as this Community thread shows.


----------



## Jflisk

Quote:


> Originally Posted by *Himo5*
> 
> Better to call it for what it is rather than 'documented and normal', which is that replacing the card will not stop this happening, regardless of card manufacturer, operating system, application or driver version. In fact, AMD don't yet know what it is, let alone have a solution for it, as this Community thread shows.


That would be where I got the information from mine does it also from time to time. Disabling ULPS seems to have helped.


----------



## BIGTom

Quote:


> Originally Posted by *en9dmp*
> 
> Just thought I'd chime in with my experience of Fallout 4 with my 2 Fury Xs... 15.11.1 Beta drivers basically disable crossfire for the game by default but gave a slightly smoother experience than the previous 15.11 drivers I was using. When I DDU'ed and installed crimson I got way worse performance, where in the same area the frame rate would go from 60-30 just by looking around me. It seemed to really tank if there was any fog being rendered.
> 
> No workable crossfire profile exists, the only way to get perfect scaling is to force AFR mode, but this introduces a number of annoying artifacts, such as extreme lighting, corpse flickering, AO flickering and gives your character a black face! So all that isn't really worth it for 60fps.
> 
> I play in 4k and I've had to go back to 15.11.1 to get reasonable frame rates with a single card.
> 
> I don't use AA though, so that might be helping.


I experienced the same performance issues with FPS drops in Fallout4 using either one of the Crimson drivers on my single Fury X. Reverted to Catalyst 15.11.1 and FPS is much gooder.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *xer0h0ur*
> 
> You really need to do a thorough driver wipe including getting rid of Afterburner without remembering settings and wiping the registry of driver remnants. If you're not starting with a clean slate then frankly you're wasting time testing things that will likely never work or fix anything.


I removed Afterburner entirely and told it not to remember previous settings.

Restarted.

Followed BradleyW's guide.

Restarted.

Ran DDU in Safe Mode.

Ran CCleaner for the registry.

Restarted.

Installed Catalyst 15.11.1 (beta).

Restarted.

Launched Fallout 4.

17-19 FPS.

CTRL-ALT-FKIT


----------



## p4inkill3r

Do you have vsync enabled? What about frame rate target control?

Something is very, very wrong for you to be getting those results.


----------



## MalsBrownCoat

Ok, NOW we're getting somewhere. I *am* running 3 monitors (MG279's) at 1440p which are FreeSync capable and the setting is ON..

I did a scan through all the settings in CCC and it had recognized that my monitors were capable and automatically applied the CCC/FreeSync setting to ON.

I turned all three monitors' FreeSync to OFF (after some fxckery with a monitor then deciding to be unrecognized and having to fiddle around with some settings...I think the triple monitor set up doesn't like when you turn off FreeSync on one of the monitors that isn't in the _first_ slot on the GPU).

So, verified that all three monitors had their FreeSync to OFF.

Verified that CCC was not enabling FreeSync.

Launched Fallout 4.

48-50 FPS.

Significantly better.

Still a little surprised that that's all I'm getting out of a Fury X, but at this point...I'll take it.

(I'm also wondering if Crimson would do better now. I'm not sure what differences there were between the _Catalyst_ version of 15.11.1 and the _Crimson_ version.

*shrug*

Here's how the 3D Application Settings look now. (note that this is stock, and I have not made any game profiles). Should I change anything?



And here is the Flat Panel Properties. Again, anything I should change here?


----------



## p4inkill3r

I think you should try upgrading to Crimson and attempting to play on just one monitor with Freesync enabled to gauge theoretical performance.

Attempting to run your setup as it stands seems pie in the sky, as I don't think there are working CFx or SLI profiles for Fallout 4 yet, much less fully fleshed-out multi monitor support.


----------



## lullerkitten

Quote:


> Originally Posted by *Himo5*
> 
> Better to call it for what it is rather than 'documented and normal', which is that replacing the card will not stop this happening, regardless of card manufacturer, operating system, application or driver version. In fact, AMD don't yet know what it is, let alone have a solution for it, as this Community thread shows.


That's comforting...


----------



## ht_addict

Anyone do a repaste yet of their FuryX? if so how easy is it?


----------



## xer0h0ur

Quote:


> Originally Posted by *NBrock*
> 
> You would be correct. I am seeing that with my 295x2. It is killing me in Battle Front since it hates crossfire and I get mad flickering. No matter what i do it still uses the second GPU. I reported the issue to AMD as I am sure others probably have.


Us 295X2 owners got boned on this driver. Globally disabling crossfire as I suggested before will only disable crossfire between cards but will not disable on the 295X2 no matter what. So in other words it will disable tri-fire between 295X2 and 290X as is the case in my rig but the 295X2 will remain crossfired. Previously all you had to do was create and application profile within the CCC and disable crossfire then you were good to go. Can't do that anymore and since the game profiles aren't sticking the crossfire setting on the 295X2....were screwed.


----------



## Neon Lights

Quote:


> Originally Posted by *ht_addict*
> 
> Anyone do a repaste yet of their FuryX? if so how easy is it?


You have to unsrew a lot of srews and wipe the old thermal paste off, which is a lot of fiddling around because there is a lot of it by default. If you have a cleaning agent it will get off quite a bit easier. Then you have to apply your new paste and screw all the screws you previously unscrewed back in.


----------



## xer0h0ur

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I removed Afterburner entirely and told it not to remember previous settings.
> 
> Restarted.
> 
> Followed BradleyW's guide.
> 
> Restarted.
> 
> Ran DDU in Safe Mode.
> 
> Ran CCleaner for the registry.
> 
> Restarted.
> 
> Installed Catalyst 15.11.1 (beta).
> 
> Restarted.
> 
> Launched Fallout 4.
> 
> 17-19 FPS.
> 
> CTRL-ALT-FKIT


Holy hell mate. You need a priest or something.

Edit: So all those headaches were because of Freesync? WAT?


----------



## lullerkitten

Hmmm score is now 500 points lower with higher clocks







dammit


----------



## Neon Lights

Anyone using Crossfire having problems disabling it in Radeon Software Crimson 15.11.1? Everytime I set it on disabled and then click apply it just sets it back to enabled and nothing happens (okay actually the main Radeon Software window closes and then opens again).


----------



## Neon Lights

Quote:


> Originally Posted by *lullerkitten*
> 
> 
> 
> Hmmm score is now 500 points lower with higher clocks
> 
> 
> 
> 
> 
> 
> 
> dammit


How were you able to set the voltage to +198 mV? I can only set it to +75mV.


----------



## xer0h0ur

Quote:


> Originally Posted by *Neon Lights*
> 
> Anyone using Crossfire having problems disabling it in Radeon Software Crimson 15.11.1? Everytime I set it on disabled and then click apply it just sets it back to enabled and nothing happens (okay actually the main Radeon Software window closes and then opens again).


Just a copy pasta from what I said earlier:

Open Radeon Settings, click Preferences then click Radeon Additonal Settings and after a moment its going to open up an old CCC style menu and you can globally disable Crossfire there. From what I understand opening this particular menu though will trigger the shut down error but it should still keep the setting change.

This works for disabling crossfire between video cards, does not work for disabling crossfire on a 295X2. So as long as you're crossfired using 2 or more video cards you should be able to globally disable crossfire like this. Trying to disable crossfire using game profiles is seemingly useless.


----------



## lullerkitten

Quote:


> Originally Posted by *Neon Lights*
> 
> How were you able to set the voltage to +198 mV? I can only set it to +75mV.


hex edit trixx


----------



## Neon Lights

Quote:


> Originally Posted by *lullerkitten*
> 
> hex edit trixx


Please give me your hex edited files or give me/direct me to a guide on how to this!

Does it give you better overclocks than the default +75mV?


----------



## Neon Lights

Quote:


> Originally Posted by *Neon Lights*
> 
> Please give me your hex edited files or give me/direct me to a guide on how to this!
> 
> Does it give you better overclocks than the default +75mV?


I think I found something about it:

__
https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/

Did I just accidentaly skip something or has raising the maximum possible overvoltage in Trixx not been explained or linked to in this thread?


----------



## xer0h0ur

People have mentioned it several times over. You missed it.


----------



## Neon Lights

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just a copy pasta from what I said earlier:
> 
> Open Radeon Settings, click Preferences then click Radeon Additonal Settings and after a moment its going to open up an old CCC style menu and you can globally disable Crossfire there. From what I understand opening this particular menu though will trigger the shut down error but it should still keep the setting change.
> 
> This works for disabling crossfire between video cards, does not work for disabling crossfire on a 295X2. So as long as you're crossfired using 2 or more video cards you should be able to globally disable crossfire like this. Trying to disable crossfire using game profiles is seemingly useless.


Quote:


> Originally Posted by *xer0h0ur*
> 
> Just a copy pasta from what I said earlier:
> 
> Open Radeon Settings, click Preferences then click Radeon Additonal Settings and after a moment its going to open up an old CCC style menu and you can globally disable Crossfire there. From what I understand opening this particular menu though will trigger the shut down error but it should still keep the setting change.
> 
> This works for disabling crossfire between video cards, does not work for disabling crossfire on a 295X2. So as long as you're crossfired using 2 or more video cards you should be able to globally disable crossfire like this. Trying to disable crossfire using game profiles is seemingly useless.


I did that, but it seems I had some sort of bug or a bad installation, because it would always jump back to enabled. After restarting my PC again, I got a low resolution screen and when attempting to start Radeon Settings it said that no GPU driver was installed.
I then used DDU and installed Radeon Crimson 15.11.1 again (I had previously just installed it over CCC 15.11.1) and now I was able to disable Crossfire with that method.


----------



## lullerkitten

Quote:


> Originally Posted by *Neon Lights*
> 
> Please give me your hex edited files or give me/direct me to a guide on how to this!
> 
> Does it give you better overclocks than the default +75mV?


It gives me higher clocks (1180) when i'm feeding it 1.4 v... but the performance is actually alot worse









So for me no, everything above 1130 on core needs too much voltage upping for better performance for me anyway, even +150mv doesn't keep 1140 on core stable..

and yup you found it


----------



## Neon Lights

Quote:


> Originally Posted by *Neon Lights*
> 
> I can do it with the same GPU clock (although I used 1231MHz).
> 
> I am able to run the "Furry and Tessy (GL4)" Test (settings are 1920x1080, 4xMSAA and Fullscreen) on the MSI Kombustor 2.5.0 without it crashing.
> At +10MHz, so 1241MHz, I get slight artifacts and at another +10MHz, so 1251MHz, I get a bit more artifacts and after a few seconds a crash.


I want to rectify what I wrote here. While I can run that test at 1231MHz with +75 mV, using Fallout 4 as a statbility test, I can only run it at 1171MHz with +75 mV for a longer time without it crashing.


----------



## Noirgheos

Can anybody here say if the XFX R9 Fury is worth $60 less than the Sapphire R9 Fury?

What are the disadvantages to the XFX one?

http://www.newegg.ca/Product/Product.aspx?Item=N82E16814202157&cm_re=r9_fury-_-14-202-157-_-Product

http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150757&cm_re=r9_fury-_-14-150-757-_-Product


----------



## Orgios

Sapphire has a dual bios switch and is somewhat better looking, I have the XFX and I am really sattisfied, unlocked to 3840 cu cores and mildly overclocked to 1075 with no changes to power, everything running smooth so far


----------



## Agent Smith1984

Quote:


> Originally Posted by *Orgios*
> 
> Sapphire has a dual bios switch and is somewhat better looking, I have the XFX and I am really sattisfied, unlocked to 3840 cu cores and mildly overclocked to 1075 with no changes to power, everything running smooth so far


Have you been able to use voltage control on your XFX? My CU's were hard locked on mine, so I couldn't unlock anything.


----------



## Jflisk

XFX dumped there lifetime warranty with the FURY X - That gives me the warm and fuzzy about these cards. Not to mention all the FURY X's come from AMD regardless of the names on them.


----------



## NBrock

My Fury X just got delivered. I'll be hooking her up tonight.


----------



## Orgios

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have you been able to use voltage control on your XFX? My CU's were hard locked on mine, so I couldn't unlock anything.


If you mean through sapphire trixx then yes I have , though I dont plan to oc the card that much , I'd rather leave it at stock voltages for the time being so its 3840 cu cores @1075 @540 stable for me (at least for now)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Orgios*
> 
> If you mean through sapphire trixx then yes I have , though I dont plan to oc the card that much , I'd rather leave it at stock voltages for the time being so its 3840 cu cores @1075 @540 stable for me (at least for now)


Are you using HDMI or DP?

I have the card, and trying to use Trixx in any way shape or form causes system reboots..... no matter what I do....

I heard one other person mention they were getting this issue using HDMI but not with DP.

I can't currently test DP...


----------



## Orgios

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you using HDMI or DP?
> 
> I have the card, and trying to use Trixx in any way shape or form causes system reboots..... no matter what I do....
> 
> I heard one other person mention they were getting this issue using HDMI but not with DP.
> 
> I can't currently test DP...


I am using the display port , even when the gpu was unstable when I was testing OC settings it would only hang for a second and recover.


----------



## Elmy

Can I be in the Club?

Just doing some testing before waterblocks go on....


----------



## NBrock

Anyone got links to more info on overclocking these and HBM since Trixx unlocked voltage?

Also what has everyone been able to do for safe 24/7 speeds?


----------



## Otterfluff

Lowering the temps via custom loop has huge effects on higher HBM clocks. Getting 600hz for me is easy just by lowering the temp, I add voltage to my HBM via hard mods and the highest it seems to go is 630mhz. @ 1.4v stable 24/7

To be honest the best overclocking info for fury is in this thread, but you have to read back.


----------



## ht_addict

In the new Crimson drivers there is an option for Virtual Super Resolution. After enabling this, Call Of Duty Black Ops now renders in 4k, then scales it down to 1080p on my OLED. Anyone else enable this feature? Game still run's 60FPS with my


----------



## Clockster

Well after installing Trixx after the latest crimson drivers my card is now causing issues.
As soon as I join a game or play a game the machine black screens, audio still playing and then I have to restart. I've tried absolutely everything I can think of but it still does it


----------



## NBrock

Quote:


> Originally Posted by *Otterfluff*
> 
> Lowering the temps via custom loop has huge effects on higher HBM clocks. Getting 600hz for me is easy just by lowering the temp, I add voltage to my HBM via hard mods and the highest it seems to go is 630mhz. @ 1.4v stable 24/7
> 
> To be honest the best overclocking info for fury is in this thread, but you have to read back.


Roger that, I did try looking but there are so many pages lol. Thanks for the info. What kinda temps are you talking? My rig is in my basement and this time of year it is nice and cool. Right now under load @ 1100 core and 550 HBM it's running 37*c @ 20% fan. I haven't tried bumping anything up higher than that right now (just got the card and didn't have much time for testing.


----------



## Otterfluff

My water temps under load are 24C and the core gets to 29C. I have two fury X and they both perform the same for HBM overclocking. If you can get it cool enough then 600Mhz HBM is very doable and stable.


----------



## NBrock

Quote:


> Originally Posted by *Otterfluff*
> 
> My water temps under load are 24C and the core gets to 29C. I have two fury X and they both perform the same for HBM overclocking. If you can get it cool enough then 600Mhz HBM is very doable and stable.


Nice thanks for the info.


----------



## Arizonian

Got my 780Ti sold off for $300, picked up *XFX Triple D Fury* for $480 plus got Star Wars Battlefront bonus for a 3rd rig I was going to buy a copy for Christmas anyway. Quite the nice bargain. I'm glad I did it last night because this morning I see they are back up to $529. Came with $20 rebate pre-paid visa, so all said and done $180 to upgrade plus $60 value was like getting my Fury for $120.









I'm a gamer so not concerned about benching and the small over clock head room. Plan on keeping my U2713HM IPS 60 Hz monitor for about another year.









Will post pics in rig when it's all set up. Going to get system set up with drivers using a back up XFX R7 370 4GB while I wait for it in the mail this week.


----------



## Jflisk

Quote:


> Originally Posted by *ht_addict*
> 
> In the new Crimson drivers there is an option for Virtual Super Resolution. After enabling this, Call Of Duty Black Ops now renders in 4k, then scales it down to 1080p on my OLED. Anyone else enable this feature? Game still run's 60FPS with my


If you go into blops 3 and change the resolution to 1440X2550 . I have 2 X FURY X and have no problems running at 60FPS in blobs 3 . Looks way better then 1080P . There may be some times when you start the game the resolution might be off and create a larger screen - go into options and go from full screen to window will correct the resolution .


----------



## Masika

I am going to enter the club and purchase a Fury or Fury X to last me until the X2 comes (update from 6990s). So the question to pose members which one? Also what brand... I am an ASUS nut but... be disappointed with their support of late. I am considering the FURY and trying my luck to unlocked to Fury X. All advice welcomed.


----------



## NBrock

I just picked up a Power Color Fury X (my first Power Color card ever. I usually get Sapphire). Quality seems fine. Price was great during black Friday as well so I lucked out there.

I may be wrong but I swear I read that with the Fury X they are all essentially the same cards and chips with different branding on them.


----------



## malitze

Quote:


> Originally Posted by *NBrock*
> 
> I just picked up a Power Color Fury X (my first Power Color card ever. I usually get Sapphire). Quality seems fine. Price was great during black Friday as well so I lucked out there.
> 
> I may be wrong but I swear I read that with the Fury X they are all essentially the same cards and chips with different branding on them.


Yep, they are


----------



## Agent Smith1984

Just traded the XFX Fury outright for a GTX 980 K|NGP|N.....

Traitor flames are understandable guys, sorry


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just traded the XFX Fury outright for a GTX 980 K|NGP|N.....
> 
> Traitor flames are understandable guys, sorry


I think with your cpu your going to get more mileage on that 980 over the fury. That and it looks like you are big on overclocking so you will be happy i think. Good luck!

Quote:


> Originally Posted by *Elmy*
> 
> Can I be in the Club?
> 
> Just doing some testing before waterblocks go on....


I would be interested in some benchmarks to compare to my quad furyx setup. Also, i thought you were getting furyx2??? hmmm


----------



## JonDuma

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just traded the XFX Fury outright for a GTX 980 K|NGP|N.....
> 
> Traitor flames are understandable guys, sorry


To be fair, i sold my EVGA GTX 970 FTW 2-way SLI for Sapphire R9 Fury (soon on Crossfire)


----------



## Agent Smith1984

Quote:


> Originally Posted by *JonDuma*
> 
> To be fair, i sold my EVGA GTX 970 FTW 2-way SLI for Sapphire R9 Fury (soon on Crossfire)


My biggest reason for the trade is that I'm getting similar performance (if not better on the 980 if I can get in the 1500+ core range), and will have HDMI 2.0 for 60hz on my 4k TV.


----------



## xer0h0ur

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just traded the XFX Fury outright for a GTX 980 K|NGP|N.....
> 
> Traitor flames are understandable guys, sorry


What is the point? If that is what satisfies you and makes you happy then so be it. Its the miserable dolts that cry about things they can't change and refuse to get something else that annoy me.


----------



## xer0h0ur

Quote:


> Originally Posted by *dagget3450*
> 
> I think with your cpu your going to get more mileage on that 980 over the fury. That and it looks like you are big on overclocking so you will be happy i think. Good luck!
> I would be interested in some benchmarks to compare to my quad furyx setup. Also, i thought you were getting furyx2??? hmmm


He is sponsored. He got a Nano, He got those Fury X's and will get Fury X2's.


----------



## MerkageTurk

Wow my gpu is so bad, no OC, power limit default and everything getting driver crashes and red screen of death


----------



## Forceman

Quote:


> Originally Posted by *MerkageTurk*
> 
> Wow my gpu is so bad, no OC, power limit default and everything getting driver crashes and red screen of death


Sounds like an immediate return/RMA.


----------



## Jflisk

Quote:


> Originally Posted by *NBrock*
> 
> I just picked up a Power Color Fury X (my first Power Color card ever. I usually get Sapphire). Quality seems fine. Price was great during black Friday as well so I lucked out there.
> 
> I may be wrong but I swear I read that with the Fury X they are all essentially the same cards and chips with different branding on them.


You might want to check the bios on the Power Color FURY X . .066 is the newest also turns on UEFI .All Fury X are The same card all come from AMD and get label slapped. I have 2 of the power colors and had to update the bios on them. I had like .63 the original bios there is a .64 and the latest .66 . I did my one card still have to do the second.

This is the sapphire Bios for FURY X but works on Power color.
https://www.techpowerup.com/vgabios/177517/sapphire-r9furyx-4096-150721.html

Proceed at your own risk - yada yada yada


----------



## MerkageTurk

I should return it than? Could it be driver related?


----------



## Forceman

Quote:


> Originally Posted by *MerkageTurk*
> 
> I should return it than? Could it be driver related?


Pretty sure the red screen of death is a card problem, not driver. You can try some different drivers to be sure if you wanted, but I don't think it'll make a difference if you are at stock.


----------



## Agent Smith1984

Quote:


> Originally Posted by *xer0h0ur*
> 
> What is the point? If that is what satisfies you and makes you happy then so be it. Its the miserable dolts that cry about things they can't change and refuse to get something else that annoy me.


My thoughts exactly!


----------



## NBrock

Quote:


> Originally Posted by *Jflisk*
> 
> You might want to check the bios on the Power Color FURY X . .066 is the newest also turns on UEFI .All Fury X are The same card all come from AMD and get label slapped. I have 2 of the power colors and had to update the bios on them. I had like .63 the original bios there is a .64 and the latest .66 . I did my one card still have to do the second.
> 
> This is the sapphire Bios for FURY X but works on Power color.
> https://www.techpowerup.com/vgabios/177517/sapphire-r9furyx-4096-150721.html
> 
> Proceed at your own risk - yada yada yada


Good looking out. Any known improvements (memory timings?)

Looks like I have 015.048.000.064.005990....So I am going to assume that is .64


----------



## DMatthewStewart

Quote:


> Originally Posted by *battleaxe*
> 
> depends if you plan on 4k or not I think. If 4kk, Fury is only gonna get stronger with drivers. Happens every time. 1440 is less definitive, and 1080p you may as well go nvidia, Fury is just too far behind IMO. Personally, I think 1080 is yesterday's news and not worth nvidia claiming any bragging rights.
> 
> I think this Fury/x and 4k is a winner especially now with volts unlocked. We should see in the next few weeks what these can really do. Remember when the 290x first came out? Wasn't long and it was owning the Titan. I'm hoping for a repeat.


Im not considering 4k for a long time. Im hooked on 144hz. Its just so fluid and smooth. And since Im on 1080p now my next step up will be to 1440. Ive been holding off on that because I cant justify the price jump. My monitor (1080p 144hz right around $300), 1440p with 120-144hz are close to $600 and up. Its stupid. I'll wait until more people buy 4k and watch the price drop on 1440p's.

Ive also finally decided to wait for the next gen gpu's. I have 290x Lightnings that are awesome right now. And the performance difference doesnt justify an upgrade. Especially since AMD just revealed today more details about Arctic. Getting double HBM along with significant performance per watt increases. That will be here before we know it.


----------



## battleaxe

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Im not considering 4k for a long time. Im hooked on 144hz. Its just so fluid and smooth. And since Im on 1080p now my next step up will be to 1440. Ive been holding off on that because I cant justify the price jump. My monitor (1080p 144hz right around $300), 1440p with 120-144hz are close to $600 and up. Its stupid. I'll wait until more people buy 4k and watch the price drop on 1440p's.
> 
> Ive also finally decided to wait for the next gen gpu's. I have 290x Lightnings that are awesome right now. And the performance difference doesnt justify an upgrade. Especially since AMD just revealed today more details about Arctic. Getting double HBM along with significant performance per watt increases. That will be here before we know it.


I'm doing the same thing. Got 290x's in Xfire and waiting for next Gen also. No point in upgrading right now if you ask me at all. But kudos to those who do, I just plan to wait.


----------



## Jflisk

Quote:


> Originally Posted by *NBrock*
> 
> Good looking out. Any known improvements (memory timings?)
> 
> Looks like I have 015.048.000.064.005990....So I am going to assume that is .64


That would be correct no UEFI on that one but it is a good one. I had the .63 and had multiple problems that the newer bioses don't seem to give me.


----------



## Evil-Mobo

Quote:


> Originally Posted by *JonDuma*
> 
> To be fair, i sold my EVGA GTX 970 FTW 2-way SLI for Sapphire R9 Fury (soon on Crossfire)


I'm the one who traded with him. I ordered the second card already and will be going Crossfire with them under water on my X99 build.









Will be looking around here for the what's/what's on these cards. I heard a rumor that they can be unlocked via BIOS to get closer to Fury X performance is this true? If so where do I need to look?

Thanks guys


----------



## xer0h0ur

Quote:


> Originally Posted by *DMatthewStewart*
> 
> Im not considering 4k for a long time. Im hooked on 144hz. Its just so fluid and smooth. And since Im on 1080p now my next step up will be to 1440. Ive been holding off on that because I cant justify the price jump. My monitor (1080p 144hz right around $300), 1440p with 120-144hz are close to $600 and up. Its stupid. I'll wait until more people buy 4k and watch the price drop on 1440p's.
> 
> Ive also finally decided to wait for the next gen gpu's. I have 290x Lightnings that are awesome right now. And the performance difference doesnt justify an upgrade. Especially since AMD just revealed today more details about Arctic. Getting double HBM along with significant performance per watt increases. That will be here before we know it.


Well for what its worth not much of a difference from 120Hz to 144Hz and DP 1.3 is bringing 120Hz @ 4K. Yet to see if Arctic Islands or Pascal will have DP 1.3 in it but its presumed it will. In any event I am also in the wait and see crowd. If dual Fiji XTs = my triple Hawaii XTs and Greenland is going to double the transistor count then there is a fair chance a single top end Arctic Islands GPU can replace the performance of my 3 GPUs. Would be hard to ignore that.


----------



## ManofGod1000

I am seriously considering an R9 Fury Air cooled XFX one. I already tried 2 x R9 390 but they ran to hot for any long term use so I am returning them. (94C max well playing Crysis 3 and that is in the winter. :thumbsdown: Right now, I am playing on an R9 290x which does ok at 4k resolutions. I guess what I am wondering is, how much better is an R9 Fury air cooled unit? Thanks.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ManofGod1000*
> 
> I am seriously considering an R9 Fury Air cooled XFX one. I already tried 2 x R9 390 but they ran to hot for any long term use so I am returning them. (94C max well playing Crysis 3 and that is in the winter. :thumbsdown: Right now, I am playing on an R9 290x which does ok at 4k resolutions. I guess what I am wondering is, how much better is an R9 Fury air cooled unit? Thanks.


Fury shines at 4k...

It was a solid 6-10fps improvement going from 390 @ 1200/1700 to a fury @ 1060/560 for me!

It's not going to blow your mind though...


----------



## Evil-Mobo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Fury shines at 4k...
> 
> It was a solid 6-10fps improvement going from 390 @ 1200/1700 to a fury @ 1060/560 for me!
> 
> It's not going to blow your mind though...


What about two of them?


----------



## ManofGod1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Fury shines at 4k...
> 
> It was a solid 6-10fps improvement going from 390 @ 1200/1700 to a fury @ 1060/560 for me!
> 
> It's not going to blow your mind though...


Cool, thank you. However, I am going to upgrade from a R9 290 unlocked to a 290x at stock speeds. It should be a maybe 10-15 fps more stock to stock I would think based on what you just said. Also, it will run a lot cooler and require less power as well. (I have not tried one but, I am just extrapolating out from what you have said above.)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Evil-Mobo*
> 
> What about two of them?


Two fury in 4k looks to be almost as awesome as 3) 290's!


----------



## Evil-Mobo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Two fury in 4k looks to be almost as awesome as 3) 290's!


Sweet!


----------



## MerkageTurk

Still more RSOD, guess you was right the card being defective.

But I can't be bothered to remove it lool


----------



## ManofGod1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Fury shines at 4k...
> 
> It was a solid 6-10fps improvement going from 390 @ 1200/1700 to a fury @ 1060/560 for me!
> 
> It's not going to blow your mind though...


Well, like it or not, I now have a XFX R9 Fury on the way that I should receive early next week. (Newegg does not offer refunds, only exchanges.) However, I am sure I will love it for a single card setup. Yes, the 980Ti is faster but, not almost $200 faster and if I want, I can go with crossfire fury's for far less cost than a 980ti sli setup. (Assuming the prices come down in the next 6 months.)

Thank you for the help, I really appreciated it. I did love that R9 390 crossfire setup but, between the fan noise and the 94C load temps, I returned them. (It is not even summer yet so they would not have done well.) 4K gaming for me is where it is at. I am upgrading from an R9 290x reference stock so it will run cooler as well.


----------



## Jflisk

Quote:


> Originally Posted by *ManofGod1000*
> 
> Well, like it or not, I now have a XFX R9 Fury on the way that I should receive early next week. (Newegg does not offer refunds, only exchanges.) However, I am sure I will love it for a single card setup. Yes, the 980Ti is faster but, not almost $200 faster and if I want, I can go with crossfire fury's for far less cost than a 980ti sli setup. (Assuming the prices come down in the next 6 months.)
> 
> Thank you for the help, I really appreciated it. I did love that R9 390 crossfire setup but, between the fan noise and the 94C load temps, I returned them. (It is not even summer yet so they would not have done well.) 4K gaming for me is where it is at. I am upgrading from an R9 290x reference stock so it will run cooler as well.


If your going to do it I would suggest the FURY X if you can afford it .


----------



## ManofGod1000

Quote:


> Originally Posted by *Jflisk*
> 
> If your going to do it I would suggest the FURY X if you can afford it .


I fully agree but, I am already using an Noctua NH-D15 in a Fractal Design Define r3 case. As you can imagine, I have not room for the radiator and fans.


----------



## Jflisk

Quote:


> Originally Posted by *ManofGod1000*
> 
> I fully agree but, I am already using an Noctua NH-D15 in a Fractal Design Define r3 case. As you can imagine, I have not room for the radiator and fans.


It will be just fine put the rad behind the last fan and the case - Just kidding


----------



## ManofGod1000

Sheez, I just cannot seem to get it right. I found that the R9 Fury is to long for my case so for the moment, I am back to the 290x until I can figure out what I am going to upgrade too.


----------



## DMatthewStewart

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well for what its worth not much of a difference from 120Hz to 144Hz and DP 1.3 is bringing 120Hz @ 4K. Yet to see if Arctic Islands or Pascal will have DP 1.3 in it but its presumed it will. In any event I am also in the wait and see crowd. If dual Fiji XTs = my triple Hawaii XTs and Greenland is going to double the transistor count then there is a fair chance a single top end Arctic Islands GPU can replace the performance of my 3 GPUs. Would be hard to ignore that.


Righto! Thats why when Im monitoring shopping I dont care if its 120 or 144hz. I put them into the same category.


----------



## Elmy

Quote:


> Originally Posted by *dagget3450*
> 
> I think with your cpu your going to get more mileage on that 980 over the fury. That and it looks like you are big on overclocking so you will be happy i think. Good luck!
> I would be interested in some benchmarks to compare to my quad furyx setup. Also, i thought you were getting furyx2??? hmmm


I am also getting 2 Fury X2's as well. Can't say when of course....but they are coming....


----------



## ManofGod1000

Quote:


> Originally Posted by *Jflisk*
> 
> It will be just fine put the rad behind the last fan and the case - Just kidding










Well, I purchased an EVGA 980 TI FTW since it was the only card that would fit my case. (Define r3) The Fury's are all to long and I did not want a Nano, as cool looking as they are. That and EVGA is the only company that I called and they took care of me straight away. Gigabyte cards were too big and MSI had me on hold for too long. (Maximum videocard length could be no bigger than 290mm.) Sorry AMD but, this time, you were just not in the game for me and I am an AMD fan.


----------



## Alastair

Quote:


> Originally Posted by *ManofGod1000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> It will be just fine put the rad behind the last fan and the case - Just kidding
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, I purchased an EVGA 980 TI FTW since it was the only card that would fit my case. (Define r3) The Fury's are all to long and I did not want a Nano, as cool looking as they are. That and EVGA is the only company that I called and they took care of me straight away. Gigabyte cards were too big and MSI had me on hold for too long. (Maximum videocard length could be no bigger than 290mm.) Sorry AMD but, this time, you were just not in the game for me and I am an AMD fan.
Click to expand...

wait what? Too long? What did I just read?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alastair*
> 
> wait what? Too long? What did I just read?


They are pretty long at 12.4"...
Mine fit in an s340 with room to spare though...


----------



## ManofGod1000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> They are pretty long at 12.4"...
> Mine fit in an s340 with room to spare though...


Yep, the longest card I could install would have been about 11.3 inches or so. (290mm) Even the Gigabyte 980Ti at 291mm would not have fit.


----------



## Agent Smith1984

I sure hope the 980 K|NGP|N clocks good, cause this Fury is really coming alive...
http://www.3dmark.com/fs/6704250

This is with the XFX BIOS that 3dmark cannot detect for some reason... the card won't do anywhere near this on the Sapphire Fury BIOS,(around 1060/550),but that BIOS allows the card to be detected as a Fury

This is max clocks on stock voltage, The new owner is putting it under water with the same card in crossfire, and definitely going to push voltage. Should make for a monster 4k setup.

I'll miss her in some ways, but this will be a chance to get familiar with NVIDIA and do some crazy overclocking (as much of the fun as anything for me).


----------



## SuperZan

It's nice to have options and new things to try. Between the PC's and laptops in my house atm I've got red, yellow, graybrown, and purple configurations and they're all fun. Keep us updated on the Kingpin clocking vs. what you were able to do with the Fury!


----------



## Alastair

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> wait what? Too long? What did I just read?
> 
> 
> 
> They are pretty long at 12.4"...
> Mine fit in an s340 with room to spare though...
Click to expand...

oh I thought he was talking about Fury x. Which is short. And costs about the same a s a 980ti.


----------



## petrvs

Has anyone tried to OC the card using a custum watercooling setup?

I am thinking to connect the Fury X to the Predator 240 by EWKB and ramp up from there.
Thoughts?


----------



## Evil-Mobo

Quote:


> Originally Posted by *petrvs*
> 
> Has anyone tried to OC the card using a custum watercooling setup?
> 
> I am thinking to connect the Fury X to the Predator 240 by EWKB and ramp up from there.
> Thoughts?


I will be running two R9 Fury's (non X) in my current build as soon as it's together I am just waiting on the case from Parvum. Will post my results once everything is setup and OC'ed.


----------



## petrvs

Quote:


> Originally Posted by *Evil-Mobo*
> 
> I will be running two R9 Fury's (non X) in my current build as soon as it's together I am just waiting on the case from Parvum. Will post my results once everything is setup and OC'ed.


Right, pardon me asking to be explicit but are you planning to put those in a custom water loop?

I am really curious to see VRM clock and core scaling with a really aggressive water cooled setup. I believe I read temps were in the mid 30 Celsius with the PRedator and block rigged so...


----------



## Evil-Mobo

Quote:


> Originally Posted by *petrvs*
> 
> Right, pardon me asking to be explicit but are you planning to put those in a custom water loop?
> 
> I am really curious to see VRM clock and core scaling with a really aggressive water cooled setup. I believe I read temps were in the mid 30 Celsius with the PRedator and block rigged so...


Yes I apologize I thought it was implied in my answer I should have been more specific. I will be running two XFX R9 Fury cards in cross fire on an X99 platform with a custom loop.


----------



## petrvs

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Yes I apologize I thought it was implied in my answer I should have been more specific. I will be running two XFX R9 Fury cards in cross fire on an X99 platform with a custom loop.


Great keep me posted!

Btw any results on similar setups yet / any guide on how to OC the card now that voltage etc. is unlocked?
I might make a post myself if nothing is there


----------



## Evil-Mobo

Quote:


> Originally Posted by *petrvs*
> 
> Great keep me posted!
> 
> Btw any results on similar setups yet / any guide on how to OC the card now that voltage etc. is unlocked?
> I might make a post myself if nothing is there


This will be my first experience with the AMD GPU's so I'm in the same boat as you. I just hope I can unlock my XFX cards which it seems most cannot be........


----------



## petrvs

Quote:


> Originally Posted by *Evil-Mobo*
> 
> This will be my first experience with the AMD GPU's so I'm in the same boat as you. I just hope I can unlock my XFX cards which it seems most cannot be........


How come?
Could it be possible to use MSI or Sapphire programs on it given the cards are all the same, I wonder


----------



## H4ZE

I want to OC my fury X a little bit more. It is cool and not loud at all. So time to push a little bit more. But i cannot use the voltage slider in MSI AB. So i tried the trixx software. But the trixx software acts all weird. It OC by itsself to 1100mhz. But when i change it to lets say 1110. It automaticly clocks back to 1100. This is problem number one i have.

The second problem is with the memory OC. When i put the HBM to 550mhz in stat of 500 everythings works fine. Until i start reboting my system. Then it sometimes boots into windows and the screen gets blue.

Things i tried to solve this problem: Update my bios to the newest version 66.
Reinstal Trixx software
I have the sapphire version of the fury X.


----------



## Evil-Mobo

Quote:


> Originally Posted by *petrvs*
> 
> How come?
> Could it be possible to use MSI or Sapphire programs on it given the cards are all the same, I wonder


Not 100% sure like I said need to research more

This is a good start:
http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


----------



## czin125

Anyone try running these settings on the Fury X 720P --> 1080P?


----------



## petrvs

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Not 100% sure like I said need to research more
> 
> This is a good start:
> http://www.overclock.net/t/1567179/activation-of-cores-in-hawaii-tonga-and-fiji-unlockability-tester-ver-1-6-and-atomtool


Seems that other programs work

http://i.imgur.com/nx5x8VZ.png


----------



## Jflisk

Quote:


> Originally Posted by *petrvs*
> 
> Has anyone tried to OC the card using a custum watercooling setup?
> 
> I am thinking to connect the Fury X to the Predator 240 by EWKB and ramp up from there.
> Thoughts?


I would stick with the AIO you are going to need a lot of radiator to keep 2x FURY X cool. I have the loop for it and I have not added them yet . The AIO works better then good for what it Is .


----------



## petrvs

Quote:


> Originally Posted by *Jflisk*
> 
> I would stick with the AIO you are going to need a lot of radiator to keep 2x FURY X cool. I have the loop for it and I have not added them yet . The AIO works better then good for what it Is .


The predator vs the stock cooler. makes he card run at 31 idle and 41 on load


----------



## MalsBrownCoat

Quote:


> Originally Posted by *xer0h0ur*
> 
> So all those headaches were because of Freesync? WAT?


For the most part, I'd say "yes". After turning all of the monitors' FreeSync to OFF, I was able to get between 48 and 50 FPS.

Quote:


> Originally Posted by *p4inkill3r*
> 
> I think you should try upgrading to Crimson and attempting to play on just one monitor with Freesync enabled to gauge theoretical performance.
> 
> Attempting to run your setup as it stands seems pie in the sky, as I don't think there are working CFx or SLI profiles for Fallout 4 yet, much less fully fleshed-out multi monitor support.


After updating to Crimson, again, my FPS has gone down. I disabled CrossFire in the Fallout 4 profile and I'm still down to 32-41 FPS.
I'm only running the game on a single monitor (the other 2 are for the desktop/any open windows).

I'm just not comprehending how so many others are getting +60 FPS with just about everything set to Ultra on a Fury X with Crimson drivers. = /

Does anything look wrong here?


----------



## p4inkill3r

Unless you duplicate the methodology as described here, I don't know how else to gauge whether or not something is amiss: http://www.techspot.com/review/1089-fallout-4-benchmarks/
Quote:


> Using Fraps we recorded 120 seconds of gameplay starting at the gas station where you meet your new best friend "Dogmeat". We then walk down the road to the town of "Concord" where we did a lap of the town and ran through a skirmish with a few raiders, which is where the frame rate often fell to its lowest value.


----------



## MalsBrownCoat

I just followed the same route and saw between 29 and 42 FPS; whereas according to that benchmark at 1440p, I should have seen between 53 and 69. Seems like I'm getting about half of what I should be.


----------



## p4inkill3r

Try setting the tesselation mode to AMD Optimized, maybe?


----------



## MalsBrownCoat

No change.

I just don't get it...

/me sighs

And for whatever it's worth, I've read on a few other forums about setting the iPresentinterval=0 in the config file, but I shouldn't *have* to be doing that.

A Fury X should smoke these numbers right out of the box, without needing to hack the config file. Especially considering that others seem to be hitting the higher numbers without such modifications.

As for the Intel vs AMD thing, it seems rather far fetched to attribute such a _heavy_ FPS drop when you have GPUs like this. It can't possibly be that simple...._can_ it?


----------



## sugarhell

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I just followed the same route and saw between 29 and 42 FPS; whereas according to that benchmark at 1440p, I should have seen between 53 and 69. Seems like I'm getting about half of what I should be.


The benchmark use an intel cpu.


----------



## dagget3450

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I just followed the same route and saw between 29 and 42 FPS; whereas according to that benchmark at 1440p, I should have seen between 53 and 69. Seems like I'm getting about half of what I should be.


Might want to verify if your clock throttling on your Fury. I found in many games my gpu clocks dropping all the way down to 350mhz and avg of around 800-900 when using vsynch in Crimson drivers. Use an OSD app to monitor gpu clocks while playing. If you see anything lower than 1050 then that may be part of your issue as well. Then try running it with vsynch turned off in the game. May have to edit an ini file to make vsynch turn off.


----------



## p4inkill3r

Quote:


> Originally Posted by *dagget3450*
> 
> Might want to verify if your clock throttling on your Fury. I found in many games my gpu clocks dropping all the way down to 350mhz and avg of around 800-900 when using vsynch in Crimson drivers. Use an OSD app to monitor gpu clocks while playing. If you see anything lower than 1050 then that may be part of your issue as well. Then try running it with vsynch turned off in the game. May have to edit an ini file to make vsynch turn off.


I haven't seen one instance of throttling in the entire time I've owned my Fury X.

Vsync issues, ok, but never any throttling.


----------



## p4inkill3r

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> No change.
> 
> I just don't get it...
> 
> /me sighs
> 
> And for whatever it's worth, I've read on a few other forums about setting the iPresentinterval=0 in the config file, but I shouldn't *have* to be doing that.
> 
> A Fury X should smoke these numbers right out of the box, without needing to hack the config file. Especially considering that others seem to be hitting the higher numbers without such modifications.
> 
> As for the Intel vs AMD thing, it seems rather far fetched to attribute such a _heavy_ FPS drop when you have GPUs like this. It can't possibly be that simple...._can_ it?


Why don't you run a benchmark like Heaven or 3dMark and see if you can duplicate the poor performance you're receiving in FO4?


----------



## Doomedx

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I'm still down to 32-41 FPS.


I got same issue as you, fallout 4 is unplayabe on my PC with r9 fury, i5 6600k with 144hz lcd asus mg279q at 1440p, i got fps from 28-41 what a joke, when i try 1080p i got constantly 90fps with freesync up.


----------



## Orgios

In the games I play (Coh2, BF4 etc) I notice that GPU usage is never over 40-46% (afterburner OSD) on my XFX Fury Air . Is this normal? I game in 4k (except Star Citizen 1440p)


----------



## Noirgheos

All these people having clocking issues... try going back to 15.11.1 CCC.


----------



## zdziseq

MSI Afterburner 4.2.0 Available for Download !!


----------



## bkvamme

Changelog:
Quote:


> Revision history for MSI Afterburner v4.2.0:
> 
> • Added AMD Fiji graphics processors family support
> • Hardware abstraction layer architecture has been revamped to allow implementation of voltage control via direct access to GPU on-die voltage controllers (e.g. AMD Fiji SMC) in addition to previously supported external voltage controllers connected to GPU via I2C bus. Please take a note that direct access to AMD SMC from multiple simultaneously running hardware monitoring applications can be unsafe and result in collisions, so similar to I2C access synchronization we introduce global namespace synchronization mutex "Access_ATI_SMC" as SMC access synchronization standard. Other developers are strongly suggested to use it during accessing AMD GPU SMC in order to provide collision free hardware monitoring
> • Added core voltage control for reference design AMD RADEON R9 Fury / Nano series cards with on-die SMC voltage controller
> • Added unofficial overclocking support for PowerPlay7 capable graphics cards (AMD Tonga and newer graphics processors family). Please take a note that unofficial overclocking mode with completely disabled PowerPlay is currently not supported for PowerPlay7 capable hardware
> • Added version detection for AMD Radeon Software Crimson edition. Please take a note that new AMD Radeon Software versioning scheme is not backward compatible so now Catalyst version can be reported improperly if you reinstall older versions of Catalyst drivers on top of AMD Radeon Software Crimson edition without cleaning the registry up. Until the issue is addressed inside AMD Radeon Software Crimson edition installer, MSI Afterburner is providing compatibility switch "LegacyDriverDetection" in the configuration file allowing you to use legacy driver version detection mechanism if you're rolling back to legacy Catalyst drivers after AMD Radeon Software Crimson edition drivers
> • GPU usage monitoring filtering algorithms, aimed to filter GPU usage monitoring artifacts in AMD ADL API on AMD Sea Islands GPU family are now disabled by default. Filtering algorithms can still be enabled by power users via configuration file if necessary
> • Added core, memory and auxiliary PEXVDD voltage control for custom design MSI GTX980Ti Lightning series graphics cards with IR3595A+IR3567B voltage regulators
> • Added memory and VRM temperature monitoring for custom design MSI GTX980Ti Lightning series graphics cards with NCT7511Y thermal sensors
> • Now SDK includes detailed documentation for third party hardware database format, allowing experienced users to add voltage control support for custom design non-MSI graphics cards
> • Temperature monitoring for AMD Family 10h - 16h micro architecture CPUs is no longer experimental. Now thermal monitoring on such CPUs is unlocked by default
> • Slightly altered VRAM usage monitoring implementation for AMD and Intel graphics cards. Now total resident bytes are being displayed as VRAM usage instead of total committed bytes, and allocated blocks are no longer being rounded to 1MB boundary per block when calculating a total value
> • Improved skin engine. Added support for altered USF skins obfuscation scheme used in most recent versions of third party overclocking tools
> • Added Brazilian Portuguese localization
> • RivaTuner Statistics Server has been upgraded to v6.4.1


Download link: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## Noirgheos

Quote:


> Originally Posted by *bkvamme*
> 
> Changelog:
> Download link: http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


VOLTAGE CONTROL ON FURY? FINALLY! I'm back to Afterburner.


----------



## Masika

Quote:


> Originally Posted by *Noirgheos*
> 
> VOLTAGE CONTROL ON FURY? FINALLY! I'm back to Afterburner.


Dude voltage control for fury/390x has been around for 3-4 weeks....

http://wccftech.com/amd-fury-series-voltage-control-unlocked-sapphire-trixx/


----------



## bkvamme

Quote:


> Originally Posted by *Masika*
> 
> Dude voltage control for fury/390x has been around for 3-4 weeks....
> 
> http://wccftech.com/amd-fury-series-voltage-control-unlocked-sapphire-trixx/


I think he was referring to Afterburner finally implementing voltage control. From the wording, he has been using Trixx.


----------



## Noirgheos

Quote:


> Originally Posted by *bkvamme*
> 
> I think he was referring to Afterburner finally implementing voltage control. From the wording, he has been using Trixx.


Yep, I just don't like TriXX. Good to be back with Afterburner.


----------



## Agent Smith1984

How are folks doing with the afterburner voltage control versus the Trixx?

I was never able to get trixx to work at all. I will test AB tonight before shipping my Fury off to the new owner.


----------



## Evil-Mobo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How are folks doing with the afterburner voltage control versus the Trixx?
> 
> I was never able to get trixx to work at all. I will test AB tonight before shipping my Fury off to the new owner.


Please let me know what it does as we both already know how much of a let down the Trixx was lol. If the AB works then that's what I will use........


----------



## Scorpion49

Hey guys, just curious if anyone has any experience with the Gigabyte Windforce card? I gave up hope for a nice air cooled Fury after 5 cards with impossible to ignore coil whine but this one looks to be 100% custom PCB. I haven't been able to find a shot of it with the cooler off though, can anyone confirm it?


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Hey guys, just curious if anyone has any experience with the Gigabyte Windforce card? I gave up hope for a nice air cooled Fury after 5 cards with impossible to ignore coil whine but this one looks to be 100% custom PCB. I haven't been able to find a shot of it with the cooler off though, can anyone confirm it?


Eh, I can't imagine going through five cards just to gamble on a sixth.
What's the definition of insanity, again?


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> Eh, I can't imagine going through five cards just to gamble on a sixth.
> What's the definition of insanity, again?


All of the others were reference style boards though, which is why I was hoping this one would be different.


----------



## Agent Smith1984

Asus Strix is a custom board and I hear it's got loud inductors too....


----------



## Scorpion49

Yeah, thats why I never considered that one. I've had noisy Asus cards before, but I don't think I've ever run across a whiny Gigabyte. Nobody seems to have one though.


----------



## 98uk

My Sapphire Fury has almost no coil whine. I only hear it if I put my head in my case


----------



## battleaxe

Quote:


> Originally Posted by *98uk*
> 
> My Sapphire Fury has almost no coil whine. I only hear it if I put my head in my case


You can never go wrong with shoving your head inside a case.


----------



## waltercaorle

Ciao... I have a fury tri-x and the max overclock reached is 1180/570 +72 mV. Also raising the voltage to + 100 / 120mV not have benefits. with a fairly aggressive fan profile the card is on the 58 / 60c.
if i install a fullcover waterblock i improve something?


----------



## NBrock

My Power Color Fury X has almost no coil whine. It's there but not bad unless I am closing Valley Benchmark and the credits screen is up...that son of a gun has made every card I have ever had coil whine.


----------



## petrvs

Can Fury XFX be used with Sapphire TriXX or Asus GPUTweak?


----------



## Semel

So anyone knows how to remove voltage limit from the new afterburner? Unwinder said it is possible but didn't specify how to do it.

Atm voltage is limited to +100mV for Furys.

Cheers.


----------



## Noirgheos

Quote:


> Originally Posted by *waltercaorle*
> 
> Ciao... I have a fury tri-x and the max overclock reached is 1180/570 +72 mV. Also raising the voltage to + 100 / 120mV not have benefits. with a fairly aggressive fan profile the card is on the 58 / 60c.
> if i install a fullcover waterblock i improve something?


What kind of performance increase did you notice with that OC?


----------



## sugarhell

Quote:


> Originally Posted by *Semel*
> 
> So anyone knows how to remove voltage limit from the new afterburner? Unwinder said it is possible but didn't specify how to do it.
> 
> Atm voltage is limited to +100mV for Furys.
> 
> Cheers.


Go to 290x thread and check my guide for the voltages. It should work because fury and 290x ref share the same voltage controller.


----------



## waltercaorle

Quote:


> Originally Posted by *Noirgheos*
> 
> What kind of performance increase did you notice with that OC?


well..in game, having the fps range 32/75 @2560x1080 via FreeSync , it is less important. in the games more expensive resources, such as crysis 3, helps in minimum fps

in terms of numbers it is difficult to quantify.I don t have enough results to calculate...

Anyway:
http://www.3dmark.com/compare/fs/6738861/fs/6536902

then I like to see increase frequencies. is a game too


----------



## Noirgheos

Quote:


> Originally Posted by *waltercaorle*
> 
> well..in game, having the fps range 32/75 @2560x1080 via FreeSync , it is less important. in the games more expensive resources, such as crysis 3, helps in minimum fps
> 
> in terms of numbers it is difficult to quantify.I don t have enough results to calculate...
> 
> Anyway:
> http://www.3dmark.com/compare/fs/6738861/fs/6536902
> 
> then I like to see increase frequencies. is a game too


Damn thats a massive increase! Must translate well to some games.


----------



## Arizonian

Tested my new *XFX Triple D Fury's* legs out just now running Crimson drivers on a IPS 1440p 60Hz monitor.



Spoiler: My Rig Pics!








Validation links
http://www.techpowerup.com/gpuz/details.php?id=99vqc
http://valid.x86.fr/08gyxx

4790 4.6 GHz - Fury 1075 MHz Core cough *stock memory* cough


Spoiler: Warning: Spoiler!








*3DMark11* *Performance P15395*
http://www.3dmark.com/3dm11/10637605

*3DMark11* *Extreme X6089*
http://www.3dmark.com/3dm11/10637653

*Unigine 'Valley' Benchmark 1.0 Extreme HD 76.8 Score*


Spoiler: Warning: Spoiler!







My first impressions of this GPU is simple in looks but at least no clashing colors aesthetically. LED Tach meter on top of power is a nice touch. Very quite when running at full load. No coil whine.









Didn't break a sweat with manual fan curve 51C @ 51% fan speed. It idles 31C @ 23% fan speed.

"*Shadows of Mordor*" benchmark test Min 53.3 Max 82.6 *Avg 60*

At 1080 MHz Core I got less of a benchmark score. Haven't even touched voltage though. Enough benching, not my thing and this already beats my 780Ti scores, temps and fan noise. Now to start up some games.









Update:

2560x1440 - Fury 1075 MHz Core +24 mV +50% Power limit - 57-58C Temp 61-62% Fan Speed (manual fan curve)

*Crysis 3* Very high No AA Min 41 Max 87 *Avg 64.4*
*Far Cry 4* - Kyrat Ultra Min 88 Max 119 *Avg 103*
*Star Wars* Battlefront Assault (40 player) Ultra Field of View 100% Min 81 Max 124 *Avg 102*

Slight bad news possibly for me. Fan closest to case slots has a rattle going when gaming. It's sporadic. If I slightly touch the fan right in the center it completely stops. I'm gaming with case off. With case closed I can't hear the fan noise if it is still sporadic. Seems minor.

May have to exchange it. I'm a premier member with Newegg so it won't cost me regardless. Got holiday extended exchange time see if this works itself out wait till end of the month.

Overall I'm very happy with FURY at my resolution. It's easily crushing games maxed.


----------



## Noirgheos

Quote:


> Originally Posted by *Arizonian*
> 
> Tested my new *XFX Triple D Fury's* legs out just now running Crimson drivers on a IPS 1440p 60Hz monitor.
> 
> 
> 
> Spoiler: My Rig Pics!
> 
> 
> 
> 
> 
> 
> 
> 
> Validation links
> http://www.techpowerup.com/gpuz/details.php?id=99vqc
> http://valid.x86.fr/08gyxx
> 
> 4790 4.6 GHz - Fury 1075 MHz Core cough *stock memory* cough
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> *3DMark11* *Performance P15395*
> http://www.3dmark.com/3dm11/10637605
> 
> *3DMark11* *Extreme X6089*
> http://www.3dmark.com/3dm11/10637653
> 
> *Unigine 'Valley' Benchmark 1.0 Extreme HD 76.8 Score*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My first impressions of this GPU is simple in looks but at least no clashing colors aesthetically. LED Tach meter on top of power is a nice touch. Very quite when running at full load. No coil whine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Didn't break a sweat with manual fan curve 51C @ 51% fan speed. It idles 31C @ 23% fan speed.
> 
> "*Shadows of Mordor*" benchmark test Min 53.3 Max 82.6 *Avg 60*
> 
> At 1080 MHz Core I got less of a benchmark score. Haven't even touched voltage though. Enough benching, not my thing and this already beats my 780Ti scores, temps and fan noise. Now to start up some games.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Update:
> 
> 2560x1440 - Fury 1075 MHz Core +24 mV +50% Power limit - 57-58C Temp 61-62% Fan Speed (manual fan curve)
> 
> *Crysis 3* Very high No AA Min 41 Max 87 *Avg 64.4*
> *Far Cry 4* - Kyrat Ultra Min 88 Max 119 *Avg 103*
> *Star Wars* Battlefront Assault (40 player) Ultra Field of View 100% Min 81 Max 124 *Avg 102*
> 
> Slight bad news possibly for me. Fan closest to case slots has a rattle going when gaming. It's sporadic. If I slightly touch the fan right in the center it completely stops. I'm gaming with case off. With case closed I can't hear the fan noise if it is still sporadic. Seems minor.
> 
> May have to exchange it. I'm a premier member with Newegg so it won't cost me regardless. Got holiday extended exchange time see if this works itself out wait till end of the month.
> 
> Overall I'm very happy with FURY at my resolution. It's easily crushing games maxed.


Hold on those are 1440p framerates? Jesus. I think I should grab a 1440p monitor for my Fury.


----------



## p4inkill3r

Fury shines at 1440.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> Fury shines at 1440.


I don't think I should worry about any game for a few years at 1080p then.


----------



## Evil-Mobo

Wow, and I was concerned about how cross fired Fury's would run lol. So for sure I'm going to need and want to use a 4k monitor


----------



## battleaxe

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Wow, and I was concerned about how cross fired Fury's would run lol. So for sure I'm going to need and want to use a 4k monitor


X-fire (at least on 4k) is where the FuryX really pulls ahead of the 980ti. When the drivers become more mature I'm sure the distance will spread too.


----------



## Evil-Mobo

Quote:


> Originally Posted by *battleaxe*
> 
> X-fire (at least on 4k) is where the FuryX really pulls ahead of the 980ti. When the drivers become more mature I'm sure the distance will spread too.


I digress, my Fury's are only the measly non X models......


----------



## battleaxe

Quote:


> Originally Posted by *Evil-Mobo*
> 
> I digress, my Fury's are only the measly non X models......


Still... relevant. Fury is barely behind the X model


----------



## Noirgheos

Quote:


> Originally Posted by *battleaxe*
> 
> Still... relevant. Fury is barely behind the X model


With the unlocked stream processors its basically a Fury X, but air cooled.


----------



## Evil-Mobo

Quote:


> Originally Posted by *Noirgheos*
> 
> With the unlocked stream processors its basically a Fury X, but air cooled.


Will be under water


----------



## Noirgheos

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Will be under water


Unlocking them didn't even increase temps for me.


----------



## Arizonian

Sorry for asking if it's been answered.

Has anyone figured out, how high of a MHz over clock does a Fury need to catch up with Fury X?

I'm assuming 8-10% OC on the Core would do it.


----------



## Noirgheos

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry for asking if it's been answered.
> 
> Has anyone figured out, how high of a MHz over clock does a Fury need to catch up with Fury X?
> 
> I'm assuming 8-10% OC on the Core would do it.


None, unlocking the CUs using a BIOS edit puts it up there. An 8-10% OC is overkill. That results in around 2000 points more in Firestrike, surpassing any stock Fury X.


----------



## battleaxe

Quote:


> Originally Posted by *Noirgheos*
> 
> None, unlocking the CUs using a BIOS edit puts it up there. An 8-10% OC is overkill. That results in around 2000 points more in Firestrike, surpassing any stock Fury X.


I doubt all Fury's can unlock to FuryX? Or can they? Maybe I don't understand. If so, why would any one buy an X model?


----------



## Noirgheos

Quote:


> Originally Posted by *battleaxe*
> 
> I doubt all Fury's can unlock to FuryX? Or can they? Maybe I don't understand. If so, why would any one buy an X model?


Not all of them, mine was able to though.


----------



## p4inkill3r

Quote:


> Originally Posted by *battleaxe*
> 
> I doubt all Fury's can unlock to FuryX? Or can they? Maybe I don't understand. If so, why would any one buy an X model?


Not having to mess with BIOSs, built in watercooling, higher binning, sweet design, etc.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> Not having to mess with BIOSs, built in watercooling, higher binning, sweet design, etc.


True the X does look pretty sexy, doesn't justify $100 more though.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> True the X does look pretty sexy, doesn't justify $100 more though.


That's a very subjective thing to put a price on.
Regardless, there are reasons to go Fury X, they just aren't everyone's.


----------



## JonDuma

How to OC my R9 Fury, not crazy OC but decent and stable OC only?
what is the recommended for:

GPU Clock
GPU Voltage
Memory Clock

Thanks.


----------



## rocket47

Quote:


> Originally Posted by *JonDuma*
> 
> How to OC my R9 Fury, not crazy OC but decent and stable OC only?
> what is the recommended for:
> 
> GPU Clock
> GPU Voltage
> Memory Clock
> 
> Thanks.


my furyX 1150/600 /+50powerlimit/ +36mv...but every card is different, im sure u know that already... Run some heaven/valley/3dmark and you will see what is stable for you.


----------



## Noirgheos

Hey guys, I have the exact same issue that this guy does in the video. The Sapphire R9 Fury 




I'm on 15.11.1 CCC. Restarting fixes it, or just changing the resolution. I'm afraid it may happen again though. Anyone know what it is?


----------



## fjordiales

Quote:


> Originally Posted by *Noirgheos*
> 
> Hey guys, I have the exact same issue that this guy does in the video. The Sapphire R9 Fury
> 
> 
> 
> 
> I'm on 15.11.1 CCC. Restarting fixes it, or just changing the resolution. I'm afraid it may happen again though. Anyone know what it is?


Mine does that but even worse. Half the screen is green. Then for some reason, I stumbled upon a suggestion regarding xfire. I have 3 Strix in Trifire and only happens to me when using 2 out of the 3 cards.

When All 3 are in xfire, never happens. Changing resolution or disable/enable xfire resets it for me.


----------



## Noirgheos

Quote:


> Originally Posted by *fjordiales*
> 
> Mine does that but even worse. Half the screen is green. Then for some reason, I stumbled upon a suggestion regarding xfire. I have 3 Strix in Trifire and only happens to me when using 2 out of the 3 cards.
> 
> When All 3 are in xfire, never happens. Changing resolution or disable/enable xfire resets it for me.


When is the last time it happened? Were you using 15.11.1 CCC when it happened? Or Crimson?


----------



## fjordiales

Quote:


> Originally Posted by *Noirgheos*
> 
> When is the last time it happened? Were you using 15.11.1 CCC when it happened? Or Crimson?


This past weekend. Just in idle. Kinda annoying since I was browsing YouTube.

It happened on CCC and crimson but only when xfire 2/3 instead of 3/3.


----------



## Noirgheos

Quote:


> Originally Posted by *fjordiales*
> 
> This past weekend. Just in idle. Kinda annoying since I was browsing YouTube.
> 
> It happened on CCC and crimson but only when xfire 2/3 instead of 3/3.


Yep only ever happens in a browser for me. Try disabling hardware acceleration.


----------



## ht_addict

Same thing was happening with mine in CF. Thought maybe it was my OLED. Has to be a software issue.


----------



## fjordiales

Quote:


> Originally Posted by *ht_addict*
> 
> Same thing was happening with mine in CF. Thought maybe it was my OLED. Has to be a software issue.


2x or 3x Crossfire?


----------



## JonDuma

How to OC my R9 Fury, not crazy OC but decent and stable OC only?
what is the recommended for:

GPU Clock
GPU Voltage
Memory Clock

Thanks.
Quote:


> Originally Posted by *rocket47*
> 
> my furyX 1150/600 /+50powerlimit/ +36mv...but every card is different, im sure u know that already... Run some heaven/valley/3dmark and you will see what is stable for you.


Thanks,


----------



## Noirgheos

Guys is it normal that I idle at 40C with the Sapphire Fury? Stock fan curve in the H440. When I tested it without the side panel it was 30C idle. Is it normal to have increaesed so much?


----------



## 98uk

Quote:


> Originally Posted by *Noirgheos*
> 
> Guys is it normal that I idle at 40C with the Sapphire Fury? Stock fan curve in the H440. When I tested it without the side panel it was 30C idle. Is it normal to have increaesed so much?


Sounds like you need to optimise airflow in your case...

But, temps don't really mean anything to compare with other people when you don't know ambients


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> Guys is it normal that I idle at 40C with the Sapphire Fury? Stock fan curve in the H440. When I tested it without the side panel it was 30C idle. Is it normal to have increaesed so much?


It is known that the H440 does not have the best airflow in its stock configuration.


----------



## Noirgheos

Quote:


> Originally Posted by *98uk*
> 
> Sounds like you need to optimise airflow in your case...
> 
> But, temps don't really mean anything to compare with other people when you don't know ambients


I mean 40C isn't dangerous, I know that. I haven't tested load temps yet, I'll have to check.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> It is known that the H440 does not have the best airflow in its stock configuration.


But would that make a 10C different from open panel to closed panel?


----------



## p4inkill3r

Access to ambient air vs. stale, heated air.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> Access to ambient air vs. stale, heated air.


Guess so. Eh, my old XFX R9 280 idled a little higher, it capped out at 64C. Anything below 70C is fine I guess.


----------



## 98uk

Quote:


> Originally Posted by *Noirgheos*
> 
> But would that make a 10C different from open panel to closed panel?


If it's just feeding off hot air, sure.

That's why so many cases did sidepanel fans.


----------



## Noirgheos

http://imgur.com/Lw9e16K


Do you guys notice the little bend from the left end of the PCB? Is this OK?


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> 
> 
> http://imgur.com/Lw9e16K
> 
> 
> Do you guys notice the little bend from the left end of the PCB? Is this OK?


I can't see anything in that picture, maybe a close up of the area you're concerned with?


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> I can't see anything in that picture, maybe a close up of the area you're concerned with?


Look at the left end of the PCB. Draw a straight line along it and a few cm down you'll notice it straightens out, creating a bend.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> Look at the left end of the PCB. Draw a straight line along it and a few cm down you'll notice it straightens out, creating a bend.



This looks like normal bending to me.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> 
> This looks like normal bending to me.


The PCB doesn't look a tiny bit warped to you? Either way, some fishing line to hold it up should be ok?


----------



## ENTERPRISE

Hmmm I am getting bored of waiting for the X2, I just want to know if it is released December or not or if it is just a Paper Launch, otherwise 2x Standard Fury X cards for me. I dislike this waiting on news game lol.


----------



## PontiacGTX

Quote:


> Originally Posted by *ENTERPRISE*
> 
> Hmmm I am getting bored of waiting for the X2, I just want to know if it is released December or not or if it is just a Paper Launch, otherwise 2x Standard Fury X cards for me. I dislike this waiting on news game lol.


the rumours said paper launch on december

but waiting isnt bad the prices of Fury X dropped from 650 to 560usd

OC UK has the Fury X at 470GBP
https://www.overclockers.co.uk/powercolor-radeon-fury-x-4096mb-hbm-pci-express-graphics-card-ax-r9-fury-x-4gbd5-3dh-gx-182-pc.html

but really if you are playing on 1440/1080 a r9 290/x is still a quite good placeholder until 2016 cards comes

150GBP
http://www.ebay.co.uk/itm/MSI-Radeon-r9-290-gaming-4gb-ddr5-OC-EDITION-/111843366168?hash=item1a0a625518:g:jhsAAOSwp5JWXz2d

in auction
http://www.ebay.co.uk/itm/Asus-AMD-R9-290-Graphics-Card-4GB-VRAM-/301817607807?hash=item4645bb367f:g:bNQAAOSwbdpWZCif


----------



## ENTERPRISE

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ENTERPRISE*
> 
> Hmmm I am getting bored of waiting for the X2, I just want to know if it is released December or not or if it is just a Paper Launch, otherwise 2x Standard Fury X cards for me. I dislike this waiting on news game lol.
> 
> 
> 
> the rumours said paper launch on december
> 
> but waiting isnt bad the prices of Fury X dropped from 650 to 560usd
> 
> OC UK has the Fury X at 470GBP
> https://www.overclockers.co.uk/powercolor-radeon-fury-x-4096mb-hbm-pci-express-graphics-card-ax-r9-fury-x-4gbd5-3dh-gx-182-pc.html
> 
> but really if you are playing on 1440/1080 a r9 290/x is still a quite good placeholder until 2016 cards comes
> 
> 150GBP
> http://www.ebay.co.uk/itm/MSI-Radeon-r9-290-gaming-4gb-ddr5-OC-EDITION-/111843366168?hash=item1a0a625518:g:jhsAAOSwp5JWXz2d
> 
> in auction
> http://www.ebay.co.uk/itm/Asus-AMD-R9-290-Graphics-Card-4GB-VRAM-/301817607807?hash=item4645bb367f:g:bNQAAOSwbdpWZCif
Click to expand...

Yeah I am hoping it is not just the paper launch...but it likely is. Nice find on the cards. I would usually go for a placeholder as I have not games in a while since the sale of my 295X2 but i decided just to wait it out this time. Nice find on the cheaper Fury X though, may pick one of them up instead !


----------



## MerkageTurk

I returned my GPU after having the same issues below and RSOD.
Quote:


> Hey guys, I have the exact same issue that this guy does in the video. The Sapphire R9 Fury
> 
> 
> 
> 
> I'm on 15.11.1 CCC. Restarting fixes it, or just changing the resolution. I'm afraid it may happen again though. Anyone know what it is?


----------



## Semel

Quote:


> Originally Posted by *Noirgheos*
> 
> Hey guys, I have the exact same issue that this guy does in the video. The Sapphire R9 Fury
> 
> 
> 
> 
> I'm on 15.11.1 CCC. Restarting fixes it, or just changing the resolution. I'm afraid it may happen again though. Anyone know what it is?


Try changing your resolution \reverting it back or unplugging\plugging monitor cable from your card.One of these methods should help and they are certainly preferable to rebooting your pc.

It's not a card issue.. It's a driver\bios issues..It happens only under low load, mostly in browser and prolly powerplay is to blame as in it might not adjust properly/fast enough voltage and stuff when suddenly (flash in a browser etc) your load increases.


----------



## Arizonian

Ah well, boxed up my XFX TD Fury and expect to have an exchange from Newegg around December 21st. Since I may sell it someday I wouldn't want that fan issue even if minor be a deterrent and was worried it could get worse.

Only had it in my hands for two days but I was extremely satisfied on the performance, temperatures and acoustics. Can't wait to have it back.









Pulled out my back up XFX DD 370 4GB while I wait.


----------



## xkm1948

I am having some trouble using Trixx to overclock my Fury X. After I set the GPU clock to 1150 and GPU Volt to +72mv, the core clock will go back to 1050 by itself within minutes. Is this only me???


----------



## xer0h0ur

Make sure Overdrive/Crimson Software isn't conflicting with your 3rd party overclocking software.


----------



## Nafu

i guess its software problem. try clearing out things. reinstall it or just go for alternative

BTW Fury X2 is waiting too long. any specific time to its release???


----------



## xSneak

How much of a factor does temperature play in stability on these cards? I went from trixx to AB, but I used the stock fan profile and it was giving DX errors at the same OC settings. The only difference would be higher temps from the slower fan speed.


----------



## Alastair

Quote:


> Originally Posted by *xSneak*
> 
> How much of a factor does temperature play in stability on these cards? I went from trixx to AB, but I used the stock fan profile and it was giving DX errors at the same OC settings. The only difference would be higher temps from the slower fan speed.


temperatures seem to be very important especially when getting milage out of the HBM.

How much voltage does Ab allow you to add guys? Stay with trixx. Or does Ab allow you to add a fair amount of voltage?


----------



## petrvs

You want to use GPUTweak II and flash the bios to uncork more than +2V voltage control

http://hwbot.org/newsflash/2989_comprehensive_asus_r9_fury_strix_overclocking_guide_by_xtreme_addict_14501000_under_ln2_(_unlock_to_fury_x)

Get at least a waterblock if you plan to do this

Check here http://www.guru3d.com/articles-pages/ek-predator-360-aio-cpu-gpu-liquid-cooling-review,1.html


----------



## zdziseq

I think u dont have to flash anything. MSI AB is always unlocked... with little trick.









now when AB have support for fury, its only matter of how much do u want to have. I heard that +100mV is default so increase this value to +200mV shoudnt be a problem

example on older card with direct voltage (fury have offset)


----------



## petrvs

Quote:


> Originally Posted by *zdziseq*
> 
> I think u dont have to flash anything. MSI AB is always unlocked... with little trick.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> now when AB have support for fury, its only matter of how much do u want to have. I heard that +100mV is default so increase this value to +200mV shoudnt be a problem
> 
> example on older card with direct voltage (fury have offset)


Sorry but at the moment the voltage is locked to only +72mV max on Fiji cards so you have to flash if you want a +2000mV headroom (2V)


----------



## zdziseq

Quote:


> Originally Posted by *petrvs*
> 
> Sorry but at the moment the voltage is locked to only +72mV max on Fiji cards so you have to flash if you want a +2000mV headroom (2V)


... locked by program like AB or BIOS?


----------



## petrvs

Quote:


> Originally Posted by *zdziseq*
> 
> ... locked by program like AB or BIOS?


I am not sure to be honest, all I have seen is that AB and TriXX have voltage control but up to that level..if there are other trick to open up voltage via software without flashing I would like to know as well, since flashing makes you lose temp monitoring


----------



## Semel

Quote:


> Originally Posted by *petrvs*
> 
> Sorry but at the moment the voltage is locked to only +72mV max on Fiji cards




__
https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/

You can remove voltage limit on AB as well but I dunno how.

PS Btw AB has +100(96mV) available officially.


----------



## Noirgheos

Quote:


> Originally Posted by *Alastair*
> 
> temperatures seem to be very important especially when getting milage out of the HBM.
> 
> How much voltage does Ab allow you to add guys? Stay with trixx. Or does Ab allow you to add a fair amount of voltage?


I got +125mv on MSI AB. Stuck with +60mv though, was enough for 1180/540


----------



## Kana-Maru

Ok what is up with these Crimson Drivers [Beta 15.11.1] when overclocking. I get more out of my GPU overclock using CCC 15.7.1 according to Firestrike results. Yes at stock the Crimson drivers are quicker according to FireStrike. When I overclock I'm seeing less performance overall. After a cold boot yesterday I decided to run some test while the room was very cold.

Here are some quick results with old data and new data:

Rig overclocked - CPU @ 4.8Ghz
*Fury X @ 1170Mhz + HBM stock*

*Catalyst 15.7.1*
Overclocked Performance Test: *15,811* - 8-27-2015
Overclocked Performance Test: *15,823* - 12-9-2015

*Crimson [Beta 15.11.1]*
The best I could get was:
Overclocked Performance Test: *15,777* - 12-9-2015

However, at *stock* settings the Crimson Drivers are boss:

Rig overclocked - CPU @ 4.8Ghz
*Fury X @ stock + HBM stock*

*Catalyst 15.7.1* - Original Results on my blog
Stock Performance Test: *15,073* - 8-16-15

*Catalyst 15.9.1 Beta*
Stock Performance Test: *15,111* - 10-16-2015

*Crimson [15.11]*
Stock Performance Test: *15,315* - 11-24-2015

You can clearly see AMD drivers getting better and better at stock settings. The stock performance increase shows when running FS Extreme and Ultra benchmarks. However, I can't say the same when I overclock. Yeah the overclock difference is minor, but I didn't expect to see a decrease in performance. Given that the latest drivers are needed for newer games I've decided to use the Crimson Drivers since I don't have to OC to get great performance. I just hope I can get an increase in performance when OC'ing in the future.

Overall I still love the Fury X. Quiet and the temps are always great. Pretty much all of my games are usually under 45c.

Has anyone noticed anything similar above?


----------



## Noirgheos

Quote:


> Originally Posted by *Kana-Maru*
> 
> Ok what is up with these Crimson Drivers [Beta 15.11.1] when overclocking. I get more out of my GPU overclock using CCC 15.7.1 according to Firestrike results. Yes at stock the Crimson drivers are quicker according to FireStrike. When I overclock I'm seeing less performance overall. After a cold boot yesterday I decided to run some test while the room was very cold.
> 
> Here are some quick results with old data and new data:
> 
> Rig overclocked - CPU @ 4.8Ghz
> *Fury X @ 1170Mhz + HBM stock*
> 
> *Catalyst 15.7.1*
> Overclocked Performance Test: *15,811* - 8-27-2015
> Overclocked Performance Test: *15,823* - 12-9-2015
> 
> *Crimson [Beta 15.11.1]*
> The best I could get was:
> Overclocked Performance Test: *15,777* - 12-9-2015
> 
> However, at *stock* settings the Crimson Drivers are boss:
> 
> Rig overclocked - CPU @ 4.8Ghz
> *Fury X @ stock + HBM stock*
> 
> *Catalyst 15.7.1* - Original Results on my blog
> Stock Performance Test: *15,073* - 8-16-15
> 
> *Catalyst 15.9.1 Beta*
> Stock Performance Test: *15,111* - 10-16-2015
> 
> *Crimson [15.11]*
> Stock Performance Test: *15,315* - 11-24-2015
> 
> You can clearly see AMD drivers getting better and better at stock settings. The stock performance increase shows when running FS Extreme and Ultra benchmarks. However, I can't say the same when I overclock. Yeah the overclock difference is minor, but I didn't expect to see a decrease in performance. Given that the latest drivers are needed for newer games I've decided to use the Crimson Drivers since I don't have to OC to get great performance. I just hope I can get an increase in performance when OC'ing in the future.
> 
> Overall I still love the Fury X. Quiet and the temps are always great. Pretty much all of my games are usually under 45c.
> 
> Has anyone noticed anything similar above?


For temps? My Sapphire R9 Fury non x stays at 52C maximum with a custom fan curve, its really quiet as well.

Performance? On Catalyst 15.11.1 its great. I may upgrade to Crimson soon and see. I'll see how much I can get out of an OC on Crimson.


----------



## Agent Smith1984

How the hell did my XFX Fury Pro at 1080/590 pull off this graphics score?
http://www.3dmark.com/fs/6704250


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How the hell did my XFX Fury Pro at 1080/590 pull off this graphics score?
> http://www.3dmark.com/fs/6704250


This is normal. maybe 200 points slower than it should be


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> This is normal. maybe 200 points slower than it should be


Na, it's not lower than it should be.... I found several like this... Mind you, I am talking graphics score only









http://www.3dmark.com/fs/5933848


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How the hell did my XFX Fury Pro at 1080/590 pull off this graphics score?
> http://www.3dmark.com/fs/6704250


Not bad atall...


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Na, it's not lower than it should be.... I found several like this... Mind you, I am talking graphics score only
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/5933848


It is. A fury pro at 1080 it should score close to 17k gs.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> It is. A fury pro at 1080 it should score close to 17k gs.


Show me where you are getting that information from


----------



## Agent Smith1984

So take a look at these three results....
http://www.3dmark.com/compare/fs/5933848/fs/6704250/fs/6748741

The two on the right are mine from my Fury, and from my 980 KPE @ 1510/7812

The result on the left is just a random similarly clocked results (except 30mhz less on the HBM)

If you look at the FPS, it looks like the battle is won on GPU test 2 between all these cards. The first test shows almost identical results between all of the GPU's...









I am going to look up what exactly each individual test is testing for.


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So take a look at these three results....
> http://www.3dmark.com/compare/fs/5933848/fs/6704250/fs/6748741
> 
> The two on the right are mine from my Fury, and from my 980 KPE @ 1510/7812
> 
> The result on the left is just a random similarly clocked results (except 30mhz less on the HBM)
> 
> If you look at the FPS, it looks like the battle is won on GPU test 2 between all these cards. The first test shows almost identical results between all of the GPU's...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am going to look up what exactly each individual test is testing for.


Just press the firestrike tab and it will fold down and there it shows what the tests do.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> Just press the firestrike tab and it will fold down and there it shows what the tests do.


Yeah, I'm not on my rig right now, so no 3dmark (at work).

I have been reading through the technical guide though....


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I'm not on my rig right now, so no 3dmark (at work).
> 
> I have been reading through the technical guide though....


Ahh okay, prepare for a wall of text i found this:


Spoiler: Warning: Spoiler!



Graphics test 1

3DMark Fire Strike Graphics test 1 focuses on geometry and illumination. Particles are drawn at half resolution and dynamic particle illumination is disabled.

There are 100 shadow casting spot lights and 140 non-shadow casting point lights in the scene. On average, 3.9 million vertices containing 500,000 input patches for tessellation are processed per frame resulting in 5.1 million triangles being rasterized either to the screen or to the shadow maps.

Compute shaders are invoked 1.5 million times per frame for particle simulations and post processing. On average, 80 million pixels are processed per frame, which is lower than in Graphics test 2 as there is no depth of field effect.

Graphics test 2

3DMark Fire Strike Graphics test 2 focuses on particles and GPU simulations. Particles are drawn at full resolution and dynamic particle illumination is enabled.

There are two smoke fields simulated on GPU. Six shadow casting spot lights and 65 non-shadow casting point lights are present. On average, 2.6 million vertices containing 240,000 input patches for tessellation are processed and 1.4 million primitives are generated with geometry shaders. That results in 5.8 million triangles being rasterized per frame on average.Compute shaders are invoked 8.1 million times per frame for particle and fluid simulations and for post processing steps. On average, 170 million pixels are processed per frame. Post processing includes a depth of field effect.

Physics test

3DMark Fire Strike Physics test benchmarks the hardware's ability to run gameplay physics simulations on the CPU. The GPU load is kept as low as possible to ensure that only the CPU is stressed. The Bullet Open Source Physics Library is used as the physics library for the test.

The test has 32 simulated worlds. One thread per available CPU core is used to run simulations. All physics are computed on CPU with soft body vertex data updated to GPU each frame.

Combined test

3DMark Fire Strike Combined test stresses both the GPU and CPU simultaneously. The GPU load combines elements from Graphics test 1 and 2 using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and depth of field.

The CPU load comes from the rigid body physics of the breaking statues in the background. There are 32 simulation worlds running in separate threads each containing one statue decomposing into 113 parts. Additionally there are 16 invisible rigid bodies in each world except the one closest to camera to push the decomposed elements apart. The simulations run on one thread per available CPU core.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> Ahh okay, prepare for a wall of text i found this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Graphics test 1
> 
> 3DMark Fire Strike Graphics test 1 focuses on geometry and illumination. Particles are drawn at half resolution and dynamic particle illumination is disabled.
> 
> There are 100 shadow casting spot lights and 140 non-shadow casting point lights in the scene. On average, 3.9 million vertices containing 500,000 input patches for tessellation are processed per frame resulting in 5.1 million triangles being rasterized either to the screen or to the shadow maps.
> 
> Compute shaders are invoked 1.5 million times per frame for particle simulations and post processing. On average, 80 million pixels are processed per frame, which is lower than in Graphics test 2 as there is no depth of field effect.
> 
> Graphics test 2
> 
> 3DMark Fire Strike Graphics test 2 focuses on particles and GPU simulations. Particles are drawn at full resolution and dynamic particle illumination is enabled.
> 
> There are two smoke fields simulated on GPU. Six shadow casting spot lights and 65 non-shadow casting point lights are present. On average, 2.6 million vertices containing 240,000 input patches for tessellation are processed and 1.4 million primitives are generated with geometry shaders. That results in 5.8 million triangles being rasterized per frame on average.
> 
> Physics test
> 
> Compute shaders are invoked 8.1 million times per frame for particle and fluid simulations and for post processing steps. On average, 170 million pixels are processed per frame. Post processing includes a depth of field effect.
> Physics test
> 
> 3DMark Fire Strike Physics test benchmarks the hardware's ability to run gameplay physics simulations on the CPU. The GPU load is kept as low as possible to ensure that only the CPU is stressed. The Bullet Open Source Physics Library is used as the physics library for the test.
> 
> The test has 32 simulated worlds. One thread per available CPU core is used to run simulations. All physics are computed on CPU with soft body vertex data updated to GPU each frame.
> 
> Combined test
> 
> 3DMark Fire Strike Combined test stresses both the GPU and CPU simultaneously. The GPU load combines elements from Graphics test 1 and 2 using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and depth of field.
> 
> The CPU load comes from the rigid body physics of the breaking statues in the background. There are 32 simulation worlds running in separate threads each containing one statue decomposing into 113 parts. Additionally there are 16 invisible rigid bodies in each world except the one closest to camera to push the decomposed elements apart. *The simulations run on one thread per available CPU core.*


What really caught my eye was this load of bullcrap.


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What really caught my eye was this load of bullcrap.


Yeah. The simulations cant use 2 thread per cpu core. Like Ht


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah. The simulations cant use 2 thread per cpu core. Like Ht


I am almost positive they did that intentionally, because 3dmark 11 states nothing about multi threading in it's technical information, yet it has no problem utilizing AMD cores....


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am almost positive they did that intentionally, because 3dmark 11 states nothing about multi threading in it's technical information, yet it has no problem utilizing AMD cores....


It uses 1 thread PER cpu core. It doesnt say that they use only 1 cpu core.


----------



## fat4l

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am almost positive they did that intentionally, because 3dmark 11 states nothing about multi threading in it's technical information, yet it has no problem utilizing AMD cores....


Soooo how is your card + AB doing ? No more freezing ?


----------



## Agent Smith1984

Quote:


> Originally Posted by *fat4l*
> 
> Soooo how is your card + AB doing ? No more freezing ?


You'll have to check with Evil-Mobo, it's his now


----------



## waltercaorle

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How the hell did my XFX Fury Pro at 1080/590 pull off this graphics score?
> http://www.3dmark.com/fs/6704250


be careful, if Fury is not stable results can be wrong

@1182/573 - GS 18119
@1180/570 (stable) - GS 17060

http://www.3dmark.com/compare/fs/6530255/fs/6530169


----------



## Agent Smith1984

Quote:


> Originally Posted by *waltercaorle*
> 
> be careful, if Fury is not stable results can be wrong
> 
> @1182/573 - GS 18119
> @1180/570 (stable) - GS 17060
> 
> http://www.3dmark.com/compare/fs/6530255/fs/6530169


Interesting..... and again, the difference is "felt" in the second GPU test, as with all the other scoring comparisons I've seen.

That is a really nice score BTW.... I never got around to overvolting my Fury.









I am getting a little lower than expected numbers from my 980 now..... most people with my clocks are getting around 16k graphics, and I am in the 15,300 range, I think I need to look at stability and voltage some more.

Thanks for the comparison.


----------



## NBrock

So far I am gaming/valley benchmark stable at 1200 core and 600 HBM. I can complete most [email protected] work units with 1200/600 but there are a few that get some errors that don't at 1180/600.
Voltage is +65 and Power slider is maxed.


----------



## Alastair

Quote:


> Originally Posted by *NBrock*
> 
> So far I am gaming/valley benchmark stable at 1200 core and 600 HBM. I can complete most [email protected] work units with 1200/600 but there are a few that get some errors that don't at 1180/600.
> Voltage is +65 and Power slider is maxed.


that sounds great!


----------



## Scorpion49

Anyone want a $400 Nano?


----------



## The Mac

Holy smokes..


----------



## xer0h0ur

Breh, my wallet can't handle the deals. Make it staph!


----------



## NBrock

DAMN THAT'S A GREAT PRICE! Man if I hadn't gotten the Fury X I probably would have picked that up in a heartbeat.


----------



## The Mac

dammit, i want it, but its not enough of an upgrade over my 390 VAPOR-X OC.


----------



## xer0h0ur

I want to build a small form factor PC for LANs around the time Zen debuts. Probably will wind up sticking a Nano in it. I can only hope the price will be as low or lower by end of next year.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone want a $400 Nano?


Trade you a 980 kpe with EVERYTHING!
1 week old!


----------



## Noirgheos

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Trade you a 980 kpe with EVERYTHING!
> 1 week old!


Come back to the Red team my friend...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> Come back to the Red team my friend...


LOL

I don't even care about that anymore, i just love playing with new toys my friend. My attention span is just too short these days


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone want a $400 Nano?


If it was anyone except Tiger Direct, I'd jump on it.
If someone else wants to be the sacrificial lamb, I'll follow.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *Arizonian*
> 
> Tested my new *XFX Triple D Fury's* legs out just now running Crimson drivers on a IPS 1440p 60Hz monitor.
> 
> *Unigine 'Valley' Benchmark 1.0 Extreme HD 76.8 Score*
> 
> 
> Spoiler: Warning: Spoiler!


It's posts like _this_ that make me think that there is something very wrong with my system.



^
That's what I'm getting on Crossfire Fury X's with full EK blocks on Crimson.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> It's posts like _this_ that make me think that there is something very wrong with my system.
> 
> 
> 
> ^
> That's what I'm getting on Crossfire Fury X's with full EK blocks on Crimson.


Don't sweat it too much man, all the heaven benches are garbage for multi gpu testing. The scaling is terrible..

Ask him what his crack-strike score is


----------



## josephimports

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> It's posts like _this_ that make me think that there is something very wrong with my system.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ^
> That's what I'm getting on Crossfire Fury X's with full EK blocks on Crimson.


Check and see if the load indicator lights are fully lit on both GPU's during the run.

Stock crossfire Fury X on Crimson drivers.


1100/550


----------



## NBrock

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> It's posts like _this_ that make me think that there is something very wrong with my system.
> 
> 
> 
> ^
> That's what I'm getting on Crossfire Fury X's with full EK blocks on Crimson.


You know what will help that benching number a lot? Turn off frame pacing for the program's profile in Crimson or CCC.

Just realized I don't see the frame pacing options on Crimson







It worked well for my 295x2 in Valley with CCC.


----------



## xer0h0ur

Quote:


> Originally Posted by *NBrock*
> 
> You know what will help that benching number a lot? Turn off frame pacing for the program's profile in Crimson or CCC.
> 
> Just realized I don't see the frame pacing options on Crimson
> 
> 
> 
> 
> 
> 
> 
> It worked well for my 295x2 in Valley with CCC.


Assuming the global setting sticks, its right there. Gaming -> Global Settings -> Frame Pacing


----------



## clubber_lang

Hey guys.....I have a Sapphire Fury Tri-x card here ( freakin' love it !! ) And I have question about the performance. I play mostly racing sims like iracing , Game stock car , rFactor 2 , project cars , ect...

Anyways....playing project cars , it demands quite a bit of performance from a single card , and I'm still up around the 65-75fps with basically everything cranked up to max except for the grass ( I turned that off because it's a frame rate killer ). Anyways....using MSI afterburner and monitoring it for a good 1-2hrs strait of driving , the card is maxxed out at 100% all the time while driving , but it's also staying extremely cool for doing it. I never exceed more than 45c-47c ever.....like 47C is the hottest it's ever gotten.

My question is , it seems I have some head room to push the card a bit. But since I'm kind of stuck at 75FPS on freeSync on my Acer 34' XR34CK monitor and the card is pushing damn near that limit all the time anyways , what would you suggest I could do to boost the performance a little? I'm going to be trying out some FPS games soon and wanted all the head room I could get. I know some of these newer FPS games can reek havoc on a single card.

I am simply amazed by this R9 Fury. Actually shocked that it can run at 100% for 1-2hrs strait and never get above 47c! Man , my 7970's in cfx would get up around 70+C and still not perform as well ( in race sims , most don't get along with CFX or sli ) , or make my race sims look near this good. Best damn investment I could have made was buying this card. This thing kicks ass in the racing sims!


----------



## 98uk

Quote:


> Originally Posted by *clubber_lang*
> 
> Hey guys.....I have a Sapphire Fury Tri-x card here ( freakin' love it !! ) And I have question about the performance. I play mostly racing sims like iracing , Game stock car , rFactor 2 , project cars , ect...
> 
> Anyways....playing project cars , it demands quite a bit of performance from a single card , and I'm still up around the 65-75fps with basically everything cranked up to max except for the grass ( I turned that off because it's a frame rate killer ). Anyways....using MSI afterburner and monitoring it for a good 1-2hrs strait of driving , the card is maxxed out at 100% all the time while driving , but it's also staying extremely cool for doing it. I never exceed more than 45c-47c ever.....like 47C is the hottest it's ever gotten.
> 
> My question is , it seems I have some head room to push the card a bit. But since I'm kind of stuck at 75FPS on freeSync on my Acer 34' XR34CK monitor and the card is pushing damn near that limit all the time anyways , what would you suggest I could do to boost the performance a little? I'm going to be trying out some FPS games soon and wanted all the head room I could get. I know some of these newer FPS games can reek havoc on a single card.
> 
> I am simply amazed by this R9 Fury. Actually shocked that it can run at 100% for 1-2hrs strait and never get above 47c! Man , my 7970's in cfx would get up around 70+C and still not perform as well ( in race sims , most don't get along with CFX or sli ) , or make my race sims look near this good. Best damn investment I could have made was buying this card. This thing kicks ass in the racing sims!


Actually, for PCars the Nvidia lineup is much better.

But anyway, I have the same card as you and play dirt, PCars, assetto corsa... And yes, the Fury is great. The cooler on it is awesome and I've never seen it go over 35% fan and never seen temps over 65c.

On stock volts, I could do about 1050mhz on the core, but it seems the latest Crimson beta drivers (fan fix ones) changed something as that is more unstable somewhat...


----------



## clubber_lang

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Don't sweat it too much man, all the heaven benches are garbage for multi gpu testing. The scaling is terrible..
> 
> Ask him what his crack-strike score is


I haven't read a bunch in this thread , but this one kind of had me scratching my head a little. I pulled out my two Sapphire 7970's a few weeks ago and replaced them with a new Sapphire R9 Tri-X card , but I ran some heaven bench marks on them before I did. The crossfire performance damn near blew me away. I don't know how to overclock , so everything is left at stock speeds. But these were the results and I'd say I had damn near 98% scaling on those cards in Heaven. I set up Heaven as high as I could go ( I think ? )


----------



## clubber_lang

Quote:


> Originally Posted by *98uk*
> 
> Actually, for PCars the Nvidia lineup is much better.
> 
> But anyway, I have the same card as you and play dirt, PCars, assetto corsa... And yes, the Fury is great. The cooler on it is awesome and I've never seen it go over 35% fan and never seen temps over 65c.
> 
> On stock volts, I could do about 1050mhz on the core, but it seems the latest Crimson beta drivers (fan fix ones) changed something as that is more unstable somewhat...


Yeah I didn't want to try the new Crimson drivers just yet. They seem to still be pretty buggy and in the racing sim world , not many guys have had much luck with getting them to run well. I'm using the 15.7.1 drivers right now and they seem to be performing pretty good.

On the Project cars thing , I know Nvidia definitely has the upper hand. Seems Nvidia is strong in some and weak in other sims , and visa-versa with the AMD cards. And don't get me wrong , even though I'm running an AMD R9 in PC , everything is cranked and it looks freakin' beautiful at 3440 X 1440p!

I was just amazed how dang cool it's been staying and it running at 100%. Thought maybe I could push it a little more , but I'm not sure I really need to. More curious than anything else.


----------



## clubber_lang

Damn it! I just realized the guys above benched a Valley benchmark....not Heaven! DOH!


----------



## Orgios

Furys are not that good at 1080p , if you really want this baby to shine bench at 1440p and 2160p


----------



## petrvs

Quote:


> Originally Posted by *sugarhell*
> 
> It is. A fury pro at 1080 it should score close to 17k gs.


When you say fury pro you mean any fury (non x) ?


----------



## Alastair

Quote:


> Originally Posted by *petrvs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> It is. A fury pro at 1080 it should score close to 17k gs.
> 
> 
> 
> When you say fury pro you mean any fury (non x) ?
Click to expand...

Yip cause that uses the Fiji Pro silicone.


----------



## The Mac

Quote:


> Originally Posted by *Alastair*
> 
> Yip cause that uses the Fiji Pro silicone.


Fiji pro breast augmentation?

yes please.

lol


----------



## Noirgheos

Quote:


> Originally Posted by *Orgios*
> 
> Furys are not that good at 1080p , if you really want this baby to shine bench at 1440p and 2160p


What do you mean? I won't be getting full performance at 1080p? Thats what I get from your reply.

I am consistently beating the 980 at 1080p.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Noirgheos*
> 
> What do you mean? I won't be getting full performance at 1080p? Thats what I get from your reply.
> 
> I am consistently beating the 980 at 1080p.


I've ended up with a final OC on my 980 KPE of 1525/7740 @ 1.18v under load.









I see about even performance (1% or less difference, with it swinging in either direction depending on the title) between this 980 @ 1080P, and my Fury running at 1060/560

The Fury seems to be about 5% faster when at those clocks in 4K though.

I lost about 3-5 FPS in GTA V @ high custom setting with 4K resolution, but since the HDMI had me limited to 30hz anyways, I actually picked up over 20 frames using the 980.

It has been interesting making the switch from AMD to NVIDIA after all this time, because the two architectures seem to respond to things much differently from an overclocking standpoint..... I have actually really enjoyed the 980, but there are a lot of variables when overclocking them. The AMD is pretty straight forward, old school overclocking. You just juice, cut the fan up, and keep pushing until it's out of room. With the 980, I actually found the highest OC undervolting by 7mv









I will probably run this card until we see Pascal and Arctic Islands, and what exactly they are capable of.

I really enjoyed the Fury though, and how well it stayed cool, and performed, but I was disappointed with the lack of clocking room... that's not really a sly to the card itself, just a negative thing for someone like myself that REALLY enjoys overclocking and tweaking things, and benchmarking...

One thing for certain, is that HBM is an amazing graphics memory architecture.


----------



## Noirgheos

Is anyone getting clock fluctuations resulting in stuttering in certain games after Crimson? I'm getting it in games, any idea how to fix?

EDIT: Disabling Witcher 3's profile helped in that game, stutters are still there. Anyone know how to fix fluctuating clocks?


----------



## 98uk

Quote:


> Originally Posted by *Noirgheos*
> 
> Is anyone getting clock fluctuations resulting in stuttering in certain games after Crimson? I'm getting it in games, any idea how to fix?
> 
> EDIT: Disabling Witcher 3's profile helped in that game, stutters are still there. Anyone know how to fix fluctuating clocks?


In bf4 Yep...


----------



## Noirgheos

Quote:


> Originally Posted by *98uk*
> 
> In bf4 Yep...


Rolled back to 15.11.1 CCC

I'll wait until they release a Crimson driver that doesn't mess up my clocks.


----------



## rocket47

Quote:


> Originally Posted by *Noirgheos*
> 
> Rolled back to 15.11.1 CCC
> 
> I'll wait until they release a Crimson driver that doesn't mess up my clocks.


i did that 2 days ago, no regret.


----------



## dagget3450

Weird how they did the crimson ui for things like changing resolution or refresh rates. I rolled back as well due to clock issue, but i really feel they rushed crimson out or just haven't finished it yet? No more clock issues on 15.10 for me. Ill wait a while as well maybe they will have it settled next year.


----------



## Noirgheos

It really is a shame, in games that got over 60FPS easily, frametimes were really good, felt so smooth. Really hope they have an official fix ready, I know there is an unofficial one.


----------



## waltercaorle

Edit


----------



## waltercaorle

http://forums.guru3d.com/showthread.php?t=404465
If someone want to try...


----------



## Noufel

hi fury people








any one with strix fury crossfire i want to know about temps and if possible performance numbers ( there litle reviews about fury cfx )
thnx


----------



## JonDuma

R9 Nano for only 529 USD


----------



## Kana-Maru

Quote:


> Originally Posted by *waltercaorle*
> 
> http://forums.guru3d.com/showthread.php?t=404465
> If someone want to try...


Tried it and appeared to work fine in one of my programs. No downclocking that resulted in much higher FPS. I'll try it out with FireStrike later.


----------



## Dirgeth

Hey..
Just bought a new one Gigabyte R9 Fury X 4GB in Czech Republic and still have first revision with little bit coiling pump noise








But its ok... not that bad











OC looks like 1120/560 without +mV
and 1170/560 +60mV


----------



## p4inkill3r

Quote:


> Originally Posted by *Dirgeth*
> 
> Hey..
> Just bought a new one Gigabyte R9 Fury X 4GB in Czech Republic and still have first revision with little bit coiling pump noise
> 
> 
> 
> 
> 
> 
> 
> 
> But its ok... not that bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OC looks like 1120/560 without +mV
> and 1170/560 +60mV


Welcome aboard!


----------



## ozyo

Quote:


> Originally Posted by *waltercaorle*
> 
> http://forums.guru3d.com/showthread.php?t=404465
> If someone want to try...


3dmark don't like it or i'm doing it wrong
without CB
http://www.3dmark.com/3dm/9712856?
with CB
http://www.3dmark.com/3dm/9712777?


----------



## Thoth420

I got some smexy Fury X pics coming soon just finishing out the semester. My sig rig is finally finished and safely home though.


----------



## Noirgheos

Anyone know how to disable the power management in Crimson? Thats whats causing the stutters.


----------



## The Mac

AB/RP force 3d clocks?


----------



## Alastair

Quote:


> Originally Posted by *Dirgeth*
> 
> Hey..
> Just bought a new one Gigabyte R9 Fury X 4GB in Czech Republic and still have first revision with little bit coiling pump noise
> 
> 
> 
> 
> 
> 
> 
> 
> But its ok... not that bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OC looks like 1120/560 without +mV
> and 1170/560 +60mV


That is a second revision unit.


----------



## Semel

Quote:


> Originally Posted by *Noirgheos*
> 
> Anyone know how to disable the power management in Crimson? Thats whats causing the stutters.


Afterburner-general-unofficial overclocking mode-without powerplay support?

PS
I'm having a strange issue with afterburner.

Trixx -OC stable, AB 4.2.0 - OC unstable (crashes within 3-5 minutes)

I'm running my fury at 1140/570, power limit+50,+72mV, custom fan curve.(temperature never exceeds 52-53+C under load)

Trixx: everything is stable, furmark, benchmarks,hours tested-no issues. different games and what is most important witcher 3 (one of the most sensitive to unstable overclock games I've ever seen)

AB 4.2.0(same OC):

Witcher 3 crashes within 3-5 minutes every time.

I have amd catalyst driver 15 .11.1 installed but I've tried the latest crimson driver and Ab has the same issue. Unfortunately I couldn't check Trixx coz crimson driver resets core clock to 1000 all the time,I didn't even touch overdrive obviously, I can OC only memory).

I tried uninstalling trixx, uninstalling AMD drivers using DDU in safe mode and installing them again, uninstalling AB and installing it again and whatnot.

I only have "unlock voltage control" enabled and "extend official overclocking limits"(so that I can OC HBM)

PS Fury unlocked to 3840. Stable(no aftifacts, no throttling, no nothing) in everything at default clocks and , as I've mentioned, at my current OC with Trixx.


----------



## Gamedaz

Quote:


> Originally Posted by *dagget3450*
> 
> Weird how they did the crimson ui for things like changing resolution or refresh rates. I rolled back as well due to clock issue, but i really feel they rushed crimson out or just haven't finished it yet? No more clock issues on 15.10 for me. Ill wait a while as well maybe they will have it settled next year.


* I've had issues downloading and extracting it onto my system, apperently AV blocks it or slows down the extraction process, I'm currently waiting for an Updated release with BUG fixes for this and other issues users are experiencing with Crimson drivers, currently on 15.11, and no major issues yet. Able to play NBA2K15 with AMD drivers without crashing, Previous Nvidia card with Updated drivers would not play the game without crashing every 5 mins.


----------



## Semel

I think I know what causes crashes when using afterburner vs trixx

Check voltages(right column)

Trixx stock voltage http://i.imgur.com/0X7Wo6N.jpg

Ab stock voltage http://i.imgur.com/VO5s4rf.jpg

Now, pay attention

Trixx OC +72mV http://i.imgur.com/DEgWmYi.jpg

Ab +72 mV http://i.imgur.com/wzuW5w8.jpg

It looks like Trixx actually pushes more than +72 mV for whatever reasons, whereas Ab is more accurate in this regard.

PPSS

OK.. I've increased +mV in AB to +96mV(it should be enough if we take a look at stable trixx OC voltage) and I still got crash within 2-3 minutes.

Judging by AB logs http://i.imgur.com/x4YLyLQ.jpg
*his time AB was pushing less voltage* most of the time than was specified.

That's all using the so called "official" OC mode in AB, not unofficial which is used in Trixx.

I guess I'll have to stick to Trixx..


----------



## Tgrove

Double post


----------



## Tgrove

Quote:


> Originally Posted by *Noirgheos*
> 
> Anyone know how to disable the power management in Crimson? Thats whats causing the stutters.


Use Clockblocker, it has worked wonders for me

http://forums.guru3d.com/showthread.php?t=404465


----------



## xer0h0ur

Semel if you were by chance using the Afterburner OSD, don't use it. I have had issues with rivatuner in the past that has caused game crashing by having Afterburner's OSD active.


----------



## Semel

Quote:


> Originally Posted by *xer0h0ur*
> 
> Semel if you were by chance using the Afterburner OSD, don't use it. I have had issues with rivatuner in the past that has caused game crashing by having Afterburner's OSD active.


The solution to AB issues I had is to enable "Unofficial overclocking mode with powerplay support".. Trixx actually uses this unofficial mode itself.

This way AB pushes exactly the same amount of voltage Trixx does at the same settings.


----------



## xer0h0ur

At least you figured it out then. Good man.


----------



## Semel

Quote:


> Originally Posted by *xer0h0ur*
> 
> At least you figured it out then. Good man.


Nah, it was Unwinder who told me to do it







I wonder though whether AB unofficial mode with powerplay support works with Crimson..Trixx doesn't seem to...


----------



## Leinei

Quote:


> Originally Posted by *Dirgeth*
> 
> Hey..
> Just bought a new one Gigabyte R9 Fury X 4GB in Czech Republic and still have first revision with little bit coiling pump noise
> 
> 
> 
> 
> 
> 
> 
> 
> But its ok... not that bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OC looks like 1120/560 without +mV
> and 1170/560 +60mV


Solid first post!


----------



## battleaxe

So when do the new cards come out? With HBM2 on them? Is there a prospective date yet?


----------



## Noirgheos

So how do I OC the HBM VRAM? MSI AB said they allow it in their recent update.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> So how do I OC the HBM VRAM? MSI AB said they allow it in their recent update.


Extend official overclocking limits.


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> So when do the new cards come out? With HBM2 on them? Is there a prospective date yet?


Nothing from the Fiji generation will have HBM2. So the cards you're thinking of are the next generation Arctic Islands video cards. There is no official date for those just as there is no official date for Pascal either. Its believed Pascal will debut Q2 and Arctic Islands Q3 but that is all speculation.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> Nothing from the Fiji generation will have HBM2. So the cards you're thinking of are the next generation Arctic Islands video cards. There is no official date for those just as there is no official date for Pascal either. Its believed Pascal will debut Q2 and Arctic Islands Q3 but that is all speculation.


Arctic Islands. Cool. Thanks man. Hoping they are great cards.


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> Arctic Islands. Cool. Thanks man. Hoping they are great cards.


Yup, I am excited to see what AMD has in store for us for that generation. Arctic Islands is supposed to be a new revision of GCN (possibly GCN 2.0, AMD only officially recognizes there being two existing versions of GCN despite us distinguishing between 1.0, 1.1 and 1.2). Its also going to have a process node shrink from 28 nanometer to 16 nanometer (albeit a hybrid instead of a true 16 nanometer process). So they will be able to pack higher transistor counts into the dies than before (AMD and Nvidia claim around double the transistor counts of current gen). Then there is of course the benefit and use of HBM2 bringing about double the bandwidth and substantially higher vRAM capacity.

The outlier in all of this to me is the inclusion or exclusion of Displayport 1.3. DP 1.3 has been finalized a long time ago and has been awaiting use. Its going to be the first monitor/tv connection capable of delivering UHD 3840x2160 (often referred to as 4K) @ 120Hz.

Another thing, which Nvidia's cards clearly don't lack yet AMD's current gen does, is the inclusion of HDMI 2.0. It was a bit of a head scratcher to see AMD not put in HDMI 2.0 into the Fiji reference design. To the average monitor user this was a non-issue but to all those people which use TVs instead of monitors this was nonsensical as nearly no TVs have a Displayport and were locked to the older HDMI's 30Hz versus HDMI 2.0's 60Hz.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yup, I am excited to see what AMD has in store for us for that generation. Arctic Islands is supposed to be a new revision of GCN (possibly GCN 2.0, AMD only officially recognizes there being two existing versions of GCN despite us distinguishing between 1.0, 1.1 and 1.2). Its also going to have a process node shrink from 28 nanometer to 16 nanometer (albeit a hybrid instead of a true 16 nanometer process). So they will be able to pack higher transistor counts into the dies than before (AMD and Nvidia claim around double the transistor counts of current gen). Then there is of course the benefit and use of HBM2 bringing about double the bandwidth and substantially higher vRAM capacity.
> 
> The outlier in all of this to me is the inclusion or exclusion of Displayport 1.3. DP 1.3 has been finalized a long time ago and has been awaiting use. Its going to be the first monitor/tv connection capable of delivering UHD 3840x2160 (often referred to as 4K) @ 120Hz.
> 
> Another thing, which Nvidia's cards clearly don't lack yet AMD's current gen does, is the inclusion of HDMI 2.0. It was a bit of a head scratcher to see AMD not put in HDMI 2.0 into the Fiji reference design. To the average monitor user this was a non-issue but to all those people which use TVs instead of monitors this was nonsensical as nearly no TVs have a Displayport and were locked to the older HDMI's 30Hz versus HDMI 2.0's 60Hz.


What do you mean by hybrid? It won't be a full 16nm die?


----------



## The Mac

GloFo/Samsung is 14nm, not 16nm,

16nm is TSMC.

its a hybrid 14/20.


----------



## The Mac

Quote:


> Originally Posted by *xer0h0ur*
> 
> Another thing, which Nvidia's cards clearly don't lack yet AMD's current gen does, is the inclusion of HDMI 2.0. It was a bit of a head scratcher to see AMD not put in HDMI 2.0 into the Fiji reference design. To the average monitor user this was a non-issue but to all those people which use TVs instead of monitors this was nonsensical as nearly no TVs have a Displayport and were locked to the older HDMI's 30Hz versus HDMI 2.0's 60Hz.


They chose to go the dongle route, club3d just released it.

They had been working with AMD on it for a while.


----------



## xer0h0ur

Quote:


> Originally Posted by *The Mac*
> 
> They chose to go the dongle route, club3d just released it.
> 
> They had been working with AMD on it for a while.


Look, I get that but its not like the cards themselves are cheap enough to justify spending the money on a displayport to hdmi 2.0 dongle on top of the cost of the card. AMD isn't converting users through that methodology. Then there is the fact that these dongles have a high failure rate regardless of which company makes them so you are also risking the possibility of having to spend on a replacement sometime in your foreseeable future. Its not a smart tactic in my opinion.


----------



## xer0h0ur

Quote:


> Originally Posted by *The Mac*
> 
> GloFo/Samsung is 14nm, not 16nm,
> 
> 16nm is TSMC.
> 
> its a hybrid 14/20.


I'm not sure who you're responding to but I believe AMD themselves had confirmed that Zen will be manufactured on the 14nm process and Arctic Islands would be on the 16nm process. Unless something has changed in the past few weeks I missed.


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> What do you mean by hybrid? It won't be a full 16nm die?


http://www.dailytech.com/TSMC+Hypes+Its+Upcoming+10+nm+Process+Amid+Struggles+to+Hit+Volume+at+16+nm/article37298.htm

"TSMC's strategy to achieve 16 nm production is controversial, as it's built "on top of" TSMC's 20 nm process. It uses 16 nm transistors. However, for the backplane it uses 20 nm interconnects (bonding pads, electrical contacts, insulating layers, and metal layers) making the CLN16FFC/FF+ nodes oddball hybrids of 16 nm and 20 nm technologies (to be fair, analysis by Chipworks indicates Samsung may be using this approach, too)."

So 16 nanometer transistors and 20 nanometer interconnects.


----------



## The Mac

Artic islands will be made by glofo not TSMC. They do not have a 16nm process.

It was codeveloped with samsung.

Its 14lpe. 14/20 hybrid.

Afaik, artic islands was always going to be made by glofo.


----------



## xer0h0ur

http://wccftech.com/amd-arctic-islands-16nm-launch-coming-quarters/

Take a read. There is apparently no actual official confirmation of which foundry is producing Arctic Islands however this particular statement by Lisa sounds like a slip up to me "And in the third quarter, we also taped out multiple products in FinFET technologies across both of our foundry partners that are on track to enter production next year."

So that statement along with the reports floating around are why people are drawing that conclusion. That Globalfoundries would be making Zen and TSMC would be making Arctic Islands. Pascal is being manufactured by TSMC btw so its not a stretch at all that they will end up producing both.


----------



## The Mac

Dunno, first I've heard of that. The rhumor has been 14nm glofo for several months now.


----------



## xer0h0ur

Yeah you're right about that. For months it was that Globalfounries would be producing Arctic Islands at 14 nanometer and the AMD trolls were shoving that in Nvidia trolls' faces non-stop.


----------



## The Mac

Quote:


> Originally Posted by *xer0h0ur*
> 
> Look, I get that but its not like the cards themselves are cheap enough to justify spending the money on a displayport to hdmi 2.0 dongle on top of the cost of the card. AMD isn't converting users through that methodology. Then there is the fact that these dongles have a high failure rate regardless of which company makes them so you are also risking the possibility of having to spend on a replacement sometime in your foreseeable future. Its not a smart tactic in my opinion.


There is no data on failure rates of this device. Yet.

It's $35. Concidering how tiny the 4k60hz TV (monitors have dp) market is, it's a more than reasonable soltion, and price.

The reality is dp is a superior connector, and aMD would prefer you use it.


----------



## xer0h0ur

Bud, you're talking to a guy that went ****** mode levels of overboard on his rig to make it 4K @ 60Hz ready. To me $35 is a non-factor if that video card is what does it for me. I just know its a problem on a global scale when people overseas have to pay ridiculous prices for that same dongle and having it shipped to them or outrageous prices from suppliers that have them on hand overseas. Then there of course is the crowd that refuses to pay extra for something separate when the competition's hardware already includes it. Some people view that as an inconvenience.

You're right about the failure rate thing. I was in fact jumping to conclusions there simply based off the terrible failure rates suffered on all the rest of the displayport adapters that have come before this one. One would hope they sorted it out for this one.


----------



## Maximization

i could not take it anymore, ordered 2 fury x saphire brand and a 40" 4k monitor, can't wait!!!!!


----------



## looncraz

I have a hypothesis as to how the camp that suspects AMD GPUs will be on 16 FF+ and the camp that suspects them to be on 14nm LPP could both be right... and it actually would have a LOT of advantages for AMD if true.

1. The highest-end Arctic Islands GPU is on 16nm FF.
2. The rest are on 14nm LPP.

Advantages for AMD:

1. If 16nm is delayed, nVidia also suffers.
2. If 16nm yields are bad, nVidia also suffers.
3. Smaller dies on 14nm offers power and cost advantages over nVidia on 16nm,
4. If 14nm dies are available first, AMD can beat nVidia to the next generation.
5. If 16nm is first, AMD will, at worst, essentially tie with nVidia to market.

And I'm sure there are others. The only downside is porting GCN2 to two processes, but that may not be such a big deal as it's not much more than converting a database and making up for any design library incompatibilities - which might be rather small considering the apparent similarity between these processes.

Thoughts?


----------



## The Mac

the are both finfet btw.

its all a lot of guessing at this point.

Personally id like to see them use GloFo as they have samsung backing the node, and it will separate them from any problems TSMC may encounter slowing down nvidia.


----------



## HagbardCeline

A few pages back, I talked about how Battlefront wasn't displaying right in certain situations since upgrading my drivers to crimson. Here are the screencaps. Another symptom, is that when I hover my mouse over certain buttons in the menus, the text disappears. Going to be wiping the drivers and installing older ones shortly.


----------



## Scorpion49

Interesting, Sapphire now has an R9 Fury Nitro card with custom PCB and 1050mhz clock with a DVI port!

Quote:


> The fastest R9 Fury in the market.
> 
> *Equipped with a SAPPHIRE original PCB design. An improved robust and efficient power design.*
> 
> A 20% improvement in the reduction of the VRM temperature.
> 
> Better connectivity: Native support of DL-DVI-D, plus HDMI + DP x 3.
> 
> *Minimal Coil Whine. Choke has been fine tuned to minimize the coil whine.*


----------



## The Mac

the reference furies dont have a DVI port?


----------



## 98uk

Quote:


> Originally Posted by *The Mac*
> 
> the reference furies dont have a DVI port?


Mine only has HDMI and DP.

I use an adaptor to DVI.


----------



## xer0h0ur

Quote:


> Originally Posted by *The Mac*
> 
> the reference furies dont have a DVI port?


Don't quote me on this one but I vaguely remember the Strix having a DVI port?

Edit: The Gigagyte Windforce Fury also appears to have a DVI port.


----------



## Alastair

So much beauty. So much wow.


----------



## xer0h0ur

Much graphics, such beauty, skeet skeet.


----------



## SuperZan

The beautiful symmetry!


----------



## MalsBrownCoat

Your blocks should have come with single-space mounting brackets.

Both of mine did.

Check your box? Unless you're wanting to keep the dual-space brackets...


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Interesting, Sapphire now has an R9 Fury Nitro card with custom PCB and 1050mhz clock with a DVI port!


So in a twist of fate I ended up requesting a exchange on the XFX Triple D Fury I returned due to loose fan blade above 60% fan speed. Have just ordered a *Sapphire Nitro Fury* instead. Should be here before the weekend.








Quote:


> Originally Posted by *Alastair*
> 
> So much beauty. So much wow.
> 
> 
> Spoiler: Warning: Spoiler!


Really nice


----------



## The Mac

Sapphire has better coolers anyway...


----------



## Noirgheos

So I talked to @SapphireEd on Twitter today. He said there is no difference between the Tri-X and the Nitro other than the custom PCB, the ports and the cooler. He said temps and noise should be about the same, a little less coil whine, if any is present. My Tri-X sure as hell has minimal coil whine.

He also said at similar clocks they will perform pretty much the same. Honestly. to me, if this is more than $20 extra than the Tri-X, its simply not worth it.


----------



## xer0h0ur

Quote:


> Originally Posted by *Noirgheos*
> 
> So I talked to @SapphireEd on Twitter today. He said there is no difference between the Tri-X and the Nitro other than the custom PCB, the ports and the cooler. He said temps and noise should be about the same, a little less coil whine, if any is present. My Tri-X sure as hell has minimal coil whine.
> 
> He also said at similar clocks they will perform pretty much the same. Honestly. to me, if this is more than $20 extra than the Tri-X, its simply not worth it.


FWIW, wccftech.com was reporting that the Nitro card will have a 1050MHz base clock versus the 1000MHz base clock on the Tri-X so either Ed is wrong or WCCFTech is wrong.

Edit: whoops. Clock speed was already said here.


----------



## Arizonian

Quote:


> Originally Posted by *The Mac*
> 
> Sapphire has better coolers anyway...


At same price OC+ was $499.99 it was a good deal for even exchange since I wasn't getting refund. They are selling Nitro OC for $489.99 @1020 MHz Core. Newegg rep said there weren't many in stock.
Quote:


> Originally Posted by *Noirgheos*
> 
> So I talked to @SapphireEd on Twitter today. He said there is no difference between the Tri-X and the Nitro other than the custom PCB, the ports and the cooler. He said temps and noise should be about the same, a little less coil whine, if any is present. My Tri-X sure as hell has minimal coil whine.
> 
> He also said at similar clocks they will perform pretty much the same. Honestly. to me, if this is more than $20 extra than the Tri-X, its simply not worth it.


I thought the difference was Black Diamond Chokes for the coil whine?


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> So much beauty. So much wow.


Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.


----------



## Luxkeiwoker

Hey guys,

I received yesterday my R9 Nano and noticed that it doesn't have UEFI support







. It's a Sapphire card.

Is there any VBios available with UEFI support for the R9 Nano?


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.


Lower the core and VRM temps quite a bit and allows higher overclocking. Same as always for full cover blocks.


----------



## xer0h0ur

Water cooling master race. I don't take steps backwards. Next rig will just have an even better water cooling loop.


----------



## flopper

Quote:


> Originally Posted by *dagget3450*
> 
> Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.


custom made is better overall.
those that do watercooling tends to make sure they get the optimal for their money.
recently changed my cpu waterblock from ekwater that was getting a bit old to the newest iteration.
Much better design so happy with the updated block for mounting and coolness.

If I bought a fury with watercooling its likely I wouldnt change it but then I will most likely do that with the 14nm cards as I have my eye on them for a while now.
waiting for some serious action from next year cards


----------



## Noirgheos

__ https://twitter.com/i/web/status/676932366133280771
Is this dangerous guys? I mean the PCB? I know PCBs are really flexible but still... will fix in around a month but just for peace of mind until then.


----------



## xer0h0ur

Quote:


> Originally Posted by *Noirgheos*
> 
> 
> __ https://twitter.com/i/web/status/676932366133280771
> Is this dangerous guys? I mean the PCB? I know PCBs are really flexible but still... will fix in around a month but just for peace of mind until then.


Perfectly average if not less than average GPU sag from how I see it in that photo. It wouldn't worry me at least.


----------



## Noirgheos

Quote:


> Originally Posted by *xer0h0ur*
> 
> Perfectly average if not less than average GPU sag from how I see it in that photo. It wouldn't worry me at least.


But look towards the left end of the PCB, thats ok right? It looks deflected.


----------



## SuperZan

Quote:


> Originally Posted by *dagget3450*
> 
> Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.


The EK blocks are certainly better than the Tri-X air cooling that those Furies had before.









As for performance over stock Fury X cooling, it's been said above but a ten-degree difference on that card could mean a lot when it comes to the already-stubborn overclocking. The Fury X AIO isn't -bad- but I wouldn't call it great by any means.


----------



## xer0h0ur

Quote:


> Originally Posted by *Noirgheos*
> 
> But look towards the left end of the PCB, thats ok right? It looks deflected.


You might be surprised by the amount of abuse these PCBs can take. I accidentally put a hell of a lot of pressure on the PCB of my 295X2 when I tried to double up the thermal pad over the PLX chip. It literally flexed the PCB hard as hell. I panicked when I noticed it and took everything apart then used TIM instead of a thermal pad on the PLX chip. I was sure I had permanently damaged the PCB but everything was fine.


----------



## Noirgheos

Quote:


> Originally Posted by *xer0h0ur*
> 
> You might be surprised by the amount of abuse these PCBs can take. I accidentally put a hell of a lot of pressure on the PCB of my 295X2 when I tried to double up the thermal pad over the PLX chip. It literally flexed the PCB hard as hell. I panicked when I noticed it and took everything apart then used TIM instead of a thermal pad on the PLX chip. I was sure I had permanently damaged the PCB but everything was fine.


Good to know. So I can wait until I get some fishing line to fix it.


----------



## Scorpion49

Quote:


> Originally Posted by *Noirgheos*
> 
> Good to know. So I can wait until I get some fishing line to fix it.


Something that causes a lot of the flexing by the PCI-E brackets like that is the motherboard alignment. What you can do is lay the case on its side, take the card out and then loosen the motherboard screws. After that, put the card in and tighten it first before the motherboard and it will help to line up the slots a little better.


----------



## Noirgheos

Quote:


> Originally Posted by *Scorpion49*
> 
> Something that causes a lot of the flexing by the PCI-E brackets like that is the motherboard alignment. What you can do is lay the case on its side, take the card out and then loosen the motherboard screws. After that, put the card in and tighten it first before the motherboard and it will help to line up the slots a little better.


In my case the motherboard is too low, so if I loosened it, it would most likely get worse due to gravity.


----------



## Scorpion49

Quote:


> Originally Posted by *Noirgheos*
> 
> In my case the motherboard is too low, so if I loosened it, it would most likely get worse due to gravity.


Can't push it up any? Usually the board has some wiggle to it on the standoffs. Was just a thought.


----------



## Noirgheos

Quote:


> Originally Posted by *Scorpion49*
> 
> Can't push it up any? Usually the board has some wiggle to it on the standoffs. Was just a thought.


Sadly no. I'll try re-seating it when I get my new CPU and install it, but after that I'll resort to fishing line.


----------



## dagget3450

Quote:


> Originally Posted by *SuperZan*
> 
> The EK blocks are certainly better than the Tri-X air cooling that those Furies had before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for performance over stock Fury X cooling, it's been said above but a ten-degree difference on that card could mean a lot when it comes to the already-stubborn overclocking. The Fury X AIO isn't -bad- but I wouldn't call it great by any means.


Yeah, i think my point is where is the proof the full cover blocks are allowing more overclocking over the hybrid cooler on furyX. I suspect FuryX overclocking is fairly limited even with cooler temps. I am not referring to benchmark runs either but gaming and stability.


----------



## SuperZan

Quote:


> Originally Posted by *dagget3450*
> 
> Yeah, i think my point is where is the proof the full cover blocks are allowing more overclocking over the hybrid cooler on furyX. I suspect FuryX overclocking is fairly limited even with cooler temps. I am not referring to benchmark runs either but gaming and stability.


I don't think you're wrong and obviously some finite proof will be nice, but given the difficulties of mounting two fat radiators with limited-length tubing that a pair of Fury X presents, I think the extra potential overclocking headroom is just a nice bonus to the more centralised and concise full-loop. If in fact some demonstrative evidence can be presented as far as improved OC potential, all the better. I have to think that it would improve the limited OC on Fury X, though whether or not it would be measurable in a game or benchmark is obviously debatable.


----------



## Jflisk

Quote:


> Originally Posted by *SuperZan*
> 
> I don't think you're wrong and obviously some finite proof will be nice, but given the difficulties of mounting two fat radiators with limited-length tubing that a pair of Fury X presents, I think the extra potential overclocking headroom is just a nice bonus to the more centralised and concise full-loop. If in fact some demonstrative evidence can be presented as far as improved OC potential, all the better. I have to think that it would improve the limited OC on Fury X, though whether or not it would be measurable in a game or benchmark is obviously debatable.


I have no preference either way. I kept the AIO on both my Fury X s . Although I can add waterblocks to both of mine. To me it just didn't make sense. When you hit the limitation of the silicon lottery that's it. Does not matter how many radiators you add. The AIO can be set for 52C max at 100% fan. I have never seen my fan hit 100% at 52C . I think the max oc without voltage is 1050/1100 safe. Adds a few points to 3Dmark.


----------



## NBrock

Overclocking isn't terrible. Sure you don't get as many MHz as Nvidia but I was very happy with my Fury X. If I keep temps down (ez with two fans) I can run 600 on the HBM without an issue. My peak core clock is 1200. It is stable for gaming but for [email protected] there are some work units that don't like 1200 on the core.


----------



## Alastair

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Your blocks should have come with single-space mounting brackets.
> 
> Both of mine did.
> 
> Check your box? Unless you're wanting to keep the dual-space brackets...


I'm keeping the brackets on cause I can't find the original ones for my case.


----------



## Alastair

Quote:


> Originally Posted by *dagget3450*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So much beauty. So much wow.
> 
> 
> 
> 
> 
> Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.
Click to expand...

Cards never break 40C. Even when overclocked. Secondly my Sapphire cards didn't come with watercooling. Thirdly I don't trust the VRM cooling on the Fury X. Fourthly if you are someone with a custom loop I don't see how you can integrate the AIO into your already existing loop. As I have already had a custom loop for two years I seemed pretty logical to just add my cards back into my loop after taking the old ones our. And dunno about you but 1200mhz on 125mv doesn't seem like a bad deal to me. Considering I have the headroom to go all the way to +300mv the ms to the headroom provided by my custom loop.


----------



## MrKoala

Connecting the AIO block into a custom loop is possible and not difficult. I think there was a thread somewhere showing this with 295X2. I'll see if I can find it.

But as you said, custom block is better especially for the VRM.


----------



## xer0h0ur

Quote:


> Originally Posted by *MrKoala*
> 
> Connecting the AIO block into a custom loop is possible and not difficult. I think there was a thread somewhere showing this with 295X2. I'll see if I can find it.
> 
> But as you said, custom block is better especially for the VRM.


It may make some level of sense for Fury X users to do that since the AIO solution cools the VRM as well as the GPU but on the 295X2 there is a distinct advantage to using a full cover block because the Asetek AIO cooler on the 295X2 does not cool the VRMs. Only the dual GPUs. Those VRMs got legit toasty without some sort of intervention. Be that changing the thermal pads to Fujipolys, doing the manual VRM fan control mod, flat out waterblocking the card or some combination of these.


----------



## Dirgeth

So guys what si your using OC for 24/7 gaming? Do u have +mV?

I just dont have a good feeling for VRM temps.. yes core is under 50.. but VRM? around 100 on default?


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> It may make some level of sense for Fury X users to do that since the AIO solution cools the VRM as well as the GPU but on the 295X2 there is a distinct advantage to using a full cover block because the Asetek AIO cooler on the 295X2 does not cool the VRMs. Only the dual GPUs. Those VRMs got legit toasty without some sort of intervention. Be that changing the thermal pads to Fujipolys, doing the manual VRM fan control mod, flat out waterblocking the card or some combination of these.


I put mine in a loop purely because I wanted a more reliable cooling solution and not have to worry about playing pump noise lottery swap games. My rig was all about being a looker this time anyway and the cooler the fury x comes with is just ugly like any AIO CLC. If it isn't good enough for CPU then it certainly isn't good enough for my Fury X


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I put mine in a loop purely because I wanted a more reliable cooling solution and not have to worry about playing pump noise lottery swap games. My rig was all about being a looker this time anyway and the cooler the fury x comes with is just ugly like any AIO CLC. If it isn't good enough for CPU then it certainly isn't good enough for my Fury X


Breh, you tryna make me even more jelly of that beauty?


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> Breh, you tryna make me even more jelly of that beauty?


I feel like it is too legendary for my hands to even touch.


----------



## Noirgheos

15.12 is out guys. Gonna see if it fixes the downlocking issues. If not I'll use clockblocker.


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> I feel like it is too legendary for my hands to even touch.


Dude, its a work of art.


----------



## dagget3450

Quote:


> Originally Posted by *Noirgheos*
> 
> 15.12 is out guys. Gonna see if it fixes the downlocking issues. If not I'll use clockblocker.


I look forward to your results as i am waiting this time to upgrade drivers. I am growing tired of using 3rd party apps to fix driver issues or lack of features. If they do fix the clocking issue then i'll gladly update to crimson again.


----------



## Noirgheos

Quote:


> Originally Posted by *dagget3450*
> 
> I look forward to your results as i am waiting this time to upgrade drivers. I am growing tired of using 3rd party apps to fix driver issues or lack of features. If they do fix the clocking issue then i'll gladly update to crimson again.


Nope no fix but try this https://community.amd.com/thread/17600

Not a third party program and worked for me. Before doing this the clock issues were still there.

Good link: https://community.amd.com/thread/176003


----------



## huzzug

I think that link is dead. Might wanna update it


----------



## Noirgheos

Well it seems disabling ULPS didn't solve everything. Can anyone here comment on clockblocker's safety? It looks weird...

No need anymore guys. I reverted back to 15.11.1 CCC.

Crimson is honestly a piece of garbage. They should fix this before releasing it.


----------



## The Mac

for you...

works great for me, nice fps uplifts on games i play...


----------



## Elmy

Quote:


> Originally Posted by *dagget3450*
> 
> Aside from aesthetically improving looks, and space. What do the EK blocks provide performance wise that the stock FuryX coolers don't? Given they are poor overclockers and all, i guess i don't see the point. I am skipping water blocks this time around on my FuryX's unless there is proof they will do better on EK. I would love to see What results Elmy gets as i suppose he is putting 4 of them under water.


Ordering waterblocks today. 

Going with Aquacomputer as they look the most aesthetically to fit my build. The EK copper ones arent bad either. I have been EK for my last 3 builds... Going to try another brand just because unless someone can talk me out of it.


----------



## JunkaDK

So.. i installed the new drivers. Tweaked MSI afterburner.. but i can't get my head around why i can't match the Graphics score of this guy in 3D Mark.

Look a this 3Dmark comparison (Im JunkaDK)

I see that he i using an old driver, but my GPU is clocked higher than his. Anyone have any idea what he could be doing different? Maybe unlocked cores?

Regards, Junka


----------



## p4inkill3r

Quote:


> Originally Posted by *JunkaDK*
> 
> So.. i installed the new drivers. Tweaked MSI afterburner.. but i can't get my head around why i can't match the Graphics score of this guy in 3D Mark.
> 
> Look a this 3Dmark comparison (Im JunkaDK)
> 
> I see that he i using an old driver, but my GPU is clocked higher than his. Anyone have any idea what he could be doing different? Maybe unlocked cores?
> 
> Regards, Junka


Turn off the tesselation control in Crimson's 3dMark profile.

Here's one of mine for comparison: http://www.3dmark.com/fs/6802832


----------



## JunkaDK

Quote:


> Originally Posted by *p4inkill3r*
> 
> Turn off the tesselation control in Crimson's 3dMark profile.
> 
> Here's one of mine for comparison: http://www.3dmark.com/fs/6802832


Nice man







Thanks, will try that later and report back.

What OC program do u use? MSI ? Can you share your settings to achieve 1180mhz









Im very impressed









Junka


----------



## Arizonian

Look what arrived in office today.











This is going to make for a long day, can't wait to get home and plug it in.


----------



## p4inkill3r

Quote:


> Originally Posted by *JunkaDK*
> 
> Nice man
> 
> 
> 
> 
> 
> 
> 
> Thanks, will try that later and report back.
> 
> What OC program do u use? MSI ? Can you share your settings to achieve 1180mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im very impressed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Junka


Yes, I use Afterburner; max voltage and power limit, I don't worry about stability or temps for 3dMark runs.


----------



## p4inkill3r

Quote:


> Originally Posted by *Arizonian*
> 
> Look what arrived in office today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is going to make for a long day, can't wait to get home and plug it in.


Nice, the Nitros are some awesome looking cards.


----------



## Lorem Ipsum

Got a freesync monitor for my Tri-X Fury today, but found that my normal OC (1150MHz) was unstable using it, despite being fine for hours of gaming previously. I've put it down to 1100MHz for now and I'll ramp it up slowly. Anyone else experienced poorer overclocking with freesync? Could it be a driver thing? I'm still on 15.10.

Or maybe the VRMs are getting tired, but I wouldn't have thought the card would age that fast with just +75mv...


----------



## battleaxe

Quote:


> Originally Posted by *Noirgheos*
> 
> Well it seems disabling ULPS didn't solve everything. Can anyone here comment on clockblocker's safety? It looks weird...
> 
> No need anymore guys. I reverted back to 15.11.1 CCC.
> 
> Crimson is honestly a piece of garbage. They should fix this before releasing it.


I've been noticing that AB is buggy for me. I have to use TRIXX to get ULPS disabled. I can't even trust AB to turn it off anymore on my system. Just something to try if you were using AB.


----------



## Alastair

Quote:


> Originally Posted by *Arizonian*
> 
> Look what arrived in office today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is going to make for a long day, can't wait to get home and plug it in.


wait what? A nitro Fury?


----------



## The Mac

where you been man, they released them last week.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> wait what? A nitro Fury?


Quote:


> Originally Posted by *The Mac*
> 
> where you been man, they released them last week.


They've been out for a while. The Nitro FuryX was released last week.


----------



## Arizonian

Quote:


> Originally Posted by *Alastair*
> 
> wait what? A nitro Fury?


Yup. OC going for $489.99 and OC+ 499.99. At this price range better than the 980.
Quote:


> Originally Posted by *p4inkill3r*
> 
> Nice, the Nitros are some awesome looking cards.






Quote:


> Originally Posted by *battleaxe*
> 
> They've been out for a while. The Nitro FuryX was released last week.


Hardeee harhar


----------



## The Mac

I am so tempted to get one...


----------



## Maximization

4k upgrade kit assembled, good bye old 1200 x 1920 this might take hours but this should be fun.


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> Yup. OC going for $489.99 and OC+ 499.99. At this price range better than the 980.
> 
> 
> 
> Hardeee harhar


LOL... yeah. No-one caught that except you... LOL


----------



## Maximization

stupid question, does anyone know drivers that work for vista 64 for fury x? i am using basic drivers i guess its ok for work software but there is no acceleration.


----------



## p4inkill3r

Quote:


> Originally Posted by *Maximization*
> 
> stupid question, does anyone know drivers that work for vista 64 for fury x? i am using basic drivers i guess its ok for work software but there is no acceleration.


Is there any reason in particular you're using Vista still?


----------



## Maximization

Quote:


> Originally Posted by *p4inkill3r*
> 
> Is there any reason in particular you're using Vista still?


yeah long story, the software was originally for xp 32, I had to do some...stuff and make it work in vista ultimate. the software is out publication but it works for my needs. to modiify more for win 10 seems like allot of work , plus 10 is more internet insecure then a locked vista. I know stupid well that's why I still use vista ultimate. Wijn 10 is no problem, I am going to install the other x card now.


----------



## Maximization

my waterblocked overclocked 7870's could barely get past 5000, my GOD!!!!!!!!!!


----------



## SuperZan

Quote:


> Originally Posted by *Maximization*
> 
> my waterblocked overclocked 7870's could barely get past 5000, my GOD!!!!!!!!!!


;D Welcome to a brave new world!


----------



## Arizonian

*Sapphire Nitro R9 Fury* -- 4790K @ 4.6 GHz / 2560x1440 60 Hz IPS monitor
*Crimson 15.12 drivers*

*Club Validation Link*

*Benching*: Highest Temp 54C Fan Speed 55% manual fan curve that starts at 30% min fan speed at 30C - idles 27C @ 30% fan speed.
*Gaming*: Highest Temp 58C Fan Speed 60% manual fan curve that starts at 30% min fan speed at 30C - idles 27C @ 30% fan speed.

*Sapphire Nitro 1100 MHz Core*

*FS 13544*

*FS Extreme 7163*

*FS Ultra 3864*

*3DMark11 P15524*

*3DMark11 X6089*

*Shadows Of Mordor* - Ultra - *Min 44 Max 66 Avg 60*
*Star Wars* - Ultra Field of View 100% "Walker Assult (40player)" - *Min 82 Max 127 Avg 105*
*Crysis 3* - Very High - *Min 44 Max 94 Avg 58*
*Far Cry 4* - Ultra - *Min 49 Max 128 Avg 104*



Spoiler: Unigine Valley 77.6







*Sapphire Nitro 1125 MHz Core*

*FS 13654*

*FS Extreme 7278*

*FS Ultra 3957*

*3DMark 11 P15586*

*3DMark 11 X6266*

*Shadows Of Mordor* - Ultra - *Min 44 Max 81 Avg 60*



Spoiler: Unigine Valley 78.9







*Sapphire Nitro 1160 MHz Core*

FS *13890*
FS Extreme *7427*
FS Ultra *4052*

3DMark11 P *15828*
3DMark11 X *6392*



Spoiler: Unigine 80.1







*Sapphire Nitro 1175 MHz Core*

*FS 14026*

*FS Extreme 7519*

*FS Ultra 4099*

*3Dmark11 16067*

*3Dmark11 Extreme 6453*

1160 MHz and 1175 MHz temps reached 58C 62% Fan Speed

I do not hear any coil whine with open case. Tested fans up to 100% no rattle at any level like the one defective fan on the XFX TD I exchanged, and I got to keep Star Wars BF on the exchange, thanks Newegg.

Really happy with temps, acoustics, and performance gaming, Best I've had on single GPU on all accounts in my rig to date.









Now to get back to Star Wars BF, they are giving bonus experience and it's hard to pass up.









*EDIT*: Updated Sapphire Nitro 1175 MHz Core over clock scores.


----------



## Alastair

Too wow. And how does that PCB look? Good? Coil whine? Think people will end up making blocks for those?

Update on my rig. Bled and now being leak tested. Looking good.


----------



## Maximization

yeah ordered 2 ek blocks the 2 rads outside my case is silly


----------



## baii

Anyone did repaste the air fury card, any noticeable gain?


----------



## Semel

*Arizonian*

Have you tried OCing your card?

Could you post firestrike results too?

Thanx.


----------



## Arizonian

Quote:


> Originally Posted by *Semel*
> 
> *Arizonian*
> 
> Have you tried OCing your card?
> 
> Could you post firestrike results too?
> 
> Thanx.


I didn't run FS but thought I'd give a see what's under the hood and here's what I got.....

*Sapphire Nitro 1100 MHz Core*

*FS 13544*

*FS Extreme 7163*

*FS Ultra 3864*

*3DMark11 P15524*

*3DMark11 X6089*

*Sapphire Nitro 1125 MHz Core*

*FS 13654*

*FS Extreme 7278*

*FS Ultra 3957*

*3DMark 11 P15586*

*3DMark 11 X6266*

I added +12 mV @ +125 MHz Core but not sure if I needed it and no voltage needed to keep it at 1100 MHz Core. Raised it to +18 mV when I tried for +150 MHz over clock but crashed. Did not attempt to add more voltage yet.

Temps are not holding this over clock back as it's running surprisingly cool. I did not even hear this break a sweat audibly with highest temps 54C at 55% fan speed during the entire benchmarking.

I tried to improve my 1125 MHz scores to 1150 MHz on the Core but crashed Fire Strike and though did not crash 3DMark11 did worse in scores. Next step for me is to see if I can sustain 1125 MHz Core gaming now and will call this over clock good 24/7 OC.









Update new Unigine Score @ 1125 Mhz Core - *78.9*



*Crimson 15.12 drivers*


----------



## JunkaDK

Quote:


> Originally Posted by *Arizonian*
> 
> I didn't run FS but thought I'd give a see what's under the hood and here's what I got.....
> 
> *Sapphire Nitro 1100 MHz Core*
> 
> *FS 13544*
> 
> *FS Extreme 7163*
> 
> *FS Ultra 3864*
> 
> *3DMark11 P15524*
> 
> *3DMark11 X6089*
> 
> *Sapphire Nitro 1125 MHz Core*
> 
> *FS 13654*
> 
> *FS Extreme 7278*
> 
> *FS Ultra 3957*
> 
> *3DMark 11 P15586*
> 
> *3DMark 11 X6266*
> 
> I added +12 mV @ +125 MHz Core but not sure if I needed it and no voltage needed to keep it at 1100 MHz Core. Raised it to +18 mV when I tried for +150 MHz over clock but crashed. Did not attempt to add more voltage yet.
> 
> Temps are not holding this over clock back as it's running surprisingly cool. I did not even hear this break a sweat audibly with highest temps 54C at 55% fan speed during the entire benchmarking.
> 
> I tried to improve my 1125 MHz scores to 1150 MHz on the Core but crashed Fire Strike and though did not crash 3DMark11 did worse in scores. Next step for me is to see if I can sustain 1125 MHz Core gaming now and will call this over clock good 24/7 OC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Update new Unigine Score @ 1125 Mhz Core - *78.9*
> 
> 
> 
> *Crimson 15.12 drivers*


My r9 fury strix is running at 1163mhz , mem at 572mhz max core and max voltage and max Cooler speed. Doing 15100 in FS with an i7-5930k at 4.7ghz


----------



## dagget3450

Quote:


> Originally Posted by *JunkaDK*
> 
> My r9 fury strix is running at 1163mhz , mem at 572mhz max core and max voltage and max Cooler speed. Doing 15100 in FS with an i7-5930k at 4.7ghz


You comparing graphics score or overall score? Regular fire strike is way more cpu heavy so more threads help alot


----------



## Toxsick




----------



## JunkaDK

Quote:


> Originally Posted by *dagget3450*
> 
> You comparing graphics score or overall score? Regular fire strike is way more cpu heavy so more threads help alot


I know The CPU adds alot to The score. Graphics score is 17126 and physics is 18448 ?


----------



## Semel

*Arizonian*
Quote:


> I tried to improve my 1125 MHz scores to 1150 MHz on the Core but crashed Fire Strike and though did not crash 3DMark11 did worse in scores.Next step for me is to see if I can sustain 1125 MHz Core gaming


You could try 1140. for instance.. There is still room for improvement









I got mine to 1140/570 stable +72mV(unofficial overclocking mode AB\Trixx, so it's ~1.28 under load) +50 power. http://www.3dmark.com/3dm/9757182? (for some reason 3dmark doesn't detect my core\memory clocks)

I could go for 1150 but it requires a whopping +108mV and after 1150 (up to 1180) I get a negative performance gain resulting in default clocks performance.regardless of +mV used
Quote:


> Crimson 15.12 drivers


Get rid of that and install 15.11.1 catalyst. Crimson screws up OCing due to it's messing with core clocks\voltages.Even clockblocker doesn't help much. It makes OC more stable but not as perfectly stable as when using 15.11.1 catalyst drivers.


----------



## SuperZan

http://www.3dmark.com/3dm/9813684

Just playing with different clocks on my Fury X / Fury Crossfire. They seemed to like 1125/500, I'll probably delve into the HBM side sometime this coming week.


----------



## Maximization

curious what is max safe mV on fury x?


----------



## devilhead

Quote:


> Originally Posted by *Toxsick*


nVidia Enthusiast with Fury x and wrong populated ram







ram should be in red slots


----------



## Toxsick

Quote:


> Originally Posted by *devilhead*
> 
> nVidia Enthusiast with Fury x and wrong populated ram
> 
> 
> 
> 
> 
> 
> 
> ram should be in red slots


Is already fixed









And yeah cba to change that title.


----------



## p4inkill3r

Quote:


> Originally Posted by *Maximization*
> 
> curious what is max safe mV on fury x?


People have modded Trixx to push +200mv.


----------



## Thoth420

Finally got to test her out!
Whisper quiet and even got lucky on my first XF270HU (edited)! No dead pixels or IPS glow even at stock 80!





Better pics soon


----------



## xer0h0ur

A G-sync monitor for an AMD card? Wat?


----------



## Thoth420

Quote:


> Originally Posted by *xer0h0ur*
> 
> A G-sync monitor for an AMD card? Wat?


It's the Freesync variant.
XF = Freesync
XB = G Sync

both use the same AU Optronics panel

XF is matte too...xb is glossy fail.


----------



## JunkaDK

http://www.3dmark.com/compare/fs/6871047/fs/5663590 - Tessalation turned off made a HUGHE difference.. 1k extra points in 3D Mark.. but its not a valid result


----------



## xer0h0ur

Quote:


> Originally Posted by *Thoth420*
> 
> It's the Freesync variant.
> XF = Freesync
> XB = G Sync
> 
> both use the same AU Optronics panel
> 
> XF is matte too...xb is glossy fail.


Oh, well I had googled the model number you put which was XB so I was like.....WAT

I am in the market for a 1440p Freesync 27/28" 144Hz monitor. I am torn between going IPS or sticking to TN.


----------



## Thoth420

Damn phone's auto correct...

The Acer is your best option then. It's been poorly marketed most don't know it exists...including newegg.

I haven't found a high refresh 2560 x 1440 TN that doesn't suffer pixel inversion so I opted for the IPS. It's a very fast panel but I don't play CS just BF4 for FPS currently so it fits my needs.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *p4inkill3r*
> 
> Turn off the tesselation control in Crimson's 3dMark profile.
> 
> Here's one of mine for comparison: http://www.3dmark.com/fs/6802832


I turned tesselation off to see how well I'd fare.

This is with 2 Fury X's.

Suffice it to say, I really should have gone with an Intel CPU rather than AMD. = (

http://www.3dmark.com/compare/fs/6851030/fs/6851176/fs/6802832


----------



## Thoth420

Anyone getting surface texture corruption in Hitman Absolution with a Fury X? I tried crimson 15.11 and 15.12 it occurs with both. Witcher 3 and Deus Ex exhibit no issues. :\


----------



## Jflisk

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I turned tesselation off to see how well I'd fare.
> 
> This is with 2 Fury X's.
> 
> Suffice it to say, I really should have gone with an Intel CPU rather than AMD. = (
> 
> http://www.3dmark.com/compare/fs/6851030/fs/6851176/fs/6802832


The Fury Xs start to shine in Fires strike extreme in scores . The firestrike scores do not push the card or cards far enough.


----------



## xer0h0ur

Some people like to knock AMD's DX11 driver overhead performance @ 1080p or lower. That makes sense with lower end to mid range cards but I always argue what is anyone doing gaming at 1080p or lower with AMD's top end offerings. It makes no sense.


----------



## Semel

Quote:


> Originally Posted by *xer0h0ur*
> 
> It makes no sense.


It makes perfect sense if you wanna play modern games at maxed out settings @60+ fps (WItcher 3 etc) . Anyways, it's a fact that 980ti IS better than fury x. There is no arguing about it.It's a sad fact And who cares what happens when dx12 games start getting released? New cards will be out, from amd and nvidia...And even now ashes of singularity benchmark shows that dx12 just makes fury get real close to a *default* clocked 980ti tand 980ti is a beast overclocking wise


----------



## SuperZan

Quote:


> Originally Posted by *Semel*
> 
> It makes perfect sense if you wanna play modern games at maxed out settings @60+ fps (WItcher 3 etc) . Anyways, it's a fact that *980ti IS better than fury x*. There is no arguing about it.It's a sad fact And who cares what happens when dx12 games start getting released? New cards will be out, from amd and nvidia...And even now ashes of singularity benchmark shows that dx12 just makes fury to get real close to a *default* clocked 980ti tand 980ti is a beast overclocking wise


At 1920x1080, absolutely. They draw even game-to-game in 2560x1440, and Fury X tends to hold a slight advantage at 4k.

At 2560x1440 my single Fury X was capable of playing everything at a 55+ FPS rate. Crossfire-wise I was able to get a Fury on deep sale, add it to my system, and use Crossfire with in-game and benchmark performance at least 90% that of a Fury X Crossfire. The Fury and Fury X are already very strong at 4k - in Crossfire I've yet to find a title that won't hold 60fps. I bring this up because it's a common refrain that you need 2 of (x) top-line card to truly enjoy 4k. With Nvidia, this is true. You need two Titan X's or two 980ti's. You can't grab a 980 at discount and work it into your system, even if the price/performance worked out in favour of such.

When offering blanket statements about which product is "better" without qualification, your assertion can only be considered as subjective as anybody else's. For me, with my monitor, my budget/buying habits, my existing system, etc. the Fury X was and is 'better' than the 980ti. Of course, I'm not playing anything at 1920x1080 on a 4k monitor.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> It makes perfect sense if you wanna play modern games at maxed out settings @60+ fps (WItcher 3 etc) . Anyways, it's a fact that 980ti IS better than fury x. There is no arguing about it.It's a sad fact And who cares what happens when dx12 games start getting released? New cards will be out, from amd and nvidia...And even now ashes of singularity benchmark shows that dx12 just makes fury to get real close to a *default* clocked 980ti tand 980ti is a beast overclocking wise


So enjoy your 980 Ti. That still has nothing to do with my point.


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> It makes perfect sense if you wanna play modern games at maxed out settings @60+ fps (WItcher 3 etc) . Anyways, it's a fact that 980ti IS better than fury x. There is no arguing about it.It's a sad fact And who cares what happens when dx12 games start getting released? New cards will be out, from amd and nvidia...And even now ashes of singularity benchmark shows that dx12 just makes fury to get real close to a *default* clocked 980ti tand 980ti is a beast overclocking wise












At 1080p yes.

But this is 2015. 1080p is yesterday's news. Anyone spending 980ti money on a card should already have a 4k display. But, but... wait... uh... oh.. um...

Yeah.


----------



## Arizonian

Quote:


> Originally Posted by *Thoth420*
> 
> Finally got to test her out!
> Whisper quiet and even got lucky on my first XF270HU (edited)! No dead pixels or IPS glow even at stock 80!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Better pics soon


*A*. Nice looking rig. *B*. Sweet monitor. *C*. Please do post pics.

I think this monitor has a lot going for it when you consider the ports include *HDMI 2.0* including *DP 1.2a+* industry standard. I know it's taboo to discuss here but can be supported by Nvidia IF they want to adapt A-synchronization down the road. A 1440p 144Hz on a AHVA panel with light matte screen.









Now to pay off some debt so I can get this baby to go with my new Fury which has no issues gaming with maxed settings at 40+ FPS minimums on the most demanding games .


----------



## Thoth420

Quote:


> Originally Posted by *Arizonian*
> 
> *A*. Nice looking rig. *B*. Sweet monitor. *C*. Please do post pics.
> 
> I think this monitor has a lot going for it when you consider the ports include *HDMI 2.0* including *DP 1.2a+* industry standard. I know it's taboo to discuss here but can be supported by Nvidia IF they want to adapt A-synchronization down the road. A 1440p 144Hz on a AHVA panel with light matte screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to pay off some debt so I can get this baby to go with my new Fury which has no issues gaming with maxed settings at 40+ FPS minimums on the most demanding games .


Thanks and will do. Also this is the Freesync version. 40 to 144hz range.


----------



## Otterfluff

Watercool finally came out with their fury X waterblocks

http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Categories/Wasserkühler/GPU_Kuehler/"Radeon%20R9%20Series"

Took their time, If I had waited I would of been waiting almost four months now. Only the computer modeled image, I would like to see an actual photo.


----------



## Clockster

Well its been fun, but I'm officially departing the Fury X club.
My 980Ti Lightning will be here in the next couple of hours.

See you guys for X2


----------



## methadon36

Quote:


> Originally Posted by *Clockster*
> 
> Well its been fun, but I'm officially departing the Fury X club.
> My 980Ti Lightning will be here in the next couple of hours.
> 
> See you guys for X2


What was the deciding factor in the switch?


----------



## Clockster

Quote:


> Originally Posted by *methadon36*
> 
> What was the deciding factor in the switch?


Sold my Fury X for a really good price and picked up the lightning for a good price as well.
Deciding factor for me though was after Crimson launched I started getting black screens every now and then. Even after a clean install, the problem remained.
On top of that, I noticed my stable overclock wasn't stable anymore and after the whole black screen thing I was just over it.


----------



## Semel

Quote:


> Originally Posted by *xer0h0ur*
> 
> So enjoy your 980 Ti. That still has nothing to do with my point.


ROFLMAO logic

So if I state a practically proven\known fact that makes amd fanatics not happy then I must have nvidia card.OK.

I'm sorry to disappoint you but I've got amd fury card. If you paid any attention when reading this thread you would know this.
Quote:


> Originally Posted by *SuperZan*
> 
> At 1920x1080, absolutely. They draw even game-to-game in 2560x1440, and Fury X tends to hold a slight advantage at 4k..


At 2560 you won't get stable 60 fps in many games (and more will come out) on a single fury x at maxed out settings. 4K is not worth discussing as fps is unacceptable (single card). We are not talking here about crossfire or sli.

And if you take into consideration 980ti's OCing scaling\potential then it gets real sad.
Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But this is 2015. 1080p is yesterday's news. Anyone spending 980ti money on a card should already have a 4k display. But, but... wait... uh... oh.. um...
> 
> Yeah.


4K display and playing games at cinematic 30 fps @ maxed out settings? No thank you..


----------



## battleaxe

Quote:


> Originally Posted by *Semel*
> 
> 4K display and play games at cinematic 30 fps? No thank you..


30 fps??? Whaaaaatttt????

On what?

being a little dramatic are we? I play at 4k and I've never seen 30fps. Usually, 70-100fps is the norm. And I could definitely turn things down a bit if I needed to as well. I have no idea what people are talking about when saying 4k gaming is not 'there' yet. It certainly works perfectly for me.


----------



## ManofGod1000

Quote:


> Originally Posted by *Semel*
> 
> ROFLMAO logic
> 
> So if I state a practically proven\known fact that makes amd fanatics not happy then I must have nvidia card.OK.
> 
> I'm sorry to disappoint you but I've got amd fury card. If you paid any attention when reading this thread you would know this.
> At 2560 you won't get stable 60 fps in many games (and more will come out) on a single fury x at maxed out settings. 4K is not worth discussing as fps is unacceptable (single card). We are not talking here about crossfire or sli.
> 
> And if you take into consideration 980ti's OCing scaling\potential then it gets real sad.
> 4K display and playing games at cinematic 30 fps @ maxed out settings? No thank you..


Games look considerably better at 4k with some reduced settings that at 1080p maxed out any day of the week. 1080p is yesterdays news for anyone who own the latest, fastest video card available. (That is, unless you are a 120fps whore or something.







) I am on 4k and my 980Ti runs really well and the games look fantastic even with some settings reduced.

I would have gone with an AMD setup again but, my case was not big enough to use the R9 Fury Air cooled and I did not feel like getting a new case. Amazing how easy it is to have differing points of view and still get along. However, if you think that 4k at 60fps maxed out is the only possibility and anyone who thinks otherwise is a nut, well, I cannot help you then. Sorry but, I do not think $1300 in video cards is justifiable. (Would love to have a 980Ti SLI setup but not for the cost.)


----------



## NBrock

I don't have any issues running a solid 60+ FPS on all my games on my Fury X @ 2560x1440. I play everything with V Sync @ 60 for my monitor and it never dips.

I play Fallout 4 Ultra Settings, Battle Front Max Settings, Battle Field 4 Max Settings, Diablo 3 Max Settings, War Thunder Max Settings, Skyrim Very Modded, and a bunch of other games that should obviously run at 60+ FPS. The only game I had issues with originally was Fall Out 4 not wanting to utilize the full clock of my GPU, but after I installed Clock Blocker that is a thing of the past.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> ROFLMAO logic
> 
> So if I state a practically proven\known fact that makes amd fanatics not happy then I must have nvidia card.OK.
> 
> I'm sorry to disappoint you but I've got amd fury card. If you paid any attention when reading this thread you would know this.
> At 2560 you won't get stable 60 fps in many games (and more will come out) on a single fury x at maxed out settings. 4K is not worth discussing as fps is unacceptable (single card). We are not talking here about crossfire or sli.
> 
> And if you take into consideration 980ti's OCing scaling\potential then it gets real sad.
> 4K display and playing games at cinematic 30 fps @ maxed out settings? No thank you..


I would guess you're still a child considering you decided to take my point about AMD cards and turn it into something about Fury/X versus 980 Ti. I am not comparing performance in the least bit. Only saying you're a fool to be buying/spending on a 980 Ti / Titan X / Fury X to be gaming at 1080p or lower. 1440p is bar none the lowest resolution I even bother with. You can keep your peasant 1080p.


----------



## OptimusToaster

What's the difference between the Fury X and the Nano if both cards have full cover waterblocks? ie no thermal limits

Looking at specs alone I'd say there isn't much but I don't know.


----------



## xer0h0ur

Quote:


> Originally Posted by *OptimusToaster*
> 
> What's the difference between the Fury X and the Nano if both cards have full cover waterblocks? ie no thermal limits
> 
> Looking at specs alone I'd say there isn't much but I don't know.


Have they made waterblocks for the Nano yet? Basically the Nano has a fully unlocked Fury X die in it but due to thermal limitations it was never built with the intention of maintaining Fury X's clock speeds so it throttles to stay within acceptable temperatures and its TDP. At least that is how I remember it.


----------



## NBrock

Quote:


> Originally Posted by *OptimusToaster*
> 
> What's the difference between the Fury X and the Nano if both cards have full cover waterblocks? ie no thermal limits
> 
> Looking at specs alone I'd say there isn't much but I don't know.


Not as much voltage control. As well as only one 8 pin connector. In addition it also has an algorithm built into the firmware that fights to keep it at the correct TDP if I remember correctly. Would be interesting to see if there are BIOS mods to fix this or if it could be flashed to a modded Fury X bios.


----------



## Toxsick

Quote:


> Originally Posted by *Clockster*
> 
> Sold my Fury X for a really good price and picked up the lightning for a good price as well.
> Deciding factor for me though was after Crimson launched I started getting black screens every now and then. Even after a clean install, the problem remained.
> On top of that, I noticed my stable overclock wasn't stable anymore and after the whole black screen thing I was just over it.


Had the same issue here.

Freesync monitor would cause artifacts in-game and black screening aswell. l
Had to downgrade to 15.7 to fix this.


----------



## Jflisk

Quote:


> Originally Posted by *xer0h0ur*
> 
> Have they made waterblocks for the Nano yet? Basically the Nano has a fully unlocked Fury X die in it but due to thermal limitations it was never built with the intention of maintaining Fury X's clock speeds so it throttles to stay within acceptable temperatures and its TDP. At least that is how I remember it.


Evidently EK has had them for awhile.
http://www.performance-pcs.com/catalogsearch/result/?q=nano+waterblock


----------



## xer0h0ur

Good god that is a tiny video card waterblock.


----------



## Neon Lights

Quote:


> Originally Posted by *Clockster*
> 
> Sold my Fury X for a really good price and picked up the lightning for a good price as well.
> Deciding factor for me though was after Crimson launched I started getting black screens every now and then. Even after a clean install, the problem remained.
> On top of that, I noticed my stable overclock wasn't stable anymore and after the whole black screen thing I was just over it.


I personally can understand anyone who buys a 980Ti instead of a Fury/switches to a 980Ti, because, at least at the moment, the 980Ti always has the better performance (overclocked).

I hope that when DirectX 12 and the other Low Level APIs finally get used in games appropriately, the Fury cards will see their deserved performance lead.


----------



## xer0h0ur

Quote:


> Originally Posted by *Neon Lights*
> 
> I personally can understand anyone who buys a 980Ti instead of a Fury/switches to a 980Ti, because, at least at the moment, the 980Ti always has the better performance (overclocked).
> 
> I hope that when DirectX 12 and the other Low Level APIs finally get used in games appropriately, the Fury cards will see their deserved performance lead.


For what its worth, Microsoft is already up to their usual garbage with delaying DX12 titles from release and I am beginning to believe that industry adoption of DX12 is actually not going to be nearly as big as imagined. All due to Vulkan being a far wider user base.

"Vulkan on the other hand can work on multiple OS which range from Windows (XP/Vista/7/8/8.1/10), Linux, SteamOS, Android. Also unlike Mantle, Vulkan will be able to run multiple GPUs from various vendors allowing a more wider support than any previous API. Another leverage over previous APIs is that Vulkan adopts the first open standard cross-API intermediate language for parallel compute and graphics known as SPIR-V, allowing developers to write programs for Vulkan in their own choice of programming language."

http://wccftech.com/khronos-group-vulkan-api-release-imminent/

Its worth a read.

There are already develpers that flat out have said there is no point to them creating a DX12 backend for their software since Vulkan is better for the industry as a whole and reaching a larger user base.


----------



## Neon Lights

Yes I know that, because of that I also wrote "DirectX 12 and the other Low Level APIs" because I think that, driver optimzation aside, it does not matter (for me personally because I am not planning to use another OS besides Windows 10 or Windows in general) if DirectX 12 or Vulcan is being used because they have the same capabilities performance-wise.
The Mantle API however has, as far as I know, actual performance advantages over the other Low Level APIs and while I hope that there will be some games that use its potential, the chances for that I would say are unfortunately not very high.


----------



## xer0h0ur

Mantle is the backbone for LiquidVR so take that as you will.


----------



## Neon Lights

But I wonder how many games will use LiquidVR.


----------



## diggiddi

Quote:


> Originally Posted by *xer0h0ur*
> 
> Mantle is the backbone for LiquidVR so take that as you will.


And Vulkan


----------



## Thoth420

Pegged my artifacting issue down to my display sadly. I am considering trying out some 4K since the most "competive" game I play lately is BF4...

Any suggestions? I was so lazer focused on 2560 x 1440 @ 144hz that I never bothered looking at the 4K offerings.

Also will my single Fury X be able to handle it? I am not big on AA but how much VRAM do textures eat up at that reso on ultra?

Looking to replay Hitman Abso, Deus Ex HR and my first play through of the witcher 3.


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> Pegged my artifacting issue down to my display sadly. I am considering trying out some 4K since the most "competive" game I play lately is BF4...
> 
> Any suggestions? I was so lazer focused on 2560 x 1440 @ 144hz that I never bothered looking at the 4K offerings.
> 
> Also will my single Fury X be able to handle it? I am not big on AA but how much VRAM do textures eat up at that reso on ultra?
> 
> Looking to replay Hitman Abso, Deus Ex HR and my first play through of the witcher 3.


I have the Keyboard smash Acer model and it works very nice. Until the GPU's get a little stronger and we get 120hz panels its about as good as you will need.


----------



## iTurn

Question, (and it's not meant to bash the OP) why are these 4 very different Video Cards lumped into one thread?

Kinda, makes it hard to search for info.


----------



## xer0h0ur

Just do like everyone else and ask questions instead of reading or searching in the thread ¯\_(ツ)_/¯


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> Just do like everyone else and ask questions instead of reading or searching in the thread ¯\_(ツ)_/¯


LOL... pretty much.


----------



## Neon Lights

I got a question which is not directly Fury-related but I don't want to open a thread because it's not that big of an issue.

This has been the issue on my 7970s too so I'm relatively sure it's my monitor (Eizo FG2421): Especially during startup a few pixels are green, and they are at the same area of the screen, but only on specific backgrounds (e.g. the the BIOS screen at startup that shows the Mainboard logo and also my desktop background), when I switch to an application I can see no green pixels. Or could it be my DisplayPort cable?


----------



## Arizonian

OK gave it a little push today. +60 mV 1160 MHz Core 1.3250V running a tad warmer 61C 64% manual fan speed

FS *13890*
FS Extreme *7427*
FS Ultra *4052*

3DMark11 P *15828*
3DMark11 X *6392*



Spoiler: Unigine 80.1







Any suggestions on air what's safe zone with adding mV?

Want to see if I can push 16K on 3DMark11 need another 172 points. Haven't tried gaming if I can keep this overclock yet, that's next. So far it's proven to be a nice card with +160 MHz on the Core benching.


----------



## p4inkill3r

That card is tough, throw +96mv at it, it can take it.


----------



## Maximization

slowly building up


----------



## Maximization

just delivered... nice..............


----------



## Arizonian

Quote:


> Originally Posted by *Maximization*
> 
> just delivered... nice..............
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice indeed....can 't wait to see this set up.


----------



## Maximization

Quote:


> Originally Posted by *Arizonian*
> 
> Nice indeed....can 't wait to see this set up.


well ordered wrong bridge connector , thank god I had some extra barbs left over from my 7870 cross fire waterblocking. I went loopy with the tubies hehehe


----------



## Maximization

maxed out at 1120 core speed, when voltage unlocked became unstable, when tried to overclock memory became unstable. using afterburner. temps no problem but i did not win silicon lottery. still kicking though.

http://ranker.sisoftware.net/show_run.php?q=c2ffc8f1d7b6d7ead8e9d8e8cebc81b197f297aa9abccff2c2

http://www.3dmark.com/fs/6934184


----------



## fat4l

Hi guys.
Is there any review showing the performance of Fury X with Crimson 15.12 drivers(or even Crimson 15.11) in comparison to 980Ti?


----------



## p4inkill3r

Quote:


> Originally Posted by *Maximization*
> 
> maxed out at 1120 core speed, when voltage unlocked became unstable, when tried to overclock memory became unstable. using afterburner. temps no problem but i did not win silicon lottery. still kicking though.
> 
> http://ranker.sisoftware.net/show_run.php?q=c2ffc8f1d7b6d7ead8e9d8e8cebc81b197f297aa9abccff2c2
> 
> http://www.3dmark.com/fs/6934184


I'd keep looking into your settings and/or other factors, because your card should be able to do better than that IMO.


----------



## Maximization

Trixx kept crashing and asus gpu tweak was difficult. Msi works but i think the crimson drivers changed allot. Is there a better oc app?


----------



## p4inkill3r

Uninstall Trixx/AB/GPUTweak and the AMD drivers. Reinstall Crimson, then install Afterburner.


----------



## Arizonian

Quote:


> Originally Posted by *p4inkill3r*
> 
> That card is tough, throw +96mv at it, it can take it.


Thanks for the encouragement. Added +96mV and got this baby 15% over clocked 24/7 at 1150 MHz Core GAMING!

Been through more than a few hours of SW:Battlefront , finally reached 32nd lvl to unlock Chewbacca's Bow









Moved onto Crysis 3, FarCry 4 and suffice to say this is a stable +150 MHz on the Core while GAMING. I'm impressed. Straight forward over clocking.

65C 75% fan speed was highest I saw. Steady 1150 MHz Core all the way through, Crimson 15.12 no issues.

I was used to 780Ti running 82-84C temp 85% fan speed, higher acoustics, and being down clocked by 13 MHz straps down to 1176 MHz.









Overall after having some time with my Nitro Fury, I'm very content. Have been enjoying the vibrant graphics gaming too. Honestly, at the $500 price point it's competitively beating 980 in price / performance and has HBM.









I can now begin to move onto a Freesync monitor at some point and I'm looking into the *Acer XF270HU* . Not thrilled with stand but specs are sweet for gaming display.

This will keep me more than content @ 1440p until AMD's next GPU release sometime in Q3 or Q4.


----------



## Maximization

Does the bios version determine if the voltage can be unlocked? It appears i have an older bios version. They are both the same for both cards though.


----------



## The Stilt

Quote:


> Originally Posted by *Maximization*
> 
> Does the bios version determine if the voltage can be unlocked? It appears i have an older bios version. They are both the same for both cards though.


Some bioses might limit the maximum voltage to lower value, however all of them MUST allow at least 1.30000V (VDDC).


----------



## p4inkill3r

Quote:


> Originally Posted by *Arizonian*
> 
> Thanks for the encouragement. Added +96mV and got this baby 15% over clocked 24/7 at 1150 MHz Core GAMING!
> 
> Been through more than a few hours of SW:Battlefront , finally reached 32nd lvl to unlock Chewbacca's Bow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Moved onto Crysis 3, FarCry 4 and suffice to say this is a stable +150 MHz on the Core while GAMING. I'm impressed. Straight forward over clocking.
> 
> 65C 75% fan speed was highest I saw. Steady 1150 MHz Core all the way through, Crimson 15.12 no issues.
> 
> I was used to 780Ti running 82-84C temp 85% fan speed, higher acoustics, and being down clocked by 13 MHz straps down to 1176 MHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overall after having some time with my Nitro Fury, I'm very content. Have been enjoying the vibrant graphics gaming too. Honestly, at the $500 price point it's competitively beating 980 in price / performance and has HBM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can now begin to move onto a Freesync monitor at some point and I'm looking into the *Acer XF270HU* . Not thrilled with stand but specs are sweet for gaming display.
> 
> This will keep me more than content @ 1440p until AMD's next GPU release sometime in Q3 or Q4.


Good to see you're satisfied.









Regarding the monitor, I'm also in the market for a Freesync and have been looking at the Acer you linked as well. I have had a PB278Q for a couple years and while its beautiful, I'm playing more games as of late than productivity and I'm anxious to see how well Freesync works.


----------



## p4inkill3r

Quote:


> Originally Posted by *Maximization*
> 
> Does the bios version determine if the voltage can be unlocked? It appears i have an older bios version. They are both the same for both cards though.


I suggest using Afterburner and verifying that you have its settings set correctly: Unlock voltage control, extending official overclocking limits, and disabling ULPS.


----------



## fat4l

Quote:


> Originally Posted by *The Stilt*
> 
> Some bioses might limit the maximum voltage to lower value, however all of them MUST allow at least 1.30000V (VDDC).


Are u allowed to tell us which ones ?
I'm planning on buying a Sapphire Fury X and I want to custom wcool this card so obviously, I'm planning to increase the volts significantly so therefore I wonder if my choice is fine


----------



## The Stilt

Quote:


> Originally Posted by *fat4l*
> 
> Are u allowed to tell us which ones ?
> I'm planning on buying a Sapphire Fury X and I want to custom wcool this card so obviously, I'm planning to increase the volts significantly so therefore I wonder if my choice is fine


All of the Fury X cards I´ve seen allow VDDC up to 1.48V (which is obviously plenty). It doesn´t really matter which Fury X you pick, since they are all identical (MBA board). In case your default bios would limit the voltage, you can simply swap the bios to another one which doesn´t. All Fury X bioses are interchangeable since they are all MBA boards.


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> I suggest using Afterburner and verifying that you have its settings set correctly: Unlock voltage control, extending official overclocking limits, and disabling ULPS.


I don't trust Afterburner with toggling ULPS anymore. Particularly after verifying with RadeonMod that its only disabling ULPS on a single GPU's settings instead of universally for all of them. RadeonMod allows you to do so for all of them. Before I was doing it manually in the registry.


----------



## 98uk

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't trust Afterburner with toggling ULPS anymore. Particularly after verifying with RadeonMod that its only disabling ULPS on a single GPU's settings instead of universally for all of them. RadeonMod allows you to do so for all of them. Before I was doing it manually in the registry.


Out of interest, what is the benefit of disabling ULPS? I see it recommended, but I'm not sure why exactly.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't trust Afterburner with toggling ULPS anymore. Particularly after verifying with RadeonMod that its only disabling ULPS on a single GPU's settings instead of universally for all of them. RadeonMod allows you to do so for all of them. Before I was doing it manually in the registry.


I've noticed the same thing on 290x and 390x cards. Seems AB just is buggy as you know what for disabling ULPS. Use TRIXX instead while AB is running. Then it should all work okay. (for everyone else, as you already know this)
Quote:


> Originally Posted by *98uk*
> 
> Out of interest, what is the benefit of disabling ULPS? I see it recommended, but I'm not sure why exactly.


If you have multiple cards you will experience multiple issues, including crashes, monitoring not working correctly, and voltage will not unlock correctly in AB or any other program. So it basically screws up your cards from working correctly when using more than one, big time.


----------



## Maximization

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't trust Afterburner with toggling ULPS anymore. Particularly after verifying with RadeonMod that its only disabling ULPS on a single GPU's settings instead of universally for all of them. RadeonMod allows you to do so for all of them. Before I was doing it manually in the registry.


thanks for pointing out RadeonMod never heard of it


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't trust Afterburner with toggling ULPS anymore. Particularly after verifying with RadeonMod that its only disabling ULPS on a single GPU's settings instead of universally for all of them. RadeonMod allows you to do so for all of them. Before I was doing it manually in the registry.


Eh, as recently as last month I was running a 2x 290 setup and did not have any issue, nor have I seen this claim repeated elsewhere.
Quote:


> Originally Posted by *98uk*
> 
> Out of interest, what is the benefit of disabling ULPS? I see it recommended, but I'm not sure why exactly.


Users that experience low usage of their second GPU, throttling, or other issues can be traded to ULPS many times.


----------



## 98uk

Quote:


> Originally Posted by *battleaxe*
> 
> If you have multiple cards you will experience multiple issues, including crashes, monitoring not working correctly, and voltage will not unlock correctly in AB or any other program. So it basically screws up your cards from working correctly when using more than one, big time.


Quote:


> Originally Posted by *p4inkill3r*
> 
> Users that experience low usage of their second GPU, throttling, or other issues can be traded to ULPS many times.


Interesting. So, for single GPU setups, it should make no difference?


----------



## p4inkill3r

Quote:


> Originally Posted by *98uk*
> 
> Interesting. So, for single GPU setups, it should make no difference?


I believe that is correct; only non-primary cards are affected by low-power state.


----------



## fat4l

Can anyone run 3dMark Firestrike and Firestrike eXtreme on Fury X (1150Mhz at least) and a highly clocked CPU and show me links so I can compare with my currect card pls ?

FS http://www.3dmark.com/fs/6734662
FSX http://www.3dmark.com/fs/6590481


----------



## xer0h0ur

Quote:


> Originally Posted by *p4inkill3r*
> 
> Eh, as recently as last month I was running a 2x 290 setup and did not have any issue, nor have I seen this claim repeated elsewhere.
> Users that experience low usage of their second GPU, throttling, or other issues can be traded to ULPS many times.


Well considering that RadeonMod is new you wouldn't have seen this claim anywhere else really. You're welcome to download and use it yourself so you can see that ULPS in multi-GPU scenarios is not being disabled on all of them.


----------



## xer0h0ur

Quote:


> Originally Posted by *Maximization*
> 
> thanks for pointing out RadeonMod never heard of it


Its a great little app that is still being upgraded and updated. I wouldn't be surprised if one day it completely makes RadeonPro useless for people that use RP's dynamic V-sync. I just wish it would allow me to force Crossfire on or off since that would be an effective workaround for the bug in Crimson Software that doesn't stick the Crossfire setting in game profiles.


----------



## p4inkill3r

Quote:


> Originally Posted by *fat4l*
> 
> Can anyone run 3dMark Firestrike and Firestrike eXtreme on Fury X (1150Mhz at least) and a highly clocked CPU and show me links so I can compare with my currect card pls ?
> 
> FS http://www.3dmark.com/fs/6734662
> FSX http://www.3dmark.com/fs/6590481


6700k @ 4.6/Fury X @1180/585MHz
Firestrike http://www.3dmark.com/fs/6802832
Firestrike Extreme: http://www.3dmark.com/fs/fs/6802501


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well considering that RadeonMod is new you wouldn't have seen this claim anywhere else really. You're welcome to download and use it yourself so you can see that ULPS in multi-GPU scenarios is not being disabled on all of them.


I just may do that.


----------



## fat4l

Quote:


> Originally Posted by *p4inkill3r*
> 
> 6700k @ 4.6/Fury X @1180/585MHz
> Firestrike http://www.3dmark.com/fs/6802832
> Firestrike Extreme: http://www.3dmark.com/fs/fs/6802501


Thank you very much









Observed results are :
FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS.


----------



## p4inkill3r

Quote:


> Originally Posted by *fat4l*
> 
> Thank you very much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Observed results are :
> FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
> FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS.


For completion's sake, Firestrike Ultra: http://www.3dmark.com/fs/6812511


----------



## JunkaDK

To all Fury owners









Just thought i would share my latest FS score. I unlocked all core's on my Asus R9 Fury STRIX and set a new #1 record for my config (i7-5930k + R9 Fury )









Here it is : http://www.3dmark.com/fs/6968466

I LOVE Tweaking, especially when it pays off









/Junka


----------



## fat4l

Quote:


> Originally Posted by *p4inkill3r*
> 
> For completion's sake, Firestrike Ultra: http://www.3dmark.com/fs/6812511


*2x 290X* vs *Fury X*
Both setups clocked high: 1200/1700 vs 1180/590 MHz.
2x 290X wins by ~40%

FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


----------



## Jflisk

Quote:


> Originally Posted by *fat4l*
> 
> *2x 290X* vs *Fury X*
> Both setups clocked high: 1200/1700 vs 1180/590 MHz.
> 2x 290X wins by ~40%
> 
> FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
> FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
> FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


Now try it FURY X x 2 against 2 x R9 290X . You will need to have 2 X FURY X to have a gain-with crossfire enabled.Against what you already have. Comes out to about 3 x R9 290X. Also less power used .Doesnt really matter thought any of the configurations above will lay shame to almost any game.


----------



## baii

but 390x have 8GB, throw that in 4k or ultrawide and compare again?


----------



## battleaxe

Quote:


> Originally Posted by *baii*
> 
> but 390x have 8GB, throw that in 4k or ultrawide and compare again?


Wont' make much diff on these tests. Only in game where more than 4Gb are used.


----------



## Noirgheos

Guys if I want to re-install Windows, but I lost my key, will selecting remove everything re-install Windows from within the OS work fine?

Before I do that, should I re-install Windows with a new CPU? Same MOBO.


----------



## fjordiales

Quote:


> Originally Posted by *Noirgheos*
> 
> Guys if I want to re-install Windows, but I lost my key, will selecting remove everything re-install Windows from within the OS work fine?
> 
> Before I do that, should I re-install Windows with a new CPU? Same MOBO.


Try this.

https://www.magicaljellybean.com/keyfinder/


----------



## ht_addict

Quote:


> Originally Posted by *Maximization*
> 
> just delivered... nice..............


Thinking of going the same route. Was there a significant drop in temperatures between the AIO and EKWB? Could you list the part numbers you ordered? Thanks


----------



## Maximization

about 5-10 degrees cooler at same load, I ordered wrong connector bridge you might want to call before ordering that. I went with EK brand, aquacomputer blocks were tempting me also.

EK-FC R9 Fury X Backplate - Black EK-FC-R9-FURYX-BP-BK

EK-FC R9 Fury X Water Block - Acetal EK-FC-R9-FURYX-CA


----------



## rv8000

Anyone having GPU usage spikes with FO4? Fresh 10 install, 15.12, and every other game works great. FPS is all over the place, freesync causes the screen to flash because it's being locked, and anything i've googled or searched for so far has lead me to no solution. Anyone running the game fine with single fury/fury x @ 1440p?


----------



## allofyourdreams

Hello,

I am looking for some feedback on Gigabyte R9 Fury WF3. I could not find any review or performance test anywhere, and am interested in buying one. Any owner of this card here, that can share some feedback?

Thank you


----------



## NBrock

Quote:


> Originally Posted by *rv8000*
> 
> Anyone having GPU usage spikes with FO4? Fresh 10 install, 15.12, and every other game works great. FPS is all over the place, freesync causes the screen to flash because it's being locked, and anything i've googled or searched for so far has lead me to no solution. Anyone running the game fine with single fury/fury x @ 1440p?


I have GPU usage issues in Fallout 4 as well. I logged it and the GPU clock is also not running full speed in game. I had to use ClockBlocker to get the game to run smoothly.


----------



## JunkaDK

Quote:


> Originally Posted by *JunkaDK*
> 
> To all Fury owners
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just thought i would share my latest FS score. I unlocked all core's on my Asus R9 Fury STRIX and set a new #1 record for my config (i7-5930k + R9 Fury )
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here it is : http://www.3dmark.com/fs/6968466
> 
> I LOVE Tweaking, especially when it pays off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> /Junka


After even more tweaking







NEW best score







http://www.3dmark.com/fs/6999418 : 15900 Points FS.


----------



## en9dmp

I have massive usage spikes as well, but could be because I'm forcing optimised 1x1 crossfire profile on my fury Xs. I get wild usage variations from 0-100 across both cards constantly. Still better performance than using one card tho.

Clock blocker is only useful where you can actually see the GPU clocks are throttling down. If the usage is fluctuating but the clocks are at the maximum then it won't help. I suspect it's more down to the game itself. So many people, including myself, having major issues getting a decent gaming experience with top end hardware.


----------



## Thoth420

Hey all, the monitor I am currently using to troubleshoot a problem (sig monitor is in for RMA perhaps refund) doesn't work with the mini dp to dp cable it came with...

Anyone using a cable they ordered from amazon that works on a Fury X with no issues?
I don't know what brands are good etc.

Monitor in question is a dell u2715h


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all, the monitor I am currently using to troubleshoot a problem (sig monitor is in for RMA perhaps refund) doesn't work with the mini dp to dp cable it came with...
> 
> Anyone using a cable they ordered from amazon that works on a Fury X with no issues?
> I don't know what brands are good etc.
> 
> Monitor in question is a dell u2715h


I'm using CableMatters DP cables with no issues


----------



## Thoth420

Quote:


> Originally Posted by *battleaxe*
> 
> I'm using CableMatters DP cables with no issues


Thanks I will order one overnight and give it a try.


----------



## Arizonian

*Sapphire Nitro 1175 MHz Core i7 4790K @ 4.6 GHz*

*3DMark11 15928*
*3Dmark11 Extreme 6453*

*FS 14026*
*FS Extreme 7519*
*FS Ultra 4099*

I have finally fully explored benching the core on my Fiji chip. I've pretty much met my match at 1175 MHz at least with +96mV added on air.

Gaming comfortably at 1150 MHz stable 58-60C.









I have not begun to look at memory overclocking but I'm not sure why I would need it with the memory bandwidth that I have already at stock clocks. Gaming, in the past memory overclocks seem to have minimal gains to higher FPS and come with higher temps that usually equivocate to lower Core.

I ended up crashing while benching at 1185 MHz on the core.

This thing is a beast gaming at 1440p producing lower temperatures than I've ever seen. It idles 33 Celsius without any fans! I like to run fans so mine idles usually a 26 Celsius at 30% fan speed.

Sapphire did an amazing job with their 8 layer PCB Fury and Tri-X fans on this Nitro with both acoustics and temperatures.


----------



## xer0h0ur

The HD series benefited more from overclocking the vRAM than the Hawaii or Fiji generation did. All due to how much bandwidth was necessary for the GPU versus how much was available.


----------



## battleaxe

I'm already bored. I wish the second gen Fury would come out with HBM2 already.









Nothing very interesting to spend my hard earned money on TBH


----------



## baii

Quote:


> Originally Posted by *Arizonian*
> 
> *Sapphire Nitro 1175 MHz Core i7 4790K @ 4.6 GHz*
> 
> *3DMark11 15928*
> *3Dmark11 Extreme 6453*
> 
> *FS 14026*
> *FS Extreme 7519*
> *FS Ultra 4099*
> 
> I have finally fully explored benching the core on my Fiji chip. I've pretty much met my match at 1175 MHz at least with +96mV added on air.
> 
> Gaming comfortably at 1150 MHz stable 58-60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not begun to look at memory overclocking but I'm not sure why I would need it with the memory bandwidth that I have already at stock clocks. Gaming, in the past memory overclocks seem to have minimal gains to higher FPS and come with higher temps that usually equivocate to lower Core.
> 
> I ended up crashing while benching at 1185 MHz on the core.
> 
> This thing is a beast gaming at 1440p producing lower temperatures than I've ever seen. It idles 33 Celsius without any fans! I like to run fans so mine idles usually a 26 Celsius at 30% fan speed.
> 
> Sapphire did an amazing job with their 8 layer PCB Fury and Tri-X fans on this Nitro with both acoustics and temperatures.


How high the fan go under load?


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> I'm already bored. I wish the second gen Fury would come out with HBM2 already.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing very interesting to spend my hard earned money on TBH


There won't be any 2nd generation Fiji card with HBM2. The only cards that will get it are the 2016 Arctic Islands generation. The only remaining Fiji card yet to be released is the FuryX2 which is still HBM1 4GB per GPU 8GB total.


----------



## battleaxe

Quote:


> Originally Posted by *xer0h0ur*
> 
> There won't be any 2nd generation Fiji card with HBM2. The only cards that will get it are the 2016 Arctic Islands generation. The only remaining Fiji card yet to be released is the FuryX2 which is still HBM1 4GB per GPU 8GB total.


Yeah, I know... wish it were here already.

I'm underwhelmed with this generation of cards from both Nvidia and AMD. Only about 10-15% better than what we have with the 390x. Kinda lame if you ask me. No, I'm not complaining. Just bored and wish the next gen were here already, this is just frustrating seeing the best scores of the 290x and 390x series being beaten by such a pathetically low margin.


----------



## Maximization

depends what your upgrading from, my 40" 4K is butter smooth now. it was a worthwhile upgrade till pcie-4 comes out and new system build. overclocking software needs to be better. the moment I unlock voltage I get instability unless I am doing it wrong, but scores go down also even when stable. best score I am getting in range.

http://www.3dmark.com/fs/7006396

if i add more voltage and more memory speed then scores go down.
I did upgrade bios, 4k was flickering , new bios seemed to take care of it. GPU scaling is definitely needed to be on.


----------



## Arizonian

Quote:


> Originally Posted by *baii*
> 
> How high the fan go under load?


You know I didn't check this last round so I ran 3Dmark 11 once more to see and it was 58C with 62% manual fan speed.

Also beat my last score and broke the mark I wanted *3Dmark11 16067*


----------



## Noirgheos

Does it make sense that I would drop below 60FPS at anytime on Battlefield 4 maxed out with 4xAA at 1080p? Sapphire Fury.

Went as low as 52 in some situations...


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> Does it make sense that I would drop below 60FPS at anytime on Battlefield 4 maxed out with 4xAA at 1080p? Sapphire Fury.
> 
> Went as low as 52 in some situations...


64 player map? Which map? Tons of stuff going on? What is the rest of your computer?

Dropping to 52 FPS isn't the end of the world.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> 64 player map? Which map? Tons of stuff going on? What is the rest of your computer?
> 
> Dropping to 52 FPS isn't the end of the world.


It kind of is with this card.

Siege of Shanghai, yes 64 players. was just after the tower collapsed and there was kind of a white haze. Running an i5 4670K which I'm upgrade to an i7 4790K in a few days. 16GB of 2400MHz (Yes I set it to XMP).

What disturbs me is that a guy running a 970 and the same settings as me was getting the same FPS for the most part... he had an i5 4460.


----------



## ht_addict

Did some playing around with Overclocking. Was able to hit 1175/570. Here are my FS Scores.

FS 1.1: http://www.3dmark.com/fs/7008151
FSX 1.1: http://www.3dmark.com/fs/7008522


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> It kind of is with this card.
> 
> Siege of Shanghai, yes 64 players. was just after the tower collapsed and there was kind of a white haze. Running an i5 4670K which I'm upgrade to an i7 4790K in a few days. 16GB of 2400MHz (Yes I set it to XMP).
> 
> What disturbs me is that a guy running a 970 and the same settings as me was getting the same FPS for the most part... he had an i5 4460.


If that is disturbing to you, so be it, but it sounds like you may have unrealistic expectations of what results you should be receiving. The difference between your Fury, his 970, your 4670k, and his 4460, especially at 1080p, is very small.


----------



## Maximization

Quote:


> Originally Posted by *ht_addict*
> 
> Did some playing around with Overclocking. Was able to hit 1175/570. Here are my FS Scores.
> 
> FS 1.1: http://www.3dmark.com/fs/7008151
> FSX 1.1: http://www.3dmark.com/fs/7008522


dynomite scores!!!


----------



## JonDuma

Happy New Year!

Is anyone else experiencing a low buzzing coil whine noise when idle with you Sapphire R9 Fury?
mine is not even OC.

thanks


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> If that is disturbing to you, so be it, but it sounds like you may have unrealistic expectations of what results you should be receiving. The difference between your Fury, his 970, your 4670k, and his 4460, especially at 1080p, is very small.


When benchmarks (who do say what map they use) get a minimum of 82FPS with the same settings, it worries me. Then again they all use i7s, so maybe my upgrade will help. I'm not even on Crimson yet thanks to that downclocking stuff.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> When benchmarks (who do say what map they use) get a minimum of 82FPS with the same settings, it worries me. Then again they all use i7s, so maybe my upgrade will help. I'm not even on Crimson yet thanks to that downclocking stuff.


https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/10.html

What is your average FPS? Just stating that you drop to 52 isn't enough information.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/10.html
> 
> What is your average FPS? Just stating that you drop to 52 isn't enough information.


Kind of hard to measure. I'd say high 70s to mid 80s. You think the i7 will make that much of a difference in this game? It seems to in multiplayer.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> Kind of hard to measure. I'd say high 70s to mid 80s. You think the i7 will make that much of a difference in this game? It seems to in multiplayer.


No, I doubt it will make that much of a difference.
If you want to get a handle on your performance, download 3DMark Firestrike, Unigine Heaven 4.0, and figure out where you stand.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> No, I doubt it will make that much of a difference.
> If you want to get a handle on your performance, download 3DMark Firestrike, Unigine Heaven 4.0, and figure out where you stand.


Alright, how much do you think my score will be lowered with i5? I know i7s do boost the score quite a bit, and the fact that I'm running 1000MHz on the card now, since I didn't get the OC one.


----------



## p4inkill3r

You'll have to run them and see, and don't be afraid to OC either you GPU or your CPU!


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> You'll have to run them and see, and don't be afraid to OC either you GPU or your CPU!


I'm not, I just can't be bothered to go into the BIOS and OC my CPU. My GPU, might as well bump it up.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> You'll have to run them and see, and don't be afraid to OC either you GPU or your CPU!





http://imgur.com/1RXFh7v


There you go. It seems the GPU OC did not apply, so it was 1000MHz, and my CPU was at stock. My graphics score is actually above what is reported by most sites, but my overall is lowered, most likely thanks to my CPU. Does this seem right to you?


----------



## p4inkill3r

You're leaving a lot of performance on the table, but that score looks like everything is functioning properly.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> You're leaving a lot of performance on the table, but that score looks like everything is functioning properly.


We'll see how much it increases when I put my i7 in and find a stable OC. Crimson will most likely boost it as well once they solve the downclocking.


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> We'll see how much it increases when I put my i7 in and find a stable OC. Crimson will most likely boost it as well once they solve the downclocking.


I have no issues with Crimson, downclocking or anything else.

If you want the highest frame rates,you have to overclock.


----------



## rv8000

Quote:


> Originally Posted by *JonDuma*
> 
> Happy New Year!
> 
> Is anyone else experiencing a low buzzing coil whine noise when idle with you Sapphire R9 Fury?
> mine is not even OC.
> 
> thanks


Yes, above ~75 FPS my coil whine is noticeable but such is the life of having a high end card, for now my quest for a coilwhineless card has ended.


----------



## Thoth420

I think I got a bad GPU








Seeing artifacting in games with everything in my system at stock clocks. I have tried 15.7, 15.11 and Crimson 15.12...persists regardless.


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*
> 
> I think I got a bad GPU
> 
> 
> 
> 
> 
> 
> 
> 
> Seeing artifacting in games with everything in my system at stock clocks. I have tried 15.7, 15.11 and Crimson 15.12...persists regardless.


Sounds like it.


----------



## Noirgheos

Quote:


> Originally Posted by *Thoth420*
> 
> I think I got a bad GPU
> 
> 
> 
> 
> 
> 
> 
> 
> Seeing artifacting in games with everything in my system at stock clocks. I have tried 15.7, 15.11 and Crimson 15.12...persists regardless.


Send it in. Looks like you got a bad one, sorry to say.


----------



## battleaxe

Quote:


> Originally Posted by *Thoth420*
> 
> I think I got a bad GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seeing artifacting in games with everything in my system at stock clocks. I have tried 15.7, 15.11 and Crimson 15.12...persists regardless.


RMA my friend, RMA... no way that's right or normal.


----------



## dagget3450

Quote:


> Originally Posted by *p4inkill3r*
> 
> I have no issues with Crimson, downclocking or anything else.
> 
> If you want the highest frame rates,you have to overclock.


I know you told me at one point crimson doesn't down clock for you. Almost like you implied i was full of it in your response. I have also seen you do this to other people. If its working good for you that is great. Despite what you believe outside of your own experience, other people are having clock issues with crimson on fury. I am sorry but we shouldn't need a third party app to keep clocks from dropping in games. I hope they fix it soon but it is really an issue for many people.


----------



## p4inkill3r

Quote:


> Originally Posted by *dagget3450*
> 
> I know you told me at one point crimson doesn't down clock for you. Almost like you implied i was full of it in your response. I have also seen you do this to other people. If its working good for you that is great. Despite what you believe outside of your own experience, other people are having clock issues with crimson on fury. I am sorry but we shouldn't need a third party app to keep clocks from dropping in games. I hope they fix it soon but it is really an issue for many people.


If I thought you were full of it, I would have told you that. I don't know what thread or topic you're talking about in specific, but the issue is by no means ubiquitous.


----------



## xer0h0ur

Whenever I see some people reporting issues that the majority don't experience I can't help but think that they would benefit from doing a manual registry wipe following BradleyW's guide to remove registry keys left behind that DDU and the AMD uninstaller don't remove.


----------



## Maximization

Speaking as a 4K user, my monitor actually had software itself and a profile that had to be installed. With the more technology that is more bleeding edge you will have more problems. More tweeking and troubleshooting for a good experience. For example I actually had to set my monitor up for 1:1 for better usage and where GPU scaling was not needed.



With Thoth420 problem maybe better to explore all software avenues first.


----------



## Thoth420

Quote:


> Originally Posted by *Maximization*
> 
> Speaking as a 4K user, my monitor actually had software itself and a profile that had to be installed. With the more technology that is more bleeding edge you will have more problems. More tweeking and troubleshooting for a good experience. For example I actually had to set my monitor up for 1:1 for better usage and where GPU scaling was not needed.
> 
> 
> 
> With Thoth420 problem maybe better to explore all software avenues first.


I am trying to iron out software first. The problems only occur in some games not all. At least so far.


----------



## Maximization

Quote:


> Originally Posted by *Thoth420*
> 
> I am trying to iron out software first. The problems only occur in some games not all. At least so far.


I noticed something else once all is set up. AMD is stacking its fury x ram at least. I have 64GB system ram. So probably I am limited to overclocking and stabilizing all this ram ...


----------



## JonDuma

Quote:


> Originally Posted by *rv8000*
> 
> Yes, above ~75 FPS my coil whine is noticeable but such is the life of having a high end card, for now my quest for a coilwhineless card has ended.


Hi and thanks, but the low buzz coilwhine is when idle.
I understand when gaming or on load its perfectly fine, the only problem is if i will RMA then i will spend international shipping. otherwise the cards works perfectly.

http://www.3dmark.com/compare/fs/6752244/fs/6751497#


----------



## JunkaDK

Quote:


> Originally Posted by *Arizonian*
> 
> *Sapphire Nitro 1175 MHz Core i7 4790K @ 4.6 GHz*
> 
> *3DMark11 15928*
> *3Dmark11 Extreme 6453*
> 
> *FS 14026*
> *FS Extreme 7519*
> *FS Ultra 4099*
> 
> I have finally fully explored benching the core on my Fiji chip. I've pretty much met my match at 1175 MHz at least with +96mV added on air.
> 
> Gaming comfortably at 1150 MHz stable 58-60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not begun to look at memory overclocking but I'm not sure why I would need it with the memory bandwidth that I have already at stock clocks. Gaming, in the past memory overclocks seem to have minimal gains to higher FPS and come with higher temps that usually equivocate to lower Core.
> 
> I ended up crashing while benching at 1185 MHz on the core.
> 
> This thing is a beast gaming at 1440p producing lower temperatures than I've ever seen. It idles 33 Celsius without any fans! I like to run fans so mine idles usually a 26 Celsius at 30% fan speed.
> 
> Sapphire did an amazing job with their 8 layer PCB Fury and Tri-X fans on this Nitro with both acoustics and temperatures.


Why not add 60mhz to The RAM? Im running at 565mhz ?


----------



## Arizonian

Quote:


> Originally Posted by *JunkaDK*
> 
> Why not add 60mhz to The RAM? Im running at 565mhz ?


I just don't see much benefit to memory over clock to increase FPS in games as I do over clocking Core. Still might tinker with it. though


----------



## Noirgheos

Quote:


> Originally Posted by *Arizonian*
> 
> I just don't see much benefit to memory over clock to increase FPS in games as I do over clocking Core. Still might tinker with it. though


In Bethseda games, OCing VRAM really helps with frametime and average FPS.


----------



## ht_addict

Quote:


> Originally Posted by *Maximization*
> 
> about 5-10 degrees cooler at same load, I ordered wrong connector bridge you might want to call before ordering that. I went with EK brand, aquacomputer blocks were tempting me also.
> 
> EK-FC R9 Fury X Backplate - Black EK-FC-R9-FURYX-BP-BK
> 
> EK-FC R9 Fury X Water Block - Acetal EK-FC-R9-FURYX-CA


With the temp drop, what are your settings when overclocking


----------



## Maximization

Quote:


> Originally Posted by *ht_addict*
> 
> With the temp drop, what are your settings when overclocking


4.6 on cpu 1250 on gpu


----------



## Tobiman

I'm interested in the white Fury Nano by ASUS. Will be picking one up in about a months time. Just started saving.







Still not sure, if I should just go for a Fury or Fury X instead. I like the power savings on the nano and the color but I don't need it and still feel the Fury X might be a better choice, if all things are considered. And why does XFX have a water cooled Fury that costs just as much as a Fury X?...lol


----------



## Thoth420

Yep a dead plumbed in and blocked fury x...lasted 5 days...wooooo :\

Never even OC'd it won't even work right at stock.


----------



## Maximization

Quote:


> Originally Posted by *Thoth420*
> 
> Yep a dead plumbed in and blocked fury x...lasted 5 days...wooooo :\
> 
> Never even OC'd it won't even work right at stock.


RA that SOB


----------



## Thoth420

Quote:


> Originally Posted by *Maximization*
> 
> RA that SOB


That's the plan.
Sigh....I wanted to finally play Witcher 3.


----------



## JonDuma

Quote:


> Originally Posted by *ht_addict*
> 
> Did some playing around with Overclocking. Was able to hit 1175/570. Here are my FS Scores.
> 
> FS 1.1: http://www.3dmark.com/fs/7008151
> FSX 1.1: http://www.3dmark.com/fs/7008522


what is your GPU Voltage for 1175/570?


----------



## ht_addict

Quote:


> Originally Posted by *JonDuma*
> 
> what is your GPU Voltage for 1175/570?


Using Trixx, I upped the voltage to +75mv and max on power level.


----------



## fat4l

Quote:


> Originally Posted by *ht_addict*
> 
> Did some playing around with Overclocking. Was able to hit 1175/570. Here are my FS Scores.
> 
> FS 1.1: http://www.3dmark.com/fs/7008151
> FSX 1.1: http://www.3dmark.com/fs/7008522


Huh...
2x Fury X vs 2x 290X and u win by only 4 % in GS? 

http://www.3dmark.com/compare/fs/7008151/fs/7019395

I would recommend going intel mate


----------



## p4inkill3r

Quote:


> Originally Posted by *fat4l*
> 
> Huh...
> 2x Fury X vs 2x 290X and u win by only 4 % in GS?
> 
> http://www.3dmark.com/compare/fs/7008151/fs/7019395
> 
> I would recommend going intel mate


He'd get higher scores in benchmarks, sure, but in most gaming scenarios, he's not going to see a discernible difference.


----------



## Noirgheos

Does anyone get quite a bit of stuttering in Witcher 3 on 15.11.1 CCC?

Getting really annoying now...


----------



## nickcnse

Hey guys,hope this is the right spot to ask. Just got my second r9 fury x added into my system running them both with EK blocks/back plates and connected by an EK parallel bridge. I've tested both of these cards before installing into my system both were working fine individually. Now that I have installed both cards into my computer I am unable to run any graphically intensive program. Any time I run a game or a graphics benchmark (3d Mark) my computer crashes. I have tried both cross-fire enabled and disabled, setting my computer back to stock clocks, with an overclock, and ran a memory test to rule that out as well. What would you guys suggest I test next? And is there anyway to fully disable one card without taking it out of my computer so I can see which card is causing the instability? Thanks.

System Info:

I7 5820k
MSI X99s Krait edition
G.Skill Ripjaw series DDR 4 2400
LEPA g1200
Samsung PRO 128gb ssd

EDIT:

Well I've been trouble shooting this for about a week now and finally when I break down to ask a question in the forum I fix it. I've done multiple driver sweeps but didn't realize it reset the ULPS states every time I was doing so. After a latest driver sweep and disabling all ULPS settings my computer looks to be running smoothly. Thanks anyways everyone!


----------



## MrKoala

Even though this was not the problem in your case, when adding new hardware to an existing system and it crashes under load, check PSU first. The PSU may be rated to deliver enough power, but they can go wrong.


----------



## battleaxe

Quote:


> Originally Posted by *nickcnse*
> 
> Hey guys,hope this is the right spot to ask. Just got my second r9 fury x added into my system running them both with EK blocks/back plates and connected by an EK parallel bridge. I've tested both of these cards before installing into my system both were working fine individually. Now that I have installed both cards into my computer I am unable to run any graphically intensive program. Any time I run a game or a graphics benchmark (3d Mark) my computer crashes. I have tried both cross-fire enabled and disabled, setting my computer back to stock clocks, with an overclock, and ran a memory test to rule that out as well. What would you guys suggest I test next? And is there anyway to fully disable one card without taking it out of my computer so I can see which card is causing the instability? Thanks.
> 
> System Info:
> 
> I7 5820k
> MSI X99s Krait edition
> G.Skill Ripjaw series DDR 4 2400
> LEPA g1200
> Samsung PRO 128gb ssd
> 
> EDIT:
> 
> Well I've been trouble shooting this for about a week now and finally when I break down to ask a question in the forum I fix it. I've done multiple driver sweeps but didn't realize it reset the ULPS states every time I was doing so. After a latest driver sweep and disabling all ULPS settings my computer looks to be running smoothly. Thanks anyways everyone!


Glad you got it figured. Sometimes that's just what it takes, doing it yourself. lol


----------



## MalsBrownCoat

Hey guys, cross posting this, but figured I might have better luck with your expertise in this thread.

I'm running an ASUS Crosshair V Formula Z, with 2 ASUS Fury X's.

Haven't had any issues in a while, but today, I lost power to my condo, twice. The first time, I rebooted back up and everything was just fine.
The second time, I rebooted to find that I had nothing on any of my 3 screens, and no signal was found for any of the monitors.

I took a look at my motherboard and I saw that it's showing an error "b2" and a lit "boot device LED".
Let me repeat that NOTHING has changed. The PC did not move, so nothing became unseated. And the power cables were working just fine a few moments ago.

I also noticed that one of my GPUs, in the first PCIE slot, is running with all of its power LEDs solid red. This GPU runs all 3 of my monitors.

Underneath it, in PCIE slot 2, is another Fury X. Which currently has NO LED activity on it. Not even one.

So, basic troubleshooting 101 here.

I unplugged the power cables from GPU 1 (in PCIE slot 1) and left the power cables in GPU2 (PCIE slot 2).
I also unplugged all monitors from GPU1 and then plugged a single monitor into GPU2.
Rebooted.

No Q code. No motherboard LED errors. The monitor that is plugged in displays Windows, the GPU power LEDS operate as normal and booting seems to work.

I then reversed the process and unplugged the power from GPU 2 and put power and a single monitor on GPU 1.

Same problem as before. Error "b2", a lit "boot device LED" and full power usage on the GPU LEDs.

Now, I can't really swap GPU 1 with GPU 2 and try the same scenario, because the entire system is watercooled, with custom hard lines and EK plates. This isn't just something I can pull out and switch around. So I can't test if it's GPU 1, or the PCIE slot that it's on.

Does it sound one or the other is fried? And how in the hell would that happen? Why wouldn't anything else have been affected?
Also to note, that I'm running my system through a pretty hefty surge protector, so as far as the system (should be) concerned; it just no longer had power.


----------



## Maximization

Quote:


> Originally Posted by *Noirgheos*
> 
> Does anyone get quite a bit of stuttering in Witcher 3 on 15.11.1 CCC?
> 
> Getting really annoying now...


turn off surface format optimization, see if it helps, for some reason it fixed everything in skyrim but that is an allot older game


----------



## the9quad

Quote:


> Originally Posted by *Noirgheos*
> 
> Does anyone get quite a bit of stuttering in Witcher 3 on 15.11.1 CCC?
> 
> Getting really annoying now...


How many GPU's do you have? I know it stutters really bad with 3 gpu's on in my rig, but is smooth with only two running.


----------



## Noirgheos

Quote:


> Originally Posted by *the9quad*
> 
> How many GPU's do you have? I know it stutters really bad with 3 gpu's on in my rig, but is smooth with only two running.


Just one Fury.


----------



## Noirgheos

Quote:


> Originally Posted by *Maximization*
> 
> turn off surface format optimization, see if it helps, for some reason it fixed everything in skyrim but that is an allot older game


Will try tomorrow!


----------



## ht_addict

Quote:


> Originally Posted by *fat4l*
> 
> Huh...
> 2x Fury X vs 2x 290X and u win by only 4 % in GS?
> 
> http://www.3dmark.com/compare/fs/7008151/fs/7019395
> 
> I would recommend going intel mate


What would you recommend? Is it really worth the $$$ to have Intel over AMD? Talking twice the price if not more.


----------



## dagget3450

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Hey guys, cross posting this, but figured I might have better luck with your expertise in this thread.
> 
> I'm running an ASUS Crosshair V Formula Z, with 2 ASUS Fury X's.
> 
> Haven't had any issues in a while, but today, I lost power to my condo, twice. The first time, I rebooted back up and everything was just fine.
> The second time, I rebooted to find that I had nothing on any of my 3 screens, and no signal was found for any of the monitors.
> 
> I took a look at my motherboard and I saw that it's showing an error "b2" and a lit "boot device LED".
> Let me repeat that NOTHING has changed. The PC did not move, so nothing became unseated. And the power cables were working just fine a few moments ago.
> 
> I also noticed that one of my GPUs, in the first PCIE slot, is running with all of its power LEDs solid red. This GPU runs all 3 of my monitors.
> 
> Underneath it, in PCIE slot 2, is another Fury X. Which currently has NO LED activity on it. Not even one.
> 
> So, basic troubleshooting 101 here.
> 
> I unplugged the power cables from GPU 1 (in PCIE slot 1) and left the power cables in GPU2 (PCIE slot 2).
> I also unplugged all monitors from GPU1 and then plugged a single monitor into GPU2.
> Rebooted.
> 
> No Q code. No motherboard LED errors. The monitor that is plugged in displays Windows, the GPU power LEDS operate as normal and booting seems to work.
> 
> I then reversed the process and unplugged the power from GPU 2 and put power and a single monitor on GPU 1.
> 
> Same problem as before. Error "b2", a lit "boot device LED" and full power usage on the GPU LEDs.
> 
> Now, I can't really swap GPU 1 with GPU 2 and try the same scenario, because the entire system is watercooled, with custom hard lines and EK plates. This isn't just something I can pull out and switch around. So I can't test if it's GPU 1, or the PCIE slot that it's on.
> 
> Does it sound one or the other is fried? And how in the hell would that happen? Why wouldn't anything else have been affected?
> Also to note, that I'm running my system through a pretty hefty surge protector, so as far as the system (should be) concerned; it just no longer had power.


Not sure what all you tried but power failures almost always break power supplies first. Its possible it took more out than itself but you need to fully vet the psu. Maybe use a psu tester or maybe meter probe voltages. If you have a spare psu try it as well. I would say its rather rare to take out hardware on power failure as most psus nowdays have decent protection. Typically they just die and it could be a rail or the unit as a whole cant supply proper load now.

Good luck and i hope its just a psu.


----------



## MalsBrownCoat

Quote:


> Originally Posted by *dagget3450*
> 
> Not sure what all you tried but power failures almost always break power supplies first. Its possible it took more out than itself but you need to fully vet the psu.


Appreciate the input, and it's a sound notion, but this is absolutely not a psu issue. As I mentioned, I validated against that by swapping the power cables between the two GPUs. Each set of cables works fine on GPU 2, but if any power is applied to GPU 1 (in PCIE slot 1), no boot is achieved.

Having hard lines makes swapping any cards around a very difficult process to endure. Especially when there is a strong hesitation to put any pressure on them whatsoever (for fear of cracking). = /

My basic sense of logic is stabbing me with the possibility of one of two things;

Either GPU 1 may be fried, or PCIE slot 1 is.

I'm just baffled at how either of those scenarios could have occurred; especially with a strong surge protector and an AX1200i.


----------



## battleaxe

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Appreciate the input, and it's a sound notion, but this is absolutely not a psu issue. As I mentioned, I validated against that by swapping the power cables between the two GPUs. Each set of cables works fine on GPU 2, but if any power is applied to GPU 1 (in PCIE slot 1), no boot is achieved.
> 
> Having hard lines makes swapping any cards around a very difficult process to endure. Especially when there is a strong hesitation to put any pressure on them whatsoever (for fear of cracking). = /
> 
> My basic sense of logic is stabbing me with the possibility of one of two things;
> 
> Either GPU 1 may be fried, or PCIE slot 1 is.
> 
> I'm just baffled at how either of those scenarios could have occurred; especially with a strong surge protector and an AX1200i.


I had this happen about a year ago and had to RMA the board. ASUS sent me a motherboard because it was toast. That being said, you will eventually want to rule out the GPU in the first slot by switching them. A pain for sure, but worth a shot.


----------



## MalsBrownCoat

Thankfully, both the board and the card are relatively new. So RMA'ing shouldn't be much of a problem.

If it's the card, it's going to be such a pain to remove the EK block and put it back to stock for return. And, hopefully ASUS won't pull their typical "we don't have any more of these in stock, so we'd like to send you a Fury (non X)" fxckery.

If it's the board....well, honestly, I'm a bit underwhelmed on the performance of using an AMD chip. At this point, I'd consider getting a replacement Crosshair from ASUS, but not installing it. Instead, I'd probably sell it and try out the new Maximus VIII Formula and go with an Intel chip.

*sigh*


----------



## Maximization

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Thankfully, both the board and the card are relatively new. So RMA'ing shouldn't be much of a problem.
> 
> If it's the card, it's going to be such a pain to remove the EK block and put it back to stock for return. And, hopefully ASUS won't pull their typical "we don't have any more of these in stock, so we'd like to send you a Fury (non X)" fxckery.
> 
> If it's the board....well, honestly, I'm a bit underwhelmed on the performance of using an AMD chip. At this point, I'd consider getting a replacement Crosshair from ASUS, but not installing it. Instead, I'd probably sell it and try out the new Maximus VIII Formula and go with an Intel chip.
> 
> *sigh*


they have little stickers on the stock rear gpu bracket you might need to make sure that is back in place, I notice them when I put on my blocks. I don't know if they check it


----------



## Noirgheos

Does anyone here not have stuttering in games with VSYNC on with Crimson?


----------



## dagget3450

Quote:


> Originally Posted by *Noirgheos*
> 
> Does anyone here not have stuttering in games with VSYNC on with Crimson?


I did, i couldn't figure out how to fix it. If you monitor your gpu clocks while in game with vsynch you will see your clocks jump all over the place.i just rolled back to pre crimson drivers and no issue. I am trying to be patient with crimson but at this point i am tempted to just give up on AMD for a while. I have had nothing but trouble with Fury
Multi gpu. NVIDIA has their hands in all the new titles, and AMD can't seem to help themselves on a basic level.
I think its time to take a break from pc gaming for me cause i am surely not going to support Nvidia either. My R9s weren't perfect but definitely worked better. Two biggest issues i have with FuryX is garbage overclocking and multi gpu hell with drivers.


----------



## 98uk

Was there ever any fix for the clocks dropping in some games? I had it specifically in BF4 where it would drop from 1000mhz to around 300mhz, thus causing stuttering.

I haven't been able to test the newest 15.12 drivers yet, but not sure whether it's best to try them, or just start with a clean re-install.


----------



## Maximization

Quote:


> Originally Posted by *dagget3450*
> 
> I did, i couldn't figure out how to fix it. If you monitor your gpu clocks while in game with vsynch you will see your clocks jump all over the place.i just rolled back to pre crimson drivers and no issue. I am trying to be patient with crimson but at this point i am tempted to just give up on AMD for a while. I have had nothing but trouble with Fury
> Multi gpu. NVIDIA has their hands in all the new titles, and AMD can't seem to help themselves on a basic level.
> I think its time to take a break from pc gaming for me cause i am surely not going to support Nvidia either. My R9s weren't perfect but definitely worked better. Two biggest issues i have with FuryX is garbage overclocking and multi gpu hell with drivers.


it was easier overclocking my 7870's but of course they were still not as fast. I have noticed too much voltage I get instability, best I can do in fury x cfx is +36mV and +50% on power limit. some software does not like HBM im thinking. have to work on custom profiles. i cant say it is a bad experience everything seems butter smooth. in display port 2 everything seems fine.

i can't go back to NVidia ether. AMD never disabled cards in software like NVidia did with Ageia. i was allot younger then and that was money that they just threw away.


----------



## Maximization

Quote:


> Originally Posted by *98uk*
> 
> Was there ever any fix for the clocks dropping in some games? I had it specifically in BF4 where it would drop from 1000mhz to around 300mhz, thus causing stuttering.
> 
> I haven't been able to test the newest 15.12 drivers yet, but not sure whether it's best to try them, or just start with a clean re-install.


that's throttling i think caused by heat or too much voltage. there is built in safguards to protect the card.


----------



## 98uk

Quote:


> Originally Posted by *Maximization*
> 
> that's throttling i think caused by heat or too much voltage. there is built in safguards to protect the card.


I don't think so, doesn't happen in other games. Seems related to 2d/3d profiles and the stuttering only occurs every few minutes, not constant.

Temps aren't over 60c at load either.


----------



## AliNT77

Quote:


> Originally Posted by *98uk*
> 
> I don't think so, doesn't happen in other games. Seems related to 2d/3d profiles and the stuttering only occurs every few minutes, not constant.
> 
> Temps aren't over 60c at load either.


I had your problem when my r9 290's VRM temp reached 120c

Check VRM temps


----------



## fat4l

Quote:


> Originally Posted by *98uk*
> 
> I don't think so, doesn't happen in other games. Seems related to 2d/3d profiles and the stuttering only occurs every few minutes, not constant.
> 
> Temps aren't over 60c at load either.


try clockblocker


----------



## OGBeandip

Well guys I may be joining you soon. Im highly considering selling my titan xs and switching to 2 Fury Xs, or 2 Fury X 2s.

Im not familiar with AMD setups though and havent bought an AMD card in a long time. How is crossfire scaling and does running 4 gpus have the same issues it does with nvidia? Is custom BIOS overclocking a thing on these AMD cards or do people stick to stock?

Ill be running them on a 1080 radiator so coolings no problem.


----------



## huzzug

May I ask why would you wanna change from TitanX's to FuryX's ? They perform close to each other at the top end so, at most it may be a side grade in terms of performance.


----------



## OGBeandip

Quote:


> Originally Posted by *huzzug*
> 
> May I ask why would you wanna change from TitanX's to FuryX's ? They perform close to each other at the top end so, at most it may be a side grade in terms of performance.


Honestly, its primarily politics. I dont agree with a lot of nvidias business practices and would like to support AMD, even moreso now that they compete so closely at lower prices.


----------



## Tobiman

From what i'm seeing in benchmarks, you'll want to stick to two cards and that works in like 70% of all games.


----------



## eqc6

Has anyone here switched from 295x2 + 290x trifler to dual fury Xs? If so, how was the performance increase?


----------



## OGBeandip

Quote:


> Originally Posted by *Tobiman*
> 
> From what i'm seeing in benchmarks, you'll want to stick to two cards and that works in like 70% of all games.


2 cards or 2 GPU? Crossfire Fury X2 is only 2 cards but its quad gpu.


----------



## p4inkill3r

Quote:


> Originally Posted by *OGBeandip*
> 
> 2 cards or 2 GPU? Crossfire Fury X2 is only 2 cards but its quad gpu.


Tri/Quadfire setups are rarely (if ever) fully utilized, same with Tri-SLI.
Fury Gemini or Fury X Crossfire would be the ideal setup IMO.


----------



## xer0h0ur

Quote:


> Originally Posted by *eqc6*
> 
> Has anyone here switched from 295x2 + 290x trifler to dual fury Xs? If so, how was the performance increase?


Assuming tri-fire and crossfire is working between both setups then performance is near identical without bringing overclocking into the mix. Once overclocked the waters get muddied so I can't comment there.


----------



## OGBeandip

Quote:


> Originally Posted by *p4inkill3r*
> 
> Tri/Quadfire setups are rarely (if ever) fully utilized, same with Tri-SLI.
> Fury Gemini or Fury X Crossfire would be the ideal setup IMO.


Im aware of the low utilization issues. Im more concerned with actual visual issues becoming a problem. A lot of people who run tri sli claim microstuttering and visual glitches are quite common. Im curious if this is a trend on AMDs xfire as well.

As far as performance goes I know the third and fourth cards are rarely worthy upgrades.


----------



## p4inkill3r

Quote:


> Originally Posted by *OGBeandip*
> 
> Im aware of the low utilization issues. Im more concerned with actual visual issues becoming a problem. A lot of people who run tri sli claim microstuttering and visual glitches are quite common. Im curious if this is a trend on AMDs xfire as well.
> 
> As far as performance goes I know the third and fourth cards are rarely worthy upgrades.


http://www.legitreviews.com/amd-radeon-r9-fury-x-4-way-crossfire-setup-benchmarked_167338


----------



## OGBeandip

Quote:


> Originally Posted by *p4inkill3r*
> 
> http://www.legitreviews.com/amd-radeon-r9-fury-x-4-way-crossfire-setup-benchmarked_167338


Good read on scaling. Thanks.


----------



## eqc6

Assuming crossfire works for both setup, the dual Fury X should still run a little better and cooler you think? Right now my 295x2 exhaust alot of heat through the radaitor and I'll be happy if the Fury X would exhaust less heat.


----------



## xer0h0ur

Well yeah, If you're trying to compare a 295X2's heat output to a single Fury X then there is going to be a clear difference in how much hot air is coming from the radiator/fan. If you're comparing it to dual Fury X's, you shouldn't exactly get very much of a difference there. The 295X2's GPUs were hotter but the AIO cooler was only cooling those dies while on the Fury X the AIO is cooling the die and the VRMs. Toss a 2nd card into the mix and you're dealing with about as much heat.


----------



## eqc6

Quote:


> Originally Posted by *xer0h0ur*
> 
> Well yeah, If you're trying to compare a 295X2's heat output to a single Fury X then there is going to be a clear difference in how much hot air is coming from the radiator/fan. If you're comparing it to dual Fury X's, you shouldn't exactly get very much of a difference there. The 295X2's GPUs were hotter but the AIO cooler was only cooling those dies while on the Fury X the AIO is cooling the die and the VRMs. Toss a 2nd card into the mix and you're dealing with about as much heat.


Well damnit, lol. I have 2 unopened Fury X sitting in my desk and I'm struggling to decide if I should try them out or just return them for a refund.


----------



## xer0h0ur

I mean it still should be less heat but not by a huge margin.


----------



## eqc6

also, on Witcher 3, all my 3 gpus are not being fully utilized, and the best I can do at 4k is 30fps with hairworks and AA off. But when I read some of the benchmarks with dual Fury X, they seem to perform way better than my setup. Is it possible that crossfire runs better than trifire? This is one of the reasons I considered dual Fury Xs.


----------



## fjordiales

Quote:


> Originally Posted by *eqc6*
> 
> also, on Witcher 3, all my 3 gpus are not being fully utilized, and the best I can do at 4k is 30fps with hairworks and AA off. But when I read some of the benchmarks with dual Fury X, they seem to perform way better than my setup. Is it possible that crossfire runs better than trifire? This is one of the reasons I considered dual Fury Xs.


Witcher 3 doesn't support trifire(yet). Had the same issue with my setup. I even e-mailed the AMD Customer Support about it and the reply I got was from a level 2 tech. Level 2 tech has quadfire and had to disable to crossfire for certain games.

For me it's easier now with Crimson to change from 3 gpu to 2 gpu but Still... Max Payne 3 and Tomb Raider utilize trifire. I think Mordor too...


----------



## eqc6

Quote:


> Originally Posted by *fjordiales*
> 
> Witcher 3 doesn't support trifire(yet). Had the same issue with my setup. I even e-mailed the AMD Customer Support about it and the reply I got was from a level 2 tech. Level 2 tech has quadfire and had to disable to crossfire for certain games.
> 
> For me it's easier now with Crimson to change from 3 gpu to 2 gpu but Still... Max Payne 3 and Tomb Raider utilize trifire. I think Mordor too...


wait...so Witcher supportts crossfire but not trifire?

And yes, the other games you listed supports trifire since I played all those in 4k.


----------



## The Stilt

Could someone with a Fury X or a fully unlockable, reference PCB (Fury X PCB) based Fury card try the bios I attached? Please use CUInfo to dump the information, prior and after the bios flash. In the absolute worst case the bios will render the card as unbootable, however it will cause no permanent damage. Also the secondary bios behind the switch will make it extremely easy to recover.

Thanks


----------



## fjordiales

Quote:


> Originally Posted by *eqc6*
> 
> wait...so Witcher supportts crossfire but not trifire?
> 
> And yes, the other games you listed supports trifire since I played all those in 4k.


Witcher 3 actually has negative scaling in trifire. Performs worse with 3+ cards compared to 2.


----------



## eqc6

Quote:


> Originally Posted by *fjordiales*
> 
> Witcher 3 actually has negative scaling in trifire. Performs worse with 3+ cards compared to 2.


Really? so this would explain why it runs so crappy on my system. Every game that works in crossfire runs great on my trifire so this would be the first game that it runs poorly. You're making me reconsider returning my dual Fury Xs lol. It's really a shame of all the games out there, it's the one I want to play the most that has poor performance, and it looks amazing at 4k too.


----------



## fat4l

Quote:


> Originally Posted by *The Stilt*
> 
> Could someone with a Fury X or a fully unlockable, reference PCB (Fury X PCB) based Fury card try the bios I attached? Please use CUInfo to dump the information, prior and after the bios flash. In the absolute worst case the bios will render the card as unbootable, however it will cause no permanent damage. Also the secondary bios behind the switch will make it extremely easy to recover.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FIJI4-8.zip 102k .zip file


Are you planning on something bigger ?









Cmon ppl do it. If I had Fury X I would!


----------



## p4inkill3r

Quote:


> Originally Posted by *fat4l*
> 
> Are you planning on something bigger ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cmon ppl do it. If I had Fury X I would!


I'm scared.


----------



## huzzug

Said no enthusiast ever


----------



## p4inkill3r

Quote:


> Originally Posted by *huzzug*
> 
> Said no enthusiast ever











I hate flashing BIOSes, even from someone as well regarded as Stilt.


----------



## fjordiales

Quote:


> Originally Posted by *eqc6*
> 
> Really? so this would explain why it runs so crappy on my system. Every game that works in crossfire runs great on my trifire so this would be the first game that it runs poorly. You're making me reconsider returning my dual Fury Xs lol. It's really a shame of all the games out there, it's the one I want to play the most that has poor performance, and it looks amazing at 4k too.


You should be good with dual fury x. I have triple fury strix unlocked to 4032 and I deactivate the middle card since its the hottest card in trifire. When I play games with great trifire scaling, then I activate all 3.


----------



## dagget3450

Quote:


> Originally Posted by *The Stilt*
> 
> Could someone with a Fury X or a fully unlockable, reference PCB (Fury X PCB) based Fury card try the bios I attached? Please use CUInfo to dump the information, prior and after the bios flash. In the absolute worst case the bios will render the card as unbootable, however it will cause no permanent damage. Also the secondary bios behind the switch will make it extremely easy to recover.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FIJI4-8.zip 102k .zip file


What does this bios do? I might consider it if i knew more?


----------



## Maximization

Quote:


> Originally Posted by *The Stilt*
> 
> Could someone with a Fury X or a fully unlockable, reference PCB (Fury X PCB) based Fury card try the bios I attached? Please use CUInfo to dump the information, prior and after the bios flash. In the absolute worst case the bios will render the card as unbootable, however it will cause no permanent damage. Also the secondary bios behind the switch will make it extremely easy to recover.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FIJI4-8.zip 102k .zip file


tempting


----------



## fat4l

Quote:


> Originally Posted by *dagget3450*
> 
> What does this bios do? I might consider it if i knew more?


Improve performance by 100%









(Lie)


----------



## The Stilt

Quote:


> Originally Posted by *dagget3450*
> 
> What does this bios do? I might consider it if i knew more?


I added some custom code to control the shader engine arrays at higher precision. It is useless for people who use Fury X or fully unlockable Fury, however it should be able to restore significantly more units from the harvested cores with actual defects in them. For Fiji PRO, AMD always disables 2 compute units from each shader engine (4 x 2 = 8). In most cases not all of the units are actually defective, but they are just disabled to produce an even number of units in each engine. Currently the unlocking method doesn´t allow to unlock a certain shader in a certain shader engine.


----------



## ozyo

soon







Quote:


> Originally Posted by *The Stilt*
> 
> I added some custom code to control the shader engine arrays at higher precision. It is useless for people who use Fury X or fully unlockable Fury, however it should be able to restore significantly more units from the harvested cores with actual defects in them. For Fiji PRO, AMD always disables 2 compute units from each shader engine (4 x 2 = 8). In most cases not all of the units are actually defective, but they are just disabled to produce an even number of units in each engine. Currently the unlocking method doesn´t allow to unlock a certain shader in a certain shader engine.


well why not I will try it


----------



## battleaxe

Quote:


> Originally Posted by *The Stilt*
> 
> I added some custom code to control the shader engine arrays at higher precision. It is useless for people who use Fury X or fully unlockable Fury, however it should be able to restore significantly more units from the harvested cores with actual defects in them. For Fiji PRO, AMD always disables 2 compute units from each shader engine (4 x 2 = 8). In most cases not all of the units are actually defective, but they are just disabled to produce an even number of units in each engine. Currently the unlocking method doesn´t allow to unlock a certain shader in a certain shader engine.


That... is pretty cool sir...


----------



## Noirgheos

How does the BIOS work guys? Also, has 16.1 solved downclocking in games for people?


----------



## Joselotek

Hello My r9 fury nitro say Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00030001 / 00000000 [..............xx]
SE2 hw/sw: 00050001 / 00000000 [.............x.x]
SE3 hw/sw: 00030001 / 00000000 [..............xx]
SE4 hw/sw: 00030001 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/O) / SW locks: 0 (R/W).
Sorry, all 8 disabled CUs can't be unlocked by BIOS replacement. is there any way to unlock any shaders?


----------



## ozyo

@The Stilt i did it
http://www.techpowerup.com/gpuz/details.php?id=c8b8g

stock bios
http://www.3dmark.com/3dm/10174176?
FIJI4-8 bios
http://www.3dmark.com/3dm/10174053?
shaders unity decrease and texture fillrate increase
now what ?


----------



## Joselotek

How?


----------



## Joselotek

So there was no performance increase and also did you flash it with the normal tool even so it said you cant unlock the shaders?


----------



## ozyo

Quote:


> Originally Posted by *Joselotek*
> 
> So there was no performance increase and also did you flash it with the normal tool even so it said you cant unlock the shaders?


It's fury X I'm trying stilt bios
If you flash your fury to fury X you well get increase in performance


----------



## Joselotek

But my card said that is Not able to unlock anycore by flashing the bios


----------



## Jflisk

New AMD beta driver 16.1
http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.1-Hotfix-Release-Notes.aspx


----------



## The Stilt

Quote:


> Originally Posted by *ozyo*
> 
> @The Stilt i did it
> http://www.techpowerup.com/gpuz/details.php?id=c8b8g
> 
> stock bios
> http://www.3dmark.com/3dm/10174176?
> FIJI4-8 bios
> http://www.3dmark.com/3dm/10174053?
> shaders unity decrease and texture fillrate increase
> now what ?


Thanks for testing, seems to work as intended









Did you use CUInfo by any chance? I would need to see the information from it too, eventhou it appears to be working.


----------



## ozyo

Quote:


> Originally Posted by *The Stilt*
> 
> Thanks for testing, seems to work as intended
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you use CUInfo by any chance? I would need to see the information from it too, eventhou it appears to be working.


unfortunately no I'm rebuilding my rig it will take 2 days at least


----------



## Semel

Quote:


> Originally Posted by *The Stilt*
> 
> Could someone with a Fury X or a fully unlockable, reference PCB (Fury X PCB) based Fury card try the bios I attached? Please use CUInfo to dump the information, prior and after the bios flash. In the absolute worst case the bios will render the card as unbootable, however it will cause no permanent damage. Also the secondary bios behind the switch will make it extremely easy to recover.
> 
> Thanks


I already have fury unlocked to 3840. 4096 worked but I had artifacts all over the screen.. I don't suppose your bios would fix that?


----------



## hrockh

pretty cool work on the bios!
can anyone share the folding PPD for a Nano? I'm really curious


----------



## NBrock

Quote:


> Originally Posted by *hrockh*
> 
> pretty cool work on the bios!
> can anyone share the folding PPD for a Nano? I'm really curious


I know the nano throttles to stay around its TDP but to give you an idea my Fury X at stock clocks gets 380k-500k+ ppd depending on work unit.


----------



## The Stilt

Quote:


> Originally Posted by *Semel*
> 
> I already have fury unlocked to 3840. 4096 worked but I had artifacts all over the screen.. I don't suppose your bios would fix that?


Post your original CUInfo dump, with original bios (shaders locked).


----------



## fjordiales

Quote:


> Originally Posted by *The Stilt*
> 
> Post your original CUInfo dump, with original bios (shaders locked).


Hello, I'm just wondering if my bios has a chance on this?

Adapters detected: 3
Card #1 PCI ID: 1002:7300 - 1043:049E
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00030000 / 00000000 [..............xx]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00050000 / 00000000 [.............x.x]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.
Card #2 PCI ID: 1002:7300 - 1043:049E
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 80010000 / 00000000 [x..............x]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.
Card #3 PCI ID: 1002:7300 - 1043:049E
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00210000 / 00000000 [..........x....x]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 00030000 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.

furybios.zip 102k .zip file


----------



## Alastair

Quote:


> Originally Posted by *The Stilt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Semel*
> 
> I already have fury unlocked to 3840. 4096 worked but I had artifacts all over the screen.. I don't suppose your bios would fix that?
> 
> 
> 
> Post your original CUInfo dump, with original bios (shaders locked).
Click to expand...

Stilt is there anyway that you know of to get my one fury to 3840? With your custom Bios? It will only unlock 3CU's with the stuff provided by the unlock thread. That's with low and high bios and all gives me 4096 but loads of artifacts. I'll post you a shot of my current info dump when I get the chance.


----------



## ECPowers

Anyone know if I can flash a Fury X on my R9 Nano without breaking it? I want to see if it let's me overclock a little bit higher


----------



## huzzug

Isn't your Nano a Fury X with less power envelope. How would it change anything with regards to performance ?


----------



## ECPowers

At my current settings even adding +6 mV will cause it to throttle. So i'm stuck at 1040/560. So I was hoping the Fury X bios would give me a little more headroom.


----------



## huzzug

Yes, but the FuryX has more power options to get additional juice to run. Yours is limited to a single 8-pin connector which can supply 225W to the card


----------



## ozyo

@The Stilt
stock
Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 1002:0B36
DevID [7300] Rev [C8] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00000001 / 00000000 [................]
SE2 hw/sw: 00000001 / 00000000 [................]
SE3 hw/sw: 00000001 / 00000000 [................]
SE4 hw/sw: 00000001 / 00000000 [................]
64 of 64 CUs are active. HW locks: 0 (R/O) / SW locks: 0 (R/W).
All CUs in this chip are already active.

FIJI4-8
Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 1002:0B36
DevID [7300] Rev [C8] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00000001 / 00000000 [................]
SE2 hw/sw: 00000001 / 00000000 [................]
SE3 hw/sw: 00000001 / 00000000 [................]
SE4 hw/sw: 00000001 / 00000000 [................]
64 of 64 CUs are active. HW locks: 0 (R/O) / SW locks: 0 (R/W).
All CUs in this chip are already active.

btw card @ 4x pcie


----------



## Gamedaz

* Finally installed AMD's Crimson Drivers 16.1 hotfix! For my XFX fury R9.

* The install went smooth, no pauses, no uninstall AV etc, like the per game O.C. options.

Tested the GPU Max Temp settings and Fan Profile settings. Now getting lower fan [email protected] 58% with Temps @ 54c (Was getting 100% use before)

GPU clocks hit 1087! Which is fine for 1080p at a 15% throttle. Will try to push it more, hitting an 1100+ Gpu clock would be ideal and would make use of this GPU as a Competitive product against Nvidia.

Tried to O.C Project Cars ~ Used 16% GPU Clock and 10% Power @ 58C ~ Was unable to start game due to GPU Temp Exceeds notice. * I assume its because the GPU increased in temps and when it past my set 58c Thresh hold it shuts the game off (Although the Driver Locks the entire Machine after this happens?) *Reseting the Clocks and Power allow the game to play normal again etc.

VSR is a neat feature I used on my previous Nvidia Card, worked great, glad to see its been implemented in AMD drivers, although it is not available in many older titles? Where Nvidia seemed to have it enabled for every game title I owned AMD lacks enough content to use it, especially since older games have too many jaggies and VSR smoothes everything out nice and clear. * Ghost Busters seems to have green lines at the side of the screen in VSR so not really playable.

I would like to know if there is a FPS counter in the Radeon software settings menu, I can't seem to find it and I would prefer to use it instead of third Party software to keep things simple if I need to check Frame rates.


----------



## josephimports

Quote:


> Originally Posted by *NBrock*
> 
> I have GPU usage issues in Fallout 4 as well. I logged it and the GPU clock is also not running full speed in game. I had to use ClockBlocker to get the game to run smoothly.


I waited for a new driver release before trying it out. When 16.1 failed to correct the downclocking, Clockblocker really came through. FO4 and GTAV are both running smoother now with no downclocking whatsoever.










Spoiler: Warning: Spoiler!


----------



## Gamedaz

Quote:


> Originally Posted by *josephimports*
> 
> I waited for a new driver release before trying it out. When 16.1 failed to correct the downclocking, Clockblocker really came through. FO4 and GTAV are both running smoother now with no downclocking whatsoever.


That seems to be a problem alot of people are complaining about right now, I assume it will be on their list of fixes on their next month release...the way I see it Crimson won't be Nvidia Ready for another year possibly, so it could be some time before the drivers are stable on most systems etc, its up to us to provide AMD feedback with driver issues etc, I'm sure the core AMD radeon Graphics division has plans to increase their Driver update frequency and number of fixes available, if they want to Market their product to a broader market share (ie people now investing in GPU and hardware etc).


----------



## Maximization

is clockblocker safe? thermal and voltage protections get overrided.


----------



## diggiddi

Quote:


> Originally Posted by *Gamedaz*
> 
> * Finally installed AMD's Crimson Drivers 16.1 hotfix! For my XFX fury R9.
> 
> * The install went smooth, no pauses, no uninstall AV etc, like the per game O.C. options.
> 
> Tested the GPU Max Temp settings and Fan Profile settings. Now getting lower fan [email protected] 58% with Temps @ 54c (Was getting 100% use before)
> 
> GPU clocks hit 1087! Which is fine for 1080p at a 15% throttle. Will try to push it more, hitting an 1100+ Gpu clock would be ideal and would make use of this GPU as a Competitive product against Nvidia.
> 
> Tried to O.C Project Cars ~ Used 16% GPU Clock and 10% Power @ 58C ~ Was unable to start game due to GPU Temp Exceeds notice. * I assume its because the GPU increased in temps and when it past my set 58c Thresh hold it shuts the game off (Although the Driver Locks the entire Machine after this happens?) *Reseting the Clocks and Power allow the game to play normal again etc.
> 
> VSR is a neat feature I used on my previous Nvidia Card, worked great, glad to see its been implemented in AMD drivers, although it is not available in many older titles? Where Nvidia seemed to have it enabled for every game title I owned AMD lacks enough content to use it, especially since older games have too many jaggies and VSR smoothes everything out nice and clear. * Ghost Busters seems to have green lines at the side of the screen in VSR so not really playable.
> 
> I would like to know if there is a FPS counter in the Radeon software settings menu, I can't seem to find it and I would prefer to use it instead of third Party software to keep things simple if I need to check Frame rates.


No FPS counter but you can have one if you are running it thru steam


----------



## ozyo

is there anyway to know vrm temp ?


----------



## NBrock

Quote:


> Originally Posted by *Maximization*
> 
> is clockblocker safe? thermal and voltage protections get overrided.


ClockBlocker is safe. The GPU will still shut itself off if it gets too hot. Those limits are set in the hardware by AMD and Manufactures. You can also pick and choose what applications to run it with. out of the box it only runs @ full clock with full screen apps.


----------



## NBrock

Quote:


> Originally Posted by *josephimports*
> 
> I waited for a new driver release before trying it out. When 16.1 failed to correct the downclocking, Clockblocker really came through. FO4 and GTAV are both running smoother now with no downclocking whatsoever.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Glad it is working for you. It really did help with Fallout for me. It's funny because the 290 and 295x2 I ran it with before didn't have any issues...just the Fury X (same drivers)


----------



## Gamedaz

* I could see the AMD BIOS having some sort of power efficiency measurement built into it, suppose setting a 10% clock reduced PSU output efficiency by a set % margin (Using crimson), aftermarket O.C software could ignore those margins~ allow the VRMs to accept more voltage and ripples , though a PSU that ihas less voltage ripple could O.C. the card more consistently with less voltage applied, is it possible to test different PSU's to O.C the GPU? I know my XFX600 has some terrible voltage ripples at high wattage draw, which could be the cause as to why my GPU locks up, I increased my temps 62c and it played for 15mins then game crashed, another issue is when it crashed and I get the error Drivers Heat error, the card is not warm or even hot?


----------



## Semel

Unwinder is going to release a new afterburner version in January that will feature unofficial overclocking mode *without powerplay* support on fury cards. It will essentially totally eliminate any downclocking issues when gaming if enabled. So it's nice to have a backup plan if AMD fails to deliver.


----------



## The Stilt

Please explain me these "downclocking" issues, like to an complete imbecile.

What happens, in what conditions?
Does increasing PowerTune limit change the behavior, etc.


----------



## Semel

Quote:


> Originally Posted by *The Stilt*
> 
> What happens


GPU downclocks its core (quite often and to somewhat really low clocks compared to 15.11.1 catalyst driver) when it "thinks" it can get away with being more "power efficient" (say, opening inventory, a map, but it happens in-game as well*,especially* when vsync is enabled) which results in lower fps and unstable OCing..
Quote:


> , in what conditions?


games+crimson driver
Quote:


> Does increasing PowerTune limit change the behavior,


Do you mean power limit in afterburner? No.

We really need a "performance mode"(when I play games I don't need any "power efficient" nonsense screwing things up. I didn't buy a top end gpu for it to be "power efficient" when gaming) like the one nvidia has in the control panel or just wait for a new afterburner release coz I don't think amd is gonna fix this anytime soon.

PS That's on fury cards. I dunno if it happens on other cards.


----------



## 98uk

I have had the downclocking issue on my Fury in only BF4. It seems to drop the clocks every once in a while which in turn causes lag.

Clockblocker appears to have rectified this.


----------



## Semel

Quote:


> Originally Posted by *98uk*
> 
> I have had the downclocking issue on my Fury in only BF4. It seems to drop the clocks every once in a while which in turn causes lag.
> 
> Clockblocker appears to have rectified this.


I have it in any game when using crimsond river. As for clockblocker....it's a nice workaround but it is not perfect and it doesn't fully eliminate OC instability issues caused by this nonsense..(checked in witcher 3..the most sensitive to OCing game.. 15.11.1- perfectly stable 24\7 as in any other game\benchmark, crimson- crashes within xx minutes even with clockblocker enabled. Without clockblocker it happens much faster)


----------



## The Stilt

If the temperature is below the throttling temperature (75°C) and the power draw stays below the TDP (270W, PowerTune / Power Control @ 0%), it is definitely a driver issue and not directly related to PowerTune or PowerPlay. Disabling PowerPlay is definitely not the correct approach to solve it, since it will mess up the power management completely.


----------



## Semel

Generally my OCed GPU sits at 40-55C (vsync on) when gaming..It depends on a game\load


----------



## The Stilt

Quote:


> Originally Posted by *Semel*
> 
> Generally my GPU sits at 40-55C when under gaming load..It depends on a game


Is there a continuously replicable difference in benchmarks (e.g 3DMark) between the drivers without the issue and the newer Crimson ones?


----------



## Noirgheos

Quote:


> Originally Posted by *Semel*
> 
> Generally my OCed GPU sits at 40-55C (vsync on) when gaming..It depends on a game\load


Which OS are you on?


----------



## Gamedaz

* It seems that AMD needs to rectify the clocking issues with Crimson drivers, otherwise whats the point in having a feature that collects dust for 1-2 years, while video drivers are improved on.

*It seems that it would take a large team to go into the BIOS Tree, find what lines of code cause the clocks to downclock, and build some type of Logic code that says if Power threshold is set to *%* then increase voltage / clocks until temp threshold is reached then power down if threshold reaches above (92c) possibly the reason why some people using aftermarket software having success O.C., AMD does not want to risk damage to GPU's just to satisfy a Market that uses 1440p or higher displays, since 1100-1500 clocks are standard for driving those types of displays.

Or they could use the HBM to plug in different shader caches from the game and get it to squeeze those through the memory instead of the GPU, I'm sure there a plenty of different ways to successfully overclock the GPU, using third party engineers seems to be the only option for any manufacturer etc.


----------



## Semel

Quote:


> Originally Posted by *The Stilt*
> 
> Is there a continuously replicable difference in benchmarks (e.g 3DMark) between the drivers without the issue and the newer Crimson ones?


I dunno. I clearly see difference in stability(if OCed) and real-time performance(thanx to monitoring tools) in games. I installed\uninstalled drivers too many times and I don't think I'm gonna do it again just to compare 3dmark firestrike scores.









The issue is widespread. Check hardware forums, reddit, etc. Even AMD acknowledged the problem

__ https://twitter.com/i/web/status/676423582520487937%5B%2FURL
Which OS are you on?[/QUOTE]
win 10 64


----------



## sugarhell

Quote:


> Originally Posted by *The Stilt*
> 
> If the temperature is below the throttling temperature (75°C) and the power draw stays below the TDP (270W, PowerTune / Power Control @ 0%), it is definitely a driver issue and not directly related to PowerTune or PowerPlay. Disabling PowerPlay is definitely not the correct approach to solve it, since it will mess up the power management completely.


Its a driver issue for years. The past years the method to avoid this problem was:

Disable powerplay
Then you set 2 profiles. A 2d one and a 3d one.
The 2d one is the default clocks of your gpu via the bios.
The 3d one most of the times was the default clocks of your gpu +1 mhz
Manual change from 3d clocks to 2d with AB

A pain in the ...

This was the perfect way for the 7970 to achieve 3d clocks all the time without trottling. But with 290x and fury it seems they changed the powerplay or something.


----------



## Semel

Quote:


> Originally Posted by *sugarhell*
> 
> Disable powerplay


The current afterburner version doesn't support that for fury cards yet..Unwinder to the rescue!


----------



## hyp36rmax

Price drop on the Nano to $499! mighty tempting for two


----------



## 98uk

Is anyone here running out of graphics memory. I haven't done yet... but seems to be going very close.

With BF4, playing at 2560x1440, with all ultra 2 x AA, i'm using ~3.6gb GPU memory.

Doesn't bode well for 4k or new games


----------



## Noirgheos

My question is, how did reviewers test Crimson without downclocking?

EDIT: lol vsync was off


----------



## Noirgheos

Quote:


> Originally Posted by *98uk*
> 
> Is anyone here running out of graphics memory. I haven't done yet... but seems to be going very close.
> 
> With BF4, playing at 2560x1440, with all ultra 2 x AA, i'm using ~3.6gb GPU memory.
> 
> Doesn't bode well for 4k or new games


What kind of FPS do you get with those settings on your Fury?


----------



## 98uk

Quote:


> Originally Posted by *Noirgheos*
> 
> What kind of FPS do you get with those settings on your Fury?


Between 70-120 depending on the map and what's going on


----------



## The Stilt

Someone try this and see if it makes any difference on the downclocking issue. The bios has only two SCLK states (instead of eight) so in theory it should address it, unless it is a hardware related issue (power or thermal related throttling). It might cause some undefined behavior during UVD or VCE, but that´s what I´m looking you guys to find out. In case there is a issue with the bios (no picture, restart while loading the display driver, etc) you can recover by using the second bios (primary / secondary).

The bios is intended for *Fury X* MBA (113-C8800100-103) board. AFAIK there is currently only a single version available, so it should be the correct one regardless of the ODM or the age of the card.

FuryXLCB2.zip 102k .zip file


If anyone tries it, please report the changes in behavior, issues, improvements, etc.


----------



## Noirgheos

Quote:


> Originally Posted by *98uk*
> 
> Between 70-120 depending on the map and what's going on


Thats insane! I sometimes drop below 60FPS at 1080p! Only in certain areas though. Like spikes. But damn. What the hell is wrong with my PC? My CPU can't bottleneck it, its a 4790K. Is running 15.11.1 CCC instead of Crimson really limiting me that much?


----------



## 98uk

Quote:


> Originally Posted by *Noirgheos*
> 
> Thats insane! I sometimes drop below 60FPS at 1080p! Only in certain areas though. Like spikes. But damn. What the hell is wrong with my PC? My CPU can't bottleneck it, its a 4790K. Is running 15.11.1 CCC instead of Crimson really limiting me that much?


I'm using Crimson 15.12 with a stock 4770k. You might want to also check your RAM is fast enough, I was using 1600mhz before and that held me back massively.


----------



## Sonikku13

Given the price drop on the Radeon R9 Nano, does it become worth buying at $350? (I get a $150 account credit on my credit card for spending $500)

My current rig specs:
A10-7850K
Radeon R7 iGPU
ASUS A88X-PRO
8 GB DDR3-1866 SDRAM
240 GB SanDisk Extreme Pro
Antec High Current Gamer 750W

I miss ultra settings, I would love to have them again.


----------



## Jbod

Hi guys. I figured I would chime in


----------



## JunkaDK

Quote:


> Originally Posted by *Jbod*
> 
> Hi guys. I figured I would chime in


Thanks alot.. now i have to get a new case again.. 2nd time this week


----------



## xer0h0ur

Alienware's cases usually tend to be pretty cool and I love the lighting to them but their motherboards are certified hot garbage. You're BIOS locked from ever truly overclocking your CPU much of anything. Its why I only run my 4930K @ 4.2 GHz with 4.3GHz being as high as I can push it with some additional turbo boost voltage. Either way the only method of overclocking the BIOS allows is jacking up the turbo boost multiplier.


----------



## AliNT77

Quote:


> Originally Posted by *Sonikku13*
> 
> Given the price drop on the Radeon R9 Nano, does it become worth buying at $350? (I get a $150 account credit on my credit card for spending $500)
> 
> My current rig specs:
> A10-7850K
> Radeon R7 iGPU
> ASUS A88X-PRO
> 8 GB DDR3-1866 SDRAM
> 240 GB SanDisk Extreme Pro
> Antec High Current Gamer 750W
> 
> I miss ultra settings, I would love to have them again.


a Nano for 350$ is Absolutely sick?
Go ahead and grab it... ?
But your cpu is a major bottleneck...
U need a fully overclocked i5 to take advantage of the nano (in 1080p)


----------



## Jbod

Quote:


> Originally Posted by *JunkaDK*
> 
> Thanks alot.. now i have to get a new case again.. 2nd time this week


Quote:


> Originally Posted by *xer0h0ur*
> 
> Alienware's cases usually tend to be pretty cool and I love the lighting to them but their motherboards are certified hot garbage. You're BIOS locked from ever truly overclocking your CPU much of anything. Its why I only run my 4930K @ 4.2 GHz with 4.3GHz being as high as I can push it with some additional turbo boost voltage. Either way the only method of overclocking the BIOS allows is jacking up the turbo boost multiplier.


http://valid.x86.fr/j4lvil

http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/4.6.jpg.html

http://valid.x86.fr/j4lvil


----------



## Alastair

Quote:


> Originally Posted by *Jbod*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JunkaDK*
> 
> Thanks alot.. now i have to get a new case again.. 2nd time this week
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *xer0h0ur*
> 
> Alienware's cases usually tend to be pretty cool and I love the lighting to them but their motherboards are certified hot garbage. You're BIOS locked from ever truly overclocking your CPU much of anything. Its why I only run my 4930K @ 4.2 GHz with 4.3GHz being as high as I can push it with some additional turbo boost voltage. Either way the only method of overclocking the BIOS allows is jacking up the turbo boost multiplier.
> 
> Click to expand...
> 
> http://valid.x86.fr/j4lvil
> 
> http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/4.6.jpg.html
> 
> http://valid.x86.fr/j4lvil
Click to expand...

this isn't the thread for this. But Cinebench isn't proof of stability.


----------



## NBrock

I think he is just proving he can overclock on his alienware.


----------



## Noirgheos

Quote:


> Originally Posted by *98uk*
> 
> I'm using Crimson 15.12 with a stock 4770k. You might want to also check your RAM is fast enough, I was using 1600mhz before and that held me back massively.


Is that the testing or practive level? Also, my RAM is at 2400Mhz. Set it to XMP. I'm guessing Crimson does help with FPS a decent bit.


----------



## Dirgeth

Someone find a way to see a VRM temps on Fury X ?

I am not woried on core temps.. it will be always OK.. but VRM temps are danger for overclocking..


----------



## Alastair

Quote:


> Originally Posted by *Dirgeth*
> 
> Someone find a way to see a VRM temps on Fury X ?
> 
> I am not woried on core temps.. it will be always OK.. but VRM temps are danger for overclocking..


you can't. There ye go.


----------



## xer0h0ur

I don't use Cinebench or CPU-Z so really I am not sure what he was trying to show with that screenshot either. Alienware PCs come factory "overclocked" anyways so its not like he has to prove anything. The thing is that Alienware's whack overclock is only applied to one core and not the rest. You have to manually go into the BIOS and apply the turbo boost multiplier across the rest of the cores or else its not actually overclocking the entire processor. There are many other things that blow about their BIOS and motherboards but I really don't feel like hate posting right now.


----------



## Sonikku13

Quote:


> Originally Posted by *AliNT77*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> Given the price drop on the Radeon R9 Nano, does it become worth buying at $350? (I get a $150 account credit on my credit card for spending $500)
> 
> My current rig specs:
> A10-7850K
> Radeon R7 iGPU
> ASUS A88X-PRO
> 8 GB DDR3-1866 SDRAM
> 240 GB SanDisk Extreme Pro
> Antec High Current Gamer 750W
> 
> I miss ultra settings, I would love to have them again.
> 
> 
> 
> a Nano for 350$ is Absolutely sick?
> Go ahead and grab it... ?
> But your cpu is a major bottleneck...
> U need a fully overclocked i5 to take advantage of the nano (in 1080p)
Click to expand...

I've run a 290X on the same setup in the past, so I'm not concerned about that. What I am concerned with is Pascal and Arctic Islands clobbering the Nano. So, bite now or bite later?


----------



## Maximization

Quote:


> Originally Posted by *Sonikku13*
> 
> I've run a 290X on the same setup in the past, so I'm not concerned about that. What I am concerned with is Pascal and Arctic Islands clobbering the Nano. So, bite now or bite later?


I think it is fair to say they will both clobber the nano. all new tech does. eventually you will have one card then cpu on die video that will tear through 4k and 8k


----------



## Jbod

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't use Cinebench or CPU-Z so really I am not sure what he was trying to show with that screenshot either. Alienware PCs come factory "overclocked" anyways so its not like he has to prove anything. The thing is that Alienware's whack overclock is only applied to one core and not the rest. You have to manually go into the BIOS and apply the turbo boost multiplier across the rest of the cores or else its not actually overclocking the entire processor. There are many other things that blow about their BIOS and motherboards but I really don't feel like hate posting right now.


I am showing that with the new Alienware (nothing older) you get full control over the FSB, multi and voltage. This means that you can overclock with it just like you would with any other PC. If you look closely there is a label inside.

http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/label_zpsmiutu2r6.jpg.html

Basically MSI are making all of the main parts for it now and the motherboard is just like their X99 entry level board. However, it also supports 3 way SLI and Crossfire X with a good layout.

Here is the overclocking tool located in the command center.

http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/CPUsettings_zpsbu4qphkf.jpg.html

As you can see, FSB multi and voltage are fully unlocked.

As for whether it's stable? it is. I got lucky with my 5820k and temps are very low for the clocks.

The keen eyed among us would know that with older Alienwares there was no real way to bypass the standard overclocking in the bios. So for example you would pick level 1 or level 2 overclock, both of which are totally lame.

Things have changed now though. They're no different to any other board and you can overclock properly with them. As I said, keen eyed would know that you can not increase the multiplier to gain those sorts of clocks on older units.

However, even the older Alienware boards (starting with the Dell Aurora and Dell Area 51) also used MSI for their boards.


----------



## xer0h0ur

Quote:


> Originally Posted by *Jbod*
> 
> I am showing that with the new Alienware (nothing older) you get full control over the FSB, multi and voltage. This means that you can overclock with it just like you would with any other PC. If you look closely there is a label inside.
> 
> http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/label_zpsmiutu2r6.jpg.html
> 
> Basically MSI are making all of the main parts for it now and the motherboard is just like their X99 entry level board. However, it also supports 3 way SLI and Crossfire X with a good layout.
> 
> Here is the overclocking tool located in the command center.
> 
> http://s72.photobucket.com/user/timmahtiburon/media/AlienFX/CPUsettings_zpsbu4qphkf.jpg.html
> 
> As you can see, FSB multi and voltage are fully unlocked.
> 
> As for whether it's stable? it is. I got lucky with my 5820k and temps are very low for the clocks.
> 
> The keen eyed among us would know that with older Alienwares there was no real way to bypass the standard overclocking in the bios. So for example you would pick level 1 or level 2 overclock, both of which are totally lame.
> 
> Things have changed now though. They're no different to any other board and you can overclock properly with them. As I said, keen eyed would know that you can not increase the multiplier to gain those sorts of clocks on older units.
> 
> However, even the older Alienware boards (starting with the Dell Aurora and Dell Area 51) also used MSI for their boards.


Well its about time they gave users that overclockability because those whack Level 1/Level 2 OC profiles is all the Aurora R4 had aside from tweaking turbo boost multiplier and raising the additional turbo boost voltage. It may have been a motherboard manufactured by MSI but its BIOS was completely stripped down to nothingness. Hell, they didn't even have a CPU virtualization option. Among the stupidity, the motherboard only had two full speed SATAIII ports etc etc. Its garbage.


----------



## MAMOLII

Quote:


> Originally Posted by *The Stilt*
> 
> Someone try this and see if it makes any difference on the downclocking issue. The bios has only two SCLK states (instead of eight) so in theory it should address it, unless it is a hardware related issue (power or thermal related throttling). It might cause some undefined behavior during UVD or VCE, but that´s what I´m looking you guys to find out. In case there is a issue with the bios (no picture, restart while loading the display driver, etc) you can recover by using the second bios (primary / secondary).
> 
> The bios is intended for *Fury X* MBA (113-C8800100-103) board. AFAIK there is currently only a single version available, so it should be the correct one regardless of the ODM or the age of the card.
> 
> FuryXLCB2.zip 102k .zip file
> 
> 
> If anyone tries it, please report the changes in behavior, issues, improvements, etc.


sorry can someone tell me if i have 113-c8800100-*102* i can flash this right? not all furyx reference boards the same? i see asus reference fury x starts with 115- ect.. and furyx first bios 113-c8800100-*101* every bios i tried works...just try to figure out what is this board thing


----------



## Noirgheos

Guys, I got 40MHz more on my VRAM. Is this dangerous? Is there a reason they locked it? I got 8FPS more in Shadow of Mordor and around that in other games. Core temps went up quite a bit, I have 1050MHz on it. Doesn't break 60C though.


----------



## p4inkill3r

HBM?
I can bench at +100MHz on my RAM, don't sweat it.


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> HBM?
> I can bench at +100MHz on my RAM, don't sweat it.


Ok good. The HBM OC gave me some massive boosts in games that are VRAM heavy.


----------



## Maximization

i can't overclock the HBM at all


----------



## The Stilt

Quote:


> Originally Posted by *MAMOLII*
> 
> sorry can someone tell me if i have 113-c8800100-*102* i can flash this right? not all furyx reference boards the same? i see asus reference fury x starts with 115- ect.. and furyx first bios 113-c8800100-*101* every bios i tried works...just try to figure out what is this board thing


It will work on all Fury X boards currently available.


----------



## dagget3450

Quote:


> Originally Posted by *Maximization*
> 
> i can't overclock the HBM at all


just out of curiosity have you tried overclocking a single fury? I quickly found out in crossfire my overclocking pretty much is non existent. I mean if you count 1070/540 as overclocked. For every card i added my clocks dropped in about half or so. Yeah, of course i am going by short term stability for benching. Never got around to doing much more as i have been disappointed with everything mult-igpu outside of cool temps.


----------



## Maximization

Quote:


> Originally Posted by *dagget3450*
> 
> just out of curiosity have you tried overclocking a single fury? I quickly found out in crossfire my overclocking pretty much is non existent. I mean if you count 1070/540 as overclocked. For every card i added my clocks dropped in about half or so. Yeah, of course i am going by short term stability for benching. Never got around to doing much more as i have been disappointed with everything mult-igpu outside of cool temps.


I have not no...I do have the dip switches to disable pcie slots but I take your experience as the standard. i think the fury stacking the vram limits the stability of the overclock in crossfire


----------



## baii

Does the fury x pump use normal fan header plug,or it only plug into the GPU?


----------



## sadboyz

Does anyone else have that problem where your entire screen goes black and you have to hard reset after updating to the latest amd drivers?


----------



## Maximization

Quote:


> Originally Posted by *sadboyz*
> 
> Does anyone else have that problem where your entire screen goes black and you have to hard reset after updating to the latest amd drivers?


turning off surface format optimization worked in the applications I was having a problem. I found out later I had to install the driver and software for my monitor, I can leave it on now. I am using display port 1.2


----------



## Arizonian

All in the last couple days just started seeing news on Nitro Fury.

*Hexus* - *Sapphire releases the Nitro R9 Fury graphics card*

*Legit Reviews* - *SAPPHIRE NITRO R9 FURY Series Video Cards Announced*

Review for Sapphire Nitro Fury by *WCCF* - *AMD's Fiji Pro and Sapphire's Nitro Review*






They went on sale December 16, 2015 when I got mine and I'm still not sure why with these absolutely low temps Fury X didn't come out air cooled. Sapphire's Nitro did Fury justice. Except for power consumption and some 1080 games it beats 980 in performance, acoustics, and temperatures. Too bad for AMD this wasn't ready much sooner, a lot of lost sales. Until pascal or polaris comes out if your building a new rig or replacing failing GPU and can't wait, this is the card for this price range now.


----------



## The Stilt

Quote:


> Originally Posted by *baii*
> 
> Does the fury x pump use normal fan header plug,or it only plug into the GPU?


Fury X pump uses a custom 6-pin connector. It is basically a standard 4-pin PWM connection (VDD, VSS, Tacho & PWM) with two additional pins for temperature.


----------



## baii

Thanks, just put it in, pump whine, and buzz on load. Meh, and it is the new pump (the chormish sticker). I think i gonna stick with air fury. My current air fury buzz pretty bad though, ehhh.

Getting tired of exchanging cards zzzz.


----------



## 98uk

Quote:


> Originally Posted by *Arizonian*
> 
> All in the last couple days just started seeing news on Nitro Fury.
> 
> *Hexus* - *Sapphire releases the Nitro R9 Fury graphics card*
> 
> *Legit Reviews* - *SAPPHIRE NITRO R9 FURY Series Video Cards Announced*
> 
> Review for Sapphire Nitro Fury by *WCCF* - *AMD's Fiji Pro and Sapphire's Nitro Review*
> 
> 
> 
> 
> 
> 
> They went on sale December 16, 2015 when I got mine and I'm still not sure why with these absolutely low temps Fury X didn't come out air cooled. Sapphire's Nitro did Fury justice. Except for power consumption and some 1080 games it beats 980 in performance, acoustics, and temperatures. Too bad for AMD this wasn't ready much sooner, a lot of lost sales. Until pascal or polaris comes out if your building a new rig or replacing failing GPU and can't wait, this is the card for this price range now.


How are these comparing to the standard "Non-Nitro" Sapphire Fury? I bought one and not sure whether I should have waited for the Nitro version









But, the temps are crazy... I switched from a reference fan Gigabyte 290 to the Fury and I kept checking my PC as I was scared it wasn't working because the fans were so quiet. I was used to a rocket in my case when gaming with my old card.


----------



## dagget3450

Quote:


> Originally Posted by *98uk*
> 
> How are these comparing to the standard "Non-Nitro" Sapphire Fury? I bought one and not sure whether I should have waited for the Nitro version
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, the temps are crazy... I switched from a reference fan Gigabyte 290 to the Fury and I kept checking my PC as I was scared it wasn't working because the fans were so quiet. I was used to a rocket in my case when gaming with my old card.


Hehe that is a big contrast of gpus and sound. I recently put a few reference 290x back on stock air cooling from water blocks. Crazy how loud they can get. Sometimes i have to look at the leds or fans to make sure my furies are on and always glancing for temps i guess its a habit now from owning 290x hehe.


----------



## Creator

Have a Nano on the way. Looking forward to benching it in my SFF Skylake build.


----------



## Arizonian

Quote:


> Originally Posted by *98uk*
> 
> How are these comparing to the standard "Non-Nitro" Sapphire Fury? I bought one and not sure whether I should have waited for the Nitro version
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, the temps are crazy... I switched from a reference fan Gigabyte 290 to the Fury and I kept checking my PC as I was scared it wasn't working because the fans were so quiet. I was used to a rocket in my case when gaming with my old card.


The nitro fury has double sided black diamond chokes to lessen coil whine. 8 layer instead of 6 layer, 2oz PCB, 6 phase circuit. A DVI-D port that TriX doesn't have. Not 100% sure on Tri-X specs but I think that's the difference.

Quote:


> Originally Posted by *dagget3450*
> 
> Hehe that is a big contrast of gpus and sound. I recently put a few reference 290x back on stock air cooling from water blocks. Crazy how loud they can get. Sometimes i have to look at the leds or fans to make sure my furies are on and always glancing for temps i guess its a habit now from owning 290x hehe.


I had a 290X reference when first launched, almost two months later sold it at a profit to a miner. Then while waiting for non-ref 290X which took forever to come out the 780Ti came along. Amazing turn around from the leaf blower my wife was upset about when I gamed in a main room. This Fury is the most quitest GPU I've ever owned to date. I had to keep GPU-Z up and kept checking it when I first got it to make sure it was running OK.









I never thought I'd lose so much value on that $680 780Ti with back plate I had to buy for $30 was sold two months ago for $300. It is the price to pay for early adopter to play but I never thought a year later a $330 970 could keep up and do better than it. Lesson learned. Moving foward I'm sticking with $500 GPU's regardless as I sell'em or pass them along to other rigs within a year or two at most. This $500 purchase makes me feel as giddy as I did years ago when I bought a 580 for $500. It's my comfort price range.


----------



## Doran

I know I'm a little behind the eight-ball but I just got my Sapphire R9 Fury X yesterday and thought I'd post an image. I'm using a Corsair Obsidian 350D and the Fury X fits perfectly, although I'm going to have to modify the air flow a little bit.

I did have to remove the 2.5" drive bay to mount the rad and use a Corsair 2.5" to 3.5" mounting bracket for my Samsung 850 EVO 500 GB.

System Specs:

Corsair Obsidian 350D
Intel i7 4790K @ 4.5 GHz
ASUS ROG Maximus VII Gene Micro ATX
Corsair Vengeance Pro 16GB DDR3 2400MHz
Sapphire R9 Fury X
Samsung 850 EVO 500 GB SSD
EVGA SuperNOVA 850 G2 80+ Gold
Corsair Hydro Series H75 CPU Cooler


----------



## p4inkill3r

Quote:


> Originally Posted by *Doran*
> 
> I know I'm a little behind the eight-ball but I just got my Sapphire R9 Fury X yesterday and thought I'd post an image. I'm using a Corsair Obsidian 350D and the Fury X fits perfectly, although I'm going to have to modify the air flow a little bit.
> 
> I did have to remove the 2.5" drive bay to mount the rad and use a Corsair 2.5" to 3.5" mounting bracket for my Samsung 850 EVO 500 GB.
> 
> System Specs:
> 
> Corsair Obsidian 350D
> Intel i7 4790K @ 4.5 GHz
> ASUS ROG Maximus VII Gene Micro ATX
> Corsair Vengeance Pro 16GB DDR3 2400MHz
> Sapphire R9 Fury X
> Samsung 850 EVO 500 GB SSD
> EVGA SuperNOVA 850 G2 80+ Gold
> Corsair Hydro Series H75 CPU Cooler


Very clean.


----------



## Mopar63

Quote:


> Originally Posted by *98uk*
> 
> How are these comparing to the standard "Non-Nitro" Sapphire Fury? I bought one and not sure whether I should have waited for the Nitro version


In theory the Nitro should be able to overclock better with a more robust power delivery system. In practice Fury cards do not overclock worth crap so a none issue. The Nitro card does come with a higher out of the box clock speed but it is MINOR, 1050 vs 1040.

The larger PCB should allow for cooler component temps, specifically VRM. That review makes it look like the actual GPU temp is a little higher, would expect that with the cooler losing the wide open fan at the end for better air flow.


----------



## p4inkill3r

Quote:


> Originally Posted by *Mopar63*
> 
> In theory the Nitro should be able to overclock better with a more robust power delivery system. In practice Fury cards do not overclock worth crap so a none issue. The Nitro card does come with a higher out of the box clock speed but it is MINOR, 1050 vs 1040.
> 
> The larger PCB should allow for cooler component temps, specifically VRM. That review makes it look like the actual GPU temp is a little higher, would expect that with the cooler losing the wide open fan at the end for better air flow.


Sapphire gets theirs to 1200MHz, which is pretty nice.

http://sapphirenation.net/sapphire-nitro-r9-fury-technical-overview/
Quote:


> The overclocking results listed in this section were achieved with SAPPHIRE's TriXX utility. Core voltage, power limits and fan speed were maxed out to remove any potential bottlenecks during the overclocking. Furthermore, the card was subjected to 15 minutes of intensive Crysis 3 gameplay to really put its stability to the test.
> 
> Maximum overclock of our sample is a very impressive 1200 MHz on the GPU-that's over 14% higher than stock frequency! There can be no doubt that this is among the best overclocking R9 FURY cards on the market. The overclocking potential of the SAPPHIRE NITRO R9 FURY is very similar to many non-reference GTX 980 cards.


----------



## Gamedaz

Quote:


> Originally Posted by *p4inkill3r*
> 
> Sapphire gets theirs to 1200MHz, which is pretty nice.
> 
> http://sapphirenation.net/sapphire-nitro-r9-fury-technical-overview/


* What sort of engineering is implemented in this Fury R9 card? Do they replace the GPU with handpicked models from a bin that O.C better? * Did they just go into the BIOS and re-write it entirely to allow the higher clocks and Voltages, which seem to show a stable 51c temps regardless of the voltage input.

I suspect the voltage efficiency tolerances built into the BIOS are causing O.C issues for this specific GPU, so a PSU that deviates more than a 3% threshold would not be able to O.C a standard Fury R9, Although a BIOS that reduces voltage tolerances will O.C better, possibly such as the Saphire Nitro, IMO most PSU have voltage tolerances the exceed 3% threshold. under certain loads and circumstances, such deviations from my understanding are output as Heat, where the VRM's on the PCB are designed dissipate and balance. It would be interesting if R9's can actually be O.C with a Re-flashed BIOS.


----------



## Otterfluff

Also said the Nitro takes three slots not two, that means the heatsink is larger than the tri-x right? Perhaps it has better cooling from a larger build?.

I wish there was info about water-blocking a Fury nano and how that decreases the throtling and how far it can overclock. I have a feeling with proper waterblock removing the thermal throttling it should overclock nicely if not better than a fury X. It has less power delivery because it needs LESS power to do the same as a reg Fury X with more. Ideally better binned chips like that should be able to get higher with less and they do have some room to move. It would be very interesting to see the results and testing from someone who has a nano under water with a decent 50C or less water temp.

Even if a nano was hitting 1050 core with zero thermal throttling it would be a better deal to buy it with a custom waterblock than a fury X by itself for anyone plumbing one into a custom loop.


----------



## Arizonian

^^^^^Sapphire Nitro Fury takes 2 1/2 slots^^^^^


----------



## baii

Quote:


> Originally Posted by *Otterfluff*
> 
> Also said the Nitro takes three slots not two, that means the heatsink is larger than the tri-x right? Perhaps it has better cooling from a larger build?.
> 
> I wish there was info about water-blocking a Fury nano and how that decreases the throtling and how far it can overclock. I have a feeling with proper waterblock removing the thermal throttling it should overclock nicely if not better than a fury X. It has less power delivery because it needs LESS power to do the same as a reg Fury X with more. Ideally better binned chips like that should be able to get higher with less and they do have some room to move. It would be very interesting to see the results and testing from someone who has a nano under water with a decent 50C or less water temp.
> 
> Even if a nano was hitting 1050 core with zero thermal throttling it would be a better deal to buy it with a custom waterblock than a fury X by itself for anyone plumbing one into a custom loop.


The tri x is said to be 2.5 slot
I had compare photos and the heatsink look the same, though no 1 review remove the shroud on the nitro, so can't be sure.


----------



## Scorpion49

Quote:


> Originally Posted by *baii*
> 
> The tri x is said to be 2.5 slot
> I had compare photos and the heatsink look the same, though no 1 review remove the shroud on the nitro, so can't be sure.


Tri-X and Nitro are the same width, 2.5 slot. The Nitro is taller than the PCI-E bracket though by about 3/4 inch compared to the Tri-X.

Anyways, really enjoying my new Nitro Fury, no coil whine is a plus. Obviously the fact that it likes to run at 406mhz in most of my games and give me only single-digit frame rates is a minus, but hey, no graphics card company can be perfect right? Joking aside, what driver do I need to roll back to from 16.1 to stop this crap from happening while at the same time not breaking FO4?


----------



## Alastair

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *baii*
> 
> The tri x is said to be 2.5 slot
> I had compare photos and the heatsink look the same, though no 1 review remove the shroud on the nitro, so can't be sure.
> 
> 
> 
> Tri-X and Nitro are the same width, 2.5 slot. The Nitro is taller than the PCI-E bracket though by about 3/4 inch compared to the Tri-X.
> 
> Anyways, really enjoying my new Nitro Fury, no coil whine is a plus. Obviously the fact that it likes to run at 406mhz in most of my games and give me only single-digit frame rates is a minus, but hey, no graphics card company can be perfect right? Joking aside, what driver do I need to roll back to from 16.1 to stop this crap from happening while at the same time not breaking FO4?
Click to expand...

just install clockblocker


----------



## 98uk

Quote:


> Originally Posted by *Mopar63*
> 
> In theory the Nitro should be able to overclock better with a more robust power delivery system. In practice Fury cards do not overclock worth crap so a none issue. The Nitro card does come with a higher out of the box clock speed but it is MINOR, 1050 vs 1040.
> 
> The larger PCB should allow for cooler component temps, specifically VRM. That review makes it look like the actual GPU temp is a little higher, would expect that with the cooler losing the wide open fan at the end for better air flow.


The standard Sapphire Fury is 1000mhz, not 1040mhz. The OC version was 1040mhz, but I haven't seen stock of Thai on Amazon for ages, so perhaps it's replaced by the nitro?
Quote:


> Originally Posted by *Otterfluff*
> 
> Also said the Nitro takes three slots not two, that means the heatsink is larger than the tri-x right? Perhaps it has better cooling from a larger build?.


The standard Fury is three slots too iirc? Same cooling system as the Nitro


----------



## Scorpion49

Quote:


> Originally Posted by *98uk*
> 
> The standard Fury is three slots too iirc? Same cooling system as the Nitro


They're 2.5 slots and they're not the same. The Nitro is taller and the heatsink is larger. The fans seem to be the same though.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyways, really enjoying my new Nitro Fury, no coil whine is a plus. Obviously the fact that it likes to run at 406mhz in most of my games and give me only single-digit frame rates is a minus, but hey, no graphics card company can be perfect right? Joking aside, what driver do I need to roll back to from 16.1 to stop this crap from happening while at the same time not breaking FO4?


Which games besides FO4 does this happen in?


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> Which games besides FO4 does this happen in?


I've been having it in Armored Warfare/World of Tanks as well and to a lesser extent DA:I, I should have noted that I am using clockblocker right now but it just seems weird to have to use a 3rd party program to do what the drivers should be doing already.

I don't remember having these problems playing the same games on Fury cards before Crimson drivers came along though. Otherwise, I'm really happy with the Nitro, it has a little bit of coil buzz (not whine, its very low frequency) that can't be heard with the case shut, cools nicely and run 1200/550 pretty easily.


----------



## p4inkill3r

I've been waiting for some common thread to pop up that links those with this downclocking issue together and I have yet to see it, very strange.


----------



## Scorpion49

I'm not sure, but I have noticed a lot more 380/380X people complaining about the issue, it seems much more common on those cards.


----------



## xer0h0ur

I keep logs of my gameplay to check for clock changes while I am playing and while I do get fluctuations its by a matter of a few MHz not hundreds like people normally experience. I have not needed to use clockbuster just yet.


----------



## The Stilt

Nobody has a Fury X card to test the bios I´ve posted previously? Come on now, the worst thing what can happen is that you need to lift your fat arses from the chair and toggle the bios switch


----------



## Maximization

Quote:


> Originally Posted by *The Stilt*
> 
> Nobody has a Fury X card to test the bios I´ve posted previously? Come on now, the worst thing what can happen is that you need to lift your fat arses from the chair and toggle the bios switch


i was waiting for someone else to try it first, i do have a fat arse though


----------



## Scorpion49

Quote:


> Originally Posted by *The Stilt*
> 
> Nobody has a Fury X card to test the bios I´ve posted previously? Come on now, the worst thing what can happen is that you need to lift your fat arses from the chair and toggle the bios switch


I'd try it but I don't have a Fury X.


----------



## xkm1948

I have a Fury X but I see no reason to try a custom BIOS when the stock one works perfectly. And I am pretty sure few FuryX owners would try something like this.


----------



## p4inkill3r

Quote:


> Originally Posted by *The Stilt*
> 
> Nobody has a Fury X card to test the bios I´ve posted previously? Come on now, the worst thing what can happen is that you need to lift your fat arses from the chair and toggle the bios switch


I have a beautiful arse but hate flashing custom BIOSes, as previously stated.


----------



## Maximization

I am afraid my crossfire goes BOOM


----------



## ozyo

Quote:


> Originally Posted by *The Stilt*
> 
> Nobody has a Fury X card to test the bios I´ve posted previously? Come on now, the worst thing what can happen is that you need to lift your fat arses from the chair and toggle the bios switch


which one ?


----------



## The Stilt

Quote:


> Originally Posted by *ozyo*
> 
> which one ?


 FuryXLCB2.zip 102k .zip file


----------



## ozyo

Quote:


> Originally Posted by *The Stilt*
> 
> FuryXLCB2.zip 102k .zip file


I don't have downclocking issue


----------



## viper16341

Can you tell me whats in that Bios - i mean the Clocks and Voltage. Are there any Fury X Mod-Bios around? How many mV up would go allright?

Thanks Guys!


----------



## The Stilt

That´s just a test bios to see if it fixes the downclocking issues. The clocks or voltages have not been touched, only some of the "middle" clock states have been removed.


----------



## Mopar63

Quote:


> Originally Posted by *p4inkill3r*
> 
> Which games besides FO4 does this happen in?


The issues I saw in FO4 was tied to frame rate limiting within the game. I actually saw framerates limited to 30 FPS (half my refresh rate) and thus my cards core down clocked since it was not needed.

I turned off vsync in the game but that caused the Physics of the game to be a mess. So I used the Frame rate Target Control in Crimson ot limit the frame rate to 60 and below, problem solved.


----------



## Scorpion49

Quote:


> Originally Posted by *Mopar63*
> 
> The issues I saw in FO4 was tied to frame rate limiting within the game. I actually saw framerates limited to 30 FPS (half my refresh rate) and thus my cards core down clocked since it was not needed.
> 
> I turned off vsync in the game but that caused the Physics of the game to be a mess. So I used the Frame rate Target Control in Crimson ot limit the frame rate to 60 and below, problem solved.


The top is what I get without clockblocker, the bottom is what I get with clockblocker and a 75fps cap (yes, they are both at 75fps because I had to go up against a wall to be able to see the overlay, normally without clockblocker my fps is int he teens to 20's out in the world):


----------



## p4inkill3r

The bottom shows 1099MHz but 0% usage.

I'm assuming this screenshot is from Fallout 4 also?


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> The bottom shows 1099MHz but 0% usage.
> 
> I'm assuming this screenshot is from Fallout 4 also?


Yeah, the usage goes to 0 every few seconds, I think its an RTSS bug since FO4 is pretty resistant to proper hooking. The usage does not actually drop to 0 in AB.


----------



## Thoth420

Quote:


> Originally Posted by *Mopar63*
> 
> The issues I saw in FO4 was tied to frame rate limiting within the game. I actually saw framerates limited to 30 FPS (half my refresh rate) and thus my cards core down clocked since it was not needed.
> 
> I turned off vsync in the game but that caused the Physics of the game to be a mess. So I used the Frame rate Target Control in Crimson ot limit the frame rate to 60 and below, problem solved.


Gamebyro.....never assess a system with games using that engine. It has never been anything but a broken mess.


----------



## dagget3450

Quote:


> Originally Posted by *Thoth420*
> 
> Gamebyro.....never assess a system with games using that engine. It has never been anything but a broken mess.


Except it is not just FO4, i can reproduce these clock issues in UT Alpha, Descent Underground, and a few others. It seems to have a link with fury and vsynch on crimson. If i want i can turn off vsynch but then i get screen tearing. The down clocks still might be happening but way less noticeable. I was at 120hz and it dropping to 60fps with clocks around 350 to 500 it was extremely annoying.

If i can get some time away from installing a new kitchen sink ill tamper with my furyx and stilts bios.


----------



## josephimports

Quote:


> Originally Posted by *The Stilt*
> 
> That´s just a test bios to see if it fixes the downclocking issues. The clocks or voltages have not been touched, only some of the "middle" clock states have been removed.


Unfortunately, the problem persists.

Bios verification


Spoiler: Warning: Spoiler!







Usage


Spoiler: Warning: Spoiler!







SS of test spot. Crimson settings at default.


Spoiler: Warning: Spoiler!


----------



## The Stilt

Quote:


> Originally Posted by *josephimports*
> 
> Unfortunately, the problem persists.
> 
> Bios verification
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Usage
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> SS of test spot. Crimson settings at default.
> 
> 
> Spoiler: Warning: Spoiler!


Thanks for testing!









If you got the time and the interest, could you test two more things:

A) Removing the: display driver & it´s accessories through "Add & Remove programs", removing Afterburner (+ other 3rd party tools, Trixx, iTurbo, GPUTweak, etc) and cleaning the display driver with DDU. Then install just the display driver and test without any 3rd party tools being installed. If you need to observe the clocks, use GPU-Z and log the telemetry to a file.

B) Test with the FuryXLCB3 (attached) bios if the problem gets *more intensive* or occurs more frequently or quickly.

I just want to be absolutely sure that the problem is caused by a driver issue instead of a 3rd party application, before I call it in.

Thanks









FuryXLCB3.zip 102k .zip file


----------



## JunkaDK

So a question for all the Fury ( non-x ) owners.

So i've done ALOT of testing with overclocking. I can get it to run stable at approx 1160Mhz with Mem at 550Mhz. But i have to run the fans at +90% speed to keep it below 60. If it gets over 60 then my PC crashes. Im guessing its the VRM that gets too hot, since im running at +48mV? Can anyone back this up?

If i lower the clock to 1080 with just +12Mv, then i get off running the fan at 30-40% at keeping it cool. And it doesnt crash before it hits around 75.

At stock it will go way higher .. close to 90.


----------



## Radox-0

trike.
Quote:


> Originally Posted by *Otterfluff*
> 
> Also said the Nitro takes three slots not two, that means the heatsink is larger than the tri-x right? Perhaps it has better cooling from a larger build?.
> 
> I wish there was info about water-blocking a Fury nano and how that decreases the throtling and how far it can overclock. I have a feeling with proper waterblock removing the thermal throttling it should overclock nicely if not better than a fury X. It has less power delivery because it needs LESS power to do the same as a reg Fury X with more. Ideally better binned chips like that should be able to get higher with less and they do have some room to move. It would be very interesting to see the results and testing from someone who has a nano under water with a decent 50C or less water temp.
> 
> Even if a nano was hitting 1050 core with zero thermal throttling it would be a better deal to buy it with a custom waterblock than a fury X by itself for anyone plumbing one into a custom loop.


My Nano is under a water block, actually performs pretty amazing and handily out performs my Fury Tri-X it replaced.

On the stock reference cooler mine did 1085 Mhz on the core in firestrike while in games it was most comfortable at 1050 Mhz with fan ramped up of course to prevent throttling. In a custom loop with the EK block it reaches 1125 Mhz on the core and for gaming I leave it at 1100 Mhz, this is with the 50% extra on power limit.

Of course at this stage there is 0 throttling as I got with the stock cooler and for gaming maintains 1100 Mhz on the core and 550 on the Memory throughout. Great little card.


----------



## Lorem Ipsum

Quote:


> Originally Posted by *JunkaDK*
> 
> So a question for all the Fury ( non-x ) owners.
> 
> So i've done ALOT of testing with overclocking. I can get it to run stable at approx 1160Mhz with Mem at 550Mhz. But i have to run the fans at +90% speed to keep it below 60. If it gets over 60 then my PC crashes. Im guessing its the VRM that gets too hot, since im running at +48mV? Can anyone back this up?
> 
> If i lower the clock to 1080 with just +12Mv, then i get off running the fan at 30-40% at keeping it cool. And it doesnt crash before it hits around 75.
> 
> At stock it will go way higher .. close to 90.


I have a similar trend with temperatures and stability, though it looks like I need a bit less fan with my Tri-X cooler. It's stable at 1165 if the fans are 100%, but that's too loud for gaming. At 1130, it seems fine at 65 degrees, which translates to about 40% fans on the Tri-X which is quiet enough for me, so that's what I use for games.

It's easy to see why they decided to watercool the Fury-X, though I imagine they could have got away with an air cooled variant at stock clocks.

Note that I just slide the voltage as high as it goes in Trixx to +72mV. Not sure if this is really the best thing to do or not...


----------



## Alastair

Quote:


> Originally Posted by *Lorem Ipsum*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JunkaDK*
> 
> So a question for all the Fury ( non-x ) owners.
> 
> So i've done ALOT of testing with overclocking. I can get it to run stable at approx 1160Mhz with Mem at 550Mhz. But i have to run the fans at +90% speed to keep it below 60. If it gets over 60 then my PC crashes. Im guessing its the VRM that gets too hot, since im running at +48mV? Can anyone back this up?
> 
> If i lower the clock to 1080 with just +12Mv, then i get off running the fan at 30-40% at keeping it cool. And it doesnt crash before it hits around 75.
> 
> At stock it will go way higher .. close to 90.
> 
> 
> 
> I have a similar trend with temperatures and stability, though it looks like I need a bit less fan with my Tri-X cooler. It's stable at 1165 if the fans are 100%, but that's too loud for gaming. At 1130, it seems fine at 65 degrees, which translates to about 40% fans on the Tri-X which is quiet enough for me, so that's what I use for games.
> 
> It's easy to see why they decided to watercool the Fury-X, though I imagine they could have got away with an air cooled variant at stock clocks.
> 
> Note that I just slide the voltage as high as it goes in Trixx to +72mV. Not sure if this is really the best thing to do or not...
Click to expand...

The chip itself can handle up to 1.4V (so +200mv) in ideal situations from what I am told. Aka. Fullcover water blocks. So I doubt +72mv on air will cause any sort of issues.


----------



## JunkaDK

Quote:


> Originally Posted by *Lorem Ipsum*
> 
> I have a similar trend with temperatures and stability, though it looks like I need a bit less fan with my Tri-X cooler. It's stable at 1165 if the fans are 100%, but that's too loud for gaming. At 1130, it seems fine at 65 degrees, which translates to about 40% fans on the Tri-X which is quiet enough for me, so that's what I use for games.
> 
> It's easy to see why they decided to watercool the Fury-X, though I imagine they could have got away with an air cooled variant at stock clocks.
> 
> Note that I just slide the voltage as high as it goes in Trixx to +72mV. Not sure if this is really the best thing to do or not..


I think i can go at 1130 +24mV at around 50% fan speed.. that's just around when it stats to make "noise"







30-40% i can barely hear.

Actually that was wrong what i wrote.. MSI afterburner goes up to +96mV and thats what i run when i go for 1160+ and mem at 550, not 48mV .. this is with 90-100% to make it stable.


----------



## JunkaDK

Quote:


> Originally Posted by *Alastair*
> 
> The chip itself can handle up to 1.4V (so +200mv) in ideal situations from what I am told. Aka. Fullcover water blocks. So I doubt +72mv on air will cause any sort of issues.


So according to what you are saying it is the VRM overheating that causes the crashes?


----------



## The Stilt

I wouldn´t consider 1.4V as safe, for the sake of the VRM alone. Despite being a six (VDDC) phase solution it runs extremely hot even at stock. The default VDDC for the lowest leaking Fiji XT ASIC is 1.25000V (before vdroop) so 1.4000V is pretty harsh (you´re looking > 380W power draw for the GPU alone).


----------



## Otterfluff

Quote:


> Originally Posted by *The Stilt*
> 
> I wouldn´t consider 1.4V as safe, for the sake of the VRM alone. Despite being a six (VDDC) phase solution it runs extremely hot even at stock. The default VDDC for the lowest leaking Fiji XT ASIC is 1.25000V (before vdroop) so 1.4000V is pretty harsh (you´re looking > 380W power draw for the GPU alone).


Only safe under water with alot of spare radiator capacity


----------



## Otterfluff

Quote:


> Originally Posted by *Radox-0*
> 
> trike.
> My Nano is under a water block, actually performs pretty amazing and handily out performs my Fury Tri-X it replaced.
> 
> On the stock reference cooler mine did 1085 Mhz on the core in firestrike while in games it was most comfortable at 1050 Mhz with fan ramped up of course to prevent throttling. In a custom loop with the EK block it reaches 1125 Mhz on the core and for gaming I leave it at 1100 Mhz, this is with the 50% extra on power limit.
> 
> Of course at this stage there is 0 throttling as I got with the stock cooler and for gaming maintains 1100 Mhz on the core and 550 on the Memory throughout. Great little card.


Very interesting, what is your water temperature under load? Have you tried raising the voltage using programs like Trixx and Afterburner? 1100 Mhz with no extra voltage is pretty amazing I would love to hear what it could do with a little extra voltage.


----------



## Radox-0

Quote:


> Originally Posted by *Otterfluff*
> 
> Very interesting, what is your water temperature under load? Have you tried raising the voltage using programs like Trixx and Afterburner? 1100 Mhz with no extra voltage is pretty amazing I would love to hear what it could do with a little extra voltage.


That was with +6 on the voltage in AB. Admitedly not sure how far you should push these, but looking from the above seems I can eek out a bit more so will add some more.

Water temps do get on the warm side as its on a slim line 240 mm rad with a 4690k as its a HTPC. Water temps get to about 40-42 degrees after 4 hours or so of playing intensive games with the 4690k @ 4.6 Ghz and the Nano at 1110 Mhz and 550 on the memory and case fans at 55% so not too worried. Of course its not an issue as such because the nano only gets to about 45 degrees which is still well below when it throttles.


----------



## Scorpion49

Quote:


> Originally Posted by *JunkaDK*
> 
> So a question for all the Fury ( non-x ) owners.
> 
> So i've done ALOT of testing with overclocking. I can get it to run stable at approx 1160Mhz with Mem at 550Mhz. But i have to run the fans at +90% speed to keep it below 60. If it gets over 60 then my PC crashes. Im guessing its the VRM that gets too hot, since im running at +48mV? Can anyone back this up?
> 
> If i lower the clock to 1080 with just +12Mv, then i get off running the fan at 30-40% at keeping it cool. And it doesnt crash before it hits around 75.
> 
> At stock it will go way higher .. close to 90.


All 4 of my Fury Tri-X cards had terrible, terrible thermal paste applications. Once I re-did then the temps dropped considerably. Something to consider anyways.

For example, this one was crashing a lot and one of the HBM stacks had no thermal paste on it at all (also check out the dry-as-cardboard VRM thermal pad, replaced with fujipoly extreme):


----------



## baii

Quote:


> Originally Posted by *Scorpion49*
> 
> All 4 of my Fury Tri-X cards had terrible, terrible thermal paste applications. Once I re-did then the temps dropped considerably. Something to consider anyways.
> 
> For example, this one was crashing a lot and one of the HBM stacks had no thermal paste on it at all (also check out the dry-as-cardboard VRM thermal pad, replaced with fujipoly extreme):


My trix had white gel like paste. Btw, since you have 4 of them, do you hear any buzzing/fluttering sound under load?


----------



## Scorpion49

Quote:


> Originally Posted by *baii*
> 
> My trix had white gel like paste. Btw, since you have 4 of them, do you hear any buzzing/fluttering sound under load?


I went through 4 Tri-X and 1 XFX Fury before giving up for a while, they all had BAD coil whine, not just a flutter.


Spoiler: Warning: Spoiler!



Tri-X




XFX







The Nitro I have is much better, it gets a very quiet flutter in some games, but I can only hear it when the case is open and I put my head inside. It also gets a little high-pitched whine at very high frame rates (400+) but most cards do that anyways.


----------



## fat4l

Quote:


> Originally Posted by *josephimports*
> 
> Unfortunately, the problem persists.
> 
> Bios verification
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Usage
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> SS of test spot. Crimson settings at default.
> 
> 
> Spoiler: Warning: Spoiler!


Just one question,
how do you flash in windows ? What atiflash are u using and is it any different from flashing in DOS ?

All the time I used DOS to flash with atiflash.









Quote:


> Originally Posted by *The Stilt*
> 
> I wouldn´t consider 1.4V as safe, for the sake of the VRM alone. Despite being a six (VDDC) phase solution it runs extremely hot even at stock. The default VDDC for the lowest leaking Fiji XT ASIC is 1.25000V (before vdroop) so 1.4000V is pretty harsh (you´re looking > 380W power draw for the GPU alone).


What about the highest VDDC(highest leaking chips) ?


----------



## Radox-0

Quote:


> Originally Posted by *Otterfluff*
> 
> Very interesting, what is your water temperature under load? Have you tried raising the voltage using programs like Trixx and Afterburner? 1100 Mhz with no extra voltage is pretty amazing I would love to hear what it could do with a little extra voltage.


Seems my card does not like voltage, does not seem to make too much of a difference. Will over clock the same with +6 mv added as it did with +54 so 1125 / 550 in benchmarks but in games works best and has no issues around 1105 Mhz mark.


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> how do you flash in windows ? What atiflash are u using and is it any different from flashing in DOS ?
> 
> All the time I used DOS to flash with atiflash.


There is ATIwinflash, I don't use it as it can go pear shape if you have say unstable OS/overclock on system. It's also way slower than dos flash when I used last time included in a package from Asus to update the DCUII 290X I used own.


----------



## baii

Quote:


> Originally Posted by *Scorpion49*
> 
> I went through 4 Tri-X and 1 XFX Fury before giving up for a while, they all had BAD coil whine, not just a flutter.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Tri-X
> 
> 
> 
> 
> XFX
> 
> 
> 
> 
> 
> 
> 
> The Nitro I have is much better, it gets a very quiet flutter in some games, but I can only hear it when the case is open and I put my head inside. It also gets a little high-pitched whine at very high frame rates (400+) but most cards do that anyways.


eh, on the exact same boat,Recently try a fury-X and the same thing. I even tried plastic dip the card everywhere, and 0 help.
i have 2 tri-x coming in and see what happens, the dadadededade just get so annoying.

The reason I didnt get a nitro is just that I doubt they have any chance to get unlocked, and NE review isn't exactly saying it have 0 whine.

Maybe I can try a asus, but then fan noise will be back to haunt me.


----------



## Otterfluff

Quote:


> Originally Posted by *Radox-0*
> 
> Seems my card does not like voltage, does not seem to make too much of a difference. Will over clock the same with +6 mv added as it did with +54 so 1125 / 550 in benchmarks but in games works best and has no issues around 1105 Mhz mark.


Have you tried to do it in small increments? What program are you using to test with? Normally red artifacts and flashes represents undervoltage, while black screen frame drops with crashing represents too much voltage. You have to very slowly edge things up and down based on those markers and try to get something stable. Someone correct me if im wrong just going off my own experience.

I would not normally dump +54mv right off the bat. I would start small and edge it up very slowly.

Also your water temps are impressive for a single 240mm radiator!


----------



## Scorpion49

Quote:


> Originally Posted by *baii*
> 
> eh, on the exact same boat,Recently try a fury-X and the same thing. I even tried plastic dip the card everywhere, and 0 help.
> i have 2 tri-x coming in and see what happens, the dadadededade just get so annoying.
> 
> The reason I didnt get a nitro is just that I doubt they have any chance to get unlocked, and NE review isn't exactly saying it have 0 whine.
> 
> Maybe I can try a asus, but then fan noise will be back to haunt me.


What PSU do you have? My whine was much more pronounced with a Corsair RM850i than it is with my Rosewill or Antec units.


----------



## Radox-0

Quote:


> Originally Posted by *Otterfluff*
> 
> Have you tried to do it in small increments? What program are you using to test with? Normally red artifacts and flashes represents undervoltage, while black screen frame drops with crashing represents too much voltage. You have to very slowly edge things up and down based on those markers and try to get something stable. Someone correct me if im wrong just going off my own experience.
> 
> I would not normally dump +54mv right off the bat. I would start small and edge it up very slowly.
> 
> Also your water temps are impressive for a single 240mm radiator!


Yes sorry what I meant was pretty much any voltage within that range going up in the increments did not make a difference.

Essentially the AMD drivers were to the desktop on multiple programmes and tests. I suspect this is far as this sample will go. Not to bad as my prior Nano did not over clock half as well.

Yep pretty happy with temps, moreso considering its a full cover motherboard block so cooling the mosfets and PCH in addition to CPU and GPU, the Vardar fans are doing a reasonable job getting the heat out.


----------



## baii

Quote:


> Originally Posted by *Scorpion49*
> 
> What PSU do you have? My whine was much more pronounced with a Corsair RM850i than it is with my Rosewill or Antec units.


I tried on a cm v650 and seasonic x650, make no difference.

I had several, yes several, 290/290x that do the same thing, and 2-3 290/290x that hardly make any buzz using the same psu. So I doubt it's the PSU.


----------



## josephimports

Quote:


> Originally Posted by *fat4l*
> 
> Just one question,
> how do you flash in windows ? What atiflash are u using and is it any different from flashing in DOS ?


Thanks to some help from @Jflisk, ive found it to be quick and simple.

Download Atiflash 2.71

Create new folder in c:\ and name it atiflash.

Extract atiflash zip file into newly created folder.

Copy the bios file you would like to flash into this folder as well.

run CMD promt as admin.

type cd c:\atiflash

For gpu adapter info type atiflash -i

To save gpu bios, use info found using command -i. Type atiflash -s (GPU NUM) (BIOS FILE NAME) [BIOS FILE SIZE]

Example atiflash -s 0 Fiji 40000

To flash bios type atiflash -p 0(0 top or 1 second card depending on card position) biosname.rom

Example " atiflash -p 0 Fiji.rom" This process takes only seconds to complete. Restart PC once complete.

List of commands can be found here


----------



## Noirgheos

So, do I have anything to look forward to when AMD finally fixes the downclocking in Crimson? I heard the driver was supposed to help with stuttering in games... does it?

I get so much of it in 15.11.1 CCC that I feel like smashing my card to pieces sometimes.


----------



## baii

Got the 2 new tri-x

good news is they can both be unlocked to 3840, and whine less than my first card.

bad thing is, they still whine worse than my old "second best" r9 290 (in like 6-7 290s).

Why cant they make card that doesn't whine? it is louder than the fan ...


----------



## xer0h0ur

To those who feel like added voltage doesn't make a difference might want to try trixx instead of afterburner or vice versa.


----------



## Tobiman

Looks like I may be getting a fury instead of a NANo. Leaning towards the strix for looks but it seems to be a problematic model, at least on newegg.


----------



## 98uk

Quote:


> Originally Posted by *Scorpion49*
> 
> What PSU do you have? My whine was much more pronounced with a Corsair RM850i than it is with my Rosewill or Antec units.


I was preparing for coil whine after I got my Sapphire Fury, but happy to say there is nothing at all, not even under full load. This is using my Seasonic SSR-650RM (G Series).

I have to say, it's probably the best card i've owned in quite a few years. Powerful, extremely quiet (even under load) and very cool. My only gripe is that there isn't an 8GB model, because even playing BF4 i'm hitting 3.5gb used graphics memory...


----------



## dagget3450

Quote:


> Originally Posted by *baii*
> 
> Got the 2 new tri-x
> 
> good news is they can both be unlocked to 3840, and whine less than my first card.
> 
> bad thing is, they still whine worse than my old "second best" r9 290 (in like 6-7 290s).
> 
> Why cant they make card that doesn't whine? it is louder than the fan ...


I think part of this coil whine issue is a bi-product of the low noise of fans. The thing runs too cool and quiet on fans for it's own good.


----------



## The Stilt

The noise (whine) is caused by physical movement, which changes it´s frequency as the current through the inductor (drawn by the GPU) varies. All inductors inductors "whine", but the noise level / frequency varies between the different inductors / structures. Generally the surface mounted inductors produce more noise, since they are not attached as firmly to the PCB. Some of the AIBs use through-hole inductors for their own designs (not available for Fury X or Nano). These inductors are generally less noisy because they are usually better shielded than the SMD inductors and they cannot move around as much due their mounting method.

If you don´t care about the warranty, you can reduce the noise produced by the inductor by using some heat-resistant epoxy. The additional bond between the inductor and the PCB will reduce the movement of the inductor and that way reduce the noise. Some AIBs have used this methods on their cards straight from the factory (PowerColor).


----------



## baii

I did try plastic dip on all the inductor on first card, make no difference. It just bother me that I had reference 290(new and used) that whine much less. They should be improving not "we can't fix it".


----------



## The Stilt

You mean Plasti Dip type of product? If so, it is probably too thin or elastic to prevent the inductor from oscillating.

Something like this should work better:





Note how PowerColor did a sloppy job applying the glue. It is supposed to be packed between the inductor and the PCB, not to be sprayed on the inductor.


----------



## baii

Probably, gonna try a evga g2 before returning the cards. Any other psu recommendation that can be grabbed from microcenter? This whole whine thing had been bothering me since beginning of build. It is so easy to make other component dead silent. And this one thing I have no control of.


----------



## Scorpion49

Quote:


> Originally Posted by *baii*
> 
> Probably, gonna try a evga g2 before returning the cards. Any other psu recommendation that can be grabbed from microcenter? This whole whine thing had been bothering me since beginning of build. It is so easy to make other component dead silent. And this one thing I have no control of.


Probably a good idea. I'm going to MC tomorrow myself to pick up a 4K monitor.


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Probably a good idea. I'm going to MC tomorrow myself to pick up a 4K monitor.


Sweet. Keep us posted which one you get and how it does.


----------



## JunkaDK

If i change the thermal pads and paste on my GPU, can i use my CPU paste (arctic silver) on the GPU chip?

https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/front_full.jpg

this is what the original looks like : https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/cooler2.jpg

OR would it be better to buy a thermal pad that would cover the GPU and mem chips? instead of applying paste.
https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/memory.jpg


----------



## JunkaDK

Quote:


> Originally Posted by *Scorpion49*
> 
> Getting hot, not particularly but there isn't an HBM temp sensor that I know of, and one of them was only half covered even with all of that stuff on there. Was just curious to see what they did. Too much TIM is bad, its an insulator when you get more than a thin layer.
> 
> EDIT: I also noticed there is a tiny chip in the core by one of the HBM stacks. I don't like that.
> 
> This is what I ended up doing:


Curios to know how this ended? Did u notice any difference after repasting and changing the thermal pads?

My problem ( i think) is that the VRM's get too hot under heavy load ( OC to 1165mhz).. it will shut down over 62 degress.. i can only keep it under 62 when the fan is a 100%.. = NOISE







My card is the Asus R9 Fury STRIX

Also... is the fuji thermal pad 1mm? or thicker?


----------



## Scorpion49

Quote:


> Originally Posted by *JunkaDK*
> 
> If i change the thermal pads and paste on my GPU, can i use my CPU paste (arctic silver) on the GPU chip?
> 
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/front_full.jpg
> 
> this is what the original looks like : https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/cooler2.jpg
> 
> OR would it be better to buy a thermal pad that would cover the GPU and mem chips? instead of applying paste.
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/memory.jpg


That is not thermal pad on the GPU/HBM! Don't put thermal pads there! Also, I would not put AS5 on it either, get something good like Gelid GC extreme that spreads easily (I use NT-H1 personally).

Quote:


> Originally Posted by *JunkaDK*
> 
> Curios to know how this ended? Did u notice any difference after repasting and changing the thermal pads?
> 
> My problem ( i think) is that the VRM's get too hot under heavy load ( OC to 1165mhz).. it will shut down over 62 degress.. i can only keep it under 62 when the fan is a 100%.. = NOISE
> 
> 
> 
> 
> 
> 
> 
> My card is the Asus R9 Fury STRIX
> 
> Also... is the fuji thermal pad 1mm? or thicker?


On that particular card my core temps dropped almost 15C, I'm not sure on the VRM temps because the Tri-X didn't have a sensor for it. I reasoned that it was wise to replace it when I was in there because the old stuff was like a stick of gum, hard as a rock and crumbling. I know I was able to get an extra 25mhz out of it (which was very good when Fiji first came out with no voltage control/custom BIOS, etc).

Not sure on the thickness of the pads, I just eyeballed it to compare with some I had already and then made sure it had good contact when I put it back together, but I think they were 1.0mm. The Asus may be different.


----------



## JunkaDK

Quote:


> Originally Posted by *Scorpion49*
> 
> That is not thermal pad on the GPU/HBM! Don't put thermal pads there! Also, I would not put AS5 on it either, get something good like Gelid GC extreme that spreads easily (I use NT-H1 personally).
> On that particular card my core temps dropped almost 15C, I'm not sure on the VRM temps because the Tri-X didn't have a sensor for it. I reasoned that it was wise to replace it when I was in there because the old stuff was like a stick of gum, hard as a rock and crumbling. I know I was able to get an extra 25mhz out of it (which was very good when Fiji first came out with no voltage control/custom BIOS, etc).
> 
> Not sure on the thickness of the pads, I just eyeballed it to compare with some I had already and then made sure it had good contact when I put it back together, but I think they were 1.0mm. The Asus may be different.


I'm gonna try it on my Asus. Thanks alot for these tips.. i will let you know how it works out







+rep


----------



## Noirgheos

Does anyone else's Tri-X Fury have the PCB kind of bending towards the port side of the card? It's almost touching the backplate. Is this a case of the backplate being too tightly screwed to the PCB? I removed the card to check if it was just stress from being installed, still there.

Should I loosen that area of the backplate?


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet. Keep us posted which one you get and how it does.


Well, I drove an hour and a half out there and got it, only to unbox extreme disappointment (again). I swear to god buying monitors is the worst thing in the entire world when it comes to PC's. I grabbed the AOC U2879VF which I think a few people here have, its a 4K Freesync panel (same 28" panel as in all of the other 28"). Mine has a huge cluster of dead pixels dead center in the screen and the entire right side light-bleeds so badly along the edge that I can see it on white backgrounds let alone black.

Now I get to decide if I want to drive all the way back with it or send it to AOC to get replaced instead.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I drove an hour and a half out there and got it, only to unbox extreme disappointment (again). I swear to god buying monitors is the worst thing in the entire world when it comes to PC's. I grabbed the AOC U2879VF which I think a few people here have, its a 4K Freesync panel (same 28" panel as in all of the other 28"). Mine has a huge cluster of dead pixels dead center in the screen and the entire right side light-bleeds so badly along the edge that I can see it on white backgrounds let alone black.
> 
> Now I get to decide if I want to drive all the way back with it or send it to AOC to get replaced instead.


That sucks, I love mine.
Drive back IMO!


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> That sucks, I love mine.
> Drive back IMO!


Yeah your posts about it are what convinced me to try it. I had the Samsung U28E590D which is the same panel but I could not for the life of me get freesync working and I sent it back. Now that Freesync drivers have matured a little and I had a great XL2730Z I decided to give it another go. What overdrive settings do you use? I'm going to try and revive the dead pixels and see if I can't fix them.


----------



## p4inkill3r

The only settings I had to change were in the OSD, Crimson automatically recognized the display and enabled Freesync.


----------



## SLK

Quote:


> Originally Posted by *Noirgheos*
> 
> Does anyone else's Tri-X Fury have the PCB kind of bending towards the port side of the card? It's almost touching the backplate. Is this a case of the backplate being too tightly screwed to the PCB? I removed the card to check if it was just stress from being installed, still there.
> 
> Should I loosen that area of the backplate?


I can't tell you how many times I had to adjust the heatsink and VGA Bracket due to Sapphire warping the PCB by overtightening/misaligning the screws. They are also great at crushing the aluminum heatsink fins on the cooler too.

I would loosen every screw on the card and tighten back up. Something may be obstructing the PCB like a folded thermal pad that is causing the warping though.


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> The only settings I had to change were in the OSD, Crimson automatically recognized the display and enabled Freesync.


Yeah freesync seems to be working great. I managed to get the dead pixels down to just one, that I could ignore. But the light bleed on the right is awful, even at 25% brightness I can see it radiating an inch out from the bezel when I turn on a game or anything not solid white.


----------



## Tobiman

Quote:


> Originally Posted by *The Stilt*
> 
> The noise (whine) is caused by physical movement, which changes it´s frequency as the current through the inductor (drawn by the GPU) varies. All inductors inductors "whine", but the noise level / frequency varies between the different inductors / structures. Generally the surface mounted inductors produce more noise, since they are not attached as firmly to the PCB. Some of the AIBs use through-hole inductors for their own designs (not available for Fury X or Nano). These inductors are generally less noisy because they are usually better shielded than the SMD inductors and they cannot move around as much due their mounting method.
> 
> If you don´t care about the warranty, you can reduce the noise produced by the inductor by using some heat-resistant epoxy. The additional bond between the inductor and the PCB will reduce the movement of the inductor and that way reduce the noise. Some AIBs have used this methods on their cards straight from the factory (PowerColor).


Yup. My PCS+ R9 290 has this and there's zero coil whine.


----------



## p4inkill3r

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah freesync seems to be working great. I managed to get the dead pixels down to just one, that I could ignore. But the light bleed on the right is awful, even at 25% brightness I can see it radiating an inch out from the bezel when I turn on a game or anything not solid white.


That's too bad. The light bleed on mine is there but not to a degree that interferes with my experience.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> The only settings I had to change were in the OSD, Crimson automatically recognized the display and enabled Freesync.


Merely detected Freesync or turned it on globally? I ask because mine was detected but default off on the Acer XF270HU. Haven't tested the BenQ yet...waiting on a cable.


----------



## Scorpion49

Quote:


> Originally Posted by *p4inkill3r*
> 
> That's too bad. The light bleed on mine is there but not to a degree that interferes with my experience.


Mine looks like this at 10% brightness:


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*
> 
> Merely detected Freesync or turned it on globally? I ask because mine was detected but default off on the Acer XF270HU. Haven't tested the BenQ yet...waiting on a cable.


Enabled globally.

Quote:


> Originally Posted by *Scorpion49*
> 
> Mine looks like this at 10% brightness:


It may not translate via picture all the way, but that would probably be an acceptable amount of bleed for me; I do tend to be more accepting of flaws than some, however.


----------



## Thoth420

Thanks Painkiller.


----------



## maximusdd

My 250d build fury x, h100i


----------



## Sonikku13

Hopefully ordering my Radeon R9 Nano this weekend, the white one from ASUS. Bad timing in terms of Pascal and Polaris coming out in 2H 2016, but I gotta do it, I'm on a derpy A10-7850K iGPU atm.


----------



## Thoth420

Quote:


> Originally Posted by *Sonikku13*
> 
> Hopefully ordering my Radeon R9 Nano this weekend, the white one from ASUS. Bad timing in terms of Pascal and Polaris coming out in 2H 2016, but I gotta do it, I'm on a derpy A10-7850K iGPU atm.


Nice white one is sick looking! If it makes you feel better I plumbed a Fury X into my loop backplate and all and soon as Polaris is out it's getting replaced.









I'm a GPU junkie


----------



## Arizonian

Ok finally have time to post. So last week I ran some benchmarks doing comparison of CCC 15.7, Crimson 15.1 Crimson 16.1 drivers. Went back to Crimson 16.1 and not even 10 mins into playing StarWars BF I got RSOD. I cleaned the registry which I didn't take the time to do the last time and reinstalled Crimson 16.1. Everything is running smooth again. This weekend I went from 35th to 38th level in StarWars BF over clocked @1150 on the core without a single issue. Seems registry clean is key, at least for me.

Quote:


> Originally Posted by *maximusdd*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My 250d build fury x, h100i


Congrats - looks nice








Quote:


> Originally Posted by *Sonikku13*
> 
> Hopefully ordering my Radeon R9 Nano this weekend, the white one from ASUS. Bad timing in terms of Pascal and Polaris coming out in 2H 2016, but I gotta do it, I'm on a derpy A10-7850K iGPU atm.


Whether your building new, replacing broke, upgrading what's can't handle gaming you don't have much choice. A lot of people in your boat all the time. At $499 I'd suggest an air cooled Fury unless there is a reason you want a Nano which I assume it by choice.

I did hear rumor of another price break.

http://www.overclock.net/t/1588455/kitguru-source-amd-to-cut-price-of-fury-gpu#post_24813267
Quote:


> Originally Posted by *Thoth420*
> 
> Nice white one is sick looking! If it makes you feel better I plumbed a Fury X into my loop backplate and all and soon as Polaris is out it's getting replaced.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I'm a GPU junkie*


Me too.

I'm putting this perfectly fine Fury on sale when Polaris reference comes out and shooting for an air cooled aftermarket Polaris. In the meantime I'm in no hurry playing at 1440p.


----------



## baii

Quote:


> Originally Posted by *Scorpion49*
> 
> Probably a good idea. I'm going to MC tomorrow myself to pick up a 4K monitor.


Just tested them with the g2, not much big difference, maybe less Skippy /scratchy, but it is there. Just can't make myself to not notice it, since it is mostly for madvr and video watching. And the fury has just a bit more than enough performance for the setting I like. A 980 won't cut it, so I need to jump to a 980ti which is much much more expensive, like 30-40% more expensive.
Maybe when nitro come down in price, newegg don't do refund grrr,don't want to end up with a card that I can't use.


----------



## Noirgheos

Quote:


> Originally Posted by *Arizonian*
> 
> Ok finally have time to post. So last week I ran some benchmarks doing comparison of CCC 15.7, Crimson 15.1 Crimson 16.1 drivers. Went back to Crimson 16.1 and not even 10 mins into playing StarWars BF I got RSOD. I cleaned the registry which I didn't take the time to do the last time and reinstalled Crimson 16.1. Everything is running smooth again. This weekend I went from 35th to 38th level in StarWars BF over clocked @1150 on the core without a single issue. Seems registry clean is key, at least for me.
> 
> Congrats - looks nice
> 
> 
> 
> 
> 
> 
> 
> 
> Whether your building new, replacing broke, upgrading what's can't handle gaming you don't have much choice. A lot of people in your boat all the time. At $499 I'd suggest an air cooled Fury unless there is a reason you want a Nano which I assume it by choice.
> 
> I did hear rumor of another price break.
> 
> http://www.overclock.net/t/1588455/kitguru-source-amd-to-cut-price-of-fury-gpu#post_24813267
> Me too.
> 
> I'm putting this perfectly fine Fury on sale when Polaris reference comes out and shooting for an air cooled aftermarket Polaris. In the meantime I'm in no hurry playing at 1440p.


Which one has the least stuttering?


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I drove an hour and a half out there and got it, only to unbox extreme disappointment (again). I swear to god buying monitors is the worst thing in the entire world when it comes to PC's. I grabbed the AOC U2879VF which I think a few people here have, its a 4K Freesync panel (same 28" panel as in all of the other 28"). Mine has a huge cluster of dead pixels dead center in the screen and the entire right side light-bleeds so badly along the edge that I can see it on white backgrounds let alone black.
> 
> Now I get to decide if I want to drive all the way back with it or send it to AOC to get replaced instead.


Ack, sorry to hear this. Would be nice to be able to try out a monitor in store at places like MC or Fry's Electronics.
Quote:


> Originally Posted by *Noirgheos*
> 
> Which one has the least stuttering?


I'm sorry but I do not have stuttering using Crimson but I also don't have dFallout 4 or COD BlackOps 3. Only games I'm actively playing is StarWars BF, CSGO, FarCry4 and a little SOM. My next game is most likely Metal Gear Solid: Phantom.

I only tried CCC 15.7.1 for a SOM benchmark and did not game on it to compare actual gaming. I came back to AMD on Crimson release. I thought Crimson 16.1 was an improvement from Crimson 15.2 when I saw FPS go up in SOM. Only I learned it wasn't improvements but was more of a fix to bring back performance levels of CCC 15.7.1.

*Shadows of Mordor* 2560x1440 60 Hz IPS

CCC15.7.1
Max *113* Min *58* Avg *83*

Crimson 15.12
Max *66* Min *44* Avg *60*

Crimson 16.1
Max *116* Min *60* Avg *84*

I'd suggest Crimson 16.1 based on improvements from Crimson 15.2 and if one encounters driver issues Crimson 16.1, best bet is to just roll back to CCC 15.7.1 until next release.


----------



## maximusdd

Anything I should be adding to my setup?


----------



## JunkaDK

Quote:


> Originally Posted by *maximusdd*
> 
> Anything I should be adding to my setup?


It would help knowing your current setup? .. all i see is a blurry image in the thread


----------



## maximusdd

CPU: Intel i7 4790k @ 4.4GHZ
CPU Cooler: Corsair H100i
Motherboard: Asus Maximus Impact vi
Case: Corsair Obsidian 250D
RAM: Kingston HyperX Beast 16 GB 2133MHz
GFX:SAPPHIRE Radeon R9 FURY X
HDD: Kingston Hyper X 3K 240GB SSD
HDD: Samsung 850 Evo 500GB SSD
PSU: SilverStone Technology 600W SFX
Case: Corsair Obsidian 250D

[[/IMG]


----------



## JunkaDK

Quote:


> Originally Posted by *maximusdd*
> 
> CPU: Intel i7 4790k @ 4.4GHZ
> CPU Cooler: Corsair H100i
> Motherboard: Asus Maximus Impact vi
> Case: Corsair Obsidian 250D
> RAM: Kingston HyperX Beast 16 GB 2133MHz
> GFX:SAPPHIRE Radeon R9 FURY X
> HDD: Kingston Hyper X 3K 240GB SSD
> HDD: Samsung 850 Evo 500GB SSD
> PSU: SilverStone Technology 600W SFX
> Case: Corsair Obsidian 250D
> 
> [[/IMG]


I would say that is a sweet setup. Can you even fit anything else in there?


----------



## maximusdd

Is it worth adding a fan on the front -push pull config, behind gpu radiator?

Temps are great at the mo


----------



## JunkaDK

Quote:


> Originally Posted by *maximusdd*
> 
> Is it worth adding a fan on the front -push pull config, behind gpu radiator?
> 
> Temps and great at the mo


Nope.. not imo. 1 fan is fine


----------



## Scorpion49

I did some back to back testing with 15.11.1 vs 16.1 in Fallout 4, the Crimson drivers allow the clocks to run way too low to "save power", and it doesn't respond fast enough to load demands causing massive stuttering and irregular frame rates. 15.11.1 maintains my set clock speed at all times while 16.1 is constantly dropping clocks, which seems ultimately useless on a GPU rated at 275W to be honest. I bought this thing for performance not to sip electricity like a budget card.

Whats worse is with 16.1 it seems to get "stuck" in low clock states often even in demanding sections of gameplay so my frames might drop down to single digits while CPU usage stays low and GPU usage is 100%. I kept getting killed in one mission in a tunnel because the card would lock at 306mhz and I couldn't play the damned game that way. Why in gods name does AMD do this crap to us?

Here are some examples:

(top 16.1)


----------



## zimm16

ClockBlocker fixed this issue for me. AMD got too aggressive in their latest drivers as far as power savings go.

I have reverted back to 15.12 due to other issues in 16.1

GL

ClockBlocker


----------



## NBrock

Anyone else get weird screen flicker randomly?
So far it has only happened 3 times.
I will be doing stuff and all the sudden the screen goes crazy (green and different colored lines and flickering and the size gets all funny). If I unplug my HDMI cable and plug it back in it goes away.

System is Windows 10 Pro x64
Maximus VI Impact
4770k
Sapphire Fury X (using newest beta drivers 16.1)
16GB 2400 GSkill ram
Corsair RM850 PSU

So far it has happened twice today while I am connected to my work computer via Screen Connect.
The first time it happened I believe I was loading either Fallout 4 or Battle Field 4.


----------



## Jflisk

Quote:


> Originally Posted by *NBrock*
> 
> Anyone else get weird screen flicker randomly?
> So far it has only happened 3 times.
> I will be doing stuff and all the sudden the screen goes crazy (green and different colored lines and flickering and the size gets all funny). If I unplug my HDMI cable and plug it back in it goes away.
> 
> System is Windows 10 Pro x64
> Maximus VI Impact
> 4770k
> Sapphire Fury X (using newest beta drivers 16.1)
> 16GB 2400 GSkill ram
> Corsair RM850 PSU
> 
> So far it has happened twice today while I am connected to my work computer via Screen Connect.
> The first time it happened I believe I was loading either Fallout 4 or Battle Field 4.


Yep known issue to AMD . I will find the links - I get it also every so often . Post the links in a few

Right click screen change resolution . Then change it back works also .

What BIOS are you running . Might help have not seen it since I went to this bios. Also turns on UEFI
https://www.techpowerup.com/vgabios/177517/sapphire-r9furyx-4096-150721.html

Amd Link they are well aware of it .
https://community.amd.com/thread/188642?q=fury%20x


----------



## Wagnelles

I'm pretty sure this question was asked before, but what case can you reccomend for a Fuxy X CF? It looks a bit tricky to put 2 Fury X rads + an AIO CPU WC all togheter without making everything looks an ugly mess.


----------



## NBrock

I have Bios 64.005990. What is everyone using to flash their GPU bios these days? ATIFlash?


----------



## p4inkill3r

Quote:


> Originally Posted by *Wagnelles*
> 
> I'm pretty sure this question was asked before, but what case can you reccomend for a Fuxy X CF? It looks a bit tricky to put 2 Fury X rads + an AIO CPU WC all togheter without making everything looks an ugly mess.


Ugliness is relative, but my Corsair Air 540 can (and hopefully will) house 2x Fury X and my crappy H110.


----------



## Jflisk

Quote:


> Originally Posted by *NBrock*
> 
> I have Bios 64.005990. What is everyone using to flash their GPU bios these days? ATIFlash?


See post 6532
http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-x2-fiji-owners-club/6530#post_24806707 .


----------



## Faydodo

Hi,
Getting a Fury Tri-X OC tomorrow and I'd like some advices for overclocking:
I want a card that can last 2 years and be stable so no hardcore OC.
Should I unlock shaders + overclock memory and core or just overclock?
What kind of OC would be safe and stable with decent perf gain? 1100/550 ?
Thanks


----------



## SuperZan

Quote:


> Originally Posted by *Wagnelles*
> 
> I'm pretty sure this question was asked before, but what case can you reccomend for a Fuxy X CF? It looks a bit tricky to put 2 Fury X rads + an AIO CPU WC all togheter without making everything looks an ugly mess.


I've gone Fury X / Fury Crossfire because I was able to get a tremendous deal on the Fury, but the Corsair 760t I've got would very easily accommodate a second Fury X along with my AIO.


----------



## rv8000

Has anyone with Fiji noticed texture corruption on alt-tabbing in any games?

Ever since crimson afaik, I'll window back into games such as GW2, FO4, and textures will be missing or corrupted on specific items. I went to double check if my card was going bad with a few quick OCCT tests and noticed that once I windowed out of the full screen test and then went back to full screen the errors would jump from 0 to ~4 million.


----------



## ht_addict

Quote:


> Originally Posted by *SuperZan*
> 
> I've gone Fury X / Fury Crossfire because I was able to get a tremendous deal on the Fury, but the Corsair 760t I've got would very easily accommodate a second Fury X along with my AIO.


I had a 760t but moved up to the 990D. So much more room


----------



## SuperZan

Quote:


> Originally Posted by *ht_addict*
> 
> I had a 760t but moved up to the 990D. So much more room


I'd imagine so. I already feel like I have too much space with the 760t.


----------



## Maximization

Quote:


> Originally Posted by *rv8000*
> 
> Has anyone with Fiji noticed texture corruption on alt-tabbing in any games?
> 
> Ever since crimson afaik, I'll window back into games such as GW2, FO4, and textures will be missing or corrupted on specific items. I went to double check if my card was going bad with a few quick OCCT tests and noticed that once I windowed out of the full screen test and then went back to full screen the errors would jump from 0 to ~4 million.


I haven't noticed that with alt-tab. for specific items just to be missing is strange.


----------



## wdpir32k3

Do any of you guys know if AMD will be fixing the clock issues with the Fury's? So I don't have to use clock blocker


----------



## Faydodo

Hi, received my fury tri-x today
tried ocing with MSI AB:
power limit +20%
+36 mV
core 1120
HBM 550
max temp 50°C at 50% fan speed

got this score:http://www.3dmark.com/3dm/10428834
stock:http://www.3dmark.com/3dm/10424230

Should I push further or lower it a bit for a safe and stable OC? I don't have a way to check VRM temps so i'd just like to be sure
Thanks.


----------



## p4inkill3r

Keep pushing!

I dismantled and repasted my Fury X this morning using NT-H1 paste and the results were dramatic; 49C is the high temperature I've seen so far in my testing at 1175MHz/580Mhz when my old temps were around 60C.

I didn't take pics, but the paste job from the factory was very uneven and haphazardly applied; I suggest everyone take a look at their own!


----------



## xer0h0ur

Quote:


> Originally Posted by *wdpir32k3*
> 
> Do any of you guys know if AMD will be fixing the clock issues with the Fury's? So I don't have to use clock blocker


No one has any idea when if at all AMD ever gets around to fixing bugs in their drivers. They still haven't fixed the game profiles in any of the Crimson drivers. Crossfire will not disable on a 295X2 unless you install the CCC after installing the Crimson driver so you can manually create an application profile in the CCC.


----------



## MAMOLII

Quote:


> Originally Posted by *p4inkill3r*
> 
> Keep pushing!
> 
> I dismantled and repasted my Fury X this morning using NT-H1 paste and the results were dramatic; 49C is the high temperature I've seen so far in my testing at 1175MHz/580Mhz when my old temps were around 60C.
> 
> I didn't take pics, but the paste job from the factory was very uneven and haphazardly applied; I suggest everyone take a look at their own!


yep every fury x stock paste apply is like that... did you change the thermal pad on the vrms?i am afraid to do it without a backup thermal pad in the house if i ruin the stock by mistake


----------



## xer0h0ur

What is there to be afraid of? All you need to do is make sure its the same size and thickness. If you're already changing the paste might as well go for broke and change the VRM pad.


----------



## NBrock

Any idea what the stock pad's thickness is? I was contemplating swapping both paste and pad out.


----------



## xer0h0ur

I can't say with certainty but if I had to take a guess it would be 1mm thick. Highly doubt its 0.5mm or 1.5mm.


----------



## 98uk

What are the max voltages deemed safe for Fury to run at?


----------



## p4inkill3r

Quote:


> Originally Posted by *MAMOLII*
> 
> yep every fury x stock paste apply is like that... did you change the thermal pad on the vrms?i am afraid to do it without a backup thermal pad in the house if i ruin the stock by mistake


I didn't because it looked like there was good, solid contact between the pad and the pipe. I have a 12'' strip of 1mm phobya sitting here too but I figured that there wouldn't be too much difference in comparison to the TIM.


----------



## Faydodo

Hmm so I tried overclocking my fury tri-x and it's stable in games and benchmarks but it's weird in games
I OCed at 1120/550 +50% power +48mV , I'm not getting any throttle, temps never exceed 50°C but after like 20 mn of battlefront it stops working and i'm getting a driver error.
Also AC syndicate has artifacts on the menu even at stock
Is it just crimson driver being bad or unstable OC ?


----------



## Maximization

Quote:


> Originally Posted by *Faydodo*
> 
> Hmm so I tried overclocking my fury tri-x and it's stable in games and benchmarks but it's weird in games
> I OCed at 1120/550 +50% power +48mV , I'm not getting any throttle, temps never exceed 50°C but after like 20 mn of battlefront it stops working and i'm getting a driver error.
> Also AC syndicate has artifacts on the menu even at stock
> Is it just crimson driver being bad or unstable OC ?


imo memory overclock is too high


----------



## Radox-0

Has anyone found a sweet spot for the Memory overclock. My nano can settle happily on 1130 Mhz core and 570 Mhz on the memory with extra voltage and under water but it seems the Frames / synthetic scores actually reduce at that level. The card at 1,114 and 508 for example seems to perform considerably better. May need to try and find the sweet spot me thinks.


----------



## maximusdd

Guys is my 600w PSU enough of my rig or should I have something like 850w?


----------



## p4inkill3r

Quote:


> Originally Posted by *Faydodo*
> 
> Hmm so I tried overclocking my fury tri-x and it's stable in games and benchmarks but it's weird in games
> I OCed at 1120/550 +50% power +48mV , I'm not getting any throttle, temps never exceed 50°C but after like 20 mn of battlefront it stops working and i'm getting a driver error.
> Also AC syndicate has artifacts on the menu even at stock
> Is it just crimson driver being bad or unstable OC ?


My experiences with HBM overclocking seem to suggest that it almost application-specific that determines whether or not the setting is stable.

GTA 5 in particular seems almost impossible for me to run OC'd..


----------



## p4inkill3r

Quote:


> Originally Posted by *maximusdd*
> 
> Guys is my 600w PSU enough of my rig or should I have something like 850w?


What are your specs? You should fill out the Rigbuilder so that they're always visible.


----------



## maximusdd

Will do, my specs are :
CPU: Intel i7 4790k @ 4.4GHZ
CPU Cooler: Corsair H100i
Motherboard: Asus Maximus Impact vi
Case: Corsair Obsidian 250D
RAM: Kingston HyperX Beast 16 GB 2133MHz
GFX:SAPPHIRE Radeon R9 FURY X
HDD: Kingston Hyper X 3K 240GB SSD
HDD: Samsung 850 Evo 500GB SSD
PSU: SilverStone Technology 600W SFX
Case: Corsair Obsidian 250D


----------



## p4inkill3r

Quote:


> Originally Posted by *maximusdd*
> 
> Will do, my specs are :
> CPU: Intel i7 4790k @ 4.4GHZ
> CPU Cooler: Corsair H100i
> Motherboard: Asus Maximus Impact vi
> Case: Corsair Obsidian 250D
> RAM: Kingston HyperX Beast 16 GB 2133MHz
> GFX:SAPPHIRE Radeon R9 FURY X
> HDD: Kingston Hyper X 3K 240GB SSD
> HDD: Samsung 850 Evo 500GB SSD
> PSU: SilverStone Technology 600W SFX
> Case: Corsair Obsidian 250D


600w is plenty.


----------



## Faydodo

Hmm MSI AB and GPU-Z give me 1.212-218 VDCC average when on +60mV , are these normal values?(Fury Tri-X)
Well these seem like normal values for a tri-x, should I just go +96 mV (max) and try get a stable OC or would that be too much? (1.24-1.25V)


----------



## xkm1948

Fury X2 is finally here!


----------



## p4inkill3r

That looks like Falcon Northwest's logo on the top pic, but what's the source for the bottom one?


----------



## xer0h0ur

Its an article from wccftech.com and it is a Falcon Northwest PC they were showcasing.

http://wccftech.com/amds-dual-fiji-based-radeon-r9-fury-x2-graphics-card-spotted-vrla/


----------



## baii

Double the pump whine?


----------



## SuperZan

Interesting. I still don't know what to make of the VR delay rationale. The target market isn't exactly over-large to begin with, and I suspect many went as I did and grabbed Crossfire Fiji. I traditionally like to purchase double flagship about six months into a new line's lifespan and was interested in a more compact build with the X2 but the wait pushed my time boundary a little too far.


----------



## xer0h0ur

Quote:


> Originally Posted by *SuperZan*
> 
> Interesting. I still don't know what to make of the VR delay rationale. The target market isn't exactly over-large to begin with, and I suspect many went as I did and grabbed Crossfire Fiji. I traditionally like to purchase double flagship about six months into a new line's lifespan and was interested in a more compact build with the X2 but the wait pushed my time boundary a little too far.


You're never going to have a solid answer on that one. AMD claims they were holding for the VR push. Rumor had it they wouldn't have been able to have enough supply available of FuryX2 had they released on previous schedule. Now you're also seeing OEMs wanting to stick FuryX2's into their systems. So I would lean towards it probably being a bit of both.

Quote:


> Originally Posted by *baii*
> 
> Double the pump whine?


So how long will this one last as the strongest video card in the world? Too bad Nvidia gave up on dual gpu cards on the Maxwell gen.


----------



## NBrock

Quote:


> Originally Posted by *baii*
> 
> Double the pump whine?


I don't have any pump whine on my Fury X so I would imagine this won't have it either.


----------



## SuperZan

Quote:


> Originally Posted by *NBrock*
> 
> I don't have any pump whine on my Fury X so I would imagine this won't have it either.


My Fury X is also devoid of pump whine and my Fury is quiet as well. I don't doubt that problems with pump/coil whine existed, of course, but it seems to have been greatly diminished as we've gone months into the production lifespan.


----------



## p4inkill3r

Quote:


> Originally Posted by *xer0h0ur*
> 
> Its an article from wccftech.com and it is a Falcon Northwest PC they were showcasing.
> 
> http://wccftech.com/amds-dual-fiji-based-radeon-r9-fury-x2-graphics-card-spotted-vrla/


I didn't even know they were still in business TBH.









Looks like a sick card though.


----------



## p4inkill3r

Quote:


> Originally Posted by *baii*
> 
> Double the pump whine?


----------



## Greenland

I just got the Fury Nitro but the clock at default is 1020, not 1050. Also, where can I use the second bios ( there is a button near rear I/O but it doesn't change anything. Help!


----------



## HandOfZeus

Hello guys
so i was looking to upgrade my 7970 due to it's age.

I can get an Sapphire Fury x for 450-500€ used. Is it worth it?I know the 980ti is slightly better but i'm not planning to pay 200€ more for 5fps more. My only concern is the 4gb of vram.

I tend to start using unreal engine and i assume it will need some serious vram.

Anyone can give me some insights?


----------



## JunkaDK

Quote:


> Originally Posted by *Greenland*
> 
> I just got the Fury Nitro but the clock at default is 1020, not 1050. Also, where can I use the second bios ( there is a button near rear I/O but it doesn't change anything. Help!


The second bios is the same as the first .. It's meant as a backup if you want to play around with flashing your bios. If something goes wrong you can always switch to The other bios and boot safely ?? Dunno about the clock though. I just run my card at 1080mhz as a nice quiet oc ?


----------



## Greenland

The Nitro version doesn't have a slide to switch between bioses. I'm aware there are 2 bios, one with the temp target at 75 and the 2nd at 80. The thing is my default clock is 1020 while the actual default clock is at 1050.


----------



## JunkaDK

Quote:


> Originally Posted by *Greenland*
> 
> The Nitro version doesn't have a slide to switch between bioses. I'm aware there are 2 bios, one with the temp target at 75 and the 2nd at 80. The thing is my default clock is 1020 while the actual default clock is at 1050.


So dual bios but no switch? i don't get that. My card only has 1 bios.. I have a Asus R9 Fury STRIX.


----------



## baii

Anyways to load afterburner fan profile and trixx mem OC at same time on start up? Trixx fan profile seem to be wonky for me, it doesnt really load on start up, and afterburner cant to mem OC.


----------



## MAMOLII

damn card my furyx i really regret that i didn't buy the 980ti instead! I had a 3870x2 later one 5850 and a 7950 before fury...after crimson crap i tried the last catalyst but even 1 mhz upping the clocks i get a lot of artifacts in games!this is crazy its totally random i can play for 2 hours at 1130-550 and the next day even in 1100-510 i see artifacts i had to format the pc 3 times cause crimson+DDU make the windows crazy even msi afterburner could not see my card







... i am gonna try to open the card and change paste see if all HMB chips are under paste or if its too much paste after that i give up!


----------



## Alastair

Quote:


> Originally Posted by *MAMOLII*
> 
> damn card my furyx i really regret that i didn't buy the 980ti instead! I had a 3870x2 later one 5850 and a 7950 before fury...after crimson crap i tried the last catalyst but even 1 mhz upping the clocks i get a lot of artifacts in games!this its crazy its totally random i can play for 2 hours at 1130-550 and the next day even in 1100-510 i see artifacts i had to format the pc 3 times cause crimson+DDU make the windows crazy even msi afterburner could not see my card
> 
> 
> 
> 
> 
> 
> 
> ... i am gonna try to open the card and change paste see if all HMB chips are under paste or if its too much paste after that i give up!


ok.


----------



## ebinkerd

Has anyone flashed their Nano with the Fury X bios? I really like the Nano, but it seems that the voltage jumps around a lot compared to the Fury X. I've been messing around with both for the past few days and have seen that the Fury X maintains a stable clock speed with small variations during normal use, ie internet, docs, ... Compare to the Nano though, it seems that the core clock is constantly changing, as well as the voltage. So I was curious to see how the Nano performed with the Fury X bios and everything worked well. Booted, fans speed works, GPU-Z recognizes it as a Fury X. However, when using Unigine I found the FPS were near half of what I was getting with the Nano bios, but visually it did not feel like the FPS reporting was accurate. I also noticed the clock speed at idle was stable just like the Fury X. I did try overclocking the card and found that I was able to reach core clocks that I otherwise would not be able to. 1200 Core and 550 Mem with +24Mv was the most I was able to try and stayed under 60c in firestrike. I switched back to the Nano bios after a driver reset and boot failure. So i am curious if anyone has played with this?


----------



## Radox-0

Quote:


> Originally Posted by *ebinkerd*
> 
> Has anyone flashed their Nano with the Fury X bios? I really like the Nano, but it seems that the voltage jumps around a lot compared to the Fury X. I've been messing around with both for the past few days and have seen that the Fury X maintains a stable clock speed with small variations during normal use, ie internet, docs, ... Compare to the Nano though, it seems that the core clock is constantly changing, as well as the voltage. So I was curious to see how the Nano performed with the Fury X bios and everything worked well. Booted, fans speed works, GPU-Z recognizes it as a Fury X. However, when using Unigine I found the FPS were near half of what I was getting with the Nano bios, but visually it did not feel like the FPS reporting was accurate. I also noticed the clock speed at idle was stable just like the Fury X. I did try overclocking the card and found that I was able to reach core clocks that I otherwise would not be able to. 1200 Core and 550 Mem with +24Mv was the most I was able to try and stayed under 60c in firestrike. I switched back to the Nano bios after a driver reset and boot failure. So i am curious if anyone has played with this?


Sounds odd that its fluctuating. I know the Nano is more aggressive in reducing clock speeds as it gets warmer which may be the issue. If you keep it cool enough by notching up the fan RPM is should stay fairly steady core and memory clock. Found I could maintain 1085 Mhz and 550 with no fluctuation with cooler cranked up and now even better under water which has removed the temperature aspect from the equation.

If you have not already, I would just crank up the fan and see if it holds stable.


----------



## Sonikku13

Bought my Radeon R9 Nano last Saturday. Hopefully will play with it on the first Friday of February.


----------



## ebinkerd

Quote:


> Originally Posted by *Radox-0*
> 
> Sounds odd that its fluctuating. I know the Nano is more aggressive in reducing clock speeds as it gets warmer which may be the issue. If you keep it cool enough by notching up the fan RPM is should stay fairly steady core and memory clock. Found I could maintain 1085 Mhz and 550 with no fluctuation with cooler cranked up and now even better under water which has removed the temperature aspect from the equation.
> 
> If you have not already, I would just crank up the fan and see if it holds stable.


Its not that I am having any issues with the Nano, it works perfectly, I just want the overclock-ability the Fury X has. The throttling is just a headache and makes it difficult to push the card. I currently have my Nano on stock bios set to 1100Mhz amd 550 with no issues. When I overclocked the Fury X I was able to push it past 1200Mhz with voltage at +24mv and the same with the Nano with the Fury X bios. So to me it seems there is more head room for the Nano.

The FPS reporting is the other issue, I am not sure how Firestrike or Unigine calculate FPS, or if the card sends the info, but what is was reporting did not feel like it was accurate.


----------



## JunkaDK

Quote:


> Originally Posted by *ebinkerd*
> 
> Its not that I am having any issues with the Nano, it works perfectly, I just want the overclock-ability the Fury X has. The throttling is just a headache and makes it difficult to push the card. I currently have my Nano on stock bios set to 1100Mhz amd 550 with no issues. When I overclocked the Fury X I was able to push it past 1200Mhz with voltage at +24mv and the same with the Nano with the Fury X bios. So to me it seems there is more head room for the Nano.
> 
> The FPS reporting is the other issue, I am not sure how Firestrike or Unigine calculate FPS, or if the card sends the info, but what is was reporting did not feel like it was accurate.


I would not try to push the Nano upwards of 1200Mhz







Is your nano watercooled? If its anything like the R9 (non-x) the vrm's wil get VERY hot .. The Fury X is watercooled so i think thats the main difference and what makes it better at overclocking. My R9 Strix shuts down at 1165Mhz unless the GPS fans run at +90%... HURRICANE


----------



## ebinkerd

Quote:


> Originally Posted by *JunkaDK*
> 
> I would not try to push the Nano upwards of 1200Mhz
> 
> 
> 
> 
> 
> 
> 
> Is your nano watercooled? If its anything like the R9 (non-x) the vrm's wil get VERY hot .. The Fury X is watercooled so i think thats the main difference and what makes it better at overclocking. My R9 Strix shuts down at 1165Mhz unless the GPS fans run at +90%... HURRICANE


Stock Nano. I ran one benchmark at 1200 and core temps didnt break 60c. Of course no clue on the VRM. But it would be hard to say how the Nano VRM reacts, it doesn't seem to share any characteristics of the Fury though and until something conclusive comes about, i'll just take it slow. Need to get a temp probe though. All this is kinda useless until I can verify the FPS problem.


----------



## JunkaDK

Quote:


> Originally Posted by *ebinkerd*
> 
> Stock Nano. I ran one benchmark at 1200 and core temps didnt break 60c. Of course no clue on the VRM. But it would be hard to say how the Nano VRM reacts, it doesn't seem to share any characteristics of the Fury though and until something conclusive comes about, i'll just take it slow. Need to get a temp probe though. All this is kinda useless until I can verify the FPS problem.


But at 1200 to keep it below 60 im guessing the fans it at max? if not then the nano truly is great.. I like my fury, but im gonna change the TIM. Alot see great improvements after that.


----------



## ebinkerd

Quote:


> Originally Posted by *JunkaDK*
> 
> But at 1200 to keep it below 60 im guessing the fans it at max? if not then the nano truly is great.. I like my fury, but im gonna change the TIM. Alot see great improvements after that.


I have the fan set to run at 100% at 60c atm. Does removing the cooler void the warranty?


----------



## JunkaDK

Quote:


> Originally Posted by *ebinkerd*
> 
> I have the fan set to run at 100% at 60c atm. Does removing the cooler void the warranty?


I would think so.. but don't know for sure. Im gonna do it anyways


----------



## flopper

Quote:


> Originally Posted by *MAMOLII*
> 
> damn card my furyx i really regret that i didn't buy the 980ti instead! I had a 3870x2 later one 5850 and a 7950 before fury...after crimson crap i tried the last catalyst but even 1 mhz upping the clocks i get a lot of artifacts in games!this is crazy its totally random i can play for 2 hours at 1130-550 and the next day even in 1100-510 i see artifacts i had to format the pc 3 times cause crimson+DDU make the windows crazy even msi afterburner could not see my card
> 
> 
> 
> 
> 
> 
> 
> ... i am gonna try to open the card and change paste see if all HMB chips are under paste or if its too much paste after that i give up!


might be your motherboard acting up.

Quote:


> Originally Posted by *Sonikku13*
> 
> Bought my Radeon R9 Nano last Saturday. Hopefully will play with it on the first Friday of February.


Nano are priced great now for sure
Quote:


> Originally Posted by *ebinkerd*
> 
> Its not that I am having any issues with the Nano, it works perfectly, I just want the overclock-ability the Fury X has. .


Buy furyx then....
expecting such with a Nano made for a different power envelope seems like a waste of time.

Personally I watercool the Nano and call it a day or just buy the furyx having wc on it already.


----------



## ebinkerd

Quote:


> Originally Posted by *flopper*
> 
> might be your motherboard acting up.
> Nano are priced great now for sure
> Buy furyx then....
> expecting such with a Nano made for a different power envelope seems like a waste of time.
> 
> Personally I watercool the Nano and call it a day or just buy the furyx having wc on it already.


I have both. I am more interested in the limitations of the design. No one has found that yet.


----------



## Radox-0

Quote:


> Originally Posted by *ebinkerd*
> 
> Its not that I am having any issues with the Nano, it works perfectly, I just want the overclock-ability the Fury X has. The throttling is just a headache and makes it difficult to push the card. I currently have my Nano on stock bios set to 1100Mhz amd 550 with no issues. When I overclocked the Fury X I was able to push it past 1200Mhz with voltage at +24mv and the same with the Nano with the Fury X bios. So to me it seems there is more head room for the Nano.
> 
> The FPS reporting is the other issue, I am not sure how Firestrike or Unigine calculate FPS, or if the card sends the info, but what is was reporting did not feel like it was accurate.


Did you bench games or anything? I have found my Nano when I push it to 1135+ and 550+ (under water) the fps in those two apparently get lower then when I am at 1110 and 500 for example.


----------



## ebinkerd

Quote:


> Originally Posted by *Radox-0*
> 
> Did you bench games or anything? I have found my Nano when I push it to 1135+ and 550+ (under water) the fps in those two apparently get lower then when I am at 1110 and 500 for example.


I haven't played any games with the Fury X bios. Here are the firestrike scores: Nano W/ Fury X bios http://www.3dmark.com/fs/7338890, Nano /default bios http://www.3dmark.com/fs/7337138

So same clock settings, half the fps. I can tell the FPS it was reporting was not the actual frame rate.


----------



## Creator

Anyone here with Nanos have acceptable coil whine? The one I just got for my SFF build may be the worst coil whine I've ever heard. I've read the articles about it, but wasn't expecting it to be this bad. I might just return it and go for a 970 Mini or R9 380 compact instead.


----------



## Radox-0

Quote:


> Originally Posted by *Creator*
> 
> Anyone here with Nanos have acceptable coil whine? The one I just got for my SFF build may be the worst coil whine I've ever heard. I've read the articles about it, but wasn't expecting it to be this bad. I might just return it and go for a 970 Mini or R9 380 compact instead.


My current one is okay in regards to whine. My first one however was awful.


----------



## BoloisBolo

Can anybody with a ek waterblock on their nano help me out? I just have a quick question about how far the inlet/outlet terminal is to the bracket.


----------



## Radox-0

Quote:


> Originally Posted by *BoloisBolo*
> 
> Can anybody with a ek waterblock on their nano help me out? I just have a quick question about how far the inlet/outlet terminal is to the bracket.


Its 30mm from bracket to start of terminal.


----------



## BoloisBolo

Quote:


> Originally Posted by *Radox-0*
> 
> Its 30mm from bracket to start of terminal.


Thanks for the info!


----------



## MAMOLII

Quote:


> Originally Posted by *flopper*
> 
> might be your motherboard acting up.


I have a sabertooth r2 990fx and a evga 1200 psu everything worked fine one month ago with my 7950 running 1100/1750 all the time.tried assasins creed syndicate at 1100/500 and i see artifacts!but at stock everything is fine!!! temps around 50-55c tried to oc via afterburner trixx and even in crimson overdrive... 3dmark works ok at 1130-560 stock volts in games even 1 mhz up and crimson setting app stops working game crashes to dekstop!its crazy after 3 pc formats cleared everything every possible compination even with last catalyst beta even clock blocker didnt help!


----------



## Radox-0

Not sure if its been mentioned already, but intresting none the less and something I was not really aware about when it comes to OC Fiji chips, comments by a AMD Rep on another forum:

_"Fiji's MCLK overdrives in discrete steps, so while various overclocking tools are increasing in 5Mhz steps the MCLK is actually only able to support 500.00/545.45/600.00/666.66MHz and right now it just rounds to the nearest step. If you are intending to overclock HBM on Fiji, then 545Mhz works well on all four of my Fury X cards. 600MHZ proved a step too far on each card and resulted in instability.

So for all those people setting their MCLK at 570Mhz, it's actually running at 545Mhz"_

Interesting IMO.


----------



## JunkaDK

Quote:


> Originally Posted by *Radox-0*
> 
> Not sure if its been mentioned already, but intresting none the less and something I was not really aware about when it comes to OC Fiji chips, comments by a AMD Rep on another forum:
> 
> _"Fiji's MCLK overdrives in discrete steps, so while various overclocking tools are increasing in 5Mhz steps the MCLK is actually only able to support 500.00/545.45/600.00/666.66MHz and right now it just rounds to the nearest step. If you are intending to overclock HBM on Fiji, then 545Mhz works well on all four of my Fury X cards. 600MHZ proved a step too far on each card and resulted in instability.
> 
> So for all those people setting their MCLK at 570Mhz, it's actually running at 545Mhz"_
> 
> Interesting IMO.


Very much.. thanks for sharing


----------



## ebinkerd

Quote:


> Originally Posted by *Radox-0*
> 
> Not sure if its been mentioned already, but intresting none the less and something I was not really aware about when it comes to OC Fiji chips, comments by a AMD Rep on another forum:
> 
> [ISo for all those people setting their MCLK at 570Mhz, it's actually running at 545Mhz"[/I]
> 
> Interesting IMO.


Odd, when I set mine to 560 it crashes, 550 is fine though. I wonder if the over volt acts a multiplier of sort due to it only being adjustable at 6mv increments. How does the overclocking tool report the clocks at the set freq when in fact the settings are discrete. Doesn't make much sense to me.


----------



## 98uk

Quote:


> Originally Posted by *MAMOLII*
> 
> damn card my furyx i really regret that i didn't buy the 980ti instead! I had a 3870x2 later one 5850 and a 7950 before fury...after crimson crap i tried the last catalyst but even 1 mhz upping the clocks i get a lot of artifacts in games!this is crazy its totally random i can play for 2 hours at 1130-550 and the next day even in 1100-510 i see artifacts i had to format the pc 3 times cause crimson+DDU make the windows crazy even msi afterburner could not see my card
> 
> 
> 
> 
> 
> 
> 
> ... i am gonna try to open the card and change paste see if all HMB chips are under paste or if its too much paste after that i give up!


Are you sure that's not power delivery issues (E.g. Motherboard or psu)? Also could just be that the card is broken, that happens with any brand... Amd or Nvidia.


----------



## JunkaDK

Quote:


> Originally Posted by *ebinkerd*
> 
> Odd, when I set mine to 560 it crashes, 550 is fine though. I wonder if the over volt acts a multiplier of sort due to it only being adjustable at 6mv increments. How does the overclocking tool report the clocks at the set freq when in fact the settings are discrete. Doesn't make much sense to me.


You can easily set it to 550 or 560, but according to this info, you are really only getting 545 no matter what the numbers say.. so no point / gain in setting it above 545 unless you go to 600


----------



## ebinkerd

Quote:


> Originally Posted by *JunkaDK*
> 
> You can easily set it to 550 or 560, but according to this info, you are really only getting 545 no matter what the numbers say.. so no point / gain in setting it above 545 unless you go to 600


I think you are missing my point. 560 mem clock causes my card to crash, while 550 is stable. If it was infact always running at 545.5 due to 550 and 560 being nearer to 545.5 than 600, why would it crash?


----------



## JunkaDK

Quote:


> Originally Posted by *ebinkerd*
> 
> I think you are missing my point. 560 mem clock causes my card to crash, while 550 is stable. If it was infact always running at 545.5 due to 550 and 560 being nearer to 545.5 than 600, why would it crash?


Well you are still pushing the card for more performance that it can only give in specific increments like 545.. 600 ect.... maybe it puts a strain on some components / adds more voltage or whatever







its just a theory


----------



## Radox-0

Quote:


> Originally Posted by *ebinkerd*
> 
> Odd, when I set mine to 560 it crashes, 550 is fine though. I wonder if the over volt acts a multiplier of sort due to it only being adjustable at 6mv increments. How does the overclocking tool report the clocks at the set freq when in fact the settings are discrete. Doesn't make much sense to me.


Probably needs some further investigation, but in a way does tally to an extent with what I have seen as I generally do crash when I reach the next discrete step of 600. In regards to the over clocking tool, not really unexpected they do not show the actual values, always find quirks with them and expect this is one of them. I will try to extract more information from said rep as its quiet interesting and did not know about it myself.


----------



## p4inkill3r

Quote:


> Originally Posted by *Radox-0*
> 
> Not sure if its been mentioned already, but intresting none the less and something I was not really aware about when it comes to OC Fiji chips, comments by a AMD Rep on another forum:
> 
> _"Fiji's MCLK overdrives in discrete steps, so while various overclocking tools are increasing in 5Mhz steps the MCLK is actually only able to support 500.00/545.45/600.00/666.66MHz and right now it just rounds to the nearest step. If you are intending to overclock HBM on Fiji, then 545Mhz works well on all four of my Fury X cards. 600MHZ proved a step too far on each card and resulted in instability.
> 
> So for all those people setting their MCLK at 570Mhz, it's actually running at 545Mhz"_
> 
> Interesting IMO.


What's the source for that, if you'd be so kind?


----------



## Radox-0

Quote:


> Originally Posted by *p4inkill3r*
> 
> What's the source for that, if you'd be so kind?


Can I link to other forums? Not too sure of the rules.

If not Google Fiji X owners thread. Its at another forum with a very similar name for this, post on page 233 by an AMD rep


----------



## Radox-0

Quote:


> Originally Posted by *p4inkill3r*
> 
> What's the source for that, if you'd be so kind?


Checking the rules, seems fine. Got the info from here: https://forums.overclockers.co.uk/showthread.php?t=18678073&page=233

Page 233 post # 6977 by AMD Representative


----------



## Otterfluff

Quote:


> Originally Posted by *Radox-0*
> 
> Checking the rules, seems fine. Got the info from here: https://forums.overclockers.co.uk/showthread.php?t=18678073&page=233
> 
> Page 233 post # 6977 by AMD Representative


That might explain why as soon as I overclock past 630Mhz HBM you instantly get instability. It must be rounding it up to 666.66Mhz. I knew there was a wall there and that must be it. 630Mhz is stable for on both my Fury X while 631 gives instant white spot artifacting.

I also noticed on overlockers.co.uk one guy mentioned flashing his fury X to a Fury Nano bios and that it removed the coil whine from his Fury X. Worth testing on my days off, Ill try it over the weekend.


----------



## 98uk

Quote:


> Originally Posted by *Otterfluff*
> 
> That might explain why as soon as I overclock past 630Mhz HBM you instantly get instability. It must be rounding it up to 666.66Mhz. I knew there was a wall there and that must be it. 630Mhz is stable for on both my Fury X while 631 gives instant white spot artifacting.
> 
> I also noticed on overlockers.co.uk one guy mentioned flashing his fury X to a Fury Nano bios and that it removed the coil whine from his Fury X. Worth testing on my days off, Ill try it over the weekend.


Has anyone proven any real world benefit to memory overclocking? I'm talking more about fps in games as opposed to benchmarks.

I'm running mine at 545mhz right now.


----------



## Otterfluff

It was proven to scale well for games but just like core overclocking it dose not give much more than 10-12% overall combined HBM+ core overclocks.

600Mhz HBM has been proven stable with good water cooling. In my experience that means under 40C.


----------



## diggiddi

Quote:


> Originally Posted by *98uk*
> 
> Has anyone proven any real world benefit to memory overclocking? I'm talking more about fps in games as opposed to benchmarks.
> 
> I'm running mine at 545mhz right now.


http://www.hardwareasylum.com/reviews/video/sapphire_r9fury-nitro/page13.aspx

"Overclocking Conclusion

Overclocking is something most enthusiasts like to do and while 89Mhz over the factory overclock may not sound like much there was a notable increase in our synthetic benchmark scores and an average increase of 10fps across the board in our game tests.

The sad part is, 89Mhz doesn't sound like much, and it really isn't however, 260Mhz adding to the memory clock is really what gave us those scores. Bandwidth went from 512GB/s up to 645GB/s and was completely stable throughout our testing session"


----------



## 98uk

Quote:


> Originally Posted by *diggiddi*
> 
> http://www.hardwareasylum.com/reviews/video/sapphire_r9fury-nitro/page13.aspx
> 
> "Overclocking Conclusion
> 
> Overclocking is something most enthusiasts like to do and while 89Mhz over the factory overclock may not sound like much there was a notable increase in our synthetic benchmark scores and an average increase of 10fps across the board in our game tests.
> 
> The sad part is, 89Mhz doesn't sound like much, and it really isn't however, 260Mhz adding to the memory clock is really what gave us those scores. Bandwidth went from 512GB/s up to 645GB/s and was completely stable throughout our testing session"


Interesting. Well, I haven't tried 600mhz yet. If what the posts above say (the memory steps from 500/545/600 etc...) then, it may be a bridge too far.


----------



## Radox-0

Quote:


> Originally Posted by *98uk*
> 
> Has anyone proven any real world benefit to memory overclocking? I'm talking more about fps in games as opposed to benchmarks.
> 
> I'm running mine at 545mhz right now.


I thought I would give it a try, alas after about 15 tests my SFX PSU blew up :O

Only managed to bench 2 games, tomb raider and Shadow of mordor and was part way through Arkham knight when it failed







My very basic results with each test averaged out of 3 or 2 similar results to try and get a better picture. Resolution was at 3440 x 1440 with CPU at 4.4Ghz throughout.

Tomb Raider
1050 core / 500 Memory = Min=84, max = 121 Avg=101.1
1050 core / 545 Memory = Min=88, max=120 Avg=104.1
1050 core / 565 Memory = Min=87, max =121 Avg =104.5
1050 core / 600 memory = Fail

So not a massive difference and could be argued within margin of error on all the results. Sadly 600 Mhz failed which would have been an interesting result.

Shadow Of Mordor
1050 core / 500 Memory = Min=41.19, max=108.71 Avg=70.70
1050 core / 545 Memory = Min=51.23, max = 108.84 Avg=71.84
1050 core / 565 Memory = Min=52.23, max =107.77 Avg =71.05
1050 core / 600 memory = Fail

Similar situation to tomb raider, memory OC seems to make little difference, for myself at least. The min on the 500 Memory is just a quirk with the shadow of mordor dipping to various low FPS the first initial second you launch said benchmark.

Hopefully someone with a card that can get to 600 Mhz will be able to try and perform some tests. Seems my Nano is not up for the task, even under water.


----------



## Semel

Is there anyone here who had a stable OC using catalyst drivers but now crashing using the latest crimson drivers even with clockblocker enabled? My stable (15.11.1)OC just doesn't seem to like crimson drivers at all....


----------



## Decade

Quote:


> Originally Posted by *Otterfluff*
> 
> I also noticed on overlockers.co.uk one guy mentioned flashing his fury X to a Fury Nano bios and that it removed the coil whine from his Fury X. Worth testing on my days off, Ill try it over the weekend.


This is intriguing... my X's coil whine is the loudest component of my build.


----------



## xer0h0ur

Quote:


> Originally Posted by *Semel*
> 
> Is there anyone here who had a stable OC using catalyst drivers but now crashing using the latest crimson drivers even with clockblocker enabled? My stable (15.11.1)OC just doesn't seem to like crimson drivers at all....


I mentioned this the moment Crimson went live. Crimson drivers don't respond as well to overclocks at all versus the Catalyst drivers.


----------



## Kana-Maru

I had issues with certain Crimson drivers and OC'ing. With Crimson 16.1 I have no issues running my OC's that were stable on the Catalyst Drivers. So far no issues.
---

I finally got around to running a few test on my AMD Fury X using the latest Crimson drivers. All I can say is that AMD continues to impress me with their drivers and quick hot fixes. 16.1 has a lot of hot fixes and is in the beta stage. The increases at stock and overclocks are pretty good for a simple driver install. Here are my percentage increases.

*AMD Fury X STOCK Settings*
*3DMark FireStrike Performance % Increase:*
_From Catalyst Driver 15.7.1 -[7/29/2015]- to Crimson Driver 16.1 -[1/17/2016]-_
-Overall Score: *+3%*
-Graphics Score: *+2.19%*
-Combined Score: *+5.89%*

*3DMark FireStrike Extreme % Increase:*
_From Catalyst Driver 15.7.1 -[7/29/2015]- to Crimson Driver 16.1 -[1/17/2016]-_
-Overall Score: *+2.24%*
-Graphics Score: *+2.20%*
-Combined Score: *+2.84%*

*Crysis 3 4K 100% Maxed - No MSAA- Performance % Increase:*
_From Catalyst Driver 15.7.1 -[7/29/2015]- to Crimson Driver 16.1 -[1/17/2016]-_
FPS: *+13.33%*
Frame Rate: *+4.48%*

My Firestrike overclock scores have increased as well. I haven't ran any MGSV: TPP benchmarks, but I have noticed that my 1440p\1600p FPS are higher than they used to be. . I just can't say for sure without benchmarking everything.

*AMD Fury X STOCK Settings*
*3DMark Feature Test [DX12] Performance % Increase:*
_From Catalyst Driver 15.7.1 -[7/29/2015]- to Crimson Driver 16.1 -[1/17/2016]-_
DirectX 12 Increase: *+5.85%*

Mantle performed better than DX12 though. Mantle was 1.57% better than DX12. Mantle had 16,374,768 Draw Calls at stock settings. DX12 Draw Calls was 16.1M


----------



## MAMOLII

catalyst 16.1 is fine for o/c more stable than previews crimsons!for me every game or app is different i can run 3dmark and call of duty and heaven benchmark at 1140/560 stock volts!! but in assasins creed syndicate even in 1100/545 i see arftifacts and crash after 5-10 seconds







so i ended up at 1075/500 for this game







any suggestions for stress tests? i dont know if furmark + ClockBlocker is gonna work and prevent downclocking i will give it a try!


----------



## Semel

Hmm.. I switched back to 15.11.1 catalyst and my OC is still unstable in *Rise of the tomb raider*.. in fact it is unstable even at 1130 instead of 1140...I had to reduce my OC to 1120 keeping same voltage. (+72mV) And the card is running hotter than when i was playing Witcher 3(!) which is pretty "impressive" considering how GPU heavy Witcher 3 was and how sensitive to OCing it was.(Witcher 3 was stable as well as benchmarks\other games)

Guys, anyone here is playing the new Tomb Raider?. How's your overclocks?


----------



## rocket47

Quote:


> Originally Posted by *Semel*
> 
> Hmm.. I switched back to 15.11.1 catalyst and my OC is still unstable in *Rise of the tomb raider*.. in fact it is unstable even at 1130 instead of 1140...I had to reduce my OC to 1120 keeping same voltage. (+72mV) And the card is running hotter than when i was playing Witcher 3(!) which is pretty "impressive" considering how GPU heavy Witcher 3 was and how sensitive to OCing it was.(Witcher 3 was stable as well as benchmarks\other games)
> 
> Guys, anyone here is playing the new Tomb Raider?. How's your overclocks?


i ve seen a topic with tomb raider overclocks on reddit, a lot of people had to reduce their overclock, it wasnt stable, youre not alone.


----------



## xer0h0ur

Well, FWIW the true way to judge temperature at load is to put the GPU at full load for a period of time. That could mean using something like Aida64, folding or using a game that maintains a consistent high load. For instance a game that used to cause higher temperatures for me was Hitman Absolution. It was one of the reasons I jumped to waterblocking my cards.


----------



## MAMOLII

wonder if our cards can handle future heavy games dx12 at stock clocks...we now there is no overclocking headroom for furys but not that much


----------



## 98uk

Quote:


> Originally Posted by *MAMOLII*
> 
> wonder if our cards can handle future heavy games dx12 at stock clocks...we now there is no overclocking headroom for furys but not that much


Doubt it. Like 1st gen dx10 cards... They were released before dx10 and then actually weren't powerful enough to play the games in that mode.

I had an Nvidia 8800gts 320mb


----------



## MAMOLII

its all about marketing but the drivers are not optimized yet there is headroom in this!i am sure that in one year from now in new dx12 titles is gonna leave 980ti behind! but today i cant run assasins creed over 1080 mhz it crashes only 30 mhz come on...







now i see tomb raider reduces the "stable" oc of other users thats so disapointed!amd put the clocks so high they played at the edge! i saw reviews from summer they hit 8% oc on core about 1120-1150 stock volts and 550 to 570 rams...yeah yeah and then a new game comes out and the clocks going down and down....


----------



## xer0h0ur

I spent 2 years on Catalyst drivers and rarely did I ever run into having to lower my overclocks. I still don't trust these Crimson drivers.


----------



## MAMOLII

they play a lot with gpu states i see when game crashes the card jumps down around 900 something mhz volts around 1.10v and then again up at 1100 mhz and 1.20v and crash..especialy in game cinematics a video of a story plays(driver drops clocks and volts in a middle state) back to game aaaaaannnddd crash clocks up and crash its like cpu overcloking with eist ,c1e,speedstep enabled!we only need one 2d state and one 3d state







tried clockblocker but it didnt work for me.


----------



## 98uk

Clockblocker works perfectly here. Keeps core speeds high if I require.


----------



## tysonischarles

Finally signed up


----------



## SuperZan

Quote:


> Originally Posted by *tysonischarles*
> 
> Finally signed up


One of us! One of us!  Welcome welcome! Not that there's much of a difference but which manufacturer did you go with?


----------



## tysonischarles

Quote:


> Originally Posted by *SuperZan*
> 
> One of us! One of us!
> 
> 
> 
> 
> 
> 
> 
> Welcome welcome! Not that there's much of a difference but which manufacturer did you go with?


I went with the XFX Fury X









Although it wont be reference water cooled for too much longer


----------



## SuperZan

Very nice, enjoy  Crimson is a bit troublesome with overclocks and Crossfire to this point but works great on a single stable card. Hopefully a few issues will be ironed out when the RTTR optimisations are released with 16.1.1 Crimson.


----------



## tysonischarles

I'll only be single card for now, sfx psu is 600w so not a heap of room for two fury X's. But I'll be sure to fiddle once I get there


----------



## 98uk

Quote:


> Originally Posted by *SuperZan*
> 
> Very nice, enjoy
> 
> 
> 
> 
> 
> 
> 
> Crimson is a bit troublesome with overclocks and Crossfire to this point but works great on a single stable card. Hopefully a few issues will be ironed out when the RTTR optimisations are released with 16.1.1 Crimson.


I noticed that with the latest beta. As soon as I installed it, my previous oc was unstable and crashed.

Went back to the latest whql and works perfectly again


----------



## Baldrex

I just upgraded my rig from a 290x recently so sign me up










Quote:


> Originally Posted by *98uk*
> 
> I noticed that with the latest beta. As soon as I installed it, my previous oc was unstable and crashed.
> Went back to the latest whql and works perfectly again


How high did you manage to push your fury card?


----------



## 98uk

Quote:


> Originally Posted by *Baldrex*
> 
> How high did you manage to push your fury card?


Right now, i'm still going up in steps.

Current stable is 1070mhz/545mhz with +18mV and power limit +25%


----------



## SuperZan

Quote:


> Originally Posted by *98uk*
> 
> Right now, i'm still going up in steps.
> 
> Current stable is 1070mhz/545mhz with +18mV and power limit +25%


At similar volts my Sapphire Fury X managed 1135/545 whilst my XFX Fury was able to hit 1120/500, so I think you'll be able to squeeze a bit more out at those power levels.


----------



## 98uk

Quote:


> Originally Posted by *SuperZan*
> 
> At similar volts my Sapphire Fury X managed 1135/545 whilst my XFX Fury was able to hit 1120/500, so I think you'll be able to squeeze a bit more out at those power levels.


Yep, just going up in steps and using a few days worth of gaming to look at stability. No great rush.

Would like 1100mhz core, that'd suit nicely.

Not sure what max mV I can apply safely though...


----------



## SuperZan

Quote:


> Originally Posted by *98uk*
> 
> Yep, just going up in steps and using a few days worth of gaming to look at stability. No great rush.
> 
> Would like 1100mhz core, that'd suit nicely.
> 
> Not sure what max mV I can apply safely though...


There are a few here that know the exact numbers better but I was advised early on from @p4inkill3r in particular that you can feed +70mV very safely without issue, and that has proven to be correct for both of my cards. It's good not to rush though, I did the same thing and found different points where I was gaming stable versus benching stable, and am very happy with what I was able to get.


----------



## Otterfluff

16.1.1 Crimson driver seems to of fixed alot of missing crossfire settings, ie crossfire profiles, remembering what settings you set it to.

I tried out the fallout 4 crossfire and it finally works, good fps boost but in game it's missing half of the textures. Has anyone else had a go with this yet? I looked for a custom crossfire profile for fallout 4 but I did not see one listed.


----------



## Baldrex

Quote:


> Originally Posted by *98uk*
> 
> Right now, i'm still going up in steps.
> 
> Current stable is 1070mhz/545mhz with +18mV and power limit +25%


Nice







How about the temperature with that air cooler after the overclock? Did it get really hot or just a little?
Quote:


> Originally Posted by *SuperZan*
> 
> At similar volts my Sapphire Fury X managed 1135/545 whilst my XFX Fury was able to hit 1120/500, so I think you'll be able to squeeze a bit more out at those power levels.


Wow so the air cooled Fury actually overclocks better than the water cooled Fury X.

Is it because the Fury is equipped with a custom PCB and cooler compared to the Fury X?
Quote:


> Originally Posted by *Otterfluff*
> 
> 16.1.1 Crimson driver seems to of fixed alot of missing crossfire settings, ie crossfire profiles, remembering what settings you set it to.
> 
> I tried out the fallout 4 crossfire and it finally works, good fps boost but in game it's missing half of the textures. Has anyone else had a go with this yet? I looked for a custom crossfire profile for fallout 4 but I did not see one listed.


Are you playing it at 4k with max settings?


----------



## 98uk

Quote:


> Originally Posted by *Baldrex*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> How about the temperature with that air cooler after the overclock? Did it get really hot or just a little?


So far I've noticed no difference in temps. However, ambients are changing massively, one day it's 16c and sunny, next day it's snowing.

In gaming with a custom fan profile, I don't see it over 60c... It's a stupid cool card. The fan is nowhere near 100% either.


----------



## p4inkill3r

Quote:


> Originally Posted by *Baldrex*
> 
> Wow so the air cooled Fury actually overclocks better than the water cooled Fury X.


I wouldn't go that far. My Fury X benches at 1180/590 and there are some guys that get 1200MHz on the core.
http://www.3dmark.com/fs/6812511


----------



## Baldrex

Quote:


> Originally Posted by *98uk*
> 
> So far I've noticed no difference in temps. However, ambients are changing massively, one day it's 16c and sunny, next day it's snowing


Good to hear that.
Quote:


> Originally Posted by *p4inkill3r*
> 
> I wouldn't go that far. My Fury X benches at 1180/590 and there are some guys that get 1200MHz on the core.
> http://www.3dmark.com/fs/6812511


Thats an awesome overclock on your Fury X. Wow at 1200MHz that Fury X must be a beast.


----------



## en9dmp

Quote:


> Originally Posted by *Otterfluff*
> 
> 16.1.1 Crimson driver seems to of fixed alot of missing crossfire settings, ie crossfire profiles, remembering what settings you set it to.
> 
> I tried out the fallout 4 crossfire and it finally works, good fps boost but in game it's missing half of the textures. Has anyone else had a go with this yet? I looked for a custom crossfire profile for fallout 4 but I did not see one listed.


Can anyone else confirm this? I've not yet had a reason to move from 15.11.1 but a working crossfire profile for fallout 4 would certainly give me one... sick of playing at 1440p with another Fury X in reserve... I've seen a lot of people on various forums confirming unplayable texture and flickering issues, but mostly with older cards. If it genuinely doesn't work then why on earth did they release it? Possibly they only tested it on pre v1.3 versions of fallout 4? Anyone know how to revert an upgrade to test it?


----------



## Thoth420

Regarding Fallout 4 complaints:

[81536] Foliage/Water may Ripple/Stutter when game is launched in High/Ultra graphics mode


----------



## SuperZan

Quote:


> Originally Posted by *Baldrex*
> 
> Wow so the air cooled Fury actually overclocks better than the water cooled Fury X.


Quote:



> Originally Posted by *p4inkill3r*
> 
> I wouldn't go that far. My Fury X benches at 1180/590 and there are some guys that get 1200MHz on the core.
> http://www.3dmark.com/fs/6812511


Yes, and I've still got more room to go with the Fury X as the cooling is superior. I think that the only reason the Fury did so well at those particular voltage levels is that according to cuinfo my air-cooled Fury was cu locked for binning purposes rather than for defects.



As I push voltage higher the Fury X takes charge.


----------



## tysonischarles

As someone who's particularly fresh to overclocking, do we have any reference material to read over to get started?


----------



## baii

Just up the clock until it crash then back down, no magic and 99%of the time it won't hurt your card.


----------



## 98uk

What do you guys use for power limit on afterburner?

Does it really make any difference?


----------



## Radox-0

Quote:


> Originally Posted by *98uk*
> 
> What do you guys use for power limit on afterburner?
> 
> Does it really make any difference?


On the Nano here. I just set it to +50% power target. It does seem to make a difference. With the normal settings I can push about 1105 Mhz or so stable and benchmarked. With it at +50% I can push to a notch just under 1150 which for the Nano is fairly decent (benchmark stable)


----------



## jamaican voodoo

I'll be joining this clube on friday just ordered 2 XFX fury x from amazon. i though about the 980ti but i'm not a fan of nvidia schemes so AMD was the only option.


----------



## Neon Lights

Quote:


> Originally Posted by *jamaican voodoo*
> 
> I'll be joining this clube on friday just ordered 2 XFX fury x from amazon. i though about the 980ti but i'm not a fan of nvidia schemes so AMD was the only option.


Sorry for the dick answer but unless you want to support the better company and/or the more future proof product, the 980ti is the better card by quite a bit for games. That is for DirectX 11, however for DirectX 12 they may tie (1500MHz vs 1200MHz overclock for Nvidia and AMD) or the Fury X may even be a little better. At the moment, and probably for at least a year, AMD graphic cards just have a worse performance in games. There are just too many AAA games still in production and popular that do not use DirectX 12/Vulkan/Mantle.


----------



## p4inkill3r

Quote:


> Originally Posted by *Neon Lights*
> 
> Sorry for the dick answer but unless you want to support the better company and/or the more future proof product, the 980ti is the better card by quite a bit for games. That is for DirectX 11, however for DirectX 12 they may tie (1500MHz vs 1200MHz overclock for Nvidia and AMD) or the Fury X may even be a little better. At the moment, and probably for at least a year, AMD graphic cards just have a worse performance in games. There are just too many AAA games still in production and popular that do not use DirectX 12/Vulkan/Mantle.


Yeah, don't say things like that in the Owners Club _and_ to a guy that is excited about his purchase. That's just poor taste.


----------



## Jflisk

Quote:


> Originally Posted by *jamaican voodoo*
> 
> I'll be joining this clube on friday just ordered 2 XFX fury x from amazon. i though about the 980ti but i'm not a fan of nvidia schemes so AMD was the only option.


Now what you do is turn the super virtual resolution on and set your games at 2550X1440 enjoy the magic . The 16.1.1 B drivers seem to fix a lot ( especially if your into BLOPS 3 or Rise of the Tomb Raider)


----------



## NBrock

And the performance on the Fury X IS NOT as bad as some would have you believe. I have been playing everything I want at 4k with one Fury X and a 4770k and have not had to sacrifice settings. Those games include Battle Field 4, Diablo 3, Fallout 4, Star Wars Battle Front, War Thunder, heavily modded Skyrim, and Rise of the Tomb Raider. Drivers ARE getting better with every update. The Fury series did have a rough start but it is not a terrible card. I can say that all of my AMD cards have had a long life and continued support for a very long time with continued improvements.


----------



## BoloisBolo

I've had my fury for a while now, but just now joining the club. Also have a nano that I picked up on the cheap to replace my fury but I decided to use it for another project. Maybe I will pick up another fury for some xfire goodness in the future.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Jflisk*
> 
> Now what you do is turn the super virtual resolution on and set your games at 2550X1440 enjoy the magic . The 16.1.1 B drivers seem to fix a lot ( especially if your into BLOPS 3 or Rise of the Tomb Raider)


awsome i'm looking forward to i bought a the Phillips BDM4065UC 40" 4k monitor about a month ago so i'm quite excited to see what these cards can do.


----------



## jamaican voodoo

good to know buddy i play alot of BF4 and Starwars battlefront, the frostbite engine seem to play well with amd cards.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Neon Lights*
> 
> Sorry for the dick answer but unless you want to support the better company and/or the more future proof product, the 980ti is the better card by quite a bit for games. That is for DirectX 11, however for DirectX 12 they may tie (1500MHz vs 1200MHz overclock for Nvidia and AMD) or the Fury X may even be a little better. At the moment, and probably for at least a year, AMD graphic cards just have a worse performance in games. There are just too many AAA games still in production and popular that do not use DirectX 12/Vulkan/Mantle.


I understand where you coming from but i have never had any issues with my amds cards, most of game i play favour amd cards anyway. but like said nvidia schemes is massive turn off for me. other than that the 980ti is fantastic card. just not for me


----------



## 98uk

Quote:


> Originally Posted by *jamaican voodoo*
> 
> good to know buddy i play alot of BF4 and Starwars battlefront, the frostbite engine seem to play well with amd cards.


My Fury is really doing well in bf4, everything ultra at 2560x1440.

I have some clock issues, but fixed with Clockblocker. I will install the newer drivers tonight as apparently they fix it.


----------



## Neon Lights

Quote:


> Originally Posted by *p4inkill3r*
> 
> Yeah, don't say things like that in the Owners Club _and_ to a guy that is excited about his purchase. That's just poor taste.


Well, I said it was actually a "dick answer" and I also am using two Fury Xs myself. What I just wanted to get across is that if you look at just the performance the 980ti is quite a bit better than the Fury X, especially when overcklocked (because it overclocks to at least 1500MHz).
Quote:


> Originally Posted by *NBrock*
> 
> And the performance on the Fury X IS NOT as bad as some would have you believe. I have been playing everything I want at 4k with one Fury X and a 4770k and have not had to sacrifice settings. Those games include Battle Field 4, Diablo 3, Fallout 4, Star Wars Battle Front, War Thunder, heavily modded Skyrim, and Rise of the Tomb Raider. Drivers ARE getting better with every update. The Fury series did have a rough start but it is not a terrible card. I can say that all of my AMD cards have had a long life and continued support for a very long time with continued improvements.


Yes, it may not do bad in games generally, but its performance is quite a bit worse than the 980ti's.


----------



## p4inkill3r

Quote:


> Originally Posted by *Neon Lights*
> 
> Well, I said it was actually a "dick answer" and I also am using two Fury Xs myself. What I just wanted to get across is that if you look at just the performance the 980ti is quite a bit better than the Fury X, especially when overcklocked (because it overclocks to at least 1500MHz).
> Yes, it may not do bad in games generally, but its performance is quite a bit worse than the 980ti's.


I don't see how that matters, because I doubt anyone posts and hopes to receive a dick answer.


----------



## Neon Lights

Quote:


> Originally Posted by *p4inkill3r*
> 
> I don't see how that matters, because I doubt anyone posts and hopes to receive a dick answer.


Who would want a dick answer?


----------



## Sonikku13

Is my A10-7850K at 4.1 GHz bottlenecking my Nano in FFXIV? I get 90 FPS uncapped at 1080p max. I also got 8 GB of RAM. A friend has an i5-4690K, 980 Ti, and 16 GB of RAM and he gets 130 FPS uncapped at 1080p max.

Whether it is bottlenecking or not, my SP3 i3, for comparison, gets 15 FPS uncapped at 720p low. And my OCed A10-7850K at 4.1 GHz with a 960 MHz iGPU clock had been getting 45 FPS at 720p low. So either way, it is a massive upgrade.

If it is bottlenecking my Nano, would it be worth getting a 1440p monitor?


----------



## p4inkill3r

Quote:


> Originally Posted by *Neon Lights*
> 
> Who would want a dick answer?


Maybe the same person that gives one? I honestly don't know, and I still don't know what you hoped to accomplish by FYIing the dude post-purchase that his $1200 worth of cards kinda suck and he should've bought something else.


----------



## NBrock

Quote:


> Originally Posted by *Neon Lights*
> 
> Well, I said it was actually a "dick answer" and I also am using two Fury Xs myself. What I just wanted to get across is that if you look at just the performance the 980ti is quite a bit better than the Fury X, especially when overcklocked (because it overclocks to at least 1500MHz).
> Yes, it may not do bad in games generally, but its performance is quite a bit worse than the 980ti's.


That depends on a lot of things. Stock for Stock it's actually pulling ahead with newer drivers in quite a few titles. That brings me to my next point...not everyone plays the same games so where Nvidia does better it may not even matter to some people. Not everyone buys a card and overclocks so the overclocking depends on the user. The overclock on the Fury series may not reach the same numbers as the 980ti but the performance still scales pretty well with the amount it can do. Not to mention there are tons of non reference setups for the 980ti that do "blow it out of the water" but also cost a good bit more. Not everyone agrees with Nvidia's past or present business practices....so again another reason why they may not choose an Nvidia card. There are many more reasons one card is not the right card for everyone....not to mention you are in the wrong thread for your Nvidia fanboy trolling.


----------



## NBrock

Quote:


> Originally Posted by *Sonikku13*
> 
> Is my A10-7850K at 4.1 GHz bottlenecking my Nano in FFXIV? I get 90 FPS uncapped at 1080p max. I also got 8 GB of RAM. A friend has an i5-4690K, 980 Ti, and 16 GB of RAM and he gets 130 FPS uncapped at 1080p max.
> 
> Whether it is bottlenecking or not, my SP3 i3, for comparison, gets 15 FPS uncapped at 720p low. And my OCed A10-7850K at 4.1 GHz with a 960 MHz iGPU clock had been getting 45 FPS at 720p low. So either way, it is a massive upgrade.
> 
> If it is bottlenecking my Nano, would it be worth getting a 1440p monitor?


A good thing to do you be to monitor your GPU's clocks in game. It may be throttling to stay within the power limits or throttling due to the issues with some of the drivers causing the clocks to fluctuate. If they are throttling I would start by checking the temps and increasing the power limit either in Catalyst Control Center/Crimson or Sapphire Trixx or MSI Afterburner if your temps are fine.

Also there is a good chance that A10 is holding you back. For example my 8350 @ 4.9 GHz did not perform as well in some games as my 4770k @ 4.1 GHz when i swapped to my current rig.


----------



## Neon Lights

Quote:


> Originally Posted by *p4inkill3r*
> 
> Maybe the same person that gives one? I honestly don't know, and I still don't know what you hoped to accomplish by FYIing the dude post-purchase that his $1200 worth of cards kinda suck and he should've bought something else.


I was not trying to say that he should have bought something else, I was trying present an objective statement on the performance differences between the Fury X and 980ti.


----------



## p4inkill3r

Quote:


> Originally Posted by *Neon Lights*
> 
> I was not trying to say that he should have bought something else, I was trying present an objective statement on the performance differences between the Fury X and 980ti.


You failed in that regard.


----------



## 98uk

Quote:


> Originally Posted by *Neon Lights*
> 
> I was not trying to say that he should have bought something else, I was trying present an objective statement on the performance differences between the Fury X and 980ti.


But, doing that in an owners club thread is a bit of a dick move tbh


----------



## Neon Lights

Quote:


> Originally Posted by *NBrock*
> 
> That depends on a lot of things. Stock for Stock it's actually pulling ahead with newer drivers in quite a few titles. That brings me to my next point...not everyone plays the same games so where Nvidia does better it may not even matter to some people. Not everyone buys a card and overclocks so the overclocking depends on the user. The overclock on the Fury series may not reach the same numbers as the 980ti but the performance still scales pretty well with the amount it can do. Not to mention there are tons of non reference setups for the 980ti that do "blow it out of the water" but also cost a good bit more. Not everyone agrees with Nvidia's past or present business practices....so again another reason why they may not choose an Nvidia card. There are many more reasons one card is not the right card for everyone....not to mention you are in the wrong thread for your Nvidia fanboy trolling.


Considering this site is called "overclock.net" and the Fury X's cooling system is the best stock cooling system there is for overclocking, I would say it is save to assume that overclocking the Fury X is natural (unless someone really explicitly says that overclocking should be disregarded). Also, even the reference 980ti's boost usually gets above 1200MHz.

There is not a single game where the Fury X does better than the 980ti I think.

I would say I am everything but an Nvidia fanboy. Are you really that dumb or have you never read anything I wrote before that post (and even in this one I wrote that I am using two Fury Xs myself)? I do not like Nvidia at all, I think they are behaving in a very bad way as a GPU company. I have never owned an Nvidia graphic card.


----------



## Neon Lights

Quote:


> Originally Posted by *p4inkill3r*
> 
> You failed in that regard.


I apparently did for you.


----------



## Neon Lights

Quote:


> Originally Posted by *98uk*
> 
> But, doing that in an owners club thread is a bit of a dick move tbh


I said it is "a dick answer".


----------



## 98uk

Quote:


> Originally Posted by *Neon Lights*
> 
> I said it is "a dick answer".


Yeah and unless someone is asking, people don't really want" dick answers"


----------



## Neon Lights

Quote:


> Originally Posted by *98uk*
> 
> Yeah and unless someone is asking, people don't really want" dick answers"


Was just trying to make a statement, was also not really directing it towards him.


----------



## jamaican voodoo

its all good guys i wasn't offended by his statement. lets move on.


----------



## HexagonRabbit

Well, I for one will be joining the club very soon. I'm updating my system and I'm going to to my hand at a showcase system. I've been building for years but until this year, I haven't really pushed any of my systems as far as I could.
I still don't know what Fury I will be getting but more than likely get the Nano. Painkiller's suggestions have me leaning to a Fury non x but something in the back of my mind thinks a mini system for the living room would be nice and a Nano or two would transfer to a mini case well.
But, for funzies, I'm making a all AMD build.
FX 9590
crosshair V
(a yet unknown Fury card)
R9 Radeon SSD
R9 Radeon 2400 RAM
.....etc.

I really don't care about how well the 980ti runs. A handful of FPS over 60 is meaningless to me. I will say nothing makes me as envious and simultaneously happy for others like computer hardware so Voodoo, happy for your very expensive purchase. I have about 1200 to burn and something like that sounds tempting.


----------



## Otterfluff

Quote:


> Originally Posted by *Neon Lights*
> 
> Sorry for the dick answer but unless you want to support the better company and/or the more future proof product, the 980ti is the better card by quite a bit for games. That is for DirectX 11, however for DirectX 12 they may tie (1500MHz vs 1200MHz overclock for Nvidia and AMD) or the Fury X may even be a little better. At the moment, and probably for at least a year, AMD graphic cards just have a worse performance in games. There are just too many AAA games still in production and popular that do not use DirectX 12/Vulkan/Mantle.


Are you honestly suggesting he should get two 980 TI for SLI? I think that idea is a ******ed proposition I could understand telling him to go for a single gpu setup but if he's going for a dual setup the Fury X is still the better deal.

I will admit that fury X crossfire profiles have been very slow to get out the door recently since crimson came in, but nvidia have not been too much faster.


----------



## SuperZan

Quote:


> Originally Posted by *Otterfluff*
> 
> Are you honestly suggesting he should get two 980 TI for SLI? I think that idea is a ******ed proposition I could understand telling him to go for a single gpu setup but if he's going for a dual setup the Fury X is still the better deal.
> 
> I will admit that fury X crossfire profiles have been very slow to get out the door recently since crimson came in, but nvidia have not been too much faster.


With Fury X2 on the horizon I think that the pace of profile updates will increase significantly. It would make very little sense to release a dual GPU card with the intent of capitalising on the VR niche without improving multi-gpu support. As far as the Fiji crossfire, my experience has been an anecdotal 95% positive. My previous primary gaming rig was actually 770 SC's in SLi and I can honestly say that I prefer Crossfire's implementation. The profiles may have been a bit quicker with Nvidia, but the system (which to be fair has been improved from it's earliest inception) is still much more finicky than Crossfire. I'm able to run a Fury X and a Fury in Crossfire at the same clock speed (~1125 for gaming typically) to no ill effect. I've also had better luck getting Crossfire to stick in non-profile'd games with the 16.1.1 driver. It's not all roses but I feel the forward momentum.


----------



## 98uk

So the new beta completely fixed downclocking for me!


----------



## en9dmp

Quote:


> Originally Posted by *NBrock*
> 
> And the performance on the Fury X IS NOT as bad as some would have you believe. I have been playing everything I want at 4k with one Fury X and a 4770k and have not had to sacrifice settings. Those games include Battle Field 4, Diablo 3, Fallout 4, Star Wars Battle Front, War Thunder, heavily modded Skyrim, and Rise of the Tomb Raider. Drivers ARE getting better with every update. The Fury series did have a rough start but it is not a terrible card. I can say that all of my AMD cards have had a long life and continued support for a very long time with continued improvements.


Sorry, but no chance 4k is acceptable on one fury X for fallout 4 unless you're playing on medium - low settings or you're happy playing at 30fps... I have slightly better setup and need to crank down to 1440p to keep 60fps. Even then it drops into the 40s sometimes.

My laptop with 8gb 980m actually provides a better experience in fallout 4 than my fury X.


----------



## NBrock

Quote:


> Originally Posted by *en9dmp*
> 
> Sorry, but no chance 4k is acceptable on one fury X for fallout 4 unless you're playing on medium - low settings or you're happy playing at 30fps... I have slightly better setup and need to crank down to 1440p to keep 60fps. Even then it drops into the 40s sometimes.
> 
> My laptop with 8gb 980m actually provides a better experience in fallout 4 than my fury X.


You playing on 16.1.1? The only thing I turned down was tessellations. Even 16.1 was fine. Also what's your overclock? Also are you throttling? I have to use Clock Blocker in Fallout 4 but the game runs fine.


----------



## en9dmp

Quote:


> Originally Posted by *NBrock*
> 
> You playing on 16.1.1? The only thing I turned down was tessellations. Even 16.1 was fine. Also what's your overclock? Also are you throttling? I have to use Clock Blocker in Fallout 4 but the game runs fine.


Every crimson version I've tried has been worse than catalyst 15.11.1 so I've been sticking with that. Fury X overclock is at 1125 525, 4790k @ 4.6Ghz, 32Gb DDR3 2400 CL10. Tried various optimisation mods as well as unmodded and still get rubbish performance. I force 32x tessellation because the god rays flicker like crazy otherwise. I use clock blocker to prevent throttling, temps are low as using full water cooling loop. Just figured the game was unoptimised as hell... was excited about trying 16.1.1 as has a crossfire profile but apparently it's useless with v1.3.

What FPS are you getting then? Test walking around sanctuary or in the streets around corvega


----------



## 98uk

Quote:


> Originally Posted by *en9dmp*
> 
> Every crimson version I've tried has been worse than catalyst 15.11.1 so I've been sticking with that. Fury X overclock is at 1125 525, 4790k @ 4.6Ghz, 32Gb DDR3 2400 CL10. Tried various optimisation mods as well as unmodded and still get rubbish performance. I force 32x tessellation because the god rays flicker like crazy otherwise. I use clock blocker to prevent throttling, temps are low as using full water cooling loop. Just figured the game was unoptimised as hell... was excited about trying 16.1.1 as has a crossfire profile but apparently it's useless with v1.3.
> 
> What FPS are you getting then? Test walking around sanctuary or in the streets around corvega


The new betas completely fixed the clock issues for me.

Don't need to run clock blocker any more.


----------



## NBrock

Quote:


> Originally Posted by *en9dmp*
> 
> Every crimson version I've tried has been worse than catalyst 15.11.1 so I've been sticking with that. Fury X overclock is at 1125 525, 4790k @ 4.6Ghz, 32Gb DDR3 2400 CL10. Tried various optimisation mods as well as unmodded and still get rubbish performance. I force 32x tessellation because the god rays flicker like crazy otherwise. I use clock blocker to prevent throttling, temps are low as using full water cooling loop. Just figured the game was unoptimised as hell... was excited about trying 16.1.1 as has a crossfire profile but apparently it's useless with v1.3.
> 
> What FPS are you getting then? Test walking around sanctuary or in the streets around corvega


I get 50-60. My Tessellation is set at 16 if I remember correctly. Also running 1150 core and 545 mem. The game is terribly unoptimized but for me 16.1 and 16.1.1 helped a lot. I have not tried the game without clock blocker on the new driver (16.1.1). When I get off work I can get you some screen shots. I think the lowest FPS I saw was 45ish. My monitor's free sync range is 40-60 which helps a lot with it looking super smooth all the time. Battle Field 4 I never noticed it go below 59FPS (same with Battle Front and War thunder). I was rather surprised by how well it did to be honest. i was expecting to have to give up on some settings.


----------



## jamaican voodoo

Oh man i'm so excited for the new right and these bad boys, getting some swiftech komodo waterblock for them from ppcs
This weekend will be nothing but gaming at 4k


----------



## NBrock

Nice! Let us know what you think! also we are gonna need pics!


----------



## Jflisk

Quote:


> Originally Posted by *jamaican voodoo*
> 
> Oh man i'm so excited for the new right and these bad boys, getting some swiftech komodo waterblock for them from ppcs
> This weekend will be nothing but gaming at 4k


I was thinking about water blocking my Fury Xs and I have the pumps and rad. Just don't think its worth it - But let me know what temps you get might give it a go. Thanks


----------



## jamaican voodoo

Quote:


> Originally Posted by *Jflisk*
> 
> I was thinking about water blocking my Fury Xs and I have the pumps and rad. Just don't think its worth it - But let me know what temps you get might give it a go. Thanks


yes i will know what temps im getting i live florida so take into account, i have a 360 and 240 monsta rad form alpha cool should be ample cooling.

Quote:


> Originally Posted by *NBrock*
> 
> Nice! Let us know what you think! also we are gonna need pics!


yes i will most definitely take some pics!


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> Every crimson version I've tried has been worse than catalyst 15.11.1 so I've been sticking with that. Fury X overclock is at 1125 525, 4790k @ 4.6Ghz, 32Gb DDR3 2400 CL10. Tried various optimisation mods as well as unmodded and still get rubbish performance. I force 32x tessellation because the god rays flicker like crazy otherwise. I use clock blocker to prevent throttling, temps are low as using full water cooling loop. Just figured the game was unoptimised as hell... was excited about trying 16.1.1 as has a crossfire profile but apparently it's useless with v1.3.
> 
> What FPS are you getting then? Test walking around sanctuary or in the streets around corvega


16.1.1 with the 1.3 version works now! 16.1.1 has to be re-downloaded from the AMD website and re-installed.


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> 16.1.1 with the 1.3 version works now! 16.1.1 has to be re-downloaded from the AMD website and re-installed.


Crossfire works? Ok gotta go try that now...!


----------



## Rossky

3D printing my own front plate to make my Fury X actually "my own" seemed too boring and I don't own a 3D Printer so I decided to customize it in a different way.
Sorry for the blurry pic - got shaky hands.


----------



## p4inkill3r

Quote:


> Originally Posted by *Rossky*
> 
> 3D printing my own front plate to make my Fury X actually "my own" seemed too boring and I don't own a 3D Printer so I decided to customize it in a different way.
> Sorry for the blurry pic - got shaky hands.


That looks pretty dope. What kind of paint did you use?


----------



## Rossky

Quote:


> Originally Posted by *p4inkill3r*
> 
> That looks pretty dope. What kind of paint did you use?


Seems to be acrylic paint. I spray painted it. Hope that helps... google translate can't do much for me here. Before deciding to paint it at all I ran it under load to make sure the painted areas don't heat up.


----------



## p4inkill3r

I think its pretty cool.








Welcome to the site.


----------



## Rossky

Quote:


> Originally Posted by *p4inkill3r*
> 
> I think its pretty cool.
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome to the site.


Thank you
Been visiting it over and over for over a year and now finally got something to show, too.
Will try to get some better pictures and some of the complete rig by tomorrow


----------



## tysonischarles

What are you guys using to OC? Just got mine on a EK block, only hit 41 degress after running through heaven 3 times, gonna get metro redux benchmark soon.

Just need to know what utility to use? thanks :d


----------



## SuperZan

I'm using Afterburner, I haven't experienced any real conflicts since the 4.2 revision. Trixx is another option. Speaking of AB, managed an 1175/545 clock on the air-cooled Fury that made it through Catzilla a few times, but it's just not having the 600-clock on the HBM at that core clock; I'll try dropping it to see if I can squeeze a bit more out of the HBM.


----------



## tysonischarles

I've been messing around with Sapphire TRIXX and heaven while Metro Redux Downloads.

Almost got right through at 1150 with stock voltage and 500mhz memory clock. 1140 seems to be my stable at stock mV and Mem Clock.


----------



## Sonikku13

Quote:


> Originally Posted by *NBrock*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> Is my A10-7850K at 4.1 GHz bottlenecking my Nano in FFXIV? I get 90 FPS uncapped at 1080p max. I also got 8 GB of RAM. A friend has an i5-4690K, 980 Ti, and 16 GB of RAM and he gets 130 FPS uncapped at 1080p max.
> 
> Whether it is bottlenecking or not, my SP3 i3, for comparison, gets 15 FPS uncapped at 720p low. And my OCed A10-7850K at 4.1 GHz with a 960 MHz iGPU clock had been getting 45 FPS at 720p low. So either way, it is a massive upgrade.
> 
> If it is bottlenecking my Nano, would it be worth getting a 1440p monitor?
> 
> 
> 
> A good thing to do you be to monitor your GPU's clocks in game. It may be throttling to stay within the power limits or throttling due to the issues with some of the drivers causing the clocks to fluctuate. If they are throttling I would start by checking the temps and increasing the power limit either in Catalyst Control Center/Crimson or Sapphire Trixx or MSI Afterburner if your temps are fine.
> 
> Also there is a good chance that A10 is holding you back. For example my 8350 @ 4.9 GHz did not perform as well in some games as my 4770k @ 4.1 GHz when i swapped to my current rig.
Click to expand...

The game is completely playable at 90 FPS at 1080p maximum, so I'm not concerned about it too much. Though down the road, I want a MG279Q for 90 Hz FreeSync 1440p IPS goodness. Which should alleviate the potential bottlenecking.


----------



## p4inkill3r

Quote:


> Originally Posted by *tysonischarles*
> 
> I've been messing around with Sapphire TRIXX and heaven while Metro Redux Downloads.
> 
> Almost got right through at 1150 with stock voltage and 500mhz memory clock. 1140 seems to be my stable at stock mV and Mem Clock.


Start bumping voltage!


----------



## tysonischarles

Quote:


> Originally Posted by *p4inkill3r*
> 
> Start bumping voltage!


I hit 1170 with +75mV and 50% Power limit.

Do I now start tuning down the power limit and mV?


----------



## p4inkill3r

I leave the power limit maxed always, and you have room to keep going up.


----------



## tysonischarles

I'm at 1150 on the gpu and 550 on the hbm, seems to artifact if i touch anyrthing else :/

Went from 2049 at stock on heaven to 2216, So just over an 8% increase, i dont know if thats good or not hahaha


----------



## p4inkill3r

That's with max voltage applied?


----------



## tysonischarles

yessir!


----------



## $k1||z_r0k

does anyone know how to repair AMD Fury's? i voided the warranty and need to repair some


----------



## p4inkill3r

Quote:


> Originally Posted by *tysonischarles*
> 
> yessir!


Run some Firestrike Extreme/Ultra, determine if bumping core/lowering mem is worth it.


----------



## p4inkill3r

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> does anyone know how to repair AMD Fury's? i voided the warranty and need to repair some


May need to elaborate a bit.


----------



## $k1||z_r0k

Quote:


> Originally Posted by *p4inkill3r*
> 
> May need to elaborate a bit.


one day when i tried to turn on the computer it would not turn on with my card in it. (powers on for a fraction of a second, then failsafe kicks in.) then when i try to turn it on several times, while i'm doing it i see a light in the case and one of the chips sparks and plastic melts a little on the chip. i have another dead Fury card, would it be possible if i could solder off the burnt chip, transplant the affected chip(s) or is the root cause something else?


----------



## p4inkill3r

Quote:


> Originally Posted by *$k1||z_r0k*
> 
> one day when i tried to turn on the computer it would not turn on with my card in it. (powers on for a fraction of a second, then failsafe kicks in.) then when i try to turn it on several times, while i'm doing it i see a light in the case and one of the chips sparks and plastic melts a little on the chip. i have another dead Fury card, would it be possible if i could solder off the burnt chip, transplant the affected chip(s) or is the root cause something else?


Anything is possible I suppose, but why is the warranty null? Did you kill the other card too?


----------



## tysonischarles

Quote:


> Originally Posted by *p4inkill3r*
> 
> Run some Firestrike Extreme/Ultra, determine if bumping core/lowering mem is worth it.


I'll give that a try, thank you!


----------



## Otterfluff

Quote:


> Originally Posted by *en9dmp*
> 
> Sorry, but no chance 4k is acceptable on one fury X for fallout 4 unless you're playing on medium - low settings or you're happy playing at 30fps... I have slightly better setup and need to crank down to 1440p to keep 60fps. Even then it drops into the 40s sometimes.
> 
> My laptop with 8gb 980m actually provides a better experience in fallout 4 than my fury X.


From day 1 Fallout 4 I was running it on my Fury X with max settings with 16x tesalation, no god rays in 4k resoultion. But it required alot of optimizations to get it to run nicely with 45fps minimals and 55fps max. I even had to resort to overclocking my ram. With alot of effort and tweaking I went from 25-45 fps -> 45-55fps.

Fallout 4 is a unoptimized mess that depending on your rig wont run well no matter what video card you had. It's one hell of a mess. Moving over to a 980 TI would not of given you a better result, It was and still is a very badly un-optimized mess.

The fact that I could get it to run on my Fury X in a very optimal fashion on release means your rig and setup was just not optimal, don't blame your fury X for the crap job they did on the game.

It is humorous that right now a single Fury X is doing better on the most recent updates for fallout 4 than a 980 TI, it's a bad joke for nvidia owners considering it's a gameworks game.


----------



## tysonischarles

Managed to hit 1170 on the clock and 530 on memory, tested with metro 2033 redux benchmark. What is everyone else getting?


----------



## 98uk

Quote:


> Originally Posted by *tysonischarles*
> 
> Managed to hit 1170 on the clock and 530 on memory, tested with metro 2033 redux benchmark. What is everyone else getting?


The memory only overclocks in steps apparently, so probably not 530mhz. More likely running at 545mhz (or stock speeds, I forget how it decides the steps)


----------



## SuperZan

500-545-600-and the devil's overclock, 666.66. Rounds it to the nearest step, whether that be up or down.

Quote:


> Originally Posted by *Radox-0*
> 
> Checking the rules, seems fine. Got the info from here:https://forums.overclockers.co.uk/showthread.php?t=18678073&page=233
> 
> Page 233 post # 6977 by AMD Representative


----------



## tysonischarles

Very interesting! Well I couldn't even get firestrike to load on 1170/530(545?). Just finishing up my CPU OC, 3.5 to 4.4 on my 6600k

I'll start from stock and work my way up again on firestrike(if it loads). Then test on metro and valley/heaven

Got through fire strike ultra on 1170/545. Gained 8% from stock.


----------



## JunkaDK

Quote:


> Originally Posted by *SuperZan*
> 
> 500-545-600-and the devil's overclock, 666.66. Rounds it to the nearest step, whether that be up or down.


I see you have a Fury X and a Fury in your rig.. Does it work in CrossfireX?


----------



## aembers

I have an inverted motherboard case so if i install my fury x it will be upside down, will that cause any problems? like coolant not touching the heatsink because air is in the way?


----------



## Alastair

Quote:


> Originally Posted by *JunkaDK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SuperZan*
> 
> 500-545-600-and the devil's overclock, 666.66. Rounds it to the nearest step, whether that be up or down.
> 
> 
> 
> I see you have a Fury X and a Fury in your rig.. Does it work in CrossfireX?
Click to expand...

yes. Just crossfire will default to using 3584 shaders and 224 TMU's on both cards. Crossfire will bypass the extra enabled parts of the Fury X.


----------



## Flamingo

The R9 nano cooler is a hybrid blower. If I covered one part of the heatsink (where the power connector is - to avoid heat being dissipated into the case), how much of an impact would it have on cooling?

Edit: Nah seems like a bad idea, since there is a solid line separating the two zones.


----------



## en9dmp

Quote:


> Originally Posted by *Otterfluff*
> 
> From day 1 Fallout 4 I was running it on my Fury X with max settings with 16x tesalation, no god rays in 4k resoultion. But it required alot of optimizations to get it to run nicely with 45fps minimals and 55fps max. I even had to resort to overclocking my ram. With alot of effort and tweaking I went from 25-45 fps -> 45-55fps.
> 
> Fallout 4 is a unoptimized mess that depending on your rig wont run well no matter what video card you had. It's one hell of a mess. Moving over to a 980 TI would not of given you a better result, It was and still is a very badly un-optimized mess.
> 
> The fact that I could get it to run on my Fury X in a very optimal fashion on release means your rig and setup was just not optimal, don't blame your fury X for the crap job they did on the game.
> 
> It is humorous that right now a single Fury X is doing better on the most recent updates for fallout 4 than a 980 TI, it's a bad joke for nvidia owners considering it's a gameworks game.


Good work on the FPS increase. Would you mind posting the contents of your fallout4prefs.ini? I've made so many tweaks but haven't really found a good configuration. Are you using any mods to gain performance?


----------



## Otterfluff

Started playing xcom 2 and it works under crossfire in optimize 1x1 Sometimes I get some texture flashing other times I don't? That game is brutal at 4k even after I turn settings down I only get 25-30fps in areas with one gpu. So crossfire would be really useful in this game I hope they make a profile as I easily get 45-60fps with crossfire enabled.
Quote:


> Originally Posted by *en9dmp*
> 
> Good work on the FPS increase. Would you mind posting the contents of your fallout4prefs.ini? I've made so many tweaks but haven't really found a good configuration. Are you using any mods to gain performance?


I am using two mods that help alot:
FPS dynamic shadows - Shadow Boost

Fallout 4 alternative launcher

The fallout 4 launcher will have all the fallout4prefs.ini edits available to change as well as some others that are not in it to start with. One of the gem's is enabling multi core threading "iNumHWThreads. By default fallout 4 only uses one core.



Spoiler: Warning: Spoiler!



[Display]
flocalShadowMapHalveEveryXUnit=750.0000
focusShadowMapDoubleEveryXUnit=450.0000
fShadowBiasScale=1.0000
fDirShadowDistance=3000.0000
fShadowDistance=3000.0000
uiOrthoShadowFilter=2
uiShadowFilter=2
iShadowMapResolution=2098
uPipboyTargetHeight=700
uPipboyTargetWidth=876
iVolumetricLightingQuality=-1
bVolumetricLightingEnable=0
bSAOEnable=1
iDirShadowSplits=3
bVolumetricLightingForceCasters=0
iTiledLightingMinLights=40
bComputeShaderDeferredTiledLighting=1
iMaxFocusShadowsDialogue=4
iMaxFocusShadows=1
bForceIgnoreSmoothness=0
fBlendSplitDirShadow=40.0000
bSinglePassDirShadow=1
bEnableWetnessMaterials=1
fTessFactorMaxDistanceScale=100.0000
sAntiAliasing=
fLeafAnimDampenDistEnd=4600.0000
fLeafAnimDampenDistStart=3600.0000
fMeshLODFadePercentDefault=1.2000
fMeshLODFadeBoundDefault=256.0000
fMeshLODFadeScalar=1.0000
fMeshLODLevel2FadeTreeDistance=2048.0000
fMeshLODLevel1FadeTreeDistance=2844.0000
fInteriorMeshLODLevel2FadeDist=1950.0000
fInteriorMeshLODLevel1FadeDist=2600.0000
fMeshLODLevel2FadeDist=3000.0000
fMeshLODLevel1FadeDist=4000.0000
iMaxAnisotropy=1
iPresentInterval=0
bTopMostWindow=0
bMaximizeWindow=0
bBorderless=1
bFull Screen=0
iSize H=2160
iSize W=3840
bAllowShadowcasterNPCLights=0
iScreenShotIndex=0
fMaxFocusShadowMapDistance=450.0000
bPrecipitationOcclusion=1
iMaxSkinDecalsPerFrame=3
iMaxDecalsPerFrame=10
iTexMipMapSkip=0
[Imagespace]
bDoDepthOfField=1
bScreenSpaceBokeh=1
bMBEnable=1
bLensFlare=1
[Pipboy]
fPipboyEffectColorB=0.0900
fPipboyEffectColorG=1.0000
fPipboyEffectColorR=0.0800
[VATS]
fModMenuEffectHighlightPAColorB=0.4100
fModMenuEffectHighlightPAColorG=0.8200
fModMenuEffectHighlightPAColorR=1.0000
fModMenuEffectPAColorB=0.4100
fModMenuEffectPAColorG=0.8200
fModMenuEffectPAColorR=1.0000
fModMenuEffectHighlightColorB=0.0824
fModMenuEffectHighlightColorG=1.0000
fModMenuEffectHighlightColorR=0.0706
fModMenuEffectColorB=0.4200
fModMenuEffectColorG=0.9900
fModMenuEffectColorR=0.4900
[MAIN]
fSkyCellRefFadeDistance=150000.0000
bCrosshairEnabled=1
fHUDOpacity=1.000
bSaveOnPause=1
bSaveOnTravel=1
bSaveOnWait=1
bSaveOnRest=1
[LightingShader]
bScreenSpaceSubsurfaceScattering=0
bScreenSpaceReflections=1
[General]
bGamepadEnable=1
bPipboyCompanionEnabled=0
iStoryManagerLoggingEvent=-1
bEnableStoryManagerLogging=0
uGridsToLoad=5
SIntroSequence=0
iNumHWThreads=4
bPlayMainmenuMusic=1
uExterior Cell Buffer=36
[Interface]
bDialogueSubtitles=0
bGeneralSubtitles=0
iHUDColorB=21
iHUDColorG=255
iHUDColorR=18
bDialogueCameraEnable=1
bShowCompass=1
[Controls]
fMouseHeadingSensitivity=0.0300
fGamepadHeadingSensitivity=0.6667
bAlwaysRunByDefault=1
bInvertYValues=0
bGamePadRumble=1
[GamePlay]
iDifficulty=2
bShowFloatingQuestMarkers=1
bShowQuestMarkers=1
[Particles]
iMaxDesired=750
[SaveGame]
fAutosaveEveryXMins=10.0000
[AudioMenu]
fAudioMasterVolume=1.0000
fVal7=1.0000
uID7=0
fVal6=1.0000
uID6=0
fVal5=1.0000
uID5=0
fVal4=0.6500
uID4=138006
fVal3=0.6500
uID3=1007612
fVal2=1.0000
uID2=94881
fVal1=0.6500
uID1=466532
fVal0=0.6500
uID0=554685
[Water]
bUseWaterDisplacements=1
bUseWaterRefractions=1
bUseWaterReflections=1
bUseWaterDepth=1
[TerrainManager]
fTreeLoadDistance=75000.0000
fBlockMaximumDistance=100000.0000
fBlockLevel2Distance=80000.0000
fBlockLevel1Distance=32000.0000
fBlockLevel0Distance=20000.0000
fSplitDistanceMult=1.1000
bShowLODInEditor=0
[Grass]
fGrassStartFadeDistance=2000.0000
fGrassMaxStartFadeDistance=7000.0000
fGrassMinStartFadeDistance=400.0000
[Decals]
uMaxDecals=100
[LOD]
fLODFadeOutMultSkyCell=1.0000
fLODFadeOutMultObjects=5.0000
fLODFadeOutMultItems=3.0000
fLODFadeOutMultActors=5.0000
[Launcher]
bEnableFileSelection=0


----------



## SuperZan

Quote:


> Originally Posted by *JunkaDK*
> 
> I see you have a Fury X and a Fury in your rig.. Does it work in CrossfireX?


Yep, works just as @Alastair stated. I was actually not intending to build a Crossfire rig when I bought the Fury X (which was itself an impulse buy). When the Fury came to me on offer (~£320 after tax) it was an easy decision. Both cards run gaming-stable at 1150/545 and if Crossfire is broken for a given game, I can run it single-GPU on a Fury X. At some point I may replace the Fury with a Nano and move the air-cooled Fury into my second system which is currently on 7970Ghz Crossfire.


----------



## tysonischarles

So here's what I have after some testing!

FireStrike Ultra @ Stock : 6966
FireStrike Ultra @ 1170/545 : 7394

Increase of 6.14%

Unigine Heaven @ Stock: 2055
Unigine Heaven @ 1170/545 : 2246

Increase of 9.29%

Unigine Valley @ Stock: 3229
Unigine Valley @ 1170/545: 3532

Increase of 9.38%

Metro 2033 Redux Benchmark @ Stock: Average Framerate: 35.33
Metro 2033 Redux Benchmark @ 1170/545: Average Framerate: 37.67

Increase of 6.62%

That is as far as I could push it whilst keeping it stable, never went about 40 degrees, seems kind of lackluster given how expensive the card was.

Grand total of 7.8575% increase across my 4 benchmarks.

So are you guys having better results? similar? let me know


----------



## p4inkill3r

You should fill out the rigbuilder so we have an idea of what the rest of your computer looks like.
10% increase in Heaven/Valley is a pretty good return.


----------



## jamaican voodoo

it;s finally here i will take some pics once i get them out the box!!! 2016 build


----------



## SuperZan

Quote:


> Originally Posted by *jamaican voodoo*
> 
> it;s finally here i will take some pics once i get them out the box!!! 2016 build
> 
> 
> Spoiler: Warning: Spoiler!


Beautimous! Looking forward to the unboxing/build.


----------



## en9dmp

Quote:


> Originally Posted by *en9dmp*
> 
> Crossfire works? Ok gotta go try that now...!


OMG the scaling is complete garbage... all this profile does is introduce a ton of input lag, stuttering and a lower frame rate than with crossfire disabled. Seriously how do they get away with this?


----------



## Maximization

Quote:


> Originally Posted by *en9dmp*
> 
> OMG the scaling is complete garbage... all this profile does is introduce a ton of input lag, stuttering and a lower frame rate than with crossfire disabled. Seriously how do they get away with this?


Quote:


> Originally Posted by *en9dmp*
> 
> OMG the scaling is complete garbage... all this profile does is introduce a ton of input lag, stuttering and a lower frame rate than with crossfire disabled. Seriously how do they get away with this?


there is no rig builder, at 4k you need to fine tune your display also, it is an important step


----------



## en9dmp

Quote:


> Originally Posted by *Maximization*
> 
> there is no rig builder, at 4k you need to fine tune your display also, it is an important step


Errr... what are you talking about? I'm talking about the non-existent scaling in the crossfire profile for fallout 4.

Rig builder? Fine tune your display? OK man...


----------



## Maximization

Quote:


> Originally Posted by *en9dmp*
> 
> Errr... what are you talking about? I'm talking about the non-existent scaling in the crossfire profile for fallout 4.
> 
> Rig builder? Fine tune your display? OK man...


I am talking to the king of the kingdom of idiots, what is your resolution man??


----------



## en9dmp

Quote:


> Originally Posted by *Maximization*
> 
> I am talking to the king of the kingdom of idiots, what is your resolution man??


I think I need some of whatever you're smoking... I play all other games at 4k 60fps except fallout 4 which runs like a turd. Either way my point is that the new crossfire profile for fallout 4 in the new drivers gives much worse performance than running on a single card.

I'm intrigued to hear your advice on fine tuning the display. Maybe that will help the crossfire scaling?


----------



## Maximization

Quote:


> Originally Posted by *en9dmp*
> 
> I think I need some of whatever you're smoking... I play all other games at 4k 60fps except fallout 4 which runs like a turd. Either way my point is that the new crossfire profile for fallout 4 in the new drivers gives much worse performance than running on a single card.
> 
> I'm intrigued to hear your advice on fine tuning the display. Maybe that will help the crossfire scaling?


model number of monitor?


----------



## en9dmp

Quote:


> Originally Posted by *Maximization*
> 
> model number of monitor?


Are you for real? I'm playing on a 58in 4k TV.

Are you seriously suggesting you think FPS gains can be made through monitor settings?

Freesync will give a perceivable smoother experience at lower frame rates but that's irrelevant to the point I was making...


----------



## Maximization

Quote:


> Originally Posted by *en9dmp*
> 
> Are you for real? I'm playing on a 58in 4k TV.
> 
> Are you seriously suggesting you think FPS gains can be made through monitor settings?
> 
> Freesync will give a perceivable smoother experience at lower frame rates but that's irrelevant to the point I was making...


I am not aware of any 58" 4k dedicated computer monitors


----------



## en9dmp

Quote:


> Originally Posted by *Maximization*
> 
> I am not aware of any 58" 4k dedicated computer monitors


The clue is in the word TV... FYI it's the only TV range in existence (now discontinued) which takes a displayport input and supports 4k @ 60hz. The clued up will know it's a Panasonic AX range, and at the time the fury x came out out was the only option for 4k 60fps living room gaming given AMD's decision not to support hdmi 2.0.

Anyway, I've no idea why I'm still replying to you...


----------



## Thoth420

Quote:


> Originally Posted by *en9dmp*
> 
> The clue is in the word TV... FYI it's the only TV range in existence (now discontinued) which takes a displayport input and supports 4k @ 60hz. The clued up will know it's a Panasonic AX range, and at the time the fury x came out out was the only option for 4k 60fps living room gaming given AMD's decision not to support hdmi 2.0.
> 
> Anyway, I've no idea why I'm still replying to you...


How long is your DP cable in length?


----------



## bluezone

Hi Guys.

I've been on the 7900 owners group with my HD 7950's for a while now.
I've been waiting on delivery of a R9 Nano from AMD (won it in a contest when the Radeon software drivers were introduced. back at the end of Nov).
I'm been busy back reading up to page 400 of the R9 Fury owners thread. Haven't Hit the intro and opinions of the Nano yet in the back pages.
How has the Nano been for use and overclocking?


----------



## Maximization

Quote:


> Originally Posted by *en9dmp*
> 
> The clue is in the word TV... FYI it's the only TV range in existence (now discontinued) which takes a displayport input and supports 4k @ 60hz. The clued up will know it's a Panasonic AX range, and at the time the fury x came out out was the only option for 4k 60fps living room gaming given AMD's decision not to support hdmi 2.0.
> 
> Anyway, I've no idea why I'm still replying to you...


you need DP 1.2 to get 60 Hz


----------



## tysonischarles

Quote:


> Originally Posted by *p4inkill3r*
> 
> You should fill out the rigbuilder so we have an idea of what the rest of your computer looks like.
> 10% increase in Heaven/Valley is a pretty good return.


done! now how do i put it in my sig?


----------



## SuperZan

Quote:


> Originally Posted by *en9dmp*
> 
> Errr... what are you talking about? I'm talking about the non-existent scaling in the crossfire profile for fallout 4.
> 
> Rig builder? Fine tune your display? OK man...


Rigbuilder. You know, the option up there on the topbar of the site? Right next to News/Gaming/Reviews? The "clued up" will know that it's the easiest way for people to see your system components and specs. The even more "clued up" will know that components and specs have a massive impact on the kind of advice people can offer you. The resolution is important because different monitors, resolutions, and even display cables have different limitations/hangups/issues. In short, the more detail you can offer people, the more helpful they can be. With the latest hotfix driver my Crossfire setup is buttery smooth at Ultra everything, even with Gimpworks garbage enabled. You can easily look at my *Rigbuilder* rig in my signature to see what I'm using. Would that we could do the same with your setup, perhaps we'd have some advice for you.


----------



## SuperZan

Quote:


> Originally Posted by *tysonischarles*
> 
> done! now how do i put it in my sig?


Click on your avatar in the top right corner of the site, scroll down to Forum Signature, Edit, and there should be an option to pop it in.


----------



## tysonischarles

Done! Thanking you


----------



## p4inkill3r

Quote:


> Originally Posted by *bluezone*
> 
> Hi Guys.
> 
> I've ben on the 7900 owners group with my HD 7950's for a while now.
> I've been waiting on delivery of a R9 Nano from AMD (won it in a contest when the Radeon software drivers were introduced. back at the end of Nov).
> I'm been busy back reading up to page 400 of the R9 Fury owners thread. Haven't Hit the intro and opinions of the Nano yet in the back pages.
> How has the Nano been for use and overclocking?


First off, congratulations on winning a Nano, what a sweet prize!
Secondly, the Nano seems to be a distant third to the Fury and Fury X in popularity, albeit for justifiable reasons. There are happy owners and others that had cards that had perceptible coil noise or other issues that caused them to not enjoy it so much, but for a free card, I think you'll love it.


----------



## bluezone

Quote:


> Originally Posted by *p4inkill3r*
> 
> First off, congratulations on winning a Nano, what a sweet prize!
> Secondly, the Nano seems to be a distant third to the Fury and Fury X in popularity, albeit for justifiable reasons. There are happy owners and others that had cards that had perceptible coil noise or other issues that caused them to not enjoy it so much, but for a free card, I think you'll love it.


Good to hear it's a card to look forward to. I'll have to keep an ear out for coil whine. Probably no way to RMA it if there is a problem though, but free is free.
Just wish they would hurry up and send the darn thing. I'm left playing Rise of the Tomb raider without it. LOL


----------



## bluezone

Quote:


> Originally Posted by *p4inkill3r*
> 
> First off, congratulations on winning a Nano, what a sweet prize!
> Secondly, the Nano seems to be a distant third to the Fury and Fury X in popularity, albeit for justifiable reasons. There are happy owners and others that had cards that had perceptible coil noise or other issues that caused them to not enjoy it so much, but for a free card, I think you'll love it.


Thank You by the way.


----------



## tysonischarles

I will mention the cool whine, my card practically sings to me when I'm benching it hahah


----------



## bluezone

Quote:


> Originally Posted by *tysonischarles*
> 
> I will mention the cool whine, my card practically sings to me when I'm benching it hahah


Show tunes, Hip Hop or Pop.


----------



## tysonischarles

Quote:


> Originally Posted by *bluezone*
> 
> Show tunes, Hip Hop or Pop.


Kinda like dubstep! doesn't help that my network card has awful whine as well...I legit have a choir of whine


----------



## jamaican voodoo

Ok some here are the beasts officially


----------



## SuperZan

Darn right, more power! Very nice, they should look great on the blocks.


----------



## bluezone

Quote:


> Originally Posted by *tysonischarles*
> 
> Kinda like dubstep! doesn't help that my network card has awful whine as well...I legit have a choir of whine


No choir with my 7950's. though one them occasionally belts out Hank Williams country and western.







.


----------



## jamaican voodoo

Quote:


> Originally Posted by *SuperZan*
> 
> Darn right, more power! Very nice, they should look great on the blocks.


I can't wait to get them up and running so excited lol, currently putting the blocks on them will post pics after.


----------



## tysonischarles

I may make a recording on my soundcload for you to listen to ahaha


----------



## bluezone

Quote:


> Originally Posted by *tysonischarles*
> 
> I may make a recording on my soundcload for you to listen to ahaha


Ah Music to/for my ears.


----------



## jamaican voodoo

alright guys the blocks are on and they look freaking good!!


----------



## SuperZan

Very clean, enjoy firing them up.


----------



## Alastair

Quote:


> Originally Posted by *jamaican voodoo*
> 
> alright guys the blocks are on and they look freaking good!!


what blocks are those? They don't look like EKWB.


----------



## jamaican voodoo

Swiftech komodos with stock fury back plate.


----------



## Otterfluff

That's the first time I have seen a clean and complete photo of the Swifttech Komodo for the Fury X attached to a card. It looks much better than all their promo pics. Thanks for sharing!


----------



## en9dmp

Quote:


> Originally Posted by *SuperZan*
> 
> Rigbuilder. You know, the option up there on the topbar of the site? Right next to News/Gaming/Reviews? The "clued up" will know that it's the easiest way for people to see your system components and specs. The even more "clued up" will know that components and specs have a massive impact on the kind of advice people can offer you. The resolution is important because different monitors, resolutions, and even display cables have different limitations/hangups/issues. In short, the more detail you can offer people, the more helpful they can be. With the latest hotfix driver my Crossfire setup is buttery smooth at Ultra everything, even with Gimpworks garbage enabled. You can easily look at my _*Rigbuilder*_ rig in my signature to see what I'm using. Would that we could do the same with your setup, perhaps we'd have some advice for you.


I only ever read and post on my phone so no rigbuilder anywhere. To be honest I wasn't actually asking for advice I was stating a fact that the CF profile is useless, which is also confirmed in other forums. People coming back talking about monitors and cables should really be knowledgeable enough to know that has nothing to do with my issue. If I was complaining of display cutting out or the refresh locking at 30hz then talking about DP length and version might be sensible, but I'm not. Fallout 4 is the only game I have issues with. I'm running fury x crossfire, 4790k, 32Gb PC2800 C10. This usually allows me to max out my games in 4k, bar the notable exception of FO4. What res are you playing at? That's quite a claim that the new drivers boosted performance that much for you.


----------



## en9dmp

Quote:


> Originally Posted by *Maximization*
> 
> you need DP 1.2 to get 60 Hz


Yes..... yes you do


----------



## SuperZan

Quote:


> Originally Posted by *en9dmp*
> 
> I only ever read and post on my phone so no rigbuilder anywhere. To be honest I wasn't actually asking for advice I was stating a fact that the CF profile is useless, which is also confirmed in other forums. People coming back talking about monitors and cables should really be knowledgeable enough to know that has nothing to do with my issue. If I was complaining of display cutting out or the refresh locking at 30hz then talking about DP length and version might be sensible, but I'm not. Fallout 4 is the only game I have issues with. I'm running fury x crossfire, 4790k, 32Gb PC2800 C10. This usually allows me to max out my games in 4k, bar the notable exception of FO4. What res are you playing at? That's quite a claim that the new drivers boosted performance that much for you.


4k, 3840x2160, LG 27MU67. I'm not the only one experiencing improved performance, so I'm really not sure how it's "quite a claim". With a combination of high and ultra settings it was playable on my monitor when I was only really benefiting from a sole Fury X. Functional Crossfire has made it exceptional. I know that the profile was broken when 16.1.1 was first released, but the improved version fixed the missing texture issues. You've a better CPU than mine, the same amount of RAM, and presumably, the same driver release. This makes questions about your monitor/cable relevant - it's not out of the realm of possibility that somebody might not know that they needed DP 1.2 compatibility for 60Hz 4k.

If you weren't asking for advice, wonderful. If you're here to claim that the CF profile is useless for a fact despite that not being a universal, fine, but I have to wonder why you'd not want to crowd-source an answer short of 'driver useless'. A lot of the people in this thread are very knowledgeable. If you're just here to throw offhand comments when people offer to try to help you, then do please jog on. <3


----------



## en9dmp

Quote:


> Originally Posted by *SuperZan*
> 
> 4k, 3840x2160, LG 27MU67. I'm not the only one experiencing improved performance, so I'm really not sure how it's "quite a claim". With a combination of high and ultra settings it was playable on my monitor when I was only really benefiting from a sole Fury X. Functional Crossfire has made it exceptional. I know that the profile was broken when 16.1.1 was first released, but the improved version fixed the missing texture issues. You've a better CPU than mine, the same amount of RAM, and presumably, the same driver release. This makes questions about your monitor/cable relevant - it's not out of the realm of possibility that somebody might not know that they needed DP 1.2 compatibility for 60Hz 4k.
> 
> If you weren't asking for advice, wonderful. If you're here to claim that the CF profile is useless for a fact despite that not being a universal, fine, but I have to wonder why you'd not want to crowd-source an answer short of 'driver useless'. A lot of the people in this thread are very knowledgeable. If you're just here to throw offhand comments when people offer to try to help you, then do please jog on. <3


The main reason for my post was warn people who, like me, were probably hoping for the driver to really make a difference, so they don't get their hopes up... After I posted I had a look on the overclockers uk forums and there are a number of posts confirming the same results as me. It seems like at the moment you may well be in the lucky minority, hence, yes, I was surprised at your claim. I would be interested to see if others on the forum are seeing positive or negative effects from the new FO4 crossfire profile. At the moment the sample size is too small to really tell.

For the avoidance of doubt, I'll hook my main rig up to my new 1440p monitor and run in 4k VSR to see if there's any difference. This would rule out any questions over the cable or TV.


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> Hi Guys.
> 
> I've ben on the 7900 owners group with my HD 7950's for a while now.
> I've been waiting on delivery of a R9 Nano from AMD (won it in a contest when the Radeon software drivers were introduced. back at the end of Nov).
> I'm been busy back reading up to page 400 of the R9 Fury owners thread. Haven't Hit the intro and opinions of the Nano yet in the back pages.
> How has the Nano been for use and overclocking?


Afaik it can be overclocked to reach fury x speeds with the +50% power setting. That makes it better than the air cooled Fury


----------



## SuperZan

It is entirely possible that I drew a favourable lot this time 'round. PC specs being what they are, variability is bound to happen. Each release of Crimson has resulted in reports of it not getting on with some systems, moreso than I can remember with Catalyst. That said, most of my issues (most to do with Crossfire) seem to have been resolved with this latest patch, but it's certainly a YMMV sort of thing.


----------



## SuperZan

Quote:


> Originally Posted by *Flamingo*
> 
> Afaik it can be overclocked to reach fury x speeds with the +50% power setting. That makes it better than the air cooled Fury


Different reviews and some users have said that it throttles more easily than Fury or Fury X, but being anecdotal I wouldn't take it as gospel. It still seems to have a bit of headroom for overclocking, and fully-unlocked shader count is quite nice.


----------



## Radox-0

Quote:


> Originally Posted by *bluezone*
> 
> Hi Guys.
> 
> I've ben on the 7900 owners group with my HD 7950's for a while now.
> I've been waiting on delivery of a R9 Nano from AMD (won it in a contest when the Radeon software drivers were introduced. back at the end of Nov).
> I'm been busy back reading up to page 400 of the R9 Fury owners thread. Haven't Hit the intro and opinions of the Nano yet in the back pages.
> How has the Nano been for use and overclocking?


Nice, congrats. Having been fussy, I went through the Fury X, a pair of Fury's and a pair of Nano's before finally settling on keeping a Nano for my HTPC. My thoughts on it.

In terms of performance its a great card. At stock Nano speeds (1000 Mhz / 500 Memory) it out performed both my Fury cards which were the OC versions of the sapphire Tri-X (1040 Mhz) thanks to having the full Fiji core vs a stripped down but faster core. But it could not sustain that as the stock cooler on the Nano comes with a fan profile that favours silence, heavily, so as a result you will see throttling of the card landing it in the 900-950 Mhz range for daily use. To negate this you should up the fan profile and it will solve this issue.

In terms of coil whine, I have found its common to all the Fiji cards rather then the Nano specifically. My Fury X had it, my Fury had it and my Nano had it. Only one of my Fury samples and my current Nano seem to produce a reasonable amount which I would class as acceptable and once you enable v-sync on say a 60 fps monitor its fine. The other cards however had a terrible amount so pot luck on that front I expect.

In terms of over clocking both my Nano samples on the air cooler could hit and hold 1050mhz for daily use with the fan ramped up. Memory on one card was totally unstable, while on the second card it could hit the next discrete memory overclock of 545 Mhz, but not reach the next step up of 600 Mhz. At 1050Mhz it was at base Fury X performance, and traded blows with my Fury cards that could overclock much higher, once again thanks to the full Fuji core vs stripped down but faster. For synthetic benchmark both cards could reach the 1085-1090 Mhz mark okay on stock cooler but fans were ramped up hard to maintain that.

Under water its all changed and the Nano has become beastly







My current sample can hit 545 on the memory which has not changed as 600 Mhz seems too far, but in synthetics, can hit 1140 Mhz and for daily use settles in at 1125 Mhz on the core. Pretty much putting it a decently overclocked Fury X levels which is nice. This is the point at which however the cooling is not the issue, rather the limited power draw and cut down power phases I expect being unable to reach the high 1150-1200 Mhz mark some Fury X's / Fury's are capable of.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Otterfluff*
> 
> That's the first time I have seen a clean and complete photo of the Swifttech Komodo for the Fury X attached to a card. It looks much better than all their promo pics. Thanks for sharing!


a lot of people are staring like the way the block look as well.


----------



## Pulsarius

Hello there, 1º post







.
ill be getting a XFX Fury triple dissipation in 2 months , just wondering if theres a custom Bios for that card ? anyone can point me to the right direction ? i saw a custom bios for the fury x , but i guess it will not work on the Fury ?

Befor you ask , yes i have a EK nikel+acetal Waterblock + Backplate waiting for that card .

thanks and good to be here.


----------



## tysonischarles

Welcome to the Choir of Coil Whine







As for bios, I have no idea :')


----------



## SuperZan

Quote:


> Originally Posted by *Pulsarius*
> 
> Hello there, 1º post
> 
> 
> 
> 
> 
> 
> 
> .
> ill be getting a XFX Fury triple dissipation in 2 months , just wondering if theres a custom Bios for that card ? anyone can point me to the right direction ? i saw a custom bios for the fury x , but i guess it will not work on the Fury ?
> 
> Befor you ask , yes i have a EK nikel+acetal Waterblock + Backplate waiting for that card .
> 
> thanks and good to be here.


Congrats! I own an XFX Fury as well, love the card  As for a modded bios, I'm actually investigating the same thing so I'll let you know what I find unless somebody beats me to it


----------



## xer0h0ur

Quote:


> Originally Posted by *jamaican voodoo*
> 
> alright guys the blocks are on and they look freaking good!!


Skeet skeet! That is a thing of beauty. Really like those transparent block terminals.


----------



## bluezone

.
Quote:


> Originally Posted by *Flamingo*
> 
> Afaik it can be overclocked to reach fury x speeds with the +50% power setting. That makes it better than the air cooled Fury


I'll give that a try when AMD finely decides to forward the card to PCPER.

Thanks for the info.


----------



## bluezone

Quote:


> Originally Posted by *Radox-0*
> 
> In terms of over clocking both my Nano samples on the air cooler could hit and hold 1050mhz for daily use with the fan ramped up. Memory on one card was totally unstable, while on the second card it could hit the next discrete memory overclock of 545 Mhz, but not reach the next step up of 600 Mhz. At 1050Mhz it was at base Fury X performance, and traded blows with my Fury cards that could overclock much higher, once again thanks to the full Fuji core vs stripped down but faster. For synthetic benchmark both cards could reach the 1085-1090 Mhz mark okay on stock cooler but fans were ramped up hard to maintain that.
> 
> Under water its all changed and the Nano has become beastly
> 
> 
> 
> 
> 
> 
> 
> My current sample can hit 545 on the memory which has not changed as 600 Mhz seems too far, but in synthetics, can hit 1140 Mhz and for daily use settles in at 1125 Mhz on the core. Pretty much putting it a decently overclocked Fury X levels which is nice. This is the point at which however the cooling is not the issue, rather the limited power draw and cut down power phases I expect being unable to reach the high 1150-1200 Mhz mark some Fury X's / Fury's are capable of.


Awesome what your getting from your Nano.

I'm looking forward to trying a few things myself once I've gotten familiar with my card. If I'm really impressed with it, maybe twins in the future.

Thanks


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> OMG the scaling is complete garbage... all this profile does is introduce a ton of input lag, stuttering and a lower frame rate than with crossfire disabled. Seriously how do they get away with this?


I just tried it a bit and for me (2x Fury X, [email protected], [email protected] with 120FPS frame limit), it scales decently. There seems to be a little bit of something like rubber banding. There is more input lag, but that is always the case with Multi GPU (at least without DirectX 12, Vulcan, Mantle) and without VSync it is not bad for me. If there is no CPU limit, I am able to get 120FPS when I previously was not able to.

I will try to record videos with Crossfire enabled and with Crossfire disabled and upload them (video files, not YouTube).


----------



## en9dmp

Quote:


> Originally Posted by *Neon Lights*
> 
> I just tried it a bit and for me (2x Fury X, [email protected], [email protected] with 120FPS frame limit), it scales decently. There seems to be a little bit of something like rubber banding. There is more input lag, but that is always the case with Multi GPU (at least without DirectX 12, Vulcan, Mantle) and without VSync it is not bad for me. If there is no CPU limit, I am able to get 120FPS when I previously was not able to.
> 
> I will try to record videos with Crossfire enabled and with Crossfire disabled and upload them (video files, not YouTube).


I spent a lot of time yesterday playing around with a lot of settings and seem to have found the culprit... HBAO+, which looks great and has no perceivable impact on frame rates when running a single fury X, is what has caused most of the stuttering for me with crossfire enabled. Changing to SSAO results in much smoother gameplay, however, in this mode the AO effect in game constantly flickers (in exactly the same way as forcing AFR or using the crisis 3 profile did in the older 15.11.1 drivers). So it seems like they haven't resolved the AO issues in their profile, but it's something I can live with. Anyone else confirm they have the SSAO shimmering with CF enabled?


----------



## en9dmp

Quote:


> Originally Posted by *Otterfluff*
> 
> Started playing xcom 2 and it works under crossfire in optimize 1x1 Sometimes I get some texture flashing other times I don't? That game is brutal at 4k even after I turn settings down I only get 25-30fps in areas with one gpu. So crossfire would be really useful in this game I hope they make a profile as I easily get 45-60fps with crossfire enabled.
> I am using two mods that help alot:
> FPS dynamic shadows - Shadow Boost
> 
> Fallout 4 alternative launcher
> 
> The fallout 4 launcher will have all the fallout4prefs.ini edits available to change as well as some others that are not in it to start with. One of the gem's is enabling multi core threading "iNumHWThreads. By default fallout 4 only uses one core.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [Display]
> flocalShadowMapHalveEveryXUnit=750.0000
> focusShadowMapDoubleEveryXUnit=450.0000
> fShadowBiasScale=1.0000
> fDirShadowDistance=3000.0000
> fShadowDistance=3000.0000
> uiOrthoShadowFilter=2
> uiShadowFilter=2
> iShadowMapResolution=2098
> uPipboyTargetHeight=700
> uPipboyTargetWidth=876
> iVolumetricLightingQuality=-1
> bVolumetricLightingEnable=0
> bSAOEnable=1
> iDirShadowSplits=3
> bVolumetricLightingForceCasters=0
> iTiledLightingMinLights=40
> bComputeShaderDeferredTiledLighting=1
> iMaxFocusShadowsDialogue=4
> iMaxFocusShadows=1
> bForceIgnoreSmoothness=0
> fBlendSplitDirShadow=40.0000
> bSinglePassDirShadow=1
> bEnableWetnessMaterials=1
> fTessFactorMaxDistanceScale=100.0000
> sAntiAliasing=
> fLeafAnimDampenDistEnd=4600.0000
> fLeafAnimDampenDistStart=3600.0000
> fMeshLODFadePercentDefault=1.2000
> fMeshLODFadeBoundDefault=256.0000
> fMeshLODFadeScalar=1.0000
> fMeshLODLevel2FadeTreeDistance=2048.0000
> fMeshLODLevel1FadeTreeDistance=2844.0000
> fInteriorMeshLODLevel2FadeDist=1950.0000
> fInteriorMeshLODLevel1FadeDist=2600.0000
> fMeshLODLevel2FadeDist=3000.0000
> fMeshLODLevel1FadeDist=4000.0000
> iMaxAnisotropy=1
> iPresentInterval=0
> bTopMostWindow=0
> bMaximizeWindow=0
> bBorderless=1
> bFull Screen=0
> iSize H=2160
> iSize W=3840
> bAllowShadowcasterNPCLights=0
> iScreenShotIndex=0
> fMaxFocusShadowMapDistance=450.0000
> bPrecipitationOcclusion=1
> iMaxSkinDecalsPerFrame=3
> iMaxDecalsPerFrame=10
> iTexMipMapSkip=0
> [Imagespace]
> bDoDepthOfField=1
> bScreenSpaceBokeh=1
> bMBEnable=1
> bLensFlare=1
> [Pipboy]
> fPipboyEffectColorB=0.0900
> fPipboyEffectColorG=1.0000
> fPipboyEffectColorR=0.0800
> [VATS]
> fModMenuEffectHighlightPAColorB=0.4100
> fModMenuEffectHighlightPAColorG=0.8200
> fModMenuEffectHighlightPAColorR=1.0000
> fModMenuEffectPAColorB=0.4100
> fModMenuEffectPAColorG=0.8200
> fModMenuEffectPAColorR=1.0000
> fModMenuEffectHighlightColorB=0.0824
> fModMenuEffectHighlightColorG=1.0000
> fModMenuEffectHighlightColorR=0.0706
> fModMenuEffectColorB=0.4200
> fModMenuEffectColorG=0.9900
> fModMenuEffectColorR=0.4900
> [MAIN]
> fSkyCellRefFadeDistance=150000.0000
> bCrosshairEnabled=1
> fHUDOpacity=1.000
> bSaveOnPause=1
> bSaveOnTravel=1
> bSaveOnWait=1
> bSaveOnRest=1
> [LightingShader]
> bScreenSpaceSubsurfaceScattering=0
> bScreenSpaceReflections=1
> [General]
> bGamepadEnable=1
> bPipboyCompanionEnabled=0
> iStoryManagerLoggingEvent=-1
> bEnableStoryManagerLogging=0
> uGridsToLoad=5
> SIntroSequence=0
> iNumHWThreads=4
> bPlayMainmenuMusic=1
> uExterior Cell Buffer=36
> [Interface]
> bDialogueSubtitles=0
> bGeneralSubtitles=0
> iHUDColorB=21
> iHUDColorG=255
> iHUDColorR=18
> bDialogueCameraEnable=1
> bShowCompass=1
> [Controls]
> fMouseHeadingSensitivity=0.0300
> fGamepadHeadingSensitivity=0.6667
> bAlwaysRunByDefault=1
> bInvertYValues=0
> bGamePadRumble=1
> [GamePlay]
> iDifficulty=2
> bShowFloatingQuestMarkers=1
> bShowQuestMarkers=1
> [Particles]
> iMaxDesired=750
> [SaveGame]
> fAutosaveEveryXMins=10.0000
> [AudioMenu]
> fAudioMasterVolume=1.0000
> fVal7=1.0000
> uID7=0
> fVal6=1.0000
> uID6=0
> fVal5=1.0000
> uID5=0
> fVal4=0.6500
> uID4=138006
> fVal3=0.6500
> uID3=1007612
> fVal2=1.0000
> uID2=94881
> fVal1=0.6500
> uID1=466532
> fVal0=0.6500
> uID0=554685
> [Water]
> bUseWaterDisplacements=1
> bUseWaterRefractions=1
> bUseWaterReflections=1
> bUseWaterDepth=1
> [TerrainManager]
> fTreeLoadDistance=75000.0000
> fBlockMaximumDistance=100000.0000
> fBlockLevel2Distance=80000.0000
> fBlockLevel1Distance=32000.0000
> fBlockLevel0Distance=20000.0000
> fSplitDistanceMult=1.1000
> bShowLODInEditor=0
> [Grass]
> fGrassStartFadeDistance=2000.0000
> fGrassMaxStartFadeDistance=7000.0000
> fGrassMinStartFadeDistance=400.0000
> [Decals]
> uMaxDecals=100
> [LOD]
> fLODFadeOutMultSkyCell=1.0000
> fLODFadeOutMultObjects=5.0000
> fLODFadeOutMultItems=3.0000
> fLODFadeOutMultActors=5.0000
> [Launcher]
> bEnableFileSelection=0


Hi Otter, thanked for posting your settings, did help with narrowing down some of the problem areas. Though I'm not surprised this gives you good fps as you've disabled most of the intensive features like the volumetric lighting and god rays, and even running with AF at 1X... the latter makes everything look terrible though, surely you can up that to 16X without going below your target frame rate?


----------



## Otterfluff

I certainly can turn up stuff now crossfire works, however god rays and vol lighting does very little for me, I do run at 4k resolution. Even AF has diminishing returns in visual gain at 4k, many texturing effects become not as big a deal when running 4k res. But I will definatly turn it up now that my twin fury X are playing nicely.

God rays and vol lightning tax fps the heaviest and are not worth turning on if you arent getting decent fps to start with.


----------



## Neon Lights

Quote:


> Originally Posted by *en9dmp*
> 
> I spent a lot of time yesterday playing around with a lot of settings and seem to have found the culprit... HBAO+, which looks great and has no perceivable impact on frame rates when running a single fury X, is what has caused most of the stuttering for me with crossfire enabled. Changing to SSAO results in much smoother gameplay, however, in this mode the AO effect in game constantly flickers (in exactly the same way as forcing AFR or using the crisis 3 profile did in the older 15.11.1 drivers). So it seems like they haven't resolved the AO issues in their profile, but it's something I can live with. Anyone else confirm they have the SSAO shimmering with CF enabled?


For me, the game looks the same as it does with one graphic card. The usage went up from about 50% per GPU with SSAO to about 65% per GPU with HBAO+ (at least in the one scene where I tested it) with a 120FPS limit.

Quote:


> Originally Posted by *Otterfluff*
> 
> I certainly can turn up stuff now crossfire works, however god rays and vol lighting does very little for me, I do run at 4k resolution. Even AF has diminishing returns in visual gain at 4k, many texturing effects become not as big a deal when running 4k res. But I will definatly turn it up now that my twin fury X are playing nicely.
> 
> God rays and vol lightning tax fps the heaviest and are not worth turning on if you arent getting decent fps to start with.


I have directly compared the After Burner values (GPU usage, CPU usage, FPS) with God Rays set to High (volumetric lightning quality 2) and set to Ultra (volumetric lightning quality 3) and noticed that neither the GPU usage nor the CPU usage go up (more than a few percent), but the FPS decrease (by about 25%) with a 120 FPS limit. As far as I know, that means that the game (or the Nvidia God Rays in it) are not well optimized.


----------



## Otterfluff

I can easily get 16x AF now in fallout 4 crossfire. I am getting around 55-80 fps in game. I notice it goes lower with Frame Rate target control, the down clocking is much less than before but It's still there. I would recommend using Riva Tuner Statistic Server to limit the fps. I have my 4k Monitor OC -> 75Hz so I set it to 75 fps in Riva Tuner fps limit.

I also have the Shadow Boost target set to 65 fps.


----------



## Noirgheos

Is there a way to cool HBM without a waterblock? Just a 40MHz OC on it gave me 10FPS avg in benchmarks, thing is, my temps went to 64C max from 54C...


----------



## p4inkill3r

Quote:


> Originally Posted by *Noirgheos*
> 
> Is there a way to cool HBM without a waterblock? Just a 40MHz OC on it gave me 10FPS avg in benchmarks, thing is, my temps went to 64C max from 54C...


What's wrong with 64C?


----------



## Noirgheos

Quote:


> Originally Posted by *p4inkill3r*
> 
> What's wrong with 64C?


Nothing, just I'm afraid to push it further. If 40MHz gave 10C more... I'd like to try 580MHz, and 74C isn't appealing.


----------



## tysonischarles

Well if you ready a bit further back, the HBM has set clock speeds. 500/545/600/666.

I'd be happy with what you have as its most likely actually at 545. And from my testing anything above causes me a lot of problems. I'd try work up the GPU clock speed. Go back like two pages and reac the link, it makes more sense than me


----------



## Arizonian

So I started to entertain the idea of a 60 Hz 4K over a 144 Hz 1440p freesync monitor. I'm trying to get a feel for others experience running 4K on a single fury or fury X.

1. How well are you doing keeping 60 FPS average? Which games, what settings?
2. Is running no AA at 4K still as brilliant as 1440 with AA turned ON?

I'm hoping with medium settings maintaining 60 FPS would be possible on my nitro fury as an interim card that I will be selling when AIB's come out with their versions of Polaris.

I've been outweighing waiting on DP1.3 for 4K support up to 144 Hz but not sure if a single Polaris will be able to push it properly unknowing what that holds until it's released. I do not want to crossfire, or SLI ever again.

I'm not a competitive gamer and spend more time using monitor for multi-media like streaming than gaming. Currently there aren't any 144 Hz freesync IPS monitors that have caught my interest due to either low freesync range capped at 90 FPS or overdrive not working properly on others.

Whatever I choose will be for two years or more so it's been a tough choice. Thanks in advance for any replies. I may have found a nice 60 Hz freesync 4k monitor with DP1.2a and HDMI 2.0.


----------



## SuperZan

I've got an LG 27MU67 that I used with a single Fury X for about two months before picking up my Fury for crossfire. I was able to keep 50-60 FPS relatively well in most newer games, provided I used a mix of medium and high settings and turned off the big offenders (HBAO+, max tess, etc.). Occasional forays into the 40's but nothing too troubling as my panel is Freesync, albeit a limited range. Older games/MMO's were toast, no problems at High settings and the like.


----------



## Neon Lights

Quote:


> Originally Posted by *Neon Lights*
> 
> I just tried it a bit and for me (2x Fury X, [email protected], [email protected] with 120FPS frame limit), it scales decently. There seems to be a little bit of something like rubber banding. There is more input lag, but that is always the case with Multi GPU (at least without DirectX 12, Vulcan, Mantle) and without VSync it is not bad for me. If there is no CPU limit, I am able to get 120FPS when I previously was not able to.
> 
> I will try to record videos with Crossfire enabled and with Crossfire disabled and upload them (video files, not YouTube).


So I just tried to record the videos, but when I view them there appears to be stuttering in them that was not present when I was playing the game.

This is a link to the videos. The first one is recorded using the H.264 accelerated encoding by AMD, the second one using the Lagarith Lossless codec, both recorded using Bandicam (version 3.0.2.1014) at 120FPS on an HDD that has no other load on it (in case that could be a problem and cause the stuttering).
Does anyone have an idea what could cause it?

https://drive.google.com/folderview?id=0Bw0viD8oaIA5aFlZdVZvdlVlRGM&usp=sharing


----------



## bluezone

Quote:


> Originally Posted by *Neon Lights*
> 
> So I just tried to record the videos, but when I view them there appears to be stuttering in them that was not present when I was playing the game.
> 
> This is a link to the videos. The first one is recorded using the H.264 accelerated encoding by AMD, the second one using the Lagarith Lossless codec, both recorded using Bandicam (version 3.0.2.1014) at 120FPS on an HDD that has no other load on it (in case that could be a problem and cause the stuttering).
> Does anyone have an idea what could cause it?
> 
> https://drive.google.com/folderview?id=0Bw0viD8oaIA5aFlZdVZvdlVlRGM&usp=sharing


I'm not an expert but maybe a frame rate mismatch or too high of bit rate if its 4k. Just shots in the dark.


----------



## Neon Lights

Quote:


> Originally Posted by *bluezone*
> 
> I'm not an expert but maybe a frame rate mismatch or too high of bit rate if its 4k. Just shots in the dark.


It is 1080p @120FPS.

How can a frame rate mismatch happen?


----------



## bluezone

recording rate doesn't exactly match display rate. I believe it's like picket fencing.

A few years ago, when the Sandybridge CPU came out, the IGPU presented an output of 60 fps. Problems is movie playback is 59.25 (or something like that). So when you viewed movie content using the IGPU you would get a jumping in the video playback presentation.

(Edited because far to tired when I first wrote this.)


----------



## bluezone

Okay I've had coffee now.

So what all might this prove. well I think that the high rate video (120 fps) is capturing and amplifying the slight variations (uneven frame times) in gameplay video. So yea I think your game play is hitching or stuttering.


----------



## bluezone

Tom's Hardware has an article on undervolting R9 Fury.

http://www.tomshardware.com/reviews/msi-afterburner-undervolt-radeon-r9-fury,4425.html

This caught my eye:

Note: Every GPU is unique. This means that the lowest number at which a processor still functions without errors can be different from one to the next. What's more, errors don't always become apparent immediately. All of a graphics card's features need to be used for a prolonged period of time to determine if the configuration is stable.

Cards made during AMD's early production cycle are particularly prone to display problems after undervolting. These GPUs were only made into Furies by a somewhat adventurous unlocking process. The newer ones are a lot easier to undervolt. However, even with these, your mileage will vary.

I'm thinking out loud here.

High frame rate usually require high current use. How gentle is the cut-off point of the power delivery circuit on Fury family?

Wow good read. more questions than answers though.

1) Do fury chips that don't overclock well, under-volt well.
2) Do the attempts at hot moding the voltage regulator bridge circuit require an equal increases in resistance to balanceout; the voltage drop (in the other half of the bridge circuit), this in order to maintain current flow (sensitivity) to prevent clipping and voltage regulation noise (clipping).
3) does this mean that the Nano may be better overclocking chips, because newer and more stable. This unfortunately coupled with a less powerful voltage delivery system.

remember I'm just thinking out loud.


----------



## AliNT77

They (and everyone else) should test UVing with r9 nano

Undervolting is the best way to overclock the nano Because it always power throttles

so when U undervolt it , the chip uses less power than stock voltage for same frequency
So it doesnt throttle as much as before

I personaly tested this with my r9 290 @-30% power limit

With stock voltage it was running @780-850mHz

But with -100 it was running at a rock-solid 947mhz


----------



## Crisium

Quote:


> Originally Posted by *tysonischarles*
> 
> Well if you ready a bit further back, the HBM has set clock speeds. 500/545/600/666.
> 
> I'd be happy with what you have as its most likely actually at 545. And from my testing anything above causes me a lot of problems. I'd try work up the GPU clock speed. Go back like two pages and reac the link, it makes more sense than me


I have my memory at 550. Should I set it to 545 instead? What actually happens when I set it to 550?


----------



## SuperZan

Quote:


> Originally Posted by *Crisium*
> 
> I have my memory at 550. Should I set it to 545 instead? What actually happens when I set it to 550?


It shouldn't make a difference whether you set it to 550, or 545. At 550, it's clocking down to 545. At 573 it should start clocking up to 600.


----------



## tysonischarles

I actually don't at what point it goes up or down to whatever step. So I can't actually answer the question


----------



## SuperZan

It works at the increments of 500-545-600-666.66MHz, rounding to the step nearest your current clockspeed.

Quote:


> Originally Posted by *Radox-0*
> 
> Checking the rules, seems fine. Got the info from here: https://forums.overclockers.co.uk/showthread.php?t=18678073&page=233
> 
> Page 233 post # 6977 by AMD Representative


Quote: AMDMatt's post @ overclockers.co.uk


> "Nice one Kaap. Here's a fun fact for all your Fiji overclockers.
> 
> Fiji's MCLK overdrives in discrete steps, so while various overclocking tools are increasing in 5Mhz steps the MCLK is actually only able to support 500.00/545.45/600.00/666.66MHz and right now it just rounds to the nearest step. If you are intending to overclock HBM on Fiji, then 545Mhz works well on all four of my Fury X cards. 600MHZ proved a step too far on each card and resulted in instability.
> 
> So for all those people setting their MCLK at 570Mhz, it's actually running at 545Mhz."


Anecdotally, Fiji does not care for insta-boosting memory clockspeed, at least on my cards. I've had success running up to 545, running a Fire Strike or Catzilla, letting it sit for a moment, and then attempting another run at 600.


----------



## 98uk

What voltage bump are you guys running for 1100mhz core?

I thought I had it stable at 1100/550 @ +18mV (+50% power), it survived BF4 and 3Dmark... but crashed very quickly in Pcars.

I bumped to +36mV and now Pcars is stable... but is +36mV a lot for 1100mhz?


----------



## JunkaDK

Quote:


> Originally Posted by *98uk*
> 
> What voltage bump are you guys running for 1100mhz core?
> 
> I thought I had it stable at 1100/550 @ +18mV (+50% power), it survived BF4 and 3Dmark... but crashed very quickly in Pcars.
> 
> I bumped to +36mV and now Pcars is stable... but is +36mV a lot for 1100mhz?


From my experince, you can throw as much power at it as you want.. my problem is the VRM overheating.. the core is fine. Im gonna replace the TIM on my Asus R9 Fury Strix and that it will lover the temps of the VRM. People have reportet massive temp drops of the core also from doing this. +10 degrees.-


----------



## ebinkerd

Quote:


> Originally Posted by *98uk*
> 
> What voltage bump are you guys running for 1100mhz core?
> 
> I thought I had it stable at 1100/550 @ +18mV (+50% power), it survived BF4 and 3Dmark... but crashed very quickly in Pcars.
> 
> I bumped to +36mV and now Pcars is stable... but is +36mV a lot for 1100mhz?


Which card? Fury? I have my Nano set to 1100/545 @+12mv stable on Firestrike, Unigine, and all games.


----------



## 98uk

Quote:


> Originally Posted by *ebinkerd*
> 
> Which card? Fury? I have my Nano set to 1100/545 @+12mv stable on Firestrike, Unigine, and all games.


Sapphire Fury (non-nitro). I thought it was fine at +18mV, but Project Cars seemed to kill it fairly quick


----------



## p4inkill3r

Quote:


> Originally Posted by *98uk*
> 
> Sapphire Fury (non-nitro). I thought it was fine at +18mV, but Project Cars seemed to kill it fairly quick


Witcher 3 kills my OCs over and over, but most other games are stable around 1150/545 with +55mv on my Fury X.


----------



## 98uk

Quote:


> Originally Posted by *p4inkill3r*
> 
> Witcher 3 kills my OCs over and over, but most other games are stable around 1150/545 with +55mv on my Fury X.


Yeah, PCars just died again... no idea why.

I can play hours upon hours of BF4 with no issue... but Pcars must be stock...


----------



## JunkaDK

Quote:


> Originally Posted by *98uk*
> 
> Yeah, PCars just died again... no idea why.
> 
> I can play hours upon hours of BF4 with no issue... but Pcars must be stock...


Are you monitoring temps or load?


----------



## p4inkill3r

Quote:


> Originally Posted by *98uk*
> 
> Yeah, PCars just died again... no idea why.
> 
> I can play hours upon hours of BF4 with no issue... but Pcars must be stock...


gta5 was the same way for many people that kept getting crashes, it just didn't like anything except stock settings for whatever reason.


----------



## 98uk

Quote:


> Originally Posted by *JunkaDK*
> 
> Are you monitoring temps or load?


Yup, core temps are never even hitting 60c...


----------



## JunkaDK

Quote:


> Originally Posted by *98uk*
> 
> Yup, core temps are never even hitting 60c...


Did you compare the load pattern in this game to other games? Is it maxed out constantly or fluctuating? V-sync on/off?


----------



## ebinkerd

So it seems undervolting the Nano yields some impressive results.

This one is at 1090/545 @ -18mv http://www.3dmark.com/3dm/10696867?

This one is 1075/545 @ -6mv http://www.3dmark.com/fs/7505360


----------



## tysonischarles

Quote:


> Originally Posted by *ebinkerd*
> 
> So it seems undervolting the Nano yields some impressive results.
> 
> This one is at 1090/545 @ -18mv http://www.3dmark.com/3dm/10696867?
> 
> This one is 1075/545 @ -6mv http://www.3dmark.com/fs/7505360


Seems to good to be true...would a Fury X Yield similar results I wonder?

Edit: For a second there I thought it was firestrike extreme, and that a nano was smashing my fury X by like 3k points hahah.


----------



## Arizonian

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> Spoiler: My question
> 
> 
> 
> So I started to entertain the idea of a 60 Hz 4K over a 144 Hz 1440p freesync monitor. I'm trying to get a feel for others experience running 4K on a single fury or fury X.
> 
> 1. How well are you doing keeping 60 FPS average? Which games, what settings?
> 2. Is running no AA at 4K still as brilliant as 1440 with AA turned ON?
> 
> I'm hoping with medium settings maintaining 60 FPS would be possible on my nitro fury as an interim card that I will be selling when AIB's come out with their versions of Polaris.
> 
> I've been outweighing waiting on DP1.3 for 4K support up to 144 Hz but not sure if a single Polaris will be able to push it properly unknowing what that holds until it's released. I do not want to crossfire, or SLI ever again.
> 
> I'm not a competitive gamer and spend more time using monitor for multi-media like streaming than gaming. Currently there aren't any 144 Hz freesync IPS monitors that have caught my interest due to either low freesync range capped at 90 FPS or overdrive not working properly on others.
> 
> Whatever I choose will be for two years or more so it's been a tough choice. Thanks in advance for any replies. I may have found a nice 60 Hz freesync 4k monitor with DP1.2a and HDMI 2.0.


Quote:


> Originally Posted by *SuperZan*
> 
> I've got an LG 27MU67 that I used with a single Fury X for about two months before picking up my Fury for crossfire. I was able to keep 50-60 FPS relatively well in most newer games, provided I used a mix of medium and high settings and turned off the big offenders (HBAO+, max tess, etc.). Occasional forays into the 40's but nothing too troubling as my panel is Freesync, albeit a limited range. Older games/MMO's were toast, no problems at High settings and the like.


Thank you for your reply. +1 rep for sharing as I'm within $200 of reaching my goal and about to make a $700 purchase. So along comes *THIS* very appealing 4K monitor with HDMI 2.0 and DP 1.2a support.

I'm glad ViewSonic decided to do away with the bird feet stand on this new gaming series.

As for gaming, I wouldn't mind turning down game settings while looking down the road patiently for when Polaris launches. Suddenly this nitro fury has more value / appeal to me than before allowing me to explore 4K and not to forget liquid VR looks promising


----------



## p4inkill3r

Quote:


> Originally Posted by *Arizonian*
> 
> Thank you for your reply. +1 rep for sharing as I'm within $200 of reaching my goal and about to make a $700 purchase. So along comes *THIS* very appealing 4K monitor with HDMI 2.0 and DP 1.2a support.
> 
> I'm glad ViewSonic decided to do away with the bird feet stand on this new gaming series.
> 
> As for gaming, I wouldn't mind turning down game settings while looking down the road patiently for when Polaris launches. Suddenly this nitro fury has more value / appeal to me than before allowing me to explore 4K and not to forget liquid VR looks promising


Not a bad looking monitor at all, and 4K is awesome even when not at maximum ultra galaxy settings and pegged 60FPS.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> Not a bad looking monitor at all, and 4K is awesome even when not at maximum ultra galaxy settings and pegged 60FPS.


Once Polaris and Pascal drop I'm making the jump up to 4K. I really appreciate above 60fps on Ultra Galaxy Worldender 9000 on PC since I also play some titles on the xbone.

The console has been a saving grace while I wait for my GPU replacement. I should have it all up and running by Saturday and then finally it's Witcher 3 time!


----------



## Kana-Maru

I'm enjoying my 4K experiences with stock clocks. I couldn't wait and had to jump on 4K gaming last year.


----------



## Menno

I am running a fury x under water with an ekwb block on stock now. Only because I had the pump whine and the shop did not want to replace it. So did was the only way lol. I flashed an sapphire bios with uefi on it because the stock msi bios did not have gop/uefi. I cant add voltage with afterburner what do you guys use? The sapphire tool?


----------



## tysonischarles

I use Sapphire Trixxx for my Overclocks.

I do have some interesting test results! I don't have any screenshots of this though as my internet was down and I didn't save the Firestrike results, I did however write them down on a piece of cardboard.

So at stock I got 12,835. I did an increase with the HBM frequency to 545 but nothing to the voltage or the GPU, I scored 12970, so nothing major from memory alone. From my previous testing I knew that 1140 on the GPU was my stable on stock voltage.

So I started at +30mv to see how far I could creep up, having normally just maxed the mV setting thinking it would be best.
At 1160/545 +30mV I managed to score 13660. At 1170/545 +30mV I hit 13749. I think that's my highest stable score so far. I then for the sake of curiosity, maxed out the mV to +75, thinking it would stabilize it, when in fact it lowered my result to 13642. So more mV isn't actually better for performance's sake. I then pushed to 1175/545 +30mV and managed to score 13786. My highest score to date! Whilst I observed 1 little flicker during the benchmark, I'm still chuffed by the score. I pumped up the mV to +75 in an attempt to stabilize 1175, but my score went down to 13638.

I attempted 1180/545 +30mV but it crashed 5 seconds in after several artifacts. Increasing the mV in increments of 10 didn't help stabilize 1180 or even get me through the Benchmark.

I'd love to have some other people test in a similar matter to see if lower mV actually correlates to better performance. I'm sure some of you probably already knew this, but it wasn't known to me and I'm sure there are others that don't know either.


----------



## 98uk

Quote:


> Originally Posted by *tysonischarles*
> 
> I use Sapphire Trixxx for my Overclocks.
> 
> I do have some interesting test results! I don't have any screenshots of this though as my internet was down and I didn't save the Firestrike results, I did however write them down on a piece of cardboard.
> 
> So at stock I got 12,835. I did an increase with the HBM frequency to 545 but nothing to the voltage or the GPU, I scored 12970, so nothing major from memory alone. From my previous testing I knew that 1140 on the GPU was my stable on stock voltage.
> 
> So I started at +30mv to see how far I could creep up, having normally just maxed the mV setting thinking it would be best.
> At 1160/545 +30mV I managed to score 13660. At 1170/545 +30mV I hit 13749. I think that's my highest stable score so far. I then for the sake of curiosity, maxed out the mV to +75, thinking it would stabilize it, when in fact it lowered my result to 13642. So more mV isn't actually better for performance's sake. I then pushed to 1175/545 +30mV and managed to score 13786. My highest score to date! Whilst I observed 1 little flicker during the benchmark, I'm still chuffed by the score. I pumped up the mV to +75 in an attempt to stabilize 1175, but my score went down to 13638.
> 
> I attempted 1180/545 +30mV but it crashed 5 seconds in after several artifacts. Increasing the mV in increments of 10 didn't help stabilize 1180 or even get me through the Benchmark.
> 
> I'd love to have some other people test in a similar matter to see if lower mV actually correlates to better performance. I'm sure some of you probably already knew this, but it wasn't known to me and I'm sure there are others that don't know either.


A lot of the numbers you refer to are within a margin of error and probably don't relate to any change in performance.

E.g. 13786 to 13638


----------



## tysonischarles

Well I have tangible proof now, my last three runs @ 1170/545 +30mv

13825
13848
13832

http://www.3dmark.com/fs/7509973
http://www.3dmark.com/fs/7510031
http://www.3dmark.com/fs/7510076


----------



## 98uk

Quote:


> Originally Posted by *tysonischarles*
> 
> Well I have tangible proof now, my last three runs @ 1170/545 +30mv
> 
> 13825
> 13848
> 13832
> 
> http://www.3dmark.com/fs/7509973
> http://www.3dmark.com/fs/7510031
> http://www.3dmark.com/fs/7510076


Nice, this is my Fury at 1100/545 for your reference:

http://www.3dmark.com/fs/7489971


----------



## tysonischarles

Quote:


> Originally Posted by *98uk*
> 
> Nice, this is my Fury at 1100/545 for your reference:
> 
> http://www.3dmark.com/fs/7489971


Thanks man









Found this:
http://www.3dmark.com/fs/6584305

Currently installing the older drivers to see if i can hit the 14k group, given this guys clock speed(1,115 MHz) is slower than mine hahah


----------



## HexagonRabbit

Quote:


> Originally Posted by *Radox-0*
> 
> Nice, congrats. Having been fussy, I went through the Fury X, a pair of Fury's and a pair of Nano's before finally settling on keeping a Nano for my HTPC. My thoughts on it.
> 
> In terms of performance its a great card. At stock Nano speeds (1000 Mhz / 500 Memory) it out performed both my Fury cards which were the OC versions of the sapphire Tri-X (1040 Mhz) thanks to having the full Fiji core vs a stripped down but faster core. But it could not sustain that as the stock cooler on the Nano comes with a fan profile that favours silence, heavily, so as a result you will see throttling of the card landing it in the 900-950 Mhz range for daily use. To negate this you should up the fan profile and it will solve this issue.
> 
> In terms of coil whine, I have found its common to all the Fiji cards rather then the Nano specifically. My Fury X had it, my Fury had it and my Nano had it. Only one of my Fury samples and my current Nano seem to produce a reasonable amount which I would class as acceptable and once you enable v-sync on say a 60 fps monitor its fine. The other cards however had a terrible amount so pot luck on that front I expect.
> 
> In terms of over clocking both my Nano samples on the air cooler could hit and hold 1050mhz for daily use with the fan ramped up. Memory on one card was totally unstable, while on the second card it could hit the next discrete memory overclock of 545 Mhz, but not reach the next step up of 600 Mhz. At 1050Mhz it was at base Fury X performance, and traded blows with my Fury cards that could overclock much higher, once again thanks to the full Fuji core vs stripped down but faster. For synthetic benchmark both cards could reach the 1085-1090 Mhz mark okay on stock cooler but fans were ramped up hard to maintain that.
> 
> Under water its all changed and the Nano has become beastly
> 
> 
> 
> 
> 
> 
> 
> My current sample can hit 545 on the memory which has not changed as 600 Mhz seems too far, but in synthetics, can hit 1140 Mhz and for daily use settles in at 1125 Mhz on the core. Pretty much putting it a decently overclocked Fury X levels which is nice. This is the point at which however the cooling is not the issue, rather the limited power draw and cut down power phases I expect being unable to reach the high 1150-1200 Mhz mark some Fury X's / Fury's are capable of.


Thank you for this. With the price drop the Nano is a great choice. I'm ordering mine today.


----------



## tysonischarles

Not try to turn you away from the Nano, I mean, I'm sure it's great, but why have a Nano when you can have a Fury X possessed by the Devil himself!

Pic related


----------



## Radox-0

Price I imagine. At least here in the uk and most places it seems the nano has had quiet a price cut and is considerably cheaper the the fury x with performance not too far behind once you notch up the fans to lower the thermal throttling.

Besides a nano under water can reach those sort of speeds a swell







maby not the 666 bit on your score however








Quote:


> Originally Posted by *HexagonRabbit*
> 
> Thank you for this. With the price drop the Nano is a great choice. I'm ordering mine today.


Np. Enjoy the card


----------



## ebinkerd

Isn't the graphics score the number that matters? Mines 16691, http://www.3dmark.com/3dm/10696867. This is undervolted too.

My highest graphics score was 16722.


----------



## Radox-0

Quote:


> Originally Posted by *ebinkerd*
> 
> Isn't the graphics score the number that matters? Mines 16691, http://www.3dmark.com/3dm/10696867. This is undervolted too.
> 
> My highest graphics score was 16722.


If your comparing GPU's then its a better measure then the combined score which accounts for the CPU's Physics score.

On that note, I am getting bizarre readings on my card. The nano at 1114 Mhz on the core and 508 (500 basically) seems to score just a notch under 17,000 (16,934) on the GPU score. Pushing it to 1130 and 545 actually seemed to decrease the score while trying to break 17,000 GPU score. Odd as it seems to be the case for all tests above 1130-1150 on my card









http://www.3dmark.com/compare/fs/7317488/fs/7128062#


----------



## ebinkerd

Quote:


> Originally Posted by *Radox-0*
> 
> If your comparing GPU's then its a better measure then the combined score which accounts for the CPU's Physics score.
> 
> On that note, I am getting bizarre readings on my card. The nano at 1114 Mhz on the core and 508 (500 basically) seems to score just a notch under 17,000 (16,934) on the GPU score. Pushing it to 1130 and 545 actually seemed to decrease the score while trying to break 17,000 GPU score. Odd as it seems to be the case for all tests above 1130-1150 on my card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/7317488/fs/7128062#


Thats a touch higher than when I had my Fury X clocked to 1100/545 @+12mv. Seriously though, try undervolting it. My undervolting scores are better than my over volt. I have no idea why. I have the firestrike scores to back it up.

This is my highest over volt @+12mv, http://www.3dmark.com/fs/7337344
This is my highest under volt @-18mv, http://www.3dmark.com/fs/7506002


----------



## tysonischarles

Quote:


> Originally Posted by *ebinkerd*
> 
> Thats a touch higher than when I had my Fury X clocked to 1100/545 @+12mv. Seriously though, try undervolting it. My undervolting scores are better than my over volt. I have no idea why. I have the firestrike scores to back it up.
> 
> This is my highest over volt @+12mv, http://www.3dmark.com/fs/7337344
> This is my highest under volt @-18mv, http://www.3dmark.com/fs/7506002


This^^

I got better scores with 1170/545 +30mV than I did with 1170/545 +75mV.

And also, if you are going to touch your memory, you should set it straight to 545 as that's the increment it will bump up to.


----------



## HexagonRabbit

Quote:


> Originally Posted by *tysonischarles*
> 
> Not try to turn you away from the Nano, I mean, I'm sure it's great, but why have a Nano when you can have a Fury X possessed by the Devil himself!
> 
> Pic related


Money. I'm tapped out. I spent close to 700 on new watercooling stuff and part of that was a waterblock for the nano. Because I'm using a core P5, whatever card i get goes into the loop. A fury x w/ a card puts it over $750


----------



## tysonischarles

Quote:


> Originally Posted by *HexagonRabbit*
> 
> Money. I'm tapped out. I spent close to 700 on new watercooling stuff and part of that was a waterblock for the nano. Because I'm using a core P5, whatever card i get goes into the loop. A fury x w/ a card puts it over $750


It's cool man, I just needed a version to post my 666 benchmark hahaha ?


----------



## Semel

Quote:


> HBM has set clock speeds. 500/545/600/666.


Where did you lot get that?

So all monitoring tools misreport HBM frequency if set to anything else?


----------



## Radox-0

Quote:


> Originally Posted by *Semel*
> 
> Where did you lot get that?
> 
> So all monitoring tools misreport HBM frequency if set to anything else?


I provided the information. I frequent another forum and the information was posted by myself based an AMD representative comments, here was my initial post:

*Checking the rules, seems fine. Got the info from here: https://forums.overclockers.co.uk/showthread.php?t=18678073&page=233

Page 233 post # 6977 by AMD Representative*

I actually followed up with a question about that and asked if monitoring tools are essentially provided false readings and they confirmed that to be the case: Page 234 post # 6996
https://forums.overclockers.co.uk/showthread.php?t=18678073&page=234


----------



## fjordiales

Anyone here with xfire fury(x) having issues not able to monitor/OC the memory of the 2nd card? I'm using the latest crimson drivers & MSI AB 4.2. I have a funny feeling I'm missing something.

Gpuz is showing ulps active but I already disabled and restarted.


----------



## josephimports

Quote:


> Originally Posted by *fjordiales*
> 
> Anyone here with xfire fury(x) having issues not able to monitor/OC the memory of the 2nd card? I'm using the latest crimson drivers & MSI AB 4.2. I have a funny feeling I'm missing something.
> 
> Gpuz is showing ulps active but I already disabled and restarted.


To disable ULPS, use regedit. Find all instances of EnableULPS and set to zero then restart.

To enable memory OC on 2nd card using AB, you will have to disable or remove the 1st card. Then go into settings in AB and enable the extended oc limits option. For some reason, AB does not apply this setting to both cards simultaneously .

Another option is to use Trixx, as it OC's memory on both cards just fine.


----------



## Maximization

Quote:


> Originally Posted by *fjordiales*
> 
> Anyone here with xfire fury(x) having issues not able to monitor/OC the memory of the 2nd card? I'm using the latest crimson drivers & MSI AB 4.2. I have a funny feeling I'm missing something.
> 
> Gpuz is showing ulps active but I already disabled and restarted.


yeah I can't overclock memory at all, TRIXX crashes on my system can only use AB.


----------



## bluezone

QUOTE "Just Delivered: Accell DisplayPort 1.2 to HDMI 2.0 Active Adapter"

http://www.pcper.com/news/General-Tech/Just-Delivered-Accell-DisplayPort-12-HDMI-20-Active-Adapter

I thought this may of been of interest to some Fury owners.


----------



## Arizonian

Quote:


> Originally Posted by *bluezone*
> 
> Just Delivered: Accell DisplayPort 1.2 to HDMI 2.0 Active Adapter
> 
> http://www.pcper.com/news/General-Tech/Just-Delivered-Accell-DisplayPort-12-HDMI-20-Active-Adapter


Question for anyone who might know,

I'm ordering a 60 Hz 4K Async panel tomorrow that comes with HDMI 2.0, from what I understand I wouldn't need anything other than regular HDMI cable or would I need this adapter too?


----------



## bluezone

Quote:


> Originally Posted by *Arizonian*
> 
> Question for anyone who might know,
> 
> I'm ordering a 60 Hz 4K Async panel tomorrow that comes with HDMI 2.0, from what I understand I wouldn't need anything other than regular HDMI cable or would I need this adapter too?


From what I've read so far, you would need the adapter or Display port 1.2 capable TV to have 60 Hz.

Still waiting on my Nano. dam you AMD.


----------



## Arizonian

Quote:


> Originally Posted by *bluezone*
> 
> From what I've read so far, you would need the adapter or Display port 1.2 capable TV to have 60 Hz.
> 
> Still waiting on my Nano. dam you AMD.


Thanks. Boy that would stink to have the *XG2700* arrive and not have the right cable. It comes with mini-DP to DP 1.2a cable so I think I'm good. Will do some more research.


----------



## provost

http://s1364.photobucket.com/user/provostelite/media/Mobile Uploads/image_zps7oodeoa4.jpeg.html

I know these may not be able to run the new Hitman Absolution DX 12 well,

http://s1364.photobucket.com/user/provostelite/media/Mobile Uploads/image_zpspgr1eloh.jpeg.html

But, will this work? Lol

Now what to do with my Titans? May be I will set em alight and tweet a pic to Jen-Hsun to let him know what I think of his software stack "optimization" strategy.


----------



## Arizonian

Congrats.









I think you and I may have moved over to AMD for close to the same reasons but I digress. I'm sure you can do better than I over clocking it. Enjoy experimenting and keep us posted on your thoughts.

I also think I'm getting this HDMI 2.0 cable to play it safe. I have using DP currently. Funny but before my fury I didn't even think to entrain 4K. I'm excited.

http://www.newegg.com/Product/Product.aspx?Item=N82E16812117611&cm_re=HDMI_2.0_cable-_-12-117-611-_-Product

PS if anyone any suggestions on a really good HDMI 2.0 cable please post.


----------



## dagget3450

Quote:


> Originally Posted by *tysonischarles*
> 
> Not try to turn you away from the Nano, I mean, I'm sure it's great, but why have a Nano when you can have a Fury X possessed by the Devil himself!
> 
> Pic related


If you have steam version of firestrike with that score you get the achievement :"You Devil" for the 666 in your score







- if it will validate


----------



## tysonischarles

Quote:


> Originally Posted by *dagget3450*
> 
> If you have steam version of firestrike with that score you get the achievement :"You Devil" for the 666 in your score
> 
> 
> 
> 
> 
> 
> 
> - if it will validate


It is the steam version :O except the drivers aren't official for that bench, so not validate ?


----------



## Fyrwulf

Quote:


> Originally Posted by *provost*
> 
> Now what to do with my Titans? May be I will set em alight and tweet a pic to Jen-Hsun to let him know what I think of his software stack "optimization" strategy.


Nah, there are always silly nVidia fanboys to take advantage of. Just sell them off.


----------



## SuperZan

Quote:


> Originally Posted by *Fyrwulf*
> 
> Nah, there are always silly nVidia fanboys to take advantage of. Just sell them off.


 My old 770's can confirm.


----------



## provost

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think you and I may have moved over to AMD for close to the same reasons but I digress. I'm sure you can do better than I over clocking it. Enjoy experimenting and keep us posted on your thoughts.
> 
> I also think I'm getting this HDMI 2.0 cable to play it safe. I have using DP currently. Funny but before my fury I didn't even think to entrain 4K. I'm excited.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16812117611&cm_re=HDMI_2.0_cable-_-12-117-611-_-Product
> 
> PS if anyone any suggestions on a really good HDMI 2.0 cable please post.


Thanks Arizonian. I didn't realize that you had moved over too. I know you weren't thrilled about NV's "disclosure" faux pas, to put it politely, related to the 970s. And, I have been explicitly or implicitly threatening to dump Nvidia for a while... Lol, as I just don't like the direction that Nvidia is headed in, both practically and philosophically speaking. From a consumer standpoint, Nvidia has been taking its install base for granted for far too long.... Lol. Anyway, I digress, I think you and I have been on a similar upgrade path since the
690 days







.
You are a conscientious fella, and you always do thorough research before making a purchase (heck I think I just piggybacked off your research when I bought one of my monitors...lol I believe it was the Dell monitor... lol)
Me on the other hand, I am programmed to shoot first, and ask questions later (figuratively speaking, of course)..lol . Call it a necessary job hazard







.

In any event, looking forward to having fun with this card until Polaris is released.


----------



## Arizonian

@provost Here are my OC results if you like to use to compare.

*My Nitro Fury Benchmarks Post*

If your benching I'd just go straight to 1175 on Core with +96 mV - Gaming without voltage 1100 with just +50% power limit is all you need. I'm going to wager your nitro can same no problem


----------



## keikei

Hey guys,

I ordered a new fury and its coming in very soon. I hope to join the club and possibly try ocing a little. Quick question. What is the performance difference between a normal fury and the nitro version? I was set on getting a nitro, then newegg decided to get greedy and raised the price. I went with the regular fury from superbiz. I cant wait!


----------



## JunkaDK

Quote:


> Originally Posted by *Arizonian*
> 
> @provost Here are my OC results if you like to use to compare.
> 
> *My Nitro Fury Benchmarks Post*
> 
> If your benching I'd just go straight to 1175 on Core with +96 mV - Gaming without voltage 1100 with just +50% power limit is all you need. I'm going to wager your nitro can same no problem


Very nice benchmarking thread.. I'm curious to know when u benched at 1160 and 1175.. did u try running Valley over an extended period? My card crashes after 5-10 mins even with fans at 100% and core is around 60,when it crashes, so i am suspecting the VRM to overheat. I ordered new thermal pads for the VRM and Gelid Extreme for the GPU.. Gonna be interesting to test with that









(Asus R9 Fury STRIX)


----------



## Jflisk

Guys fight for the red .

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638

Thanks


----------



## Arizonian

Quote:


> Originally Posted by *provost*
> 
> Thanks Arizonian. I didn't realize that you had moved over too. I know you weren't thrilled about NV's "disclosure" faux pas, to put it politely, related to the 970s. And, I have been explicitly or implicitly threatening to dump Nvidia for a while... Lol, as I just don't like the direction that Nvidia is headed in, both practically and philosophically speaking. From a consumer standpoint, Nvidia has been taking its install base for granted for far too long.... Lol. Anyway, I digress, I think you and I have been on a similar upgrade path since the
> 690 days
> 
> 
> 
> 
> 
> 
> 
> .
> You are a conscientious fella, and you always do thorough research before making a purchase (heck I think I just piggybacked off your research when I bought one of my monitors...lol I believe it was the Dell monitor... lol)
> Me on the other hand, I am programmed to shoot first, and ask questions later (figuratively speaking, of course)..lol . Call it a necessary job hazard
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In any event, looking forward to having fun with this card until Polaris is released.


Now that you brought that up, ironically I was originally looking at 144Hz IPS Gsync monitor so I bought 2 970's in preparation rather than SLI 780ti's with only 3GB VRAM I felt go with an extra gig for a little longevity. I would have never purchased them for just 512 MB VRAM more than my 780ti at 1440. Luckily I have two 1080p monitors on other rigs so I split them and put them to good use. I do not plan on getting rid of my 970s on my other rigs I maintain anytime soon. They are a great value at 1080p and by the time I'm done with them there, probably one of the best. Least to say that changed everything as far as main rig.

Other reasons like Gameworks, green initiative, GeForced experience, Non-acceptance of VESA Async standard give me pause. A company becomes great making awesome graphics cards but can lose sight when they put their stockholders in front of their end-users. Either way as a consumer at the end of the day it doesn't bother me all I have to do is use my dollars elsewhere, luckily we still have choice. Yet I can only say this in an AMD club setting with less fear of disagreement.









Having said that I also feel AMD restructuring is poised well with thier architecture with upcoming DX12 and we are already seeing games coming out supporting it. I feel DX12 will be norm much sooner than anticipated. On single GPU I have no issues gaming which is all that matters. Good luck on yours.

Ordered the XG2700 this morning will be here Wensday. First 4K Async with HDMI 2.0 to support full UHD at 60 Hz.








Quote:


> Originally Posted by *keikei*
> 
> Hey guys,
> 
> I ordered a new fury and its coming in very soon. I hope to join the club and possibly try ocing a little. Quick question. What is the performance difference between a normal fury and the nitro version? I was set on getting a nitro, then newegg decided to get greedy and raised the price. I went with the regular fury from superbiz. I cant wait!


Congrats









I've been watching the prices since November. There has been at some point a fury on sale for $499 quite often. First TriX, XFX DDD at 499.99, then Nitro OC at $479.99. Prices go back up maybe for a couple weeks and it or something else will come on sale again in that range. Just shop around like you did is the way to do it. $499.99 is the sweet spot.
Quote:


> Originally Posted by *JunkaDK*
> 
> Very nice benchmarking thread.. I'm curious to know when u benched at 1160 and 1175.. did u try running Valley over an extended period? My card crashes after 5-10 mins even with fans at 100% and core is around 60,when it crashes, so i am suspecting the VRM to overheat. I ordered new thermal pads for the VRM and Gelid Extreme for the GPU.. Gonna be interesting to test with that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Asus R9 Fury STRIX)


No I did not run it for a lengthy period of time. I let it go through one complete pass to load textures and then I run my benchmark. I will usually do this twice and take the best result. Does gaming marathon on SWBF for 5 hours clocked at 1160 core on Saturday count?

HBM memory OC of 50 MHz gave me 10C increase in temp and 5 FPS in SOM. I keep memeory *cough* at stock. Unless something changes poor OC results will be no different with HBM2. Not end of world the amount of memory bandwidth at stock not shabby at all.

@jamaican voodoo congrats on your crossfires.


----------



## provost

Quote:


> Originally Posted by *Arizonian*
> 
> @provost Here are my OC results if you like to use to compare.
> 
> *My Nitro Fury Benchmarks Post*
> 
> If your benching I'd just go straight to 1175 on Core with +96 mV - Gaming without voltage 1100 with just +50% power limit is all you need. I'm going to wager your nitro can same no problem


Thanks! This would give me a running start, as I am fairly green (or shall I say not red enough yet... Lol) when it comes to AMD cards. Much appreciated.









Quote:


> Originally Posted by *Arizonian*
> 
> Now that you brought that up, ironically I was originally looking at 144Hz IPS Gsync monitor so I bought 2 970's in preparation rather than SLI 780ti's with only 3GB VRAM I felt go with an extra gig for a little longevity. I would have never purchased them for just 512 MB VRAM more than my 780ti at 1440. Luckily I have two 1080p monitors on other rigs so I split them and put them to good use. I do not plan on getting rid of my 970s on my other rigs I maintain anytime soon. They are a great value at 1080p and by the time I'm done with them there, probably one of the best. Least to say that changed everything as far as main rig.
> 
> Other reasons like Gameworks, green initiative, GeForced experience, Non-acceptance of VESA Async standard give me pause. A company becomes great making awesome graphics cards but can lose sight when they put their stockholders in front of their end-users. Either way as a consumer at the end of the day it doesn't bother me all I have to do is use my dollars elsewhere, luckily we still have choice. Yet I can only say this in an AMD club setting with less fear of disagreement.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Having said that I also feel AMD restructuring is poised well with thier architecture with upcoming DX12 and we are already seeing games coming out supporting it. I feel DX12 will be norm much sooner than anticipated. On single GPU I have no issues gaming which is all that matters. Good luck on yours.
> 
> Ordered the XG2700 this morning will be here Wensday. First 4K Async with HDMI 2.0 to support full UHD at 60 Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats
> 
> 
> 
> 
> 
> 
> 
> .


Well, Congrats on your new monitor purchase too! I have been lukewarm on the whole VRR monitors due to not wanting to lock myself into an ecosystem, but now that I have decided which way my future purchases will be directed, I may also pick up async (Which is vendor agnostic, well for AMD and Intel anyway, right?... Lol) and it doesn't hurt that I wouldn't have to pay relatively hefty premium for the privilege of locking myself into an ecosystem... Lol

I can probably check some of the same boxes, and then some for my reasons to move to AMD after being on Nvidia platform for five plus years , and having owned pretty much every high end NV card until Maxwell... Lol. But, since you me mentioned shareholders, etc, I don't believe that Nvidia's current strategy is even good for them, but that's a discussion for another place and time, and may be it will make for a good case study at some of the B-schools one day, starting with Stanford..


----------



## stargate125645

I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?


----------



## keikei

Quote:


> Originally Posted by *stargate125645*
> 
> I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?


Sapphire Nitro.


----------



## NBrock

Quote:


> Originally Posted by *stargate125645*
> 
> I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?


I would get the Nano personally if you don't want the Fury X. With the Nano you still get the full count of stream processors and many are able to hit or exceed Fury X speeds. Also I believe the Fury Nano is still cheaper than the Fury (non X).


----------



## tysonischarles

In regards to your +mV, you honestly don't need to go over +30mV, from my over clocks, 30 is the best and going higher actually reduces your benchmark scores, so I honest would suggest you don't just max mV and ramp it up.

Start at stock with 545 on the memory, as that's a fixed step up on the memory, and anything before will either round up or down to that number(or 500). And anything over will round down(or up, the next step up is 600(which is extremely unstable from my testing)).

Then bump up as high as you can on the gpu on stock mV and when you can't go any higher, then bump up your mV in incrimints of 10, and then push as high as you can on the +10 before adding another +10.

This gave me my best results, ending up at a stable 1170/545 @ +30mV. Maxing out the mV with 1170/545 actually reduced my numbers.


----------



## stargate125645

Quote:


> Originally Posted by *keikei*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?
> 
> 
> 
> Sapphire Nitro.
Click to expand...

Really? An R9 390 will max at most games at 2650x1440? I've been out of the game too long!


----------



## tysonischarles

Quote:


> Originally Posted by *stargate125645*
> 
> I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?


As for this, unless your set on AMD I would suggest a 980/980ti, on the basis that my car never hits over 35 degrees yet I can barely squeeze a 10% increase from overclocking. Not only that, but my coil whine is like a violin, it's very audible from within my case.

If you are set on AMD, then Nano, On a waterblock you can get almost Fury X results and also very respectable results with an altered fan profile on air. But same issues as above would most likely apply


----------



## NBrock

I am pretty lucky I guess. I have no pump noise or coil whine on my Fury X. The only time I get coil whine is when Valley Benchmark is closing.


----------



## stargate125645

Quote:


> Originally Posted by *tysonischarles*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> I am interested in getting a Fury (not the X) myself for a new gaming rig (5930K, Sabertooth X99 motherboard, 4x4GB DDR4), but am debating whether I should just wait until next round and buy two of the new ones for 21:9 gaming. Which of the Fiji cards should I get if I decide to buy a holdover until the next generation? I'm leaning towards an R9 Fury from XFX, and would use it as a backup card once the next generation comes out. Note that I will not be buying a 21:9 monitor for a little while; for now I will be using my 2560x1440 monitor for games at least until the next generation of AMD GPUs comes out, so could I get away with an even smaller GPU upgrade until then and be able to max out pretty much every game?
> 
> 
> 
> As for this, unless your set on AMD I would suggest a 980/980ti, on the basis that my car never hits over 35 degrees yet I can barely squeeze a 10% increase from overclocking. Not only that, but my coil whine is like a violin, it's very audible from within my case.
> 
> If you are set on AMD, then Nano, On a waterblock you can get almost Fury X results and also very respectable results with an altered fan profile on air. But same issues as above would most likely apply
Click to expand...

I'm not keen on messing with watercooling unless it's built-in. Is the nano worth the premium over a 390 that the other person suggested? A 980Ti would defeat the purpose of saving money now for later purchases given it costs in excess of $600.


----------



## NBrock

Honestly all the games I ran on my 290x @ 2560x1440 ran just fine. I thought you were debating on a Fury series card. If that doesn't matter to you a 290x, 390, or 390x would be fine. Do keep in mind I overclocked it a good bit over stock.

Played BF4, Diablo 3, Skyrim, TF2, Civ 5, Star Wars Battle Front, CS GO, Modded Skyrim...


----------



## stargate125645

Quote:


> Originally Posted by *NBrock*
> 
> Honestly all the games I ran on my 290x @ 2560x1440 ran just fine. I thought you were debating on a Fury series card. If that doesn't matter to you a 290x, 390, or 390x would be fine. Do keep in mind I overclocked it a good bit over stock.
> 
> Played BF4, Diablo 3, Skyrim, TF2, Civ 5, Star Wars Battle Front, CS GO, Modded Skyrim...


I was thinking I had to get a Fury in order to max out games at 2560x1440. Apparently I don't, and an R9 390 works, which means I should save money now since the card is only going to be a holdover anyway.


----------



## NBrock

Quote:


> Originally Posted by *stargate125645*
> 
> I was thinking I had to get a Fury in order to max out games at 2560x1440. Apparently I don't, which means I should save money now since the card is going to be a holdover anyway.


I wouldn't even look at new if you don't care. I have seen the occasional 290/290x for good prices on the forums and ebay. Some going from 180-230 USD depending on 290 or 290x and if it's a reference design or not.


----------



## tysonischarles

Quote:


> Originally Posted by *stargate125645*
> 
> I'm not keen on messing with watercooling unless it's built-in. Is the nano worth the premium over a 390 that the other person suggested? A 980Ti would defeat the purpose of saving money now for later purchases given it costs in excess of $600.


A lot of users would vouch for the nano being the best pick out of all the fiji cards. But I would still honestly recommend team green, Pay a little more, have a phenomenal amount more of overclock headroom and not have a violin play every time you play a game. From what I saw briefly on newegg, the difference between a nano and 980ti is $109 USD, To me, The better choice is obvious.


----------



## stargate125645

Quote:


> Originally Posted by *tysonischarles*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> I'm not keen on messing with watercooling unless it's built-in. Is the nano worth the premium over a 390 that the other person suggested? A 980Ti would defeat the purpose of saving money now for later purchases given it costs in excess of $600.
> 
> 
> 
> A lot of users would vouch for the nano being the best pick out of all the fiji cards. But I would still honestly recommend team green, Pay a little more, have a phenomenal amount more of overclock headroom and not have a violin play every time you play a game. From what I saw briefly on newegg, the difference between a nano and 980ti is $109 USD, To me, The better choice is obvious.
Click to expand...

If this weren't a holdover card, absolutely. But if even an R9 390 handles my intermediate needs as others have suggested, it would be dumb to buy a 980, let alone a Fury, when the $100+ could go to next generation dual cards.


----------



## tysonischarles

Quote:


> Originally Posted by *stargate125645*
> 
> If this weren't a holdover card, absolutely. But if even an R9 390 handles my intermediate needs as others have suggested, it would be dumb to buy a 980, let alone a Fury, when the $100+ could go to next generation dual cards.


Well in that case, avoid a new card and buy a second hand card that will see you over for the time being


----------



## provost

Quote:


> Originally Posted by *Arizonian*
> 
> Ordered the XG2700 this morning will be here Wensday. First 4K Async with HDMI 2.0 to support full UHD at 60 Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats


Ok, now that I have settled in on my video card vendor going forward (and what a relief, actually this is one consumer discretionary purchase that makes me feel really good, instead of just feeling good for teh sake of acquiring and owning something..lol), I am trying to decipher the whole hdmi 2.0, display port 3.0, etc for async/freesync vrr monitors. I tried to do quickly google it, but it seems like it's gonna take a bit more brain damage to pick the right display to go with this card, and more importantly with my Polaris build, whenever the big Polaris is released. Do yo mind me asking how did you narrow your choice down to this particular model? I know that you do thorough research on your purchases, so I figured I would start my process by soliciting your input. Tks.


----------



## provost

Quote:


> Originally Posted by *Jflisk*
> 
> Guys fight for the red .
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638
> 
> Thanks


Well, I only fight for ME ...lol









But, benching can be fun, I still have my phase cooler, but it is a lot of work to prep everything, if you don't bench like some of the other pros here... lol
Has anyone tried freezing the Fury/Furyx and does anyone know if it scales at all with cold?


----------



## fjordiales

Quote:


> Originally Posted by *josephimports*
> 
> To disable ULPS, use regedit. Find all instances of EnableULPS and set to zero then restart.
> 
> To enable memory OC on 2nd card using AB, you will have to disable or remove the 1st card. Then go into settings in AB and enable the extended oc limits option. For some reason, AB does not apply this setting to both cards simultaneously .
> 
> Another option is to use Trixx, as it OC's memory on both cards just fine.


Thanks for the Trixx tip. It now works.

Quote:


> Originally Posted by *Maximization*
> 
> yeah I can't overclock memory at all, TRIXX crashes on my system can only use AB.


I might go back to AB is Trixx acts up on me but will try the other tips from the quote above.


----------



## tysonischarles

Quote:


> Originally Posted by *provost*
> 
> Well, I only fight for ME ...lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, benching can be fun, I still have my phase cooler, but it is a lot of work to prep everything, if you don't bench like some of the other pros here... lol
> Has anyone tried freezing the Fury/Furyx and does anyone know if it scales at all with cold?


Well my card doesn't go over 35 degrees ~, And I cant get higher than 1170/550, so I wouldn't think that Temperature is the restricting factor on these cards


----------



## Arizonian

Quote:


> Originally Posted by *provost*
> 
> Ok, now that I have settled in on my video card vendor going forward (and what a relief, actually this is one consumer discretionary purchase that makes me feel really good, instead of just feeling good for teh sake of acquiring and owning something..lol), I am trying to decipher the whole hdmi 2.0, display port 3.0, etc for async/freesync vrr monitors. I tried to do quickly google it, but it seems like it's gonna take a bit more brain damage to pick the right display to go with this card, and more importantly with my Polaris build, whenever the big Polaris is released. Do yo mind me asking how did you narrow your choice down to this particular model? I know that you do thorough research on your purchases, so I figured I would start my process by soliciting your input. Tks.


Ok so best to explain it,
Quote:


> "HDMI 2.0 increases the maximum TMDS per channel throughput from 3.4 Gbit/s to 6 Gbit/s which allows for a maximum total TMDS throughput of 18 Gbit/s.[146] This allows HDMI 2.0 to carry 4K resolution at 60 frames per second (fps)"


In short, prior to this you'd be actually playing 4K at 30 FPS max because HDMI 1.4a only supports 4K UHD only to 30 Hz.









HDMI 2.0 supports 4K UHD up to 60 Hz refresh rate and will get 60 FPS gaming now.









So all monitors equipped with HDMI 2.0 regardless of synchronization is what you want on 4K UHD or don't buy it.

I'm speculating Async should work from at the least 30-60 FPS range, hoping even lower. It looks like without reviews I'm taking a leap of faith and will be early adopter but I'm feeling confident. As Newegg Premier member I will be using it to my advantage if needed. Will be putting under the microscope ritual for the normal defects like bad back light bleed, visible IPS glow when gaming or watching movies and the normal dead pixel hunt which can't be unseen once seen.









Finally next step if I'm not mistaken will be HDMI 3.0 which will take 4K UHD to 120 Hz and 120 FPS. Expecting not one of those monitors will be in my price range and not one GPU will rule them all for at least a few years to come. Hope that helps.

Will be posting results in the *ViewSonic XG2700 4K Asyc IPS Discussion Thread*

*UPDATE*: I chose it because it's the only 4K Async monitor out with HDMI 2.0 (*just released*) and ViewSonic doesn't have bad history with panels. Nothing superb but I'd rate it above ASUS monitor quality control and or in line or slightly better than Dell IMO.


----------



## provost

Quote:


> Originally Posted by *tysonischarles*
> 
> Well my card doesn't go over 35 degrees ~, And I cant get higher than 1170/550, so I wouldn't think that Temperature is the restricting factor on these cards


Ok. Thanks, guess I won't be busting out my phase cooler until big Polaris then..lol
Sure, it would be fun to bench against my ex-comrades who might still be on Nvidia platform









Quote:


> Originally Posted by *Arizonian*
> 
> Ok so best to explain it,
> 
> In short, prior to this you'd be actually playing 4K at 30 FPS max because HDMI 1.4a only supports 4K UHD only to 30 Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HDMI 2.0 supports 4K UHD up to 60 Hz refresh rate and will get 60 FPS gaming now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So all monitors equipped with HDMI 2.0 regardless of synchronization is what you want on 4K UHD or don't buy it.
> 
> I'm speculating Async should work from at the least 30-60 FPS range, hoping even lower. It looks like without reviews I'm taking a leap of faith and will be early adopter but I'm feeling confident. As Newegg Premier member I will be using it to my advantage if needed. Will be putting under the microscope ritual for the normal defects like bad back light bleed, visible IPS glow when gaming or watching movies and the normal dead pixel hunt which can't be unseen once seen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally next step if I'm not mistaken will be HDMI 3.0 which will take 4K UHD to 120 Hz and 120 FPS. Expecting not one of those monitors will be in my price range and not one GPU will rule them all for at least a few years to come. Hope that helps.
> 
> Will be posting results in the *ViewSonic XG2700 4K Asyc IPS Discussion Thread*
> 
> *UPDATE*: I chose it because it's the only 4K Async monitor out with HDMI 2.0 (*just released*) and ViewSonic doesn't have bad history with panels. Nothing superb but I'd rate it above ASUS monitor quality control and or in line or slightly better than Dell IMO.


Thanks, yes it is helpful







kind a like a coles notes version








Will be keeping an eye on your review/discussion thread.


----------



## stargate125645

How is async different from Freesync?


----------



## Arizonian

Quote:


> Originally Posted by *stargate125645*
> 
> How is async different from Freesync?


Freesync is AMD's name for Adaptive synchronization which has been accepted as VESA standard and moving forward by Intel. Nividia can come out and support Async maybe by Volta if not by Pascal if they want. Async is not proprietary like Gsync.

I may entertain another Nvidia card if they support Async some day.


----------



## stargate125645

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> How is async different from Freesync?
> 
> 
> 
> Freesync is AMD's name for Adaptive synchronization which has been accepted as VESA standard and moving forward by Intel. Nividia can come out and support Async maybe by Volta if not by Pascal if they want. Async is not proprietary like Gsync.
> 
> I may entertain another Nvidia card if they support Async some day.
Click to expand...

So you're saying Freesync and Async are the same thing and terms are used interchangeably?


----------



## bluezone

Quote:


> Originally Posted by *stargate125645*
> 
> So you're saying Freesync and Async are the same thing and terms are used interchangeably?


See this if you want to know a bit about Async.

http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing


----------



## stargate125645

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stargate125645*
> 
> So you're saying Freesync and Async are the same thing and terms are used interchangeably?
> 
> 
> 
> See this if you want to know a bit about Async.
> 
> http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing
Click to expand...

Different type of Async?


----------



## bluezone

Quote:


> Originally Posted by *stargate125645*
> 
> Different type of Async?


He might be referring to Freesync using part of the Adaptive sync standard. Maybe he is shorting Adaptive sync down to Async. But I generally refer to Async as relating to Asynchronous compute or Asynchronous Shaders.


----------



## xer0h0ur

Yeah its not a good idea to shorten Adaptive Sync to Async since that is a reference to Asynchronous Compute


----------



## Radox-0

Quote:


> Originally Posted by *stargate125645*
> 
> Different type of Async?


Yep two different things being discussed.

Basically as Arizonan stated amd use freesync technology which basically an alternative name for the adaptive synchronisation a technology which has been accepted as the VESA standard. It's great as unlike gsync is all port based technology provided you have the right card. Unlike gsync which in addition to being propitary needs hardware in the panel itself which lends to a price premium.

Async is typically short hand for asynchronise compute a entirely different subject.


----------



## Arizonian

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yeah its not a good idea to shorten Adaptive Sync to Async since that is a reference to Asynchronous Compute


Quote:


> Originally Posted by *Radox-0*
> 
> Yep two different things being discussed.
> 
> Basically as Arizonan stated amd use freesync technology which basically an alternative name for the adaptive synchronisation a technology which has been accepted as the VESA standard. It's great as unlike gsync is all port based technology provided you have the right card. Unlike gsync which in addition to being propitary needs hardware in the panel itself which lends to a price premium.
> 
> Async is typically short hand for asynchronise compute a entirely different subject.


Yeah I think your both right and for the sake of confusion I'll use the term "adaptive sync" without cutting corners moving forward even when discussing a monitor. Changed the XG2700 thread title to reflect as well.


----------



## bluezone

Quote:


> Originally Posted by *Arizonian*
> 
> Yeah I think your both right and for the sake of confusion I'll use the term "adaptive sync" without cutting corners moving forward even when discussing a monitor. Changed the XG2700 thread title to reflect as well.


Your welcome


----------



## buildzoid

Has anyone else here noticed that raising core voltage in software leads to lower FPS? Because after getting weird benchmark results I went and did some testing and got this:



My Fury Xs are PowerColor cards and both BIOSs on them are the same. So I'm guessing that both of my BIOSs are 250W BIOSs but that doesn't explain why raising the power limit doesn't help in the slightest with the throttling. Also GPU-z doesn't pick up on any of the throttling since it seem to last less than 100ms. The GPUs were bellow 50C core temperature so the throttling isn't temperature relate.

I did the testing in Unigine with the 16.1.1b driver. I used both Trixx and Afterburner and both got the same results.


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> Has anyone else here noticed that raising core voltage in software leads to lower FPS? Because after getting weird benchmark results I went and did some testing and got this:
> 
> 
> 
> My Fury Xs are PowerColor cards and both BIOSs on them are the same. So I'm guessing that both of my BIOSs are 250W BIOSs but that doesn't explain why raising the power limit doesn't help in the slightest with the throttling. Also GPU-z doesn't pick up on any of the throttling since it seem to last less than 100ms. The GPUs were bellow 50C core temperature so the throttling isn't temperature relate.
> 
> I did the testing in Unigine with the 16.1.1b driver. I used both Trixx and Afterburner and both got the same results.


I did not find that to be the case when I tried my hard volt mods on the pre-crimson driver when I was benching. Would be interesting to see if that is re-produced if you roll back to pre-crimson drivers because thats when everyone started to have throttling issues.


----------



## buildzoid

I get voltage related performance loss even on 15.7.1 I just didn't bother testing it properly.

I already got an idea for a fix if it's due to power limits. I'm going to flash the high power Sapphire Fury Tri-X OC BIOS. If I have a power problem that BIOS should solve it. I'm also throwing on the 4096 core unlock on it so performance should be the same.

EDIT: The BIOS flash didn't help at all the card still gets lower FPS with more voltage.


----------



## Otterfluff

So you think it is down-clocking based on a voltage max? That would certainly explain why we have so much trouble with basic Overclocks.

My original testing was done with Firestrike but I never monitored fps.


----------



## buildzoid

I'm working on a high power limit BIOS ATM. If it works I'll have 511W power limit. I'm also trying to hack Vcore and HBM so it might be a few hours before I get any results.


----------



## Otterfluff

I will be interested to hear how it goes.


----------



## gupsterg

@buildzoid

In regards to your PM about cracking PowerLimit, I'm







I posted this back in 07/11/15, perhaps no one noted it?







.

I also have an updated screenie, IIRC I never posted it, as "bios modding" wasn't gaining support in thread







.

Near the end of 2015 I did sort of drop off forum







, but did send a PM regarding how is bios modding going on your end as I thought you'd have used the screenie I posted or made some of your own developments. I did state at the time to use the hawaii thread as due to atombios being used there are a lot of similarities (in a way).

Developments on my side regarding bios modding been for say Hawaii:-

a) gained insight on how PowerPlay have pointers within it to sections. I can apply/have applied this to Fury/X this morning







.

b) I have cracked how VoltageObjectInfo program IR3567B on Hawaii, so can change VDDC limit, etc. I have yet to view Fury/X due to time. I'm guessing due to you using hard mods not useful to you.

Another insight has occurred regarding Fury/X ROMs but to early to state, members need to get involved in bios modding for this to move forward







.

Glad to read your getting into bios modding







.


----------



## buildzoid

@gupsterg

The problem is that I have no idea how to get the Hex to line up with your screenshots. Like I know that all the power and clock stuff is in the A140 to A350 but your screenshots start at 0.

Also I stopped watching this thread for some time. So I missed your newer screenshots. Though now that I have them I should make much faster progress.

I think I'll do a 999A 999W BIOS because 511W 420A seems to have worked. I wish I had all my usual equipment because then I could measure GPU power draw to check if the TDP hacks I've been doing actually help.


----------



## Radox-0

Quote:


> Originally Posted by *buildzoid*
> 
> Has anyone else here noticed that raising core voltage in software leads to lower FPS? Because after getting weird benchmark results I went and did some testing and got this:
> 
> 
> 
> My Fury Xs are PowerColor cards and both BIOSs on them are the same. So I'm guessing that both of my BIOSs are 250W BIOSs but that doesn't explain why raising the power limit doesn't help in the slightest with the throttling. Also GPU-z doesn't pick up on any of the throttling since it seem to last less than 100ms. The GPUs were bellow 50C core temperature so the throttling isn't temperature relate.
> 
> I did the testing in Unigine with the 16.1.1b driver. I used both Trixx and Afterburner and both got the same results.


Nice write up. Yeah I noticed something similar and posted a while back thinking it was just me. Although I am on the Nano, under water it can be pushed quiet high as temps remain at or under 30 degrees typically when testing.

I find anything in excess of +18 mv will actually start to result in a drop off in performance. I can for example get stable 1120 Mhz at +12 Volts which result in my best score. Going further to highest stable my Nano can actually do of 1135 Mhz at +64 mv but the reported performance will be a lot worse.


----------



## gupsterg

@buildzoid

No worries







, I'll explain why my screenie start at zero.

I use AtomDis to create tables list for a ROM (guide in hawaii bios thread OP) thus I know where PowerPlay table is and only view that section of ROM







.

Besides using that method you will see the hex values to pointers within sections of PowerPlay when comparing with other PowerPlays







(not all are marked on screenie below, was just a quick meddle this morning).



Below is last update I did regarding Fury PP







(still a work in progress).



Perhaps time for another thread just focusing on bios mods for Fury/X by you guys?

*** edit ***

Something regarding Fury X VoltageObjectInfo for you







.



i) you'll also note some values in the black section which change in PT1/PT3







.
ii) another tip this table grows upwards from FF 00 01 07 0C , so if you add values make change from there.
iii) if you add values to table, change table length at beginning of table and pointer as needed







.
iv) if table size changes a section in ROM needs changing as other tables will have shifted, ref heading *How to edit ROM for data/command table length changes* (be aware info will need to be applied in context of Fury ROM, which is bigger and may will have more tables)

Sorry to be cryptic but due to the potential card destroying nature of this mod I never documented this fully in hawaii thread but left crumbs of info, so doing the same here.


----------



## p4inkill3r

Quote:


> Originally Posted by *buildzoid*
> 
> Has anyone else here noticed that raising core voltage in software leads to lower FPS? Because after getting weird benchmark results I went and did some testing and got this:
> 
> 
> 
> My Fury Xs are PowerColor cards and both BIOSs on them are the same. So I'm guessing that both of my BIOSs are 250W BIOSs but that doesn't explain why raising the power limit doesn't help in the slightest with the throttling. Also GPU-z doesn't pick up on any of the throttling since it seem to last less than 100ms. The GPUs were bellow 50C core temperature so the throttling isn't temperature relate.
> 
> I did the testing in Unigine with the 16.1.1b driver. I used both Trixx and Afterburner and both got the same results.


Nice information. +rep


----------



## HexagonRabbit

Now I'm waiting on my nano. Having this card will be easier to double up on later. Not that any of the furys wont crossfire, but my OCD won't allow that. I have to be symmetrical. I'm excited as hell and wish I didn't have to upgrade so many WC parts or I would have gotten another monitor and maybe another nano. Still, I've rocked this 7970 long enough.


----------



## buildzoid

Bad news people. Even with the 511 watt, 511 amp BIOS the card trips OCP or OPP at 400W with +200mv. So it looks like there is something other than the BIOS dictating the behavior of the card.


----------



## mattwalter85

Just thought I'd show off ASUS R9 FURY STRIX. So far im pretty impressed with this card and I hope to get a second one soon!


----------



## Arizonian

Quote:


> Originally Posted by *mattwalter85*
> 
> Just thought I'd show off ASUS R9 FURY STRIX. So far im pretty impressed with this card and I hope to get a second one soon!
> 
> 
> Spoiler: Warning: Spoiler!


Looks great @mattwalter85 congrats and welcome to the club and OCN with second post.









@HexagonRabbit post pics when she's installed.


----------



## Flamingo

Has anyone tried V1Tech or ColdZero for custom backplates? Im thinking to get one for my Nano, and Im wondering if it will affect temperatures.

Apparently the VRMs get very hot and would not like an cable to touch em.

*Edit:*

I thought some of the VRM modules were placed behind the card. Apparently, not. I was thinking of getting the EK backplate since its aluminum and has thermal pads too to provide passive cooling, but that seems over the top. Acyclic would be fine too it seems, but since the price difference is little, im leaning towards the EK block.

Is it possible just to take out 4 screws on the edges, and using them to place the EK backplate?


----------



## mattwalter85

Thanks Bud, I appreciate it! I used to spend all my time over at OCC but nobody seems to get on there anymore so Ive moved over to the dark side


----------



## Thoth420

New Fury X tested on Crimson 16.1.1 working flawlessly with all games so far! 2560 x 1440 @ 144hz. The first one was a dud








Now to wait for my Agent 47 figurine to arrive to sit on top of the backplate.


----------



## provost

Quote:


> Originally Posted by *Flamingo*
> 
> Has anyone tried V1Tech or ColdZero for custom backplates? Im thinking to get one for my Nano, and Im wondering if it will affect temperatures.
> 
> Apparently the VRMs get very hot and would not like an cable to touch em.
> 
> *Edit:*
> 
> I thought some of the VRM modules were placed behind the card. Apparently, not. I was thinking of getting the EK backplate since its aluminum and has thermal pads too to provide passive cooling, but that seems over the top. Acyclic would be fine too it seems, but since the price difference is little, im leaning towards the EK block.
> 
> Is it possible just to take out 4 screws on the edges, and using them to place the EK backplate?


I tried ordering from Cold Zero with some color graphics, but he told me that they were in the process of ordering the machine to do that. Just ended up getting from an Artisan here. Acrylic plates no problems with vrms on some other cards I used to run, would assume it would be the same for Nano.

Quote:


> Originally Posted by *Thoth420*
> 
> New Fury X tested on Crimson 16.1.1 working flawlessly with all games so far! 2560 x 1440 @ 144hz. The first one was a dud
> 
> 
> 
> 
> 
> 
> 
> 
> Now to wait for my Agent 47 figurine to arrive to sit on top of the backplate.


That's a sweet looking theme.


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*
> 
> New Fury X tested on Crimson 16.1.1 working flawlessly with all games so far! 2560 x 1440 @ 144hz. The first one was a dud
> 
> 
> 
> 
> 
> 
> 
> 
> Now to wait for my Agent 47 figurine to arrive to sit on top of the backplate.


Post some more pics of your comp!


----------



## Thoth420

Wi
Quote:


> Originally Posted by *p4inkill3r*
> 
> Post some more pics of your comp!


Will do tomorrow when I can get some natural light in my room.

@Provost Thanks


----------



## Arizonian

Quote:


> Originally Posted by *Thoth420*
> 
> New Fury X tested on Crimson 16.1.1 working flawlessly with all games so far! 2560 x 1440 @ 144hz. The first one was a dud
> 
> 
> 
> 
> 
> 
> 
> 
> Now to wait for my Agent 47 figurine to arrive to sit on top of the backplate.


Looks really beautiful but we know the figurine will be the icing on the cake.


----------



## Maximization

Quote:


> Originally Posted by *Thoth420*
> 
> New Fury X tested on Crimson 16.1.1 working flawlessly with all games so far! 2560 x 1440 @ 144hz. The first one was a dud
> 
> 
> 
> 
> 
> 
> 
> 
> Now to wait for my Agent 47 figurine to arrive to sit on top of the backplate.


looking goood


----------



## Awsan

Simple question

R9 fury > R9 nano > R9 390x

OR

R9 fury = R9 nano > R9 390x


----------



## keikei

Quote:


> Originally Posted by *Awsan*
> 
> Simple question
> 
> *
> R9 fury > R9 nano > R9 390x*
> 
> OR
> 
> R9 fury = R9 nano > R9 390x


Nano is very close to the Fury. http://www.anandtech.com/show/9621/the-amd-radeon-r9-nano-review/5


----------



## Radox-0

R9 fury = R9 nano > R9 390x (if you notch up Nano's fans very slightly, most reviews show the nano under conditions in which it thermal throttles. Removing said throttling, My Nano could match my Fury cards (Pair of sapphire Tri-X's) even with the lower core clock thanks to being the full Fiji core)

R9 nano > R9 Fury > R9 390x (Nano under water)


----------



## Flamingo

Quote:


> Originally Posted by *Awsan*
> 
> Simple question
> 
> R9 fury > R9 nano > R9 390x
> 
> OR
> 
> R9 fury = R9 nano > R9 390x







Nano is usually 2-4 frames behind fury.


----------



## HexagonRabbit

Quote:


> Originally Posted by *Flamingo*
> 
> 
> 
> 
> 
> 
> Nano is usually 2-4 frames behind fury.


Thats with very old drivers. AMD seems to have stepped up the driver game with the crimson. Also, the Nano throttles hard. I'd like to see stock Nano results under water.

At least thats what I'm getting from what I've looked up.


----------



## Awsan

Ok so lets lay down my options,I own a 250D with 2x80mm, 2x120mm, 1x200mm (Enough airflow) and wont water the GPU at all.

So will going with the Fury (As it has after market coolers) be worth it or should I just get a nano?


----------



## keikei

Quote:


> Originally Posted by *Awsan*
> 
> Ok so lets lay down my options,I own a 250D with 2x80mm, 2x120mm, 1x200mm (Enough airflow) and wont water the GPU at all.
> 
> So will going with the Fury (As it has after market coolers) be worth it or should I just get a nano?


The Nitro is the larger fury card out now and im not sure it'll fit your case. The nano will fit and as of typing this its also slightly cheaper. I'd go nano. Almost near identical performance with better efficiency and half the size. Remember the Nano launched at $650, its now $480.


----------



## Luftdruck

Is there someone who lives in europe, got a Fury X with a waterblock and would sell his Fury X housing/cooling unit to me?


----------



## Thoth420

Quote:


> Originally Posted by *Arizonian*
> 
> Looks really beautiful but we know the figurine will be the icing on the cake.


That and the game not releasing a broken mess. If it comes out solid then I am painting the suit and tie on the white side section next to the panel. Maybe an ICA logo on the front panel.









Pictures as some have requested(sorry for the medium quality images, my phone sucks and I have GAD so my hands shake) this was the best I could manage in my dungeon lair. I tried to get shots of anything that matters. Yes that is a 2.5mm laptop HDD under the Intel SSD









All build credit goes to *Maingear Computers*. I am merely a padawan who could never build this.


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







It's hard to see but there is a T connector in the back for bleeding the loop that I just discovered


----------



## xer0h0ur

If anyone waterblocking a Fury X is selling the illuminated RADEON logo from the card, let me know. #NEEDDAT


----------



## p4inkill3r

Quote:


> Originally Posted by *Thoth420*


Very nice.


----------



## Thoth420

Quote:


> Originally Posted by *p4inkill3r*
> 
> Very nice.


Thank you much sir!








My dream system


----------



## HexagonRabbit

Well I got my Nano last night and tossed it in to test it before putting on the water block. Wow....the coil whine.
Keep in mind that I use an open air case but still, that is one of the loudest card I've ever owned (that record still goes to the 6800 Ultra).


----------



## Radox-0

Quote:


> Originally Posted by *HexagonRabbit*
> 
> Well I got my Nano last night and tossed it in to test it before putting on the water block. Wow....the coil whine.
> Keep in mind that I use an open air case but still, that is one of the loudest card I've ever owned (that record still goes to the 6800 Ultra).


Yeah kinda pot luck really. First nano I had was awful. Second I received and current one I use is much better and overclocks much nicer. Similar for my fury cards. One was terrible while the replacement was okay.

You going to keep it?


----------



## looncraz

Quote:


> Originally Posted by *HexagonRabbit*
> 
> Well I got my Nano last night and tossed it in to test it before putting on the water block. Wow....the coil whine.
> Keep in mind that I use an open air case but still, that is one of the loudest card I've ever owned (that record still goes to the 6800 Ultra).


Make sure you let the thing run for a long time in whatever situation causes it the most coil whine. My 7870XT had the worst coil whine I'd ever heard, but I ran it overnight in one of the game menus and the sound was pretty much gone. Now, years later, the card is completely silent - it's even in my noise-sensitive HTPC build.


----------



## p4inkill3r

Quote:


> Originally Posted by *looncraz*
> 
> Make sure you let the thing run for a long time in whatever situation causes it the most coil whine. My 7870XT had the worst coil whine I'd ever heard, but I ran it overnight in one of the game menus and the sound was pretty much gone. Now, years later, the card is completely silent - it's even in my noise-sensitive HTPC build.


Good advice.
It doesn't always work, but letting it sit at the Unigine Heaven credits screen overnight (gives me 5k FPS) may be enough to blast it into relative silence.


----------



## baii

So I have been using my full unlock fury for a while and have the same delimma av few pages ago or posted in the unlock thread.

With it full unlocked, I can't push more than 1050 or 1070/80ish ov, vs 3840 or stock with a few more MHz?


----------



## spyshagg

guys

guys

guys

Its time!!!




*AMD vs NVIDIA*

> Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50

Put your FURYS to WORK!
Lets do this! come on!


----------



## HexagonRabbit

Quote:


> Originally Posted by *Radox-0*
> 
> You going to keep it?


Yeah. I think its a little more noticeable because its an open air case 1 foot from my ear. My wife snores so a little coil whine is tolerable. I'll break it in soon enough.


----------



## bluezone

Guys, HARDOCP did some testing of different GPU's in Rise of the Tomb Raider. The results were interesting,. the wow factor was the Crossfire scaling of the R9 390X - 99.7%.

http://www.hardocp.com/article/2016/02/15/rise_tomb_raider_video_card_performance_review/10#.VsTPFOTSk7I

Problems with R9 Fury X in Crossfire though. It did reach 90% scaling.


----------



## Jflisk

Anyone with a steam account they are giving 2 games away for free. Look up Golden AX or Jet set radio . It will entitle you to 4 games . The 2 mentioned being pretty good. Go to packages that include this game and install. Oh and keeping in the forum Go Fury X.


----------



## Kana-Maru

Quote:


> Originally Posted by *spyshagg*
> 
> guys
> guys
> guys
> Its time!!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ]
> 
> *AMD vs NVIDIA*
> 
> > Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50
> 
> Put your FURYS to WORK!
> Lets do this! come on!


Ummmm........
Quote:


> *3DMark Vantage - Performance
> 3DMark11 - Performance
> 3DMark - Fire Strike*


1080p Performance ONLY???
No 1440p Extreme or 4K Ultra Category benchmarks?

Yeah I think I'll pass.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> Ummmm........
> 1080p Performance ONLY???
> No 1440p Extreme or 4K Ultra Category benchmarks?
> 
> Yeah I think I'll pass.


You are right.. My gtx 980 with nice overclock will compete with and sometimes beat the hell out of my fury at 1080/560 but at 4k my fury was a good bit quicker.

Sold fury, and now I'm selling k|ngp|n gtx 980. Ready to go cf 390 again!


----------



## xTesla1856

Good morning guys, and excuses in advance for not reading through the entire thread.

I currently own two Titan X's in SLI, but since Nvidia refuses to fix Surround + SLI issues that have existed for 10+ years now I find myself peeking over to the red camp more and more often. Now, if there is anyone in here who has had experience with both TX SLI / 980Ti SLI and with Fury X's in Crossfire, I would love to hear what your experiene was like and how the platforms compare.

I keep hearing that AMD is where it's at in terms of multi-monitor gaming, something that I use every day. How superior is it to Surround? And are 4GB of VRAM enough for 3x1080p? I apologize if these are too many questions at once, but I'd love to finally dabble with an AMD card.


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are right.. My gtx 980 with nice overclock will compete with and sometimes beat the hell out of my fury at 1080/560 but at 4k my fury was a good bit quicker.
> 
> Sold fury, and now I'm selling k|ngp|n gtx 980. Ready to go cf 390 again!


Exactly. Nvidia DX11 low overhead drivers plus crazy high OCs will prevail every time. Fiji seems to shine at 1440p\1600p and especially at 4K in my test. Of course the games matter as well [Gameworks = crazy tessellation & closed source code] and the optimizations\patches etc.

I was definitely checking out those Radeon 390s in CF. A single 390 beats the GTX 970 for sure. The Radeon 390 in CF is great bang for your bucks from the benchmarks I've seen. If you catch em on sale or use rebates you can get two for about $600 which is cheaper than the Fury X and 980 Ti+Titan X. I felt bad for GTX 980 purchasers this time around.

I've been seeing a lot of talk about 4K not being pleasant with a single GPU. Am I the only one enjoying 4K gaming with my single Fury X at stock clocks?


----------



## Arizonian

Quote:


> Originally Posted by *Kana-Maru*
> 
> Exactly. Nvidia DX11 low overhead drivers plus crazy high OCs will prevail every time. Fiji seems to shine at 1440p\1600p and especially at 4K in my test. Of course the games matter as well [Gameworks = crazy tessellation & closed source code] and the optimizations\patches etc.
> 
> I was definitely checking out those Radeon 390s in CF. A single 390 beats the GTX 970 for sure. The Radeon 390 in CF is great bang for your bucks from the benchmarks I've seen. If you catch em on sale or use rebates you can get two for about $600 which is cheaper than the Fury X and 980 Ti+Titan X. I felt bad for GTX 980 purchasers this time around.
> 
> I've been seeing a lot of talk about 4K not being pleasant with a single GPU. *Am I the only one enjoying 4K gaming with my single Fury X at stock clocks?*


I just recently had my hands on a 4K panel for one very long night and I enjoyed it on a single Nitro Fury - *here*. Hope next panel is a keeper.


----------



## Kana-Maru

Quote:


> Originally Posted by *Arizonian*
> 
> I just recently had my hands on a 4K panel for one very long night and I enjoyed it on a single Nitro Fury - *here*. Hope next panel is a keeper.


Very nice. Also those were some great scores. I get close to 60fps in some games, but the performance @4K is great from what I've benchmarked and tested. It appears you are having a great experience as well. My frame times were extremely stable with no micro stutter or input lag issues. I can support 60hz @ 4K.


----------



## spyshagg

Quote:


> Originally Posted by *Kana-Maru*
> 
> Ummmm........
> 1080p Performance ONLY???
> No 1440p Extreme or 4K Ultra Category benchmarks?
> 
> Yeah I think I'll pass.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> You are right.. My gtx 980 with nice overclock will compete with and sometimes beat the hell out of my fury at 1080/560 but at 4k my fury was a good bit quicker.
> 
> Sold fury, and now I'm selling k|ngp|n gtx 980. Ready to go cf 390 again!


well this is about team effort not personal pride but hey maybe next year!

cheers any way


----------



## NBrock

Quote:


> Originally Posted by *Kana-Maru*
> 
> Exactly. Nvidia DX11 low overhead drivers plus crazy high OCs will prevail every time. Fiji seems to shine at 1440p\1600p and especially at 4K in my test. Of course the games matter as well [Gameworks = crazy tessellation & closed source code] and the optimizations\patches etc.
> 
> I was definitely checking out those Radeon 390s in CF. A single 390 beats the GTX 970 for sure. The Radeon 390 in CF is great bang for your bucks from the benchmarks I've seen. If you catch em on sale or use rebates you can get two for about $600 which is cheaper than the Fury X and 980 Ti+Titan X. I felt bad for GTX 980 purchasers this time around.
> 
> I've been seeing a lot of talk about 4K not being pleasant with a single GPU. Am I the only one enjoying 4K gaming with my single Fury X at stock clocks?


I am loving it so far on my Fury X with my 4k Free Sync monitor. I was worried I was going to have to give up a lot of settings for it to run smoothly.


----------



## Agent Smith1984

Those with Fury Nano's.....

What kind of overclocks are you getting with voltage control???

Thinking about going back to red team and picking one up......


----------



## Radox-0

My Nano is under water so would be more then what one gets on stock cooler. This is what I currently get with memory at 545 Mhz:

1075 Mhz / +50% power target / -12mv (I undervolt the card)
1085 Mhz / +50% Power target / 0mv
1100 Mhz / +50% Power target/ + 6mv
1110 Mhz / +50% power target / +6mv
1120 Mhz / +50% power target / +12 mv
1130 Mhz / +50% power target / +36 mv
1140 Mhz / + 50% power target / +48 mv

However once I add anything more then 12 mv so to stabilise the higher over clocks, the performance actually decreases. 1130 and 1140 mhz also seems unstable in games so 1120 / 545 is the sweet spot for my card it seems.

Did not do too much testing on Air to be honest but recall being able to maintain 1080 Mhz with, but that was with fans ramped up fully, for daily use 1050 /545 was a nice spot with 0 need to add voltage. In honesty found the temperature was the limiter as such rather then the voltage.


----------



## NBrock

Quote:


> Originally Posted by *Radox-0*
> 
> My Nano is under water so would be more then what one gets on stock cooler. This is what I currently get with memory at 545 Mhz:
> 
> 1075 Mhz / +50% power target / -12mv (I undervolt the card)
> 1085 Mhz / +50% Power target / 0mv
> 1100 Mhz / +50% Power target/ + 6mv
> 1110 Mhz / +50% power target / +6mv
> 1120 Mhz / +50% power target / +12 mv
> 1130 Mhz / +50% power target / +36 mv
> 1140 Mhz / + 50% power target / +48 mv
> 
> However once I add anything more then 12 mv so to stabilise the higher over clocks, the performance actually decreases. 1130 and 1140 mhz also seems unstable in games so 1120 / 545 is the sweet spot for my card it seems.
> 
> Did not do too much testing on Air to be honest but recall being able to maintain 1080 Mhz with, but that was with fans ramped up fully, for daily use 1050 /545 was a nice spot with 0 need to add voltage. In honesty found the temperature was the limiter as such rather then the voltage.


Some good info thanks!


----------



## Kana-Maru

Quote:


> Originally Posted by *spyshagg*
> 
> well this is about team effort not personal pride but hey maybe next year!
> cheers any way


Personal pride? It's DX11 and 1080p for crying out loud lol. $650+ graphics cards are definitely going to played @ 1440p\1600 & 4K. Nvidia optimized low-overhead serial DX11 drivers will win every time. With those crazy high overclocks and tweaks gamers use they will win across the board at 1080p in most cases. Fiji shines at higher resolutions and AMD has clearly been focusing on parallel [Mantle\DX12\Vulkan] for a long time now. I will say that I'm impressed with the performance AMD is pushing out with their DX11 drivers for many games [not with synthetic benchmarks in mind, actual performance in games]. .Cheers









Quote:


> Originally Posted by *NBrock*
> 
> I am loving it so far on my Fury X with my 4k Free Sync monitor. I was worried I was going to have to give up a lot of settings for it to run smoothly.


Glad I'm not the only one. Those review sites are depressing. I guess if it isn't 60fps Average it's pathetic. What about 55fps average....I mean you won't really notice the difference. Unfortunately I don't have a free-sync monitor. I thought I would have to give up a lot of settings as well. I max out all settings in every game I play. The only setting that I will tone down since I don't really see the difference @ 4K is high Anti-Aliasing settings [MSAA\SMAA etc]. Everything else is maxed.


----------



## spyshagg

Quote:


> Originally Posted by *Kana-Maru*
> 
> Personal pride? It's DX11 and 1080p for crying out loud lol. $650+ graphics cards are definitely going to played @ 1440p\1600 & 4K. Nvidia optimized low-overhead serial DX11 drivers will win every time. With those crazy high overclocks and tweaks gamers use they will win across the board at 1080p in most cases. Fiji shines at higher resolutions and AMD has clearly been focusing on parallel [Mantle\DX12\Vulkan] for a long time now. I will say that I'm impressed with the performance AMD is pushing out with their DX11 drivers for many games [not with synthetic benchmarks in mind, actual performance in games]. .Cheers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Glad I'm not the only one. Those review sites are depressing. I guess if it isn't 60fps Average it's pathetic. What about 55fps average....I mean you won't really notice the difference. Unfortunately I don't have a free-sync monitor. I thought I would have to give up a lot of settings as well. I max out all settings in every game I play. The only setting that I will tone down since I don't really see the difference @ 4K is high Anti-Aliasing settings [MSAA\SMAA etc]. Everything else is maxed.


Ok man you don't want to contribute no one is forcing you!

cheers


----------



## NBrock

This is the monitor I ended up with http://www.newegg.com/Product/Product.aspx?Item=N82E16824160278
Looks like it is out of stock at NewEgg and just available through other retailers on NewEgg. The NewEgg price was $349. It's a TN panel but it's a darn good one (same as the Samsung). The Free Sync range is 40-60 and I can tell ya it's buttery smooth.


----------



## stargate125645

Newegg has a shell shocker today for a 21:9 AOC monitor, an IPS one, if you want that instead of buying from a different retailer.


----------



## p4inkill3r

Quote:


> Originally Posted by *NBrock*
> 
> This is the monitor I ended up with http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=U2879VF&N=-1&isNodeId=1
> Looks like it is out of stock at NewEgg and just available through other retailers on NewEgg. The NewEgg price was $349. It's a TN panel but it's a darn good one (same as the Samsung). The Free Sync range is 40-60 and I can tell ya it's buttery smooth.


I have the same one and I can concur, its dope.


----------



## kelleh

Quote:


> Originally Posted by *NBrock*
> 
> I am loving it so far on my Fury X with my 4k Free Sync monitor. I was worried I was going to have to give up a lot of settings for it to run smoothly.


I run my non water cooled (as of yet) Nano at 1090clock and 545 memory with zero voltage increase My temps also never go above 55C


----------



## keikei

Sign me up gentlemen.











Spoiler: Warning: Spoiler!


----------



## jamaican voodoo

i'm loving my fury x setup here what i got going, pics are from my iPhone. the monitor is Philips BDM4065UC 40" Class 4K


----------



## Arizonian

Quote:


> Originally Posted by *keikei*
> 
> Sign me up gentlemen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats with a Nitro Fury - welcome to the club







Got mine running 4K now.








Quote:


> Originally Posted by *jamaican voodoo*
> 
> i'm loving my fury x setup here what i got going, pics are from my iPhone. the monitor is Philips BDM4065UC 40" Class 4K
> 
> 
> Spoiler: Warning: Spoiler!


Really really nice


----------



## keikei

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats with a Nitro Fury - welcome to the club
> 
> 
> 
> 
> 
> 
> 
> Got mine running 4K now.
> 
> 
> 
> 
> 
> 
> 
> 
> Really really nice


How's the eye candy @4k?


----------



## Arizonian

Quote:


> Originally Posted by *keikei*
> 
> How's the eye candy @4k?


I think the best way to explain it would be to remember the 'wow' factor when you go from 1080 to 1440 for the first time, but just not as much 'wow' factor. I didn't have it too long but another way to describe it is how the colors were brilliant in Crysis 3 without AA at 4K looked as brilliant as 1440 with AA to me. Streaming Netflix seemed to be a very crisp picture too.


----------



## keikei

Quote:


> Originally Posted by *Arizonian*
> 
> I think the best way to explain it would be to remember the 'wow' factor when you go from 1080 to 1440 for the first time, but just not as much 'wow' factor. I didn't have it too long but another way to describe it is how the colors were brilliant in Crysis 3 without AA at 4K looked as brilliant as 1440 with AA to me. Streaming Netflix seemed to be a very crisp picture too.


I find with the card i dont have to compromise as much as say with a 290. It barely fits the case. Lol. I can only imagine what polaris will bring to gaming. I'm even testing some games on 5k. At that res, aa is absolutely not needed.


----------



## Arizonian

Quote:


> Originally Posted by *keikei*
> 
> I find with the card i dont have to compromise as much as say with a 290. It barely fits the case. Lol. I can only imagine what polaris will bring to gaming. I'm even testing some games on 5k. At that res, aa is absolutely not needed.


Will be nice as expecting a modest 30% gains in 4K with single Polaris (Fury equivalent) is not unfathomable.

Example SOM (Ultra) 4K
Now: 30 min 45 avg 60max
After: 39 min 58 avg 78max

I feel we'll finally see the sales on 4K monitors increase in the common household as a premium monitor to have. Right now it just doesn't seem like a lot of people are making the move yet. Waiting on the market and what it fully might unfold in the way of displays with synchronization.

Took me over a year to decide myself since ROG Swift TN panel released. It's finally time. I don't expect manufacturers making 4K monitors @144 Hz refresh rate until one single GPU can keep a minimum of 80 FPS 100 FPS average. I just don't see manufacturers realistically cater to a very small niche' who can push it properly with two graphics cards. I know some think we'll be seeing those types of monitors by next year but I'm going to speculate different.

Good times poised.


----------



## keikei

Quote:


> Originally Posted by *Arizonian*
> 
> Will be nice as expecting a modest 30% gains in 4K with single Polaris (Fury equivalent) is not unfathomable.
> 
> Example SOM (Ultra) 4K
> Now: 30 min 45 avg 60max
> After: 39 min 58 avg 78max
> 
> I feel we'll finally see the sales on 4K monitors increase in the common household as a premium monitor to have. Right now it just doesn't seem like a lot of people are making the move yet. Waiting on the market and what it fully might unfold in the way of displays with synchronization.
> 
> Took me over a year to decide myself since ROG Swift TN panel released. It's finally time. I don't expect manufacturers making 4K monitors @144 Hz refresh rate until one single GPU can keep a minimum of 80 FPS 100 FPS average. I just don't see manufacturers realistically cater to a very small niche' who can push it properly with two graphics cards. I know some think we'll be seeing those types of monitors by next year but I'm going to speculate different.
> 
> Good times poised.


DP3 will be standard on polaris and assuming on pascal as well. I can see manufacturers release those high hz 4k monitors within 2 more years if not sooner. Dell will release their oled monitor this march and it'll be first out of the gate with those spec, but at a ridiculous price of 5 grand.


----------



## shadowxaero

So a close friend of mine bought a 980ti....which meant...benchmark wars. So in my valiant attempt to beat this card that cost 200 dollars more than mine ;-) I started trying a few things, one of which including flashing an older or rather different bios.

My Fury TriX shipped with 015.049.000.003.000000 and I flashed it to 015.049.000.002.000000 which I believe is just the TriX OC bios.

Upon doing this I was able to get a stable OC of 1225\600 with +54mV, although 600mhz on HBM yielded me lower scored so I ran firestrike at 1225/590. Nonetheless the results impressed me.

Firestrike Ultra: 4429
http://www.3dmark.com/fs/7614849

Firestrike Extreme: 8103
http://www.3dmark.com/fs/7614895

Before flashing bios, I wasn't stable at anything over 1160.


----------



## stargate125645

Quote:


> Originally Posted by *shadowxaero*
> 
> So a close friend of mine bought a 980ti....which meant...benchmark wars. So in my valiant attempt to beat this card that cost 200 dollars more than mine ;-) I started trying a few things, one of which including flashing an older or rather different bios.
> 
> My Fury TriX shipped with 015.049.000.003.000000 and I flashed it to 015.049.000.002.000000 which I believe is just the TriX OC bios.
> 
> Upon doing this I was able to get a stable OC of 1225\600 with +54mV, although 600mhz on HBM yielded me lower scored so I ran firestrike at 1225/590. Nonetheless the results impressed me.
> 
> Firestrike Ultra: 4429
> http://www.3dmark.com/fs/7614849
> 
> Firestrike Extreme: 8103
> http://www.3dmark.com/fs/7614895
> 
> Before flashing bios, I wasn't stable at anything over 1160.


What software are you using to OC the card and apply additional voltage? Trixx?


----------



## shadowxaero

Quote:


> Originally Posted by *stargate125645*
> 
> What software are you using to OC the card and apply additional voltage? Trixx?


Afterburner 4.2.0


----------



## p4inkill3r

1225MHz is pretty great.


----------



## shadowxaero

Quote:


> Originally Posted by *p4inkill3r*
> 
> 1225MHz is pretty great.


I wanna go higher lol but literally anything over that crashes, that single Mh to 1226 and black screen haha. And when I add more voltage then +54, weird things start happening. I start getting single loss and a lot of flickering (only when running at 2160p/60Hz. At 30Hz I can add voltage up to +96 and be fine.


----------



## Noirgheos

Where would I get the Tri-X OC BIOS and how would I apply it? It may solve my stutter.


----------



## kelleh

Fastest Air Cooled Nano ever?

http://www.3dmark.com/3dm/10080908 - Fire Strike Ultra

http://www.3dmark.com/fs/7620211 Fire Strike

These settings are about gaming stable though I game a lower clocks as not to push my gpu/cpu when not needed


----------



## JonDuma

Hi, what do you use to OC the fury? Is it trixxx oy msi afterburner?

Thanks,
Jon


----------



## JonDuma

Quote:


> Originally Posted by *98uk*
> 
> What voltage bump are you guys running for 1100mhz core?
> 
> I thought I had it stable at 1100/550 @ +18mV (+50% power), it survived BF4 and 3Dmark... but crashed very quickly in Pcars.
> 
> I bumped to +36mV and now Pcars is stable... but is +36mV a lot for 1100mhz?


Quote:


> Originally Posted by *JonDuma*
> 
> Hi, what do you use to OC the fury? Is it trixxx oy msi afterburner?
> 
> Thanks,
> Jon


.


----------



## kelleh

Quote:


> Originally Posted by *JonDuma*
> 
> Hi, what do you use to OC the fury? Is it trixxx oy msi afterburner?
> 
> Thanks,
> Jon


I use Trixx


----------



## bluezone

Has anyone tried loading the Radeon software beta for Vulkan.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Vulkan-Beta.aspx?webSyncID=c22ca125-14e6-3ba7-2140-16c1a2a5d2d5&sessionGUID=8b8499e2-2b83-2dbd-6791-6bde957b1409

Some of the Guru3D members have been loading it over top the Crimson 16.1.1 software package. The reason for this is, it seems to reduce hitching and stutter in certain games. Rise of the Tomb Raider for one. In RotTR the load level stutter that occurs when fast traveling to areas is Gone. I've tried this in the worse case scenario. E.G. Windows 7 HD 7950's in X-fire.

Note this doesn't add any apparent FPS only smoothness in gameplay.

Can anyone running R9 Fury in X-fire give this a go and let me know if how it works for them.

P.S. AMD has finally shipped out my R9 Nano. So hopefully I should receive it next week. Woooo whoooo.


----------



## Wagnelles

Please don't see this question as a fight spark.

But I'm looking forward to have a dual GPU solution on my next build. How's the Fury X Crossfire doing it with the new Crimson drivers? Any words about Freesync (compared to Gsync)?

Thanks!


----------



## Maximization

Quote:


> Originally Posted by *Wagnelles*
> 
> Please don't see this question as a fight spark.
> 
> But I'm looking forward to have a dual GPU solution on my next build. How's the Fury X Crossfire doing it with the new Crimson drivers? Any words about Freesync (compared to Gsync)?
> 
> Thanks!


my use so far for a 4k 40" display the fury x crossfire is plenty power. I am using the star citizen beta, the screens movements are smooth and graphics clear in crimson.


----------



## Wagnelles

Quote:


> Originally Posted by *Maximization*
> 
> my use so far for a 4k 40" display the fury x crossfire is plenty power. I am using the star citizen beta, the screens movements are smooth and graphics clear in crimson.


Actually my thing is 21:9. The suggested monitor for such setup would be the Freesync variant of the Acer Predator X34.


----------



## Himo5

Has anyone at all on this thread experienced the Fury X Display Problem or is aware that after 6 months AMD still hasn't the slightest idea how to reproduce it - let alone solve it?


----------



## Noirgheos

Quote:


> Originally Posted by *Himo5*
> 
> Has anyone at all on this thread experienced the Fury X Display Problem or is aware that after 6 months AMD still hasn't the slightest idea how to reproduce it - let alone solve it?


AMD knows how to reproduce it, and to be frank, the fix is easy as all hell to do. Just don't let it go to 2D clocks at idle, even then, it won't always happen. I got it 2 times since I've had the Fury (3 months), simple fix is to just change resolution and then change back, done.

With Clockblocker I haven't had it, might be because it keeps the Fury at over 300MHz most of the time.


----------



## Himo5

Quote:


> Originally Posted by *Noirgheos*
> 
> AMD knows how to reproduce it, and to be frank, the fix is easy as all hell to do. Just don't let it go to 2D clocks at idle, even then, it won't always happen. I got it 2 times since I've had the Fury (3 months), simple fix is to just change resolution and then change back, done.
> 
> With Clockblocker I haven't had it, might be because it keeps the Fury at over 300MHz most of the time.


Well on the Community site last Thursday (Feb 17th) they said they were still trying to solve it and there's not been any update to their previous report of being unable to reproduce it. If we knew how to set it off we would at least be able to avoid it, but even the 2D clock theory has not been officially confirmed there.


----------



## buildzoid

I did even more testing with the voltage based throttling. It is 100% power based. The bad news is that it is not affected by BIOS power settings. I've also gone out and gotten all my power measurement gear which is why it became so obvious what's happening.

I don't have a table for the new results since they aren't worth a table.
Here's what I got on the new round of testing:

Fury X at 500/500mhz 1.05V 165FPS about 110W power draw
Fury X at 500/500mhz 1.39V 165FPS about 230W power draw

Fury X at 1050mhz 1.2V 302FPS about 310W power draw
Fury X at 1050mhz 1.4V *287FPS* about 420W power draw

To remove the throttling I've tried taking the 350W Sapphire Fury Tri-x BIOS unlocking it to 4096 cores and using that. That did absolutely nothing to the "micro" throttle(that's what I'm gonna call it since GPU-z doesn't pick it up). However based on the Tri-X BIOS I made my own 511W 511A, 768W 768A, and 65000W 65000A BIOS. None of those did anything either. I tried doing BIOS mods for more voltage but as far as I can tell that didn't work at all.(I'm probably doing it wrong)

In theory physical power and voltage mods should not be affected because what ever software is screwing with the core clocks will be fed false information as to what the GPU is actually doing and therefore won't kick in. There's a good chance that even the simple physical volt mod on it's own will be enough to trick the power management. There is also a chance that a combination of physical voltmod + power hacked BIOS will work.

Now I just need to get all my volt modding supplies again(I moved from CZ to UK).


----------



## Wagnelles

Got another question:

*Is Freesync + Crossfire + VSR possible?* That's a reason to get the Fury X's for me. In case you don't know, Gsync + SLI + DSR is not possible to be active at the same time. It's a known issue that Nvidia refuses to fix for more than a year.

Edited because I created a thread asking anything else besides this.


----------



## akromatic

just wondering what is the overclock potential over the R9 nano when watercooled? so far all the benches i've found indicates that it will throttle on air but there is nothing on water.

thinking if i should get the nano($769AUD) or the fury x($999AUD) given the current price drop on the nano.

dont quite see a point on the regular air cooled fury when the nano is currently cheaper and is better when overclocked.


----------



## Radox-0

Quote:


> Originally Posted by *akromatic*
> 
> just wondering what is the overclock potential over the R9 nano when watercooled? so far all the benches i've found indicates that it will throttle on air but there is nothing on water.
> 
> thinking if i should get the nano($769AUD) or the fury x($999AUD) given the current price drop on the nano.
> 
> dont quite see a point on the regular air cooled fury when the nano is currently cheaper and is better when overclocked.


Posted my results a while back. Under water no issue temps wise with any setting, 1125mhz /545 MHz seems to be the sweet spot for my card with + 12 Mv. I can push it to 1145 MHz / 545 MHz, but this requires around +36 mv. Seems that when you go over x amount of voltage performance decreases so while I can stabilise higher overclocks thanks to the higher voltage required to get thier, performance actually drops. Not unique in this as buildzoids great posts prior allude to.

So long story short, watercooled you will likely reach the performance and top out comparably to a moderately overclocked fury x. The fury x however will have a higher top limit it seems.


----------



## akromatic

Quote:


> Originally Posted by *Radox-0*
> 
> Posted my results a while back. Under water no issue temps wise with any setting, 1125mhz /545 MHz seems to be the sweet spot for my card with + 12 Mv. I can push it to 1145 MHz / 545 MHz, but this requires around +36 mv. Seems that when you go over x amount of voltage performance decreases so while I can stabilise higher overclocks thanks to the higher voltage required to get thier, performance actually drops. Not unique in this as buildzoids great posts prior allude to.
> 
> So long story short, watercooled you will likely reach the performance and top out comparably to a moderately overclocked fury x. The fury x however will have a higher top limit it seems.


got any bench results and power draw figures?

im not really interested in getting max overclocks as im very conscious on my power draws(SFX PSU limits) and cooling limitations as i be cutting down on as much rad space as i can but im still after the fury x speeds and perhaps catching the 980ti

there are 2 scenarios that im looking at getting a nano for.

1: watercooling in my raijintek metis which currently has a i7 3770 + 970GTX under a single 120mm rad (temps are fine but i hope to keep the nano within similar temp margins)
2: custom as small as possible matx case with 2x nano watercooled and a lower TDP(~80w) LGA2011-3 CPU that im hoping to be able to cool with just 240mm worth of rad space.

currently im running triple display with 2x 1080p and a 2560x1080 ultrawide in the middle and im intending to move up to 2x 1440p with a 3440x1440 ultrawide


----------



## Radox-0

Quote:


> Originally Posted by *akromatic*
> 
> got any bench results and power draw figures?
> 
> im not really interested in getting max overclocks as im very conscious on my power draws(SFX PSU limits) and cooling limitations as i be cutting down on as much rad space as i can but im still after the fury x speeds and perhaps catching the 980ti
> 
> there are 2 scenarios that im looking at getting a nano for.
> 
> 1: watercooling in my raijintek metis which currently has a i7 3770 + 970GTX under a single 120mm rad (temps are fine but i hope to keep the nano within similar temp margins)
> 2: custom as small as possible matx case with 2x nano watercooled and a lower TDP(~80w) LGA2011-3 CPU that im hoping to be able to cool with just 240mm worth of rad space.
> 
> currently im running triple display with 2x 1080p and a 2560x1080 ultrawide in the middle and im intending to move up to 2x 1440p with a 3440x1440 ultrawide


Don't have too many benches saved but can do some checks for you on Wednesday. Had the LG Cuved 3440 x 1440 Ultrawide but sold it Friday to make way for the ASUS ROG 100 hz 3440 x 1440 panel, hopefully should arrive tomorrow so will do some further benches on Wednesday. Here is a fire strike / ultra examples in the meantime: http://www.3dmark.com/fs/7547292/ http://www.3dmark.com/fs/7547342

However I can help somewhat on the SFX bit. Mine is in my SFX build also "White Fusion" rig in my sig so am always conciousness of power also. I use the 500 Watt Silverstone SFX unit and with a 4690k overclocked to 4.7 Ghz, 1.31 volts and the Nano @ 1145 / 545 Mhz, power target @ +50%, + 42mv, the peak amount I pulled from the wall according to my power meter was 437 watts over about 3 hours of mixed usage and benchmarking to try and get the max figure I could to basically ensure the PSU was sufficient.

For temps run the CPU (and PCH / VRM's as its a complete motherboard block) and Nano under a single 240mm loop with the EK SE 240mm rad, so thier slim radiator with a pair of EK Vardars and temps are reasonable. Under gaming scenarios, with fans going away at about 1200 RPM the temperatures on the Nano reach about 45 degrees and 50 on CPU, while while hot for a custom loop is not really an issue persay. All though I expect this will vastly differ for yourself.


----------



## akromatic

yeh well stock clocks im going to assume its gonna run a tad hotter then my current CPU + 970 on a single slim 120 where they are both around 55c each with my noisblocker fan @ 1200rpm. i guess im comfortable with them both running up to around 70c load as its still better then air temps and should not throttle.

cool i'm also currently using a 500w SFX-L, i've been reading your past posts and you mentioned that you killed a SFX PSU. mind sharing that story?

glad to know that you are running a 1440p ultra wide though i intend to run surround hence im need to move to team red as team green doesnt support mixed resolution. how is the performance on the ultrawide?


----------



## Radox-0

Quote:


> Originally Posted by *akromatic*
> 
> yeh well stock clocks im going to assume its gonna run a tad hotter then my current CPU + 970 on a single slim 120 where they are both around 55c each with my noisblocker fan @ 1200rpm. i guess im comfortable with them both running up to around 70c load as its still better then air temps and should not throttle.
> 
> cool i'm also currently using a 500w SFX-L, i've been reading your past posts and you mentioned that you killed a SFX PSU. mind sharing that story?
> 
> glad to know that you are running a 1440p ultra wide though i intend to run surround hence im need to move to team red as team green doesnt support mixed resolution. how is the performance on the ultrawide?


Yep, fully agree while the temps are not amazing for a loop, far better then what you get on air and eliminates throttling all together.

Yes I should have went back and amended that! My fault it was not the PSU that died, it was the pump. My Pump / Res is located underneath the PSU area and it appears the O-Ring was not fully sealed. Looked like some fluid made its way into the pumps motor and killed it basically. I assumed the smell and noise at the time was coming from the PSU. Replaced the pump last week, problem solved. The PSU is fine.

Performance wise the Nano runs pretty nicely on the ultrawide, more so under water, where its basically giving Fury X performance. Most titles run high / ultra settings pretty happily once you turn off the game work features in some titles such as Arkaham Knight and Witcher or notch down those overly demanding settings such as vegetation in GTA 5.

I will see if I can pull together some benchmarks in some titles and record them on Wednesday once I have a monitor again.


----------



## Alastair

I wonder if the performance decreases when the voltage goes over a certain point is caused by some sort of throttling and if there is a way around it? I mean this is the first GPU I have seen which doesn't seem to be responding well to voltages. Anyone tried contacting AMD about it?


----------



## Flamingo

My R9 Nano has finally arrived!

What drivers should I install straightaway? Crimson or the ones before? The downclocking issue is related to any specific drivers?


----------



## Radox-0

Quote:


> Originally Posted by *Flamingo*
> 
> My R9 Nano has finally arrived!
> 
> What drivers should I install straightaway? Crimson or the ones before? The downclocking issue is related to any specific drivers?


I Just use latest Crimson ones.

Downclocking issue?


----------



## dagget3450

I am being a little salty with my Fury experience, that said can anyone here look at the below post i made from the fanboy war thread? While it appears the fury cards ran with tess on vs 290x didn't the scores are still strange. I realize most dont have 3 fury gpu's, but even 2 way might help for data. It appears there is negative scaling or heavier cpu limitation?

The below data is 3dmark11 @ Performance setting - also you can refer to http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd OP has a table with results for data. It looks like fury is ignoring CF profile and running with tess on. Even still the scores seem way low and lower than 290x tweaked runs.

discussion also started here --> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/500#post_24918967

I would like to think i am just wrong here. So perhaps while my system is down (back up hopefully tomorrow) someone can prove i am wrong.
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> My point is, if you look at the details something isn't right and i am able to replicate this on my setup as well. His platform is pretty close to yours. his ram speed is higher and your slightly ahead on cpu clock 100mhz or so... But where i am talking is specifically the GPU tests in 3dmark11.
> 
> 3dmark11 htaddict mus1mus
> gpu1 170.33 fps 194.26 fps
> gpu2 280.74 fps 271.02 fps
> gpu3 345.63 fps 372.36 fps
> gpu4 149.06 fps 174.18 fps
> 
> There is something buggy with this bench and Fury on multigpu. here is mine on same bench with 4x furyx on a 5960x
Click to expand...


----------



## Jflisk

Quote:


> Originally Posted by *Himo5*
> 
> Has anyone at all on this thread experienced the Fury X Display Problem or is aware that after 6 months AMD still hasn't the slightest idea how to reproduce it - let alone solve it?


Had it happen yesterday. Only happens on desktop -Sporadically . I can game for hours straight with no problems.I have reported it on AMDs site and on the same forum you are probably linking.


----------



## Flamingo

Quote:


> Originally Posted by *Radox-0*
> 
> I Just use latest Crimson ones.
> 
> Downclocking issue?


the problem that needs clockblock to run or something like that.


----------



## NBrock

Quote:


> Originally Posted by *dagget3450*
> 
> I am being a little salty with my Fury experience, that said can anyone here look at the below post i made from the fanboy war thread? While it appears the fury cards ran with tess on vs 290x didn't the scores are still strange. I realize most dont have 3 fury gpu's, but even 2 way might help for data. It appears there is negative scaling or heavier cpu limitation?
> 
> The below data is 3dmark11 @ Performance setting - also you can refer to http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd OP has a table with results for data. It looks like fury is ignoring CF profile and running with tess on. Even still the scores seem way low and lower than 290x tweaked runs.
> 
> discussion also started here --> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/500#post_24918967
> 
> I would like to think i am just wrong here. So perhaps while my system is down (back up hopefully tomorrow) someone can prove i am wrong.


Maybe with that 4th card active it is causing PCIE speed to go down? Not sure what board you have but maybe the 4th card needs to go in a different slot?


----------



## dagget3450

Quote:


> Originally Posted by *NBrock*
> 
> Maybe with that 4th card active it is causing PCIE speed to go down? Not sure what board you have but maybe the 4th card needs to go in a different slot?


Im sure it does, but even still 3 vs 3 is lower still. So when i am back up ill run 3way. Not including my results looking at 3 vs 3 fury against 290x is tessellation going to cause such a huge deficit to run slower on fury? Shouldn't the speed difference alone at least have it equal or slightly higher on fury?


----------



## Origondoo

Hi together,

does someone know when meanwhile fury x2 aka gemini will be released?

I have a new build missing a GPU. It's mITX, so I look for a dual card.


----------



## NBrock

Quote:


> Originally Posted by *dagget3450*
> 
> Im sure it does, but even still 3 vs 3 is lower still. So when i am back up ill run 3way. Not including my results looking at 3 vs 3 fury against 290x is tessellation going to cause such a huge deficit to run slower on fury? Shouldn't the speed difference alone at least have it equal or slightly higher on fury?


The Furys should definitely be faster. Maybe a driver issue. Have you tried uninstalling the drivers and using something like DDU to wipe any hints of the drivers? Then use the latest beta 16.1.1 drivers. Also it might be worth running clock blocker during your bench marks. Maybe one or a couple cards is not running full speed for whatever reason.


----------



## keikei

Quote:


> Originally Posted by *Origondoo*
> 
> Hi together,
> 
> does someone know when meanwhile fury x2 aka gemini will be released?
> 
> I have a new build missing a GPU. It's mITX, so I look for a dual card.


Best estimate is early Q2 2016.


----------



## Origondoo

Quote:


> Originally Posted by *keikei*
> 
> Best estimate is early Q2 2016.


Thanks for info.
Hope they will set the price point accordingly, since the next gen is around the corner


----------



## keikei

Quote:


> Originally Posted by *Origondoo*
> 
> Thanks for info.
> Hope they will set the price point accordingly, since the next gen is around the corner


Best case scenario for Polaris is late summer/early autumn. Worst case is late winter. I dont even know if Pascal has announced a launch date yet. Plenty of rumors and speculations though.


----------



## Flamingo

R9 Nano with SG-13 Case





How are these temperatures? (With chasis cover on)



I ran Heaven standard benchmark and got a score:



Fans ran at 52% speed max and were not jet loud annoying, but you could notice them running faster than idle.

The rear grill, metal of the card was so hot I couldnt keep my finger for a long time. Front case was okayish, but definately warm

CPU temps slightly increased too:
On Intel Graphics, CPU Temp: 47C
On R9 Nano, CPU Temp: 51C


----------



## Medusa666

I need some advice from you guys,

I bought a XFX R9 Nano in november last year, I love this little guy to bits even though I got him in a E-ATX system. Needless to say, I'm very happy with the performance and the form factor and everything else about it.

Thing is, the last month I had weird things happening. Screen blacks out for 2-4 seconds, then comes back on while gaming, screen does a small explosive sound ( hard to describe ) and an artifact like thingy flashes horizontally across the screen for a second, I get black square artifacts ( 1440P screen so very small ) and so on. This only happens in one game, Elite Dangerous Horizons. While in Windows the screen went nuts two times and everything got mixed up, and while playing Heroes of The Storm the textures flickers constantly ( not all, just some ).

All in all, I got all these small nuisances that occurs relative frequently but random. I suspected that it was the card that was faulty, but thing is I ran 3D mark and Furmark for like 40-60 min and nothing shows up, works good, no errors or display artifacts.

The vendor has offered to take the card in for testing, they told me they will run the card for hours in Furmark etc for a full 24 HRS. And since these errors I wrote about above are non consistent what worries me is that if they do not find any faults I will recieve the card back after it being stressed really hard, i.e in worse shape than I left it in.

What should I do? Could it be the display? It is a Acer XG270HU 1440P 144 Hz Freesync monitor, the GPU is not overclocked and the PSU is a Corsair RM1000i ( quality stuff ), motherboard is MSI GODLIKE X99.

Halp! : ) Pls.


----------



## keikei

^
Quote:


> Originally Posted by *Medusa666*
> 
> I need some advice from you guys,
> 
> I bought a XFX R9 Nano in november last year, I love this little guy to bits even though I got him in a E-ATX system. Needless to say, I'm very happy with the performance and the form factor and everything else about it.
> 
> Thing is, the last month I had weird things happening. Screen blacks out for 2-4 seconds, then comes back on while gaming, screen does a small explosive sound ( hard to describe ) and an artifact like thingy flashes horizontally across the screen for a second, I get black square artifacts ( 1440P screen so very small ) and so on. This only happens in one game, Elite Dangerous Horizons. While in Windows the screen went nuts two times and everything got mixed up, and while playing Heroes of The Storm the textures flickers constantly ( not all, just some ).
> 
> All in all, I got all these small nuisances that occurs relative frequently but random. I suspected that it was the card that was faulty, but thing is I ran 3D mark and Furmark for like 40-60 min and nothing shows up, works good, no errors or display artifacts.
> 
> The vendor has offered to take the card in for testing, they told me they will run the card for hours in Furmark etc for a full 24 HRS. And since these errors I wrote about above are non consistent what worries me is that if they do not find any faults I will recieve the card back after it being stressed really hard, i.e in worse shape than I left it in.
> 
> What should I do? Could it be the display? It is a Acer XG270HU 1440P 144 Hz Freesync monitor, the GPU is not overclocked and the PSU is a Corsair RM1000i ( quality stuff ), motherboard is MSI GODLIKE X99.
> 
> Halp! : ) Pls.


Sounds like a driver issue. There is hotfix for the latest beta. Also try reinstalling or rolling back to a known workable driver.


----------



## p4inkill3r

Quote:


> The vendor has offered to take the card in for testing, they told me they will run the card for hours in Furmark etc for a full 24 HRS.


Who is this vendor?


----------



## Medusa666

Quote:


> Originally Posted by *p4inkill3r*
> 
> Who is this vendor?


It is just the local store I bought it from, I asked them what kind of tests they do and they said that they do alot but among them are 12-24HR Furmark, which sounds extremely punishing.

I don't want to name them here in this thread though, their intentions are not bad.


----------



## Medusa666

Quote:


> Originally Posted by *keikei*
> 
> ^
> Sounds like a driver issue. There is hotfix for the latest beta. Also try reinstalling or rolling back to a known workable driver.


Yeah, I have been considering that. This is occured with all Crimson drivers, I can't recall if it happened prior to Crimson, maybe worth rolling back to the last Catalyst and try it out before I ship the card.

Funny thing is, everything is working good 95% of the time, these errors happen sporadically and infrequently.


----------



## p4inkill3r

Quote:


> Originally Posted by *Medusa666*
> 
> It is just the local store I bought it from, I asked them what kind of tests they do and they said that they do alot but among them are 12-24HR Furmark, which sounds extremely punishing.
> 
> I don't want to name them here in this thread though, their intentions are not bad.


Fair enough, but I'd be hesitant about handing over my card to them.

Perhaps reinstalling your drivers or just writing it off as behavior limited to a couple of titles would be an equitable solution for you?


----------



## Medusa666

Quote:


> Originally Posted by *p4inkill3r*
> 
> Fair enough, but I'd be hesitant about handing over my card to them.
> 
> Perhaps reinstalling your drivers or just writing it off as behavior limited to a couple of titles would be an equitable solution for you?


Yeah, been considering that too, thing is, in my country the first 6 months it is up to the vendor to prove that the customer has caused the fault, after 6 months it is the customer that has to prove that the card is faulty. i.e it is harder in the future to get it replaced.

But you are right though, it is just a few titles that have problems, and if the card breaks down completely they will swap it out, 6 months or not.

I'l do some more tests and see what happens.


----------



## Thoth420

Quote:


> Originally Posted by *Medusa666*
> 
> I need some advice from you guys,
> 
> I bought a XFX R9 Nano in november last year, I love this little guy to bits even though I got him in a E-ATX system. Needless to say, I'm very happy with the performance and the form factor and everything else about it.
> 
> Thing is, the last month I had weird things happening. Screen blacks out for 2-4 seconds, then comes back on while gaming, screen does a small explosive sound ( hard to describe ) and an artifact like thingy flashes horizontally across the screen for a second, I get black square artifacts ( 1440P screen so very small ) and so on. This only happens in one game, Elite Dangerous Horizons. While in Windows the screen went nuts two times and everything got mixed up, and while playing Heroes of The Storm the textures flickers constantly ( not all, just some ).
> 
> All in all, I got all these small nuisances that occurs relative frequently but random. I suspected that it was the card that was faulty, but thing is I ran 3D mark and Furmark for like 40-60 min and nothing shows up, works good, no errors or display artifacts.
> 
> The vendor has offered to take the card in for testing, they told me they will run the card for hours in Furmark etc for a full 24 HRS. And since these errors I wrote about above are non consistent what worries me is that if they do not find any faults I will recieve the card back after it being stressed really hard, i.e in worse shape than I left it in.
> 
> What should I do? Could it be the display? It is a Acer XG270HU 1440P 144 Hz Freesync monitor, the GPU is not overclocked and the PSU is a Corsair RM1000i ( quality stuff ), motherboard is MSI GODLIKE X99.
> 
> Halp! : ) Pls.


Sounds like my defective fury x sorry to say. It was confirmed the card was the problem and I had the black squares and issues with 144hz 1440 panel. Also something to consider before the GPU is the cable. A 2meter or shorter(which is very short) DP 1.2a VESA certified cable is what AMD recommends for that reso and refresh rate. The panels all seem to come with substandard cabling so you can try the Accel brand which I am using at the moment or perhaps a 2m Belkin(more reliable company) or shorter cable first before RMAing the GPU. I ruled out cabling as the Belkin and Accel both work on my new GPU but did not solve the issues on the old one so in my case the GPU was defective but def rule out cabling first for a panel like that.


----------



## drm8627

hey guys. just stopping in to ask a quick question:

DO you think Linus's ghetto cooling solution would work for a r9 nano?




I have a reason, im jw if thatd work for the r9 nano.


----------



## baii

Quote:


> Originally Posted by *drm8627*
> 
> hey guys. just stopping in to ask a quick question:
> 
> DO you think Linus's ghetto cooling solution would work for a r9 nano?
> 
> 
> 
> 
> I have a reason, im jw if thatd work for the r9 nano.


If the block is big enough for core and the hbm, kind of, as you still need to take care of the vrm. I think that mounting method was originated somewhere here on ocn.


----------



## drm8627

Quote:


> Originally Posted by *baii*
> 
> If the block is big enough for core and the hbm, kind of, as you still need to take care of the vrm. I think that mounting method was originally somewhere here on ocn.


too bad corsair hasnt made an hg10 type bracket for the nano. that would be perfect.


----------



## baii

Quote:


> Originally Posted by *drm8627*
> 
> too bad corsair hasnt made an hg10 type bracket for the nano. that would be perfect.


Well for 30+50, you can almost by a fury x instead.


----------



## drm8627

Quote:


> Originally Posted by *baii*
> 
> Well for 30+50, you can almost by a fury x instead.


well.. in my area nano goes for 450 usd.

Only reason i want to watercool it is because im putting it in a tu100 case with a 35 tdp cpu, and plan on running the nano at 1440.

This is all a theoretical PC.

there is enough room on the front of the case to mount an AIO , which could exhaust any excess heat from the low tdp cpu, while still keeping the nano cool because there isnt much airflow in the case, and the nano throttles anyway.

anyway. the nano is 450. and the fury x is 650
im just planning on 1440p, locked at 60 hz, with graphics cranked through the roof through nv inspector.
Which nano should be overkill for, let alone a fury x.
Its a hobby build im trying to work some kinks out of.


----------



## akromatic

Quote:


> Originally Posted by *baii*
> 
> If the block is big enough for core and the hbm, kind of, as you still need to take care of the vrm. I think that mounting method was originated somewhere here on ocn.


highly doubt it , the die and the HBM modules are at different height. the only AIO with remote chance of fitting is the stock cooler of the fury x.

vrm is not a big deal, can be cooled by other means or use the stock VRM sink.


----------



## looncraz

Quote:


> Originally Posted by *akromatic*
> 
> highly doubt it , the die and the HBM modules are at different height. the only AIO with remote chance of fitting is the stock cooler of the fury x.
> 
> vrm is not a big deal, can be cooled by other means or use the stock VRM sink.


I thought they were at nearly the exact same height? Because the Fury [X] coolers have flat bases.


----------



## Wickedtt

Hey all I just joined the FuryX club the other day. I have a small issue and wondered if anyone has insight. When ever i set my voltage to anything after +36mv it blackscreens and sometimes blackscreens when it set to that. Any ideas should i get a rma?


----------



## Medusa666

Quote:


> Originally Posted by *Thoth420*
> 
> Sounds like my defective fury x sorry to say. It was confirmed the card was the problem and I had the black squares and issues with 144hz 1440 panel. Also something to consider before the GPU is the cable. A 2meter or shorter(which is very short) DP 1.2a VESA certified cable is what AMD recommends for that reso and refresh rate. The panels all seem to come with substandard cabling so you can try the Accel brand which I am using at the moment or perhaps a 2m Belkin(more reliable company) or shorter cable first before RMAing the GPU. I ruled out cabling as the Belkin and Accel both work on my new GPU but did not solve the issues on the old one so in my case the GPU was defective but def rule out cabling first for a panel like that.


Thanks for the informative and good reply!

I have been thinking about the cable, but I believe it is unlikely it would cause these random effects, especially in only a few selected titles.

However I will try the older drivers before returning the card for RMA service.


----------



## Thoth420

Quote:


> Originally Posted by *Medusa666*
> 
> Thanks for the informative and good reply!
> 
> I have been thinking about the cable, but I believe it is unlikely it would cause these random effects, especially in only a few selected titles.
> 
> However I will try the older drivers before returning the card for RMA service.


No worries and hope you figure it out









I forgot to mention that I have read quite a few reviews about the XG and a popping sound resulting in basically what you reported. Check newegg and amazon user reviews for some more insight on that. XG was the only 144hz (aside the new Eizo) 1440 panel I haven't tried and it was because of the negative reviews mostly sounding like catastophic failure.


----------



## Arizonian

So did a quick SteamVR ready test, 1 Fury = 9.4 Score. Fiji is ready for VR.











Spoiler: My SteamVR score!


----------



## Alastair

Quote:


> Originally Posted by *looncraz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *akromatic*
> 
> highly doubt it , the die and the HBM modules are at different height. the only AIO with remote chance of fitting is the stock cooler of the fury x.
> 
> vrm is not a big deal, can be cooled by other means or use the stock VRM sink.
> 
> 
> 
> I thought they were at nearly the exact same height? Because the Fury [X] coolers have flat bases.
Click to expand...

there is a difference in height. The HBM stacks are taller than the core. Maybe Fury X coolers are flat but I can confirm the Tri-x cooler that came off of my Tri-x Fury's took the difference into consideration.


----------



## Alastair

Quote:


> Originally Posted by *drm8627*
> 
> hey guys. just stopping in to ask a quick question:
> 
> DO you think Linus's ghetto cooling solution would work for a r9 nano?
> 
> 
> 
> 
> I have a reason, im jw if thatd work for the r9 nano.


nope. HBM is taller than core.


----------



## Flamingo

Quote:


> Originally Posted by *Arizonian*
> 
> So did a quick SteamVR ready test, 1 Fury = 9.4 Score. Fiji is ready for VR.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: My SteamVR score!


Fury Nano = 8.4

curves are pretty similar










Spoiler: My SteamVR score!


----------



## baii

My sapphire fury cooler is flat iirc? It doesn't have a step, maybe be it is slightly concave?
the fury wasterblocks seem to be flat, as far as pictured shown.


----------



## Kana-Maru

Quote:


> Originally Posted by *Arizonian*
> 
> So did a quick SteamVR ready test, 1 Fury = 9.4 Score. Fiji is ready for VR.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: My SteamVR score!


Quote:


> Originally Posted by *Flamingo*
> 
> Fury Nano = 8.4
> 
> curves are pretty similar
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: My SteamVR score!


I ran two test. One at 4Ghz DDR3-1400Mhz and the other at 4.8Ghz - DDR3-1675Mhz. I'm on the X58 platform by the way. I ran my Fury X at *STOCK settings* as well.

*4Ghz DDR3-1400Mhz* = 9.3


Spoiler: Warning: Spoiler!







*4.8Ghz - DDR3-1675Mhz* = 9.6


Spoiler: Warning: Spoiler!







4Ghz is still going strong on my 1st gen motherboard. You are correct, Fiji is definitely ready for VR.


----------



## HexagonRabbit

I've noticed that there is definitely a sweet spot when OC'ing the nano. I'm using the radeon software and so far I like it. I got it somewhat stable at 1025 but I'm having a hard time getting it back as I cant seem to stop messing with things that aren't broken. The highest I hit was 1112 but crashed pretty quickly. My temps don't go above 33c


----------



## looncraz

Quote:


> Originally Posted by *Alastair*
> 
> nope. HBM is taller than core.


The stock cooler has a completely flat base.

http://www.legitreviews.com/amd-radeon-r9-fury-x-complete-teardown_166851

Here, they say the memory and the core are effectively the same height...

If there's a difference, it's positively minuscule.


----------



## hyp36rmax

Anyone have crossfire Nano's with the standard 1.6 inch spacing? (MATX) Curious on the heat and performance.


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> I did even more testing with the voltage based throttling. It is 100% power based. The bad news is that it is not affected by BIOS power settings. I've also gone out and gotten all my power measurement gear which is why it became so obvious what's happening.
> 
> I don't have a table for the new results since they aren't worth a table.
> Here's what I got on the new round of testing:
> 
> Fury X at 500/500mhz 1.05V 165FPS about 110W power draw
> Fury X at 500/500mhz 1.39V 165FPS about 230W power draw
> 
> Fury X at 1050mhz 1.2V 302FPS about 310W power draw
> Fury X at 1050mhz 1.4V *287FPS* about 420W power draw
> 
> To remove the throttling I've tried taking the 350W Sapphire Fury Tri-x BIOS unlocking it to 4096 cores and using that. That did absolutely nothing to the "micro" throttle(that's what I'm gonna call it since GPU-z doesn't pick it up). However based on the Tri-X BIOS I made my own 511W 511A, 768W 768A, and 65000W 65000A BIOS. None of those did anything either. I tried doing BIOS mods for more voltage but as far as I can tell that didn't work at all.(I'm probably doing it wrong)
> 
> In theory physical power and voltage mods should not be affected because what ever software is screwing with the core clocks will be fed false information as to what the GPU is actually doing and therefore won't kick in. There's a good chance that even the simple physical volt mod on it's own will be enough to trick the power management. There is also a chance that a combination of physical voltmod + power hacked BIOS will work.
> 
> Now I just need to get all my volt modding supplies again(I moved from CZ to UK).


I would be willing to test the power hacked BIOS in combination with my hard volt mods. How would I go about modifying a Asus Fury X bios or are all the bios cross compatible?


----------



## buildzoid

The Fury X all use the same PCB so the BIOSs are interchangeable(my Powercolor card is right now running a Sapphire Fury BIOS unlocked to 4096 cores). I'll give you a 500W 800W and a 65000W BIOS. Nothing else in them will be changed except the version idetntification string so that ir reflects which BIOS you're using. So instead of the BIOS version being a string of random numbers it will be something like: "Buildzoid.testBIOS.500" or something. Also if anything explodes it's not my fault.


----------



## gupsterg

Quote:


> Originally Posted by *buildzoid*
> 
> I tried doing BIOS mods for more voltage but as far as I can tell that didn't work at all.(I'm probably doing it wrong).


Upload the modded ROM and which you based it on and I'll have a compare







.

I have gained a Sapphire Fury Tri-X







, should be with me by weekend (hopefully it will unlock







).

I won't be hard volt modding it but will have a play with bios







.


----------



## Alastair

Quote:


> Originally Posted by *buildzoid*
> 
> The Fury X all use the same PCB so the BIOSs are interchangeable(my Powercolor card is right now running a Sapphire Fury BIOS unlocked to 4096 cores). I'll give you a 500W 800W and a 65000W BIOS. Nothing else in them will be changed except the version idetntification string so that ir reflects which BIOS you're using. So instead of the BIOS version being a string of random numbers it will be something like: "Buildzoid.testBIOS.500" or something. Also if anything explodes it's not my fault.


have any ideas why the performance is dropping when voltage gets high?


----------



## buildzoid

It seems to be some kind of power management problem.


----------



## Wickedtt

Yeah i have the issue whenever i add more than 30mv i get blackscreens and it comes back for a second and black screens again. is it a known issue or should i just rma and hope for a better sample?


----------



## Alastair

So we have no solutions to the problem thus far. Is this do you think caused by the overclocking programs or do you think it's a BIOS issue. Has anyone tried talking to AMD about it?


----------



## looncraz

Quote:


> Originally Posted by *Wickedtt*
> 
> Yeah i have the issue whenever i add more than 30mv i get blackscreens and it comes back for a second and black screens again. is it a known issue or should i just rma and hope for a better sample?


Does this problem *only* occur when adding voltage?

If so, an RMA might be a bit tricky. Almost sounds like a driver reset, which could indicate a software-only issue, or a voltage regulator issue, or a BIOS issue, or...


----------



## Otterfluff

Quote:


> Originally Posted by *buildzoid*
> 
> The Fury X all use the same PCB so the BIOSs are interchangeable(my Powercolor card is right now running a Sapphire Fury BIOS unlocked to 4096 cores). I'll give you a 500W 800W and a 65000W BIOS. Nothing else in them will be changed except the version idetntification string so that ir reflects which BIOS you're using. So instead of the BIOS version being a string of random numbers it will be something like: "Buildzoid.testBIOS.500" or something. Also if anything explodes it's not my fault.


I don't mind blowing up one of my cards, I was prepared to kill at least one when I first did my hard volt mods. If anything I love tinkering and often buy hardware/tech with the intention to modify it from day one.

I appreciate you supplying your hacked bios.


----------



## Wickedtt

Quote:


> Originally Posted by *looncraz*
> 
> Does this problem *only* occur when adding voltage?
> 
> If so, an RMA might be a bit tricky. Almost sounds like a driver reset, which could indicate a software-only issue, or a voltage regulator issue, or a BIOS issue, or...


from what ive gotten so far i get random shadow and picture distortions from stock but adding volts makes the problem worse even stock clocks add voltage blackscreen instantly i might return it and get another hoping i dont have the same problem.


----------



## looncraz

Quote:


> Originally Posted by *Wickedtt*
> 
> from what ive gotten so far i get random shadow and picture distortions from stock but adding volts makes the problem worse even stock clocks add voltage blackscreen instantly i might return it and get another hoping i dont have the same problem.


Interesting - definitely sounds like a card issue (I assume you did the normal driver uninstall - with DDU - and reinstall already).

Out of curiosity, have you tried undervolting? Leave clocks stock, just move the voltage down. Almost sounds like what happens when you start pushing voltage too hard and things get unstable (yeah, that's a thing







). Probably isn't, but worth a shot (and would probably point to a single failed passive on the card's VRM).


----------



## Wickedtt

Quote:


> Originally Posted by *looncraz*
> 
> Interesting - definitely sounds like a card issue (I assume you did the normal driver uninstall - with DDU - and reinstall already).
> 
> Out of curiosity, have you tried undervolting? Leave clocks stock, just move the voltage down. Almost sounds like what happens when you start pushing voltage too hard and things get unstable (yeah, that's a thing
> 
> 
> 
> 
> 
> 
> 
> ). Probably isn't, but worth a shot (and would probably point to a single failed passive on the card's VRM).


yeah i reinstalled 16.1.1 and 15.12 to see if there was a difference checked both bios. Same issue but i will check undervolting and see if i have issues. But its going back and hoping for a more cooperative one. This experience had me looking at a 980ti for 4k but than i realize i couldnt do it no matter what.


----------



## gupsterg

@buildzoid

Cheers got the ROM, will view ASAP, at present I've started to study @Xtreme Addict LN2 ROMs done by Asus R&D (Link to ROMs).

If anyone else have time/wanna give input use AtomDis to create tables for ROMs, view OP of Hawaii bios mod thread. Do compares of like CU ROMs ie stock vs 3564 VGPU vs 3564 LN2, otherwise you will see changes related to unlock of CU. The CU unlock is located in what's called *TV1OutputControl* command table in AtomDis tables list for a ROM.

Currently working on Stock FuryStrix *Vs* 3564 / VGPU CONTROL *Vs* FIXED LN2 3564 . My pastebin has AtomDis tables lists for these ROMs.

I wouldn't advise flashing the Strix ROMs to a ref PCB Fury/X as VRM design differs.


Spoiler: Sapphire Fury Tri-X OC PCB









Spoiler: Asus Fury Strix PCB







I believe the XA ROMs can be made to support ref PCB Fury/X VRM by modding the VoltageObjectInfo from ref to XA ROMs, as this is what "we" have been doing on Hawaii ROMs.
Quote:


> Note that LN2 Fixed bioses have *disabled monitoring of temperature*, you can flash it and try on air and it won't kill you card, but you will be playing "blind".


This is another thing that we must be aware of when using those ROMs as is or modded to support ref PCB. Hoping to get through enough of the ROMs before my Fury arrives so can start testing ASAP







.

*** edit ***

Started a bios mod thread so hope you and others will join in there







.


----------



## toluun

Hey all just got a Fury-X last month any have some questions regarding pump noise.

I'm curious if anyone has some video on the current pump noise levels after the fix. My card seems to be really load.

I have two videos documenting the noise.my card emits.


----------



## Kana-Maru

Easy. Return it ASAP and get a replacement. You have video proof plus the initial launch Fury Xs had well known noise issues. I have 0 noise coming from my Fury X. I haven't heard that noise your Fury X is emitting before.


----------



## Flamingo

Anyone running the Nano on its stock cooler? I want to see fan curves.

Here is mine, keeps the GPU under 70C, but damn the fan is loud at 3400 RPM lol


----------



## hyp36rmax

Quote:


> Originally Posted by *Flamingo*
> 
> Anyone running the Nano on its stock cooler? I want to see fan curves.
> 
> Here is mine, keeps the GPU under 70C, but damn the fan is loud at 3400 RPM lol


I wonder if there are any aftermarket gpu air coolers that will fit it.


----------



## 98uk

Can someone help? I installed the new 16.2 hotfix drivers today and it gave me some notification saying something like "Radeon settings has detected that one or more high DPI monitors may be connected...", I clicked the notification to read more, but it disappeared and instead brought up Radeon Settings.

Do you know what it was telling me, I think it said I should enable something?


----------



## looncraz

Quote:


> Originally Posted by *98uk*
> 
> Can someone help? I installed the new 16.2 hotfix drivers today and it gave me some notification saying something like "Radeon settings has detected that one or more high DPI monitors may be connected...", I clicked the notification to read more, but it disappeared and instead brought up Radeon Settings.
> 
> Do you know what it was telling me, I think it said I should enable something?


It wanted you to enable Virtual Super Resolution.

Radeon Settings -> Display -> Virtual Super Resolution


----------



## 98uk

Quote:


> Originally Posted by *looncraz*
> 
> It wanted you to enable Virtual Super Resolution.
> 
> Radeon Settings -> Display -> Virtual Super Resolution


Ahhh... I did suspect that. It brought up some settings that I had no idea about. Things like virtual super resolution, gpu scaling etc...

I read up on VSR, but I suspect it'll be of no help to me as I struggle to play my games at 1440p let alone any higher







Perhaps Civ5 will benefit.


----------



## Flamingo

Quote:


> Originally Posted by *hyp36rmax*
> 
> I wonder if there are any aftermarket gpu air coolers that will fit it.


None that I could find.

Also, speaking of VSR, I tried the following games:

Saints Row IV: 3840 X 2160 = 100%GPU usage and around 45fps
Tomb Raider Underworld = 60% GPU usage, but with EnB it goes to 100% and around 55fps

I couldnt get ROTR to run at 1440p or "4K" with VSR enabled.

Also, do reshade patches favour any specific vendor? Like AMD is hit harder than Nvidia or vice versa?


----------



## gupsterg

@buildzoid

See this info The Stilt posted :-
Quote:


> Originally Posted by *The Stilt*
> 
> If the temperature is below the throttling temperature (75°C) and the power draw stays below the TDP (270W, PowerTune / Power Control @ 0%), it is definitely a driver issue and not directly related to PowerTune or PowerPlay. Disabling PowerPlay is definitely not the correct approach to solve it, since it will mess up the power management completely.


I have found the 3 values I believe we need to edit for PowerLimit, see appropriate heading in Fury bios mod thread. My Fury should be in my hands today







and plan to start testing and see what effect edits have. I'm aware your edits have not had an effect but the 3rd value was not being modded, in Hawaii I found if the 3rd value wasn't modded I'd get downclocking even if the other 2 had been upped.


----------



## HexagonRabbit

So far, I love my nano.


----------



## Gomi

Hi guys.

I am about to pull the trigger on a Nano which I will be watercooling. I am however slightly worried about the initial coilwhine problems, can anyone comment on how the coilwhine is?

Unfortunately I am super sensitive to coilwhine, watercooling does not exactly help (with no fans to drown the whine) and the card will be mounted in an open enviroment (not inside a case).


----------



## xTesla1856

From Titan X SLI to 390X Crossfire. Glad to finally try this AMD thing out


----------



## looncraz

Quote:


> Originally Posted by *98uk*
> 
> Ahhh... I did suspect that. It brought up some settings that I had no idea about. Things like virtual super resolution, gpu scaling etc...
> 
> I read up on VSR, but I suspect it'll be of no help to me as I struggle to play my games at 1440p let alone any higher
> 
> 
> 
> 
> 
> 
> 
> Perhaps Civ5 will benefit.


I prefer VSR to AA. Try using VSR without any AA, and you may find that you have the image quality with higher frame rates - that's what works best for me in quite a few games (such as Hitman;Absolution).

I get better framerates with BF4 that way, but I have a 144hz monitor, so VSR's 60hz limits is a restriction I do not like for FPS games.


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> From Titan X SLI to 390X Crossfire. Glad to finally try this AMD thing out


Welcome to the club! Glad to have you here sir.









Its a fun thread. Good group of guys. Lots of possible mods, BIOS, etc... enjoy yourself.


----------



## HexagonRabbit

Quote:


> Originally Posted by *Gomi*
> 
> Hi guys.
> 
> I am about to pull the trigger on a Nano which I will be watercooling. I am however slightly worried about the initial coilwhine problems, can anyone comment on how the coilwhine is?
> 
> Unfortunately I am super sensitive to coilwhine, watercooling does not exactly help (with no fans to drown the whine) and the card will be mounted in an open enviroment (not inside a case).


It goes away.....at least it did for me. I cant hear it if it's there over my rad fans.


----------



## xTesla1856

Quote:


> Originally Posted by *battleaxe*
> 
> Welcome to the club! Glad to have you here sir.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its a fun thread. Good group of guys. Lots of possible mods, BIOS, etc... enjoy yourself.


Thanks for the warm welcome









One question I do have though: How big is the performance delta between the 390X in CF and the Fury (non X) in CF?


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> Thanks for the warm welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One question I do have though: How big is the performance delta between the 390X in CF and the Fury (non X) in CF?


As a guess and this is strictly a guess... maybe 35%. But the 390X overclocks better as a whole so some of that can be made up. The FuryX is probably closer to 40-45%. My understanding is that two FuryX is equal in power to three 390X. If that makes sense. Forgive me if my math is off as I didn't use a calculator or anything. Just kind throwing a stab at it.


----------



## keikei

Quote:


> Originally Posted by *xTesla1856*
> 
> Thanks for the warm welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One question I do have though: How big is the performance delta between the 390X in CF and the Fury (non X) in CF?


These were early benchmarks http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/14


----------



## SuperZan

Quote:


> Originally Posted by *xTesla1856*
> 
> Thanks for the warm welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One question I do have though: How big is the performance delta between the 390X in CF and the Fury (non X) in CF?


As said above it's quite close after tweaks. My Fury X / Fury Xfire in Firestrike:http://www.3dmark.com/fs/7497493 and realmacciv's 390's:http://www.3dmark.com/3dm/9672931 . He gets everything out of those 390s and I'm decent at tweaking my cards.

Apologies for terrible formatting, hashing this post out on the mobile.


----------



## xTesla1856

Thanks for the answers guys, upon some further reflection I decided to change my order for 2 Sapphire R9 Fury Tri-X. Money isn't an issue and I want the most power I can get without water cooling. The Sapphire card seems really cool and quiet (which can't be said for Titans). I will post pictures once my system is fully upgraded. I also have a 5820K and an ASUS RVE on the way


----------



## gupsterg

Wow, little disappointed with Fury vs my Hawaii card (note test is 1080P and Hawaii MAX bios mods vs no mod Fury).

Link:- http://www.3dmark.com/compare/fs/7689856/fs/7447383

CPU is @ 4.9GHz / 4.4GHz cache for both tests, don't know why physics score took such a hit with Fury







, never seen that kinda variation with 4x differing Hawaii cards.


----------



## p4inkill3r

Quote:


> Originally Posted by *gupsterg*
> 
> Wow, little disappointed with Fury vs my Hawaii card (note test is 1080P and Hawaii MAX bios mods vs no mod Fury).
> 
> Link:- http://www.3dmark.com/compare/fs/7689856/fs/7447383
> 
> CPU is @ 4.9GHz / 4.4GHz cache for both tests, don't know why physics score took such a hit with Fury
> 
> 
> 
> 
> 
> 
> 
> , never seen that kinda variation with 4x differing Hawaii cards.


I think you can get a bit more out of your Fury tbh: http://www.3dmark.com/3dm/10951388?


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> Wow, little disappointed with Fury vs my Hawaii card (note test is 1080P and Hawaii MAX bios mods vs no mod Fury).
> 
> Link:- http://www.3dmark.com/compare/fs/7689856/fs/7447383
> 
> CPU is @ 4.9GHz / 4.4GHz cache for both tests, don't know why physics score took such a hit with Fury
> 
> 
> 
> 
> 
> 
> 
> , never seen that kinda variation with 4x differing Hawaii cards.


your not alone, i'm noticing varying physics scores as well, i thought it was because of windows 10. now im not so sure... i think im having some major driver malfunctions with fury and crimson. gonna roll back to 15.10 and see what happens


----------



## Arizonian

Quote:


> Originally Posted by *gupsterg*
> 
> Wow, little disappointed with Fury vs my Hawaii card (note test is 1080P and Hawaii MAX bios mods vs no mod Fury).
> 
> Link:- http://www.3dmark.com/compare/fs/7689856/fs/7447383
> 
> CPU is @ 4.9GHz / 4.4GHz cache for both tests, don't know why physics score took such a hit with Fury
> 
> 
> 
> 
> 
> 
> 
> , never seen that kinda variation with 4x differing Hawaii cards.


Quote:


> Originally Posted by *p4inkill3r*
> 
> I think you can get a bit more out of your Fury tbh: http://www.3dmark.com/3dm/10951388?


Yeah it can do better, my fury on air

http://www.3dmark.com/3dm/10006411?


----------



## Kana-Maru

Quote:


> Originally Posted by *dagget3450*
> 
> your not alone, i'm noticing varying physics scores as well, i thought it was because of windows 10. now im not so sure... i think im having some major driver malfunctions with fury and crimson. gonna roll back to 15.10 and see what happens


I had some issues as well. Make sure you run DDU and AMDs driver removal tool. Then reinstall Crimson 16.2 Drivers.

1) DDU = http://www.guru3d.com/files-details/display-driver-uninstaller-download.html
2) AMD Driver Removal Tool = http://support.amd.com/en-us/kb-articles/Pages/AMD-Clean-Uninstall-Utility.aspx
3) Install and run benchmarks.

I was having issues with my Fury X. I previously had Nvidia GTX installed prior to upgrading. AMD has been pushing out drivers rapidly lately. I finally decided to run DDU and AMD Tool and it has fixed all of the issues I was having and my scores greatly increased in FireStrike and other gaming benchmarks. Hopefully that helps.


----------



## Arizonian

Don't forget Bradleys removal guide for left over stuff in registery. A worthy pain to go through.

http://www.overclock.net/t/988215/how-to-remove-your-amd-gpu-drivers-new-2016


----------



## gupsterg

@p4inkill3r

Cheers for score link, ok starting to get a bit of more "get up and go"







.

Fury out of box *Vs* Unlock to 3776 *Vs* [email protected] *Vs* [email protected] *Vs* Hawaii 290X

I'm up to about page 80 of this thread and had some questions:-

q1) I'm getting voltage control in MSI AB, I thought there was none? (no bios mods yet other than unlock to 3776)

q2) I was reading several of the TPU reviews on Fury and noting the voltage readings, what do you guys get as VID per DPM? and say i2cdumps?



In i2cdump IR3567B is responding on same bus/address as Hawaii cards (a lot is looking familiar, strangely).

Even though GPU-Z is not reading an ASIC quality (aka LeakageID) I'm reckoning some profiling is going on to set DPM voltages so need info from you guys on what your VIDs are.

@dagget3450

I'm on Win 7 x64, Crimson 16.2 Hotfix. Physics seems better/consistent now after unlock to 3776







.

@Kana-Maru

I know your post not directed at me but just thought I'd state I did do the DDU, etc process prior to change from Hawaii to Fury.

@Arizonian

I'm on air as well, Sapphire Fury Tri-X (STD).

I thought the OC edition only came with increased PL ROMs but the STD does as well (Towards IO file is increased PL on my card from factory.).

Fury_Tri-X_STD.zip 207k .zip file


Cheers for score link, are you adding voltage?


----------



## p4inkill3r

Extend official overclocking limits in AB, give it some voltage, raise the power limit, take the core to 1100 and memory to 545, see what you get.


----------



## Wickedtt

Quote:


> Originally Posted by *p4inkill3r*
> 
> I think you can get a bit more out of your Fury tbh: http://www.3dmark.com/3dm/10951388?


Pain my furyX at 1160/560 only get 17.6k graphics score how in the hell haha just wondering if you got some tricks or my old X58 mobo holding me back. X5660 @ 4.7 1666mhz ram. also here is my welcome to the club. http://www.techpowerup.com/gpuz/details.php?id=mpzkc


----------



## p4inkill3r

Quote:


> Originally Posted by *Wickedtt*
> 
> Pain my furyX at 1160/560 only get 17.6k graphics score how in the hell haha just wondering if you got some tricks or my old X58 mobo holding me back. X5660 @ 4.7 1666mhz ram. also here is my welcome to the club. http://www.techpowerup.com/gpuz/details.php?id=mpzkc


Here are a couple of scores with an 8320 and a g3258 from a while back:
http://www.3dmark.com/fs/5551553
http://www.3dmark.com/fs/5587570

Both were at stock settings on the Fury, but they both came in around 16k. Driver maturation and a healthy overclock could be all the difference, I think.


----------



## dagget3450

for funsies...

http://www.3dmark.com/3dm/10953772?



16.2 crimson, only 4ghz on cpu/ furys are stock clocks with tess off.

One of these days we will have a half decent whql driver we can post actual scores online.


----------



## Wickedtt

Hmm mine stock with 16.2 drivers fresh install. http://www.3dmark.com/3dm/10954609 at 1180/550 http://www.3dmark.com/3dm/10954709 so i wish i could get the massive gains your getting, might be chipset/pcie 3.0 not sure.


----------



## JunkaDK

Quote:


> Originally Posted by *Wickedtt*
> 
> Hmm mine stock with 16.2 drivers fresh install. http://www.3dmark.com/3dm/10954609 at 1180/550 http://www.3dmark.com/3dm/10954709 so i wish i could get the massive gains your getting, might be chipset/pcie 3.0 not sure.


Did you turn tessalation off like he did?


----------



## gupsterg

Quote:


> Originally Posted by *p4inkill3r*
> 
> Extend official overclocking limits in AB, give it some voltage, raise the power limit, take the core to 1100 and memory to 545, see what you get.


Cheers pretty much like hawaii then, what's the norm for GPU core voltage offset members are adding?


----------



## Flamingo

Quote:


> Originally Posted by *p4inkill3r*
> 
> I think you can get a bit more out of your Fury tbh: http://www.3dmark.com/3dm/10951388?


Quote:


> Originally Posted by *Arizonian*
> 
> Yeah it can do better, my fury on air
> 
> http://www.3dmark.com/3dm/10006411?


Fury Nano = http://www.3dmark.com/fs/7693824


----------



## Semel

1180Mhz...1175Mhz.... I envy you guys lol

I need a whopping +72 mV (unofficial OCing mode!) to make it work stable at 1120 (all games) or 1140Mhz max (all except for witcher 3 , rise of the tomb raider). WHen I was experimenting with unlocked voltage via trixx I could push it to 1150(+100+mV) but after that (1160-1180Mhz) regardless of voltage used the card performed as if it wasn't OCed at all

And I couldn't even fully unlock my fury.. (3840 stream processors)

I read how someone in this thread flashed trixx OC edition bios and it helped for some reason to get higher OC but it didn't work out in my case,


----------



## p4inkill3r

Quote:


> Originally Posted by *JunkaDK*
> 
> Did you turn tessalation off like he did?


That should be the first thing anyone does when using an AMD card in 3DMark.


----------



## JunkaDK

Quote:


> Originally Posted by *Semel*
> 
> 1180Mhz...1175Mhz.... I envy you guys lol
> 
> I need a whopping +72 mV (unofficial OCing mode!) to make it work stable at 1120 (all games) or 1140Mhz max (all except for witcher 3 , rise of the tomb raider). WHen I was experimenting with unlocked voltage via trixx I could push it to 1150(+100+mV) but after that (1160-1180Mhz) regardless of voltage used the card performed as if it wasn't OCed at all
> 
> And I couldn't even fully unlock my fury.. (3840 stream processors)
> 
> I read how someone in this thread flashed trixx OC edition bios and it helped for some reason to get higher OC but it didn't work out in my case,


Well.. tbh i don't alot are running 1175 stabile in gaming.. I can do 3D Mark at 1180Mhz 545Mhz Mem, but max stable gaming atm is also around 1120mhz at 545mhz ram. This is at +12mV / 30% extra power.


----------



## JunkaDK

Quote:


> Originally Posted by *p4inkill3r*
> 
> That should be the first thing anyone does when using an AMD card in 3DMark.


Why is that if you care to elaborate?







i mean the score is alot better, but the result is also invalid.


----------



## dagget3450

Quote:


> Originally Posted by *Wickedtt*
> 
> Hmm mine stock with 16.2 drivers fresh install. http://www.3dmark.com/3dm/10954609 at 1180/550 http://www.3dmark.com/3dm/10954709 so i wish i could get the massive gains your getting, might be chipset/pcie 3.0 not sure.


I don't think your replying to my posted score, but if you were i can see the easy confusion. I forgot to say mine was 4x gpu and FSU not FS. The scores are so close too lol. I never realized that.


----------



## Flamingo

Eh noob question, but what does target fan speed and target GPU temperature achieve in the overdrive tab ?


----------



## p4inkill3r

Quote:


> Originally Posted by *JunkaDK*
> 
> Why is that if you care to elaborate?
> 
> 
> 
> 
> 
> 
> 
> i mean the score is alot better, but the result is also invalid.


Going back to Radeon 5000 series, tessellation has been a way for software to choke AMD's cards.
When we first got the option to modify it in Catalyst 11.1, everyone called it a driver hack, a cheat, etc.
Fast forward to 2016 and there are many examples of tessellation for tessellation's sake being used (Gameworks, et al) and the ability to turn it down/off is an intrinsic part of AMD's driver suite, for very good reason.


----------



## JunkaDK

Quote:


> Originally Posted by *p4inkill3r*
> 
> Going back to Radeon 5000 series, tessellation has been a way for software to choke AMD's cards.
> When we first got the option to modify it in Catalyst 11.1, everyone called it a driver hack, a cheat, etc.
> Fast forward to 2016 and there are many examples of tessellation for tessellation's sake being used (Gameworks, et al) and the ability to turn it down/off is an intrinsic part of AMD's driver suite, for very good reason.


Thanks a lot for clarifying that







But don't you miss out on some graphic details or?







or is that not noticeable?


----------



## p4inkill3r

Quote:


> Originally Posted by *JunkaDK*
> 
> Thanks a lot for clarifying that
> 
> 
> 
> 
> 
> 
> 
> But don't you miss out on some graphic details or?
> 
> 
> 
> 
> 
> 
> 
> or is that not noticeable?


I suggest you run Firestrike with and without tessellation enabled and find out for yourself.

This isn't a new phenomenon by any means:


----------



## Semel

Quote:


> Originally Posted by *JunkaDK*
> 
> . This is at +12mV / 30% extra power.


Just +12 mV (vs my +72mV) ? Damn.. I wish my card was like yours lol.. I bet you could OC it considerably more if you added voltage.


----------



## HexagonRabbit

What are the opinions of you guys on AMD's overclocking software included in crimson?


----------



## Kana-Maru

I don't use AMD overclock utility, but you can use it. There's plenty of OC software to choose from.


----------



## fjordiales

Quote:


> Originally Posted by *JunkaDK*
> 
> Well.. tbh i don't alot are running 1175 stabile in gaming.. I can do 3D Mark at 1180Mhz 545Mhz Mem, but max stable gaming atm is also around 1120mhz at 545mhz ram. This is at +12mV / 30% extra power.


With these OC settings, what voltage are you getting while gaming/firestrike?


----------



## flopper

Quote:


> Originally Posted by *JunkaDK*
> 
> Why is that if you care to elaborate?
> 
> 
> 
> 
> 
> 
> 
> i mean the score is alot better, but the result is also invalid.


tesselation wont add more to image quality at some point, Nvidias made some developers add tesselation in mass to choke amd cards to make them seem worse.
I find 8x tesselation seems to be a good fit when gaming.


----------



## JunkaDK

Quote:


> Originally Posted by *fjordiales*
> 
> With these OC settings, what voltage are you getting while gaming/firestrike?


At 1120/545 im running 12mV/30% and benchmarking at 1185 which is the max is has completede im running 96mV/50%.. if thats what you wanted to know?


----------



## Jesse36m3

Is anybody running a single Nano on an ultrawide?

What kind of framerates are you getting?

Is it worth the extra $100 to get a Fury X if I'm going to strip the cooler off it anyway? (Rig is already watercooled, so I'd be adding in new blocks).


----------



## Radox-0

Quote:


> Originally Posted by *Jesse36m3*
> 
> Is anybody running a single Nano on an ultrawide?
> 
> What kind of framerates are you getting?
> 
> Is it worth the extra $100 to get a Fury X if I'm going to strip the cooler off it anyway? (Rig is already watercooled, so I'd be adding in new blocks).


Well my Nano is in my HTPC but do have a 3440 x 1440 ultrawide (ASUS ROG PG348Q) so if you got a few games in mind, can hook it up and run off some benchmarks. Its also under water so runs at 1125 / 545 mhz for gaming normally.

I will say from when I was using it when testing, the nano did perform well, typically high (rather then ultra / maxed out settings) maintained 60 fps in most cases.


----------



## Jesse36m3

Quote:


> Originally Posted by *Radox-0*
> 
> Well my Nano is in my HTPC but do have a 3440 x 1440 ultrawide (ASUS ROG PG348Q) so if you got a few games in mind, can hook it up and run off some benchmarks. Its also under water so runs at 1125 / 545 mhz for gaming normally.
> 
> I will say from when I was using it when testing, the nano did perform well, typically high (rather then ultra / maxed out settings) maintained 60 fps in most cases.


Cool, thanks for the info. GTAV and Assetto Corsa / Project cars is my go to game for killing time.

Also really enjoy FPS games like Bioshock, Metro, BF, Crysis, etc.

How much of an overclock is that? Stock clocks are 1Ghz / 500Mhz right?


----------



## Noirgheos

Does anyone notice that when forcing tess to 16x for some games, it causes more stutters? Average FPS is slightly higher, but more stutters. Put it back to use App settings, and its all smooth again, but slightly lower average. Witcher 3 and Fallout 4 for example. Should I set them to AMD optimized instead? Against popular opinion, I do notice a minor difference between 8x and 16x tessellation...


----------



## Radox-0

Quote:


> Originally Posted by *Jesse36m3*
> 
> Cool, thanks for the info. GTAV and Assetto Corsa / Project cars is my go to game for killing time.
> 
> Also really enjoy FPS games like Bioshock, Metro, BF, Crysis, etc.
> 
> How much of an overclock is that? Stock clocks are 1Ghz / 500Mhz right?


Cool, got some of them, so will get some of the bench marks done. Yeah Stock is 1000 / 500 I normally sit at 1125 / 545. I will notch it down to 1100 / 545 and see what I get.


----------



## fjordiales

Quote:


> Originally Posted by *JunkaDK*
> 
> At 1120/545 im running 12mV/30% and benchmarking at 1185 which is the max is has completede im running 96mV/50%.. if thats what you wanted to know?


My bad, lemme change the wording. Does your fury strix go past 1.2v with 12mv?

Most fury strix are 1.69v stock.


----------



## xTesla1856

My Nitro R9 Furies are arriving tomorrow, can't wait to finally get them in my system. Gonna give the Titans one last beating before sending them off to their new home tomorrow


----------



## Radox-0

Quote:


> Originally Posted by *Jesse36m3*
> 
> Cool, thanks for the info. GTAV and Assetto Corsa / Project cars is my go to game for killing time.
> 
> Also really enjoy FPS games like Bioshock, Metro, BF, Crysis, etc.
> 
> How much of an overclock is that? Stock clocks are 1Ghz / 500Mhz right?


Not got all the games, but here is a few of them I do. For refrence its paired with 4690k @ 4.0 Ghz and the nano was at 1100 Mhz / 545 Mhz

Tomb Raider - Ultra Preset
Avg - 108
Max - 126
Min - 90

Metro Last Light Redux - Quality: High, SSAO: on, 16x AF, Tessellation high
Avg - 45
Max - 134
Min - 17 ( though that due to a random frame in the sequence)

Arkham Knight - All maxed but no Nvida gamework features on
Avg - 65
Max - 108
Min - 81

Shadow Of mordor - Ultra Preset
Avg - 74
Max - 137
Min - 36

Bioshock Infinate
Highest settings - never under 60 fps

GTA 5
Averaged these results in the 4 tests:
Test 1: 82
Test 2: 91
Test 3: 104
Test 4: 90

Settings: TXAA, no MSAA, No water MSAA, Max vegetation, max population, max line of sight, long shadows maxed. So basically decent settings can notch up the settings as there is some fps to play with.


----------



## Jesse36m3

Quote:


> Originally Posted by *Radox-0*
> 
> Not got all the games, but here is a few of them I do. For refrence its paired with 4690k @ 4.0 Ghz and the nano was at 1100 Mhz / 545 Mhz
> 
> Tomb Raider - Ultra Preset
> Avg - 108
> Max - 126
> Min - 90
> 
> Metro Last Light Redux - Quality: High, SSAO: on, 16x AF, Tessellation high
> Avg - 45
> Max - 134
> Min - 17 ( though that due to a random frame in the sequence)
> 
> Arkham Knight - All maxed but no Nvida gamework features on
> Avg - 65
> Max - 108
> Min - 81
> 
> Shadow Of mordor - Ultra Preset
> Avg - 74
> Max - 137
> Min - 36
> 
> Bioshock Infinate
> Highest settings - never under 60 fps
> 
> GTA 5
> Averaged these results in the 4 tests:
> Test 1: 82
> Test 2: 91
> Test 3: 104
> Test 4: 90
> 
> Settings: TXAA, no MSAA, No water MSAA, Max vegetation, max population, max line of sight, long shadows maxed. So basically decent settings can notch up the settings as there is some fps to play with.


Thanks, I appreciate you taking the time to gather these results.


----------



## Flamingo

I was getting only 37fps on average at 1080p on the starting scene of ROTR and was kinda disappointed since I thought 4K would be unplayable at this rate. And it was at 8tps

Turns out I had SSAA 4X on all the time which meant the game was basically rendering at 4K at 1080p and 4 times the resolution at 4K

Any other features I should be vary off besides tessellation and hbao+


----------



## keikei

Quote:


> Originally Posted by *Flamingo*
> 
> I was getting only 37fps on average at 1080p on the starting scene of ROTR and was kinda disappointed since I thought 4K would be unplayable at this rate. And it was at 8tps
> 
> Turns out I had SSAA 4X on all the time which meant the game was basically rendering at 4K at 1080p and 4 times the resolution at 4K
> 
> Any other features I should be vary off besides tessellation and hbao+


It depends on your taste. Some impact performance heavier than others. http://www.geforce.com/whats-new/guides/rise-of-the-tomb-raider-graphics-and-performance-guide


----------



## gupsterg

Well got 3840 CU unlock on card







(tested 1hr 45min loop of 3DM FS all tests, gonna leave card [email protected] overnight see if get an error).

Some 3DM FS scaling results, Link:- Out of box (3584CU) *vs* 3776CU unlock *vs* 3840CU unlock

I think the lower physics score for 3584CU result is an anomaly, as I have not been able to gain same low result again.

One thing that is amazing me is how inaudible the card is!







, only thing bugging me is when fans do spins up from off position there is a distinct faint little click I hear.

My whole system is on air, I have 2x140m TY-143 as intakes, 2x 140mm TY-143 on Archon SB-E X2, 1x exhaust 120mm, 2x 92mm exhaust and 1x 135mm in the CM V850 as an exhaust (all of them spin lazily).

Just started marking "PowerPlay" of ROM and hoping to do some bios fan profile mods tomorrow







.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Well got 3840 CU unlock on card
> 
> 
> 
> 
> 
> 
> 
> (tested 1hr 45min loop of 3DM FS all tests, gonna leave card [email protected] overnight see if get an error).
> 
> Some 3DM FS scaling results, Link:- Out of box (3584CU) *vs* 3776CU unlock *vs* 3840CU unlock
> 
> I think the lower physics score for 3584CU result is an anomaly, as I have not been able to gain same low result again.
> 
> One thing that is amazing me is how inaudible the card is!
> 
> 
> 
> 
> 
> 
> 
> , only thing bugging me is when fans do spins up from off position there is a distinct faint little click I hear.
> 
> My whole system is on air, I have 2x140m TY-143 as intakes, 2x 140mm TY-143 on Archon SB-E X2, 1x exhaust 120mm, 2x 92mm exhaust and 1x 135mm in the CM V850 as an exhaust (all of them spin lazily).
> 
> Just started marking "PowerPlay" of ROM and hoping to do some bios fan profile mods tomorrow
> 
> 
> 
> 
> 
> 
> 
> .


OOOOoooo.ooo I see you moved forward







and now u have a fully unlocked, custom pcb, Fuxry X. Nice! (or its not custom pcb ?)

Got bored with 290X ?


----------



## gupsterg

Hi Fat4l







,

No not custom PCB, its ref PCB Sapphire Fury Tri-X STD edition: only custom on market is Asus Strix Fury AFAIK.

Fury X is 4096 SP, mine sadly if unlocked to all CUs artifacts badly at stock clocks in 3D loads. TBH not bad getting 3840SP FOC, you could say it's between the 2; as exactly 4 out of the 8 CU difference unlocked.

Did I need Fury? hell no .... why did I get? .... to do the bios mods







....

When I saw a deal on one I thought get it and you can first hand test mods, instead of advising others ....

Today I modded the fan profile in Advanced mode (aka Fuzzy Logic, this is stock method for Fury) even though temps are now ~<65C (was about ~75C) it's still way quieter than Vapor-X 290X or Tri-X 290. I keep flipping using a torch to see through the mesh side panel if the fans are on or not! LOL

Dunno know if I'm gonna keep it, see how it goes really ...


----------



## Radox-0

Quote:


> Originally Posted by *gupsterg*
> 
> Hi Fat4l
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> No not custom PCB, its ref PCB Sapphire Fury Tri-X STD edition: only custom on market is Asus Strix Fury AFAIK.
> 
> Fury X is 4096 SP, mine sadly if unlocked to all CUs artifacts badly at stock clocks in 3D loads. TBH not bad getting 3840SP FOC, you could say it's between the 2; as exactly 4 out of the 8 CU difference unlocked.
> 
> Did I need Fury? hell no .... why did I get? .... to do the bios mods
> 
> 
> 
> 
> 
> 
> 
> ....
> 
> When I saw a deal on one I thought get it and you can first hand test mods, instead of advising others ....
> 
> Today I modded the fan profile in Advanced mode (aka Fuzzy Logic, this is stock method for Fury) even though temps are now ~<65C (was about ~75C) it's still way quieter than Vapor-X 290X or Tri-X 290. I keep flipping using a torch to see through the mesh side panel if the fans are on or not! LOL
> 
> Dunno know if I'm gonna keep it, see how it goes really ...


Sweet loved that card for the short time I had it. Looks like your reason is epic also, Bios modding









FYI, the Nitro variant of the sapphire card actually uses a custom PCB which differs from the Sapphire Tri-X model, which as you rightly pointed out is a Ref PCB


----------



## nadja92

So today I have become an owner











Now to wait for the water block to come...


----------



## Noirgheos

Quote:


> Originally Posted by *gupsterg*
> 
> Hi Fat4l
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> No not custom PCB, its ref PCB Sapphire Fury Tri-X STD edition: only custom on market is Asus Strix Fury AFAIK.
> 
> Fury X is 4096 SP, mine sadly if unlocked to all CUs artifacts badly at stock clocks in 3D loads. TBH not bad getting 3840SP FOC, you could say it's between the 2; as exactly 4 out of the 8 CU difference unlocked.
> 
> Did I need Fury? hell no .... why did I get? .... to do the bios mods
> 
> 
> 
> 
> 
> 
> 
> ....
> 
> When I saw a deal on one I thought get it and you can first hand test mods, instead of advising others ....
> 
> Today I modded the fan profile in Advanced mode (aka Fuzzy Logic, this is stock method for Fury) even though temps are now ~<65C (was about ~75C) it's still way quieter than Vapor-X 290X or Tri-X 290. I keep flipping using a torch to see through the mesh side panel if the fans are on or not! LOL
> 
> Dunno know if I'm gonna keep it, see how it goes really ...


I wouldn't mind your Fury to CF with mine.... lol


----------



## gupsterg

Quote:


> Originally Posted by *Radox-0*
> 
> Looks like your reason is epic also, Bios modding

















....
Quote:


> Originally Posted by *Radox-0*
> 
> FYI, the Nitro variant of the sapphire card actually uses a custom PCB which differs from the Sapphire Tri-X model, which as you rightly pointed out is a Ref PCB


Cheers for info







, just viewed the PCB images in KitGuru review, to me it seems so sparsely populated with components, that it looks like they cheaped out on the board.

The Strix to me seems like the better PCB, more phases and from past experience of their chokes, I reckon they must be coil whine free.

I owned an Asus DCUII 290X for a while, coil whine was non existent. IMO the Vapor-X 290X with Sapphire Black diamond chokes is between the DCUII and Tri-X 290 with ref PCB.
Quote:


> Originally Posted by *Noirgheos*
> 
> I wouldn't mind your Fury to CF with mine.... lol


LOL, will let you know if it comes on the market







.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Hi Fat4l
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> No not custom PCB, its ref PCB Sapphire Fury Tri-X STD edition: only custom on market is Asus Strix Fury AFAIK.
> 
> Fury X is 4096 SP, mine sadly if unlocked to all CUs artifacts badly at stock clocks in 3D loads. TBH not bad getting 3840SP FOC, you could say it's between the 2; as exactly 4 out of the 8 CU difference unlocked.
> 
> Did I need Fury? hell no .... why did I get? .... to do the bios mods
> 
> 
> 
> 
> 
> 
> 
> ....
> 
> When I saw a deal on one I thought get it and you can first hand test mods, instead of advising others ....
> 
> Today I modded the fan profile in Advanced mode (aka Fuzzy Logic, this is stock method for Fury) even though temps are now ~<65C (was about ~75C) it's still way quieter than Vapor-X 290X or Tri-X 290. I keep flipping using a torch to see through the mesh side panel if the fans are on or not! LOL
> 
> Dunno know if I'm gonna keep it, see how it goes really ...


What im tempted about is getting another 290X but now MATRIX one!
There's on ebay atm. 0.95v is making me want it


----------



## buildzoid

The Nitro PCB looks like a slightly up graded refrence PCB. I wouldn't be surprised if the Nitro worked with the Tri-X BIOS.


----------



## gupsterg

I would agree that the Nitro ROM should work with Tri-X, I'd be surprised if it differs at all.

Currently still prefer the Strix TBH, 12 phase VRM and solid chokes gotta be better IMO based on DCUII experience.

I wonder if they'll ever do Vapor-X ..... now that I would want







.... or Lightning Fury .... maybe Asus will do Matrix Fury as that on Hawaii had so many controls like MVDDC / fSW / 0.95V rail







...


----------



## Radox-0

Yep does not look too much different from the ref pcb. Not seen to much information on how the strix varient works and how well it can oc and what not. Seems to be a limited amount of them sadly.


----------



## nyk20z3

Well my 980 Matrix is a total loss so what do you guys think about a R9 Nano under water in my 05S build ?

Ive been using nvidia since 2007 but i am willing to give AMD a try, i game on 1440 so the Nano should hold its own for the foreseeable future.



As you can see the Matrix was a monster gpu so the nano would free up a ton of space and look more natural in this case.



Its either i go with the Nano or wait until amd and nvidia drop there new cards.


----------



## Semel

I would wait for new cards tbh.


----------



## Radox-0

Quote:


> Originally Posted by *nyk20z3*
> 
> Well my 980 Matrix is a total loss so what do you guys think about a R9 Nano under water in my 05S build ?
> 
> Its either i go with the Nano or wait until amd and nvidia drop there new cards.


Well pretty sure you know my opinion as that would pretty much describe my build









Nano in the PC-05s works a treat IMO and under water the Nano performs flawlessly and would pretty much outpace the 980.



Like you said free's up a ton of space and looks sweet on its riser card like that. While I would say wait also, I expect we may not see the Nano form factor in the next gen cards for a while yet.


----------



## p4inkill3r

Quote:


> Originally Posted by *nyk20z3*
> 
> Well my 980 Matrix is a total loss so what do you guys think about a R9 Nano under water in my 05S build ?
> 
> Ive been using nvidia since 2007 but i am willing to give AMD a try, i game on 1440 so the Nano should hold its own for the foreseeable future.
> 
> 
> 
> As you can see the Matrix was a monster gpu so the nano would free up a ton of space and look more natural in this case.
> 
> 
> 
> Its either i go with the Nano or wait until amd and nvidia drop there new cards.


Nano @ $479 is a pretty good deal. http://www.amazon.com/dp/B0159EWJO6


----------



## nyk20z3

Quote:


> Originally Posted by *Radox-0*
> 
> Well pretty sure you know my opinion as that would pretty much describe my build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nano in the PC-05s works a treat IMO and under water the Nano performs flawlessly and would pretty much outpace the 980.
> 
> 
> 
> Like you said free's up a ton of space and looks sweet on its riser card like that. While I would say wait also, I expect we may not see the Nano form factor in the next gen cards for a while yet.


Thats what i was thinking as well these mini cards dont come around quite often.

I will consider every one's advice!


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> I would agree that the Nitro ROM should work with Tri-X, I'd be surprised if it differs at all.
> 
> Currently still prefer the Strix TBH, 12 phase VRM and solid chokes gotta be better IMO based on DCUII experience.
> 
> I wonder if they'll ever do Vapor-X ..... now that I would want
> 
> 
> 
> 
> 
> 
> 
> .... or Lightning Fury .... maybe Asus will do Matrix Fury as that on Hawaii had so many controls like MVDDC / fSW / 0.95V rail
> 
> 
> 
> 
> 
> 
> 
> ...


The Stock Fury X VRM is kick ass. I suspect the Nitro to actually be a little less powerful but more efficient. The STRIX VRM is definitely excellent.

I suspect the Fury X ref PCB has an overly noisy VRM however neither the Nitro nor the STRIX seem to clock much better.

I did recently by a basic oscilloscope and once I learn to use it I will do voltage ripple measurements on the Fury X.


----------



## gupsterg

Quote:


> Originally Posted by *buildzoid*
> 
> The Stock Fury X VRM is kick ass. I suspect the Nitro to actually be a little less powerful but more efficient.


As you know learning very slowly about VRM.

TBH I'm pretty happy how cool VRM is even on air after fan profile mod. IIRC I was touching 90C with stock bios fan profile (Max ~900ish RPM), now depending on room ambient <65C (~1200-1500RPM ).
Quote:


> Originally Posted by *buildzoid*
> 
> I did recently by a basic oscilloscope and once I learn to use it I will do voltage ripple measurements on the Fury X.


Ahhh great news







....

I know how to do fSW mod in bios for hawaii, as do not have oscilloscope (or know how to use) I'm waiting on some test results from @MihaStar.

When I viewed i2cdump for Fury it was near identical (quick visual view) and as you know they use IR3567B.

Once you're ready let me know and test if fSW is 490kHz on Fury as i2cdump suggests it is that.

I'm planning on lowering fSW, The Stilt did this in mining/MLU build ROMs, improved VRM temps & efficiency. I wasn't losing any OC headroom on the Tri-X 290 (ref PCB) by using 290kHz.


----------



## buildzoid

fSW is MOSFET switching frequency?


----------



## gupsterg

Yes.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Yes.


Any voltage bios mods ? Or we are not this far yet ?









Btw why didnt u go with Fury nano ? The price is


----------



## escksu

Is there anyway to reduce coil whine on the nano? I sold my nano because of the coil whine. If it can be reduced by undervolting, I will get another nano till polaris is out.


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> Any voltage bios mods ? Or we are not this far yet ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw why didnt u go with Fury nano ? The price is


Seen VID go up when changed gpu frequency per DPM up, rom was evv mode. This result is strange compared with hawaii. Doing further tests today







.

Why i got tri-x fury was:-

a) total knee jerk purchase to do bios mods

b) at the time only £10 mode than nano, yes was losing SP count but gaining 2 extra VRM phases plus better air cooling for OC tests

c) also wanted to "experience" unlock for real, if you get what I mean
Quote:


> Originally Posted by *escksu*
> 
> Is there anyway to reduce coil whine on the nano? I sold my nano because of the coil whine. If it can be reduced by undervolting, I will get another nano till polaris is out.


TBH I'm not experiencing any huge coil whine at present, very very slight at the momment. Mostly i use pc with no sound output and silent room, if i had slight noise in room i'd reckon i would not even note it. I'm using fury at 1080p 120hz over dp.

Early days in bios mod to test undervolting by rom, did you not test undervolt with MSI AB?

VRM switching frequency mod would probably change aspect of coil whine, to a degree i' m guessing.


----------



## 98uk

How are prices for Fury cards changing in the USA?

Since I bought mine, the prices have gone up a good €50 per card, i've never seen it lower than I bought it. Seem I got lucky as I bought at the lowest ever recorded price.


----------



## Flamingo

Has anyone tried overclocking only using Crimson?

Was playing around with the Target GPU Temp and Target Fan Setting to see how they work.

Using Target GPU temp results in throttling though.

1) Set Temp to 64C, target fan untouched (which is around 60%).
Ran Heaven > Temps stayed at 64C but 500Mhz, alot of throttling/

2) Set Temp to 64C, Fan @ 100%
Ran Heaven > Temps stayed at 64C, clocks hovering around 883 to 930Mhz, fans @ 100%.

So I increased the temp limit to 70C, stayed at 1000Mhz until it reached 70C - which took a min or three - clocks are anywhere between 920 - 1000Mhz.

Seems good enough. Kinda prefer this to Afterburner as the fan doesnt increase or drop suddenly. Fan speed goes down but reeal slowly, which I guess is okay.

How would I go about overclocking (increase power limit and clock) and prevent throttling while staying below 70-80C (ie setting Target GPU temp to 80C for example).


----------



## Flamingo

Quote:


> Originally Posted by *98uk*
> 
> How are prices for Fury cards changing in the USA?


https://jet.com/search?term=R9%20nano

Sapphire R9 Nano was around 499USD, price has jumped a bit.

Also idk whats up with the VisionTek brand, why is it so expensive everywhere? I have visiontek which has no coil whine - is it because its a good/recent batch or so where there is no coil whine?


----------



## gupsterg

Target temp in overdrive panel is Max ASIC temp an attribute of powerlimit, so yes will make card throttle if your getting close to the temp.

What I would do is fan profile bios mod, there is a temp target in there when adjusted makes stock method of cooling aka advanced mode(fuzzy logic) adjust to that target. I set mine to 55C, depending on room ambient hitting anywhere between 1000-1400rpm under load (still very quiet). It's also improved VRM temp by alot and you won't get GPU throttle as it ramps fan as high required to maintain temp target in rom. The max the fan will ramp to under FL is determined by the OD fan target, which can also be modded in bios; becomes new default setting (still adjustable in OD).

At idle fans still switch off, it also improves aspect of when fans come on and they run a little longer after GPU unloaded.


----------



## Flamingo

Quote:


> Originally Posted by *gupsterg*
> 
> Target temp in overdrive panel is Max ASIC temp an attribute of powerlimit, so yes will make card throttle if your getting close to the temp.
> 
> What I would do is fan profile bios mod, there is a temp target in there when adjusted makes stock method of cooling aka advanced mode(fuzzy logic) adjust to that target. I set mine to 55C, depending on room ambient hitting anywhere between 1000-1400rpm under load (still very quiet). It's also improved VRM temp by alot and you won't get GPU throttle as it ramps fan as high required to maintain temp target in rom.


One thing ive noticed is that the fan ramps up and slows down kinda slowly.

Running luxmark, it reached 70C (target) immediately and kept throttling down to 700Mhz until fans speed reached 70%. Then it throttled within the 900Mhz range.

Is there anyway to improve the response time? Would using afterburner (which has a much faster response time compared to crimson) affect the fans life?

Crimson:
Increase speeds are like 1% per second, until it hits target temp, then increases 2-3% per second.
Decrease speeds are like 1% per second (so it can take 60s+ for fans to slow while all the while the GPU is at room temps within 10-15s)


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Is there anyway to improve the response time?


Yes, there is a sensitivity/granularity setting in ROM for fan under FL mode. I adjusted mine to 150% increase.

FL mode works on monitoring GPU load and rise in temp, based on that it ramps fan plus the other values. You can adjust the profile it applies by:-

a) sensitivity to temperature

b) target temp for gpu to be archieved

c) max fan speed limit

As long as max fan speed is not too low for target temp for gpu set you will see no throttling.

I adjusted sensitivity +150% gpu target 55c max fan 100% in ROM. When running 3DM FS demo looped ~30min with room ambient ~18c iirc max fan was 30%, room ambient ~23c max 40%. Still very quiet in my rig, total aspect of when fan came on/off and ramping became so much better







.

If you want for this instant I will mod your ROM for you if you attach to post







, then you can experience mod 1st hand







. Also any further adjustment you can follow bios mod thread to adjust yourself







.


----------



## Nameless101

Hi all!
I just recently put together my new rig with an R9 Nano, since I found a really good deal on it. However, the coil whine is really unbearable. The pitch varies and in some games it is less audible than in other others, but I can always hear it above fan noise. I have tried a lot of suggested fixes as well (folding, benching, over/undervolting...), but the sound just doesn't go away. This is very unfortunate, as the performance is great and I love the form factor, but I don't like using headphones when gaming, so the coil whine just really ruins it for me.

Now, I will be sending it back to Amazon this week and getting refund. So my question is this: does anyone know if newer batches of the Nano are still plagued with excessive coil whine? Does the Fury X suffer from coil whine as well? That would of course be significant step up in price, but the form factor is still appealing and there would of course be some performance gains as well. So what would you all recommend I do?


----------



## liljonpsp

Does anyone know where I can get a white backplate for the Fury X?


----------



## Flamingo

Quote:


> Originally Posted by *liljonpsp*
> 
> Does anyone know where I can get a white backplate for the Fury X?


1. Coldzero.eu
2. Artisian Store at OCN
3. V1tech


----------



## Alastair

Quote:


> Originally Posted by *Nameless101*
> 
> Hi all!
> I just recently put together my new rig with an R9 Nano, since I found a really good deal on it. However, the coil whine is really unbearable. The pitch varies and in some games it is less audible than in other others, but I can always hear it above fan noise. I have tried a lot of suggested fixes as well (folding, benching, over/undervolting...), but the sound just doesn't go away. This is very unfortunate, as the performance is great and I love the form factor, but I don't like using headphones when gaming, so the coil whine just really ruins it for me.
> 
> Now, I will be sending it back to Amazon this week and getting refund. So my question is this: does anyone know if newer batches of the Nano are still plagued with excessive coil whine? Does the Fury X suffer from coil whine as well? That would of course be significant step up in price, but the form factor is still appealing and there would of course be some performance gains as well. So what would you all recommend I do?


let the nano run overnight in a very high FPS environment. Like the Heaven bench credits screen or something similar. You will find the coil whine will be greatly reduced.


----------



## Nameless101

I tried getting heaven/valley credits screen technique to work, but the credits close after a few seconds. What's the trick to keeping the credits up indefinitely?


----------



## Flamingo

Quote:


> Originally Posted by *gupsterg*
> 
> Yes, there is a sensitivity/granularity setting in ROM for fan under FL mode. I adjusted mine to 150% increase.
> 
> FL mode works on monitoring GPU load and rise in temp, based on that it ramps fan plus the other values. You can adjust the profile it applies by:-
> 
> a) sensitivity to temperature
> 
> b) target temp for gpu to be archieved
> 
> c) max fan speed limit
> 
> As long as max fan speed is not too low for target temp for gpu set you will see no throttling.
> 
> I adjusted sensitivity +150% gpu target 55c max fan 100% in ROM. When running 3DM FS demo looped ~30min with room ambient ~18c iirc max fan was 30%, room ambient ~23c max 40%. Still very quiet in my rig, total aspect of when fan came on/off and ramping became so much better
> 
> 
> 
> 
> 
> 
> 
> .
> 
> If you want for this instant I will mod your ROM for you if you attach to post
> 
> 
> 
> 
> 
> 
> 
> , then you can experience mod 1st hand
> 
> 
> 
> 
> 
> 
> 
> . Also any further adjustment you can follow bios mod thread to adjust yourself
> 
> 
> 
> 
> 
> 
> 
> .


I ran two tests, Heaven benchmark and Luxmark Stress test





You can see the effect of throttling when the GPU temp reaches 70C and fan is still speeding up early on. Once fan is at 60ish% the throttling is less severe and clocks begins to stabilize.

So fan response speed up is slow. And the early throttle will affect benchmark results atleast.

Next when the bench is off, its reeeeeeally slow to ramp down speeds, ~4 min for the Heaven and 8min for Luxmark. Both are undesirable considering GPU cools down in about 1min30s.

So yea... ROM from GPUZ is dumped and attached

If you dont mind, also pls tell me the hex values for the following sensitivities and what youll be editing it at








, just need to play with that









100, 125 and 150%

Fiji.zip 46k .zip file


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> So yea... ROM from GPUZ is dumped and attached
> 
> If you dont mind, also pls tell me the hex values for the following sensitivities and what youll be editing it at
> 
> 
> 
> 
> 
> 
> 
> , just need to play with that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 100, 125 and 150%


You have 3 ROMs + ATiFlash windows command line version:-

Sensitivity: +100% , +125% , +150%
Temp.Target for GPU using FL: 55C

After flashing and rebooting do 



.

If you compare the 3 ROMs using a HEX editor you will know the offset locations/values changed, ref also heading *How to edit fan mode* > *Editing Advanced fan mode (FL)* in OP of Fiji bios mod thread as well.

I will soon be adding links to HEX editors I use in heading *Useful links* and creating a checksum fix ref heading *ATiFlash* > *Using Hawaiireader to fix checksum on Fiji ROM*.

May I ask something of you? when you have time can you provide registers/i2c dumps for your card? ref heading *Gaining per DPM VID information and i2cdump* for info.

Most of all enjoy your new ROMs and welcome to Fiji bios mod







.

Flamingo.zip 368k .zip file


----------



## Flamingo

Thanks! Here are the dumps.

i2cdump.txt 25k .txt file


atigpureg.txt 43k .txt file


----------



## xkm1948

Pre-ordered the HTV Vive VR set today. After months of owning my Fury X now I finally have something to utilize the full potential of the FuryX. Anyone else going on the VR train?


----------



## xTesla1856

Guys, the Titans are gone, I'm officially an owner now





















They're absolutely gorgeous and some of the best built cards I've ever come across. They are kind of a tight fit in the Air 540 though. How would you recommend me going about getting the most of these? I'm a complete noob to AMD cards and how they overclock.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Thanks! Here are the dumps.


Many thanks for them







, +rep.

To answer your questions you sent via PM and perhaps to help others understand what I did or they can do view image below







.

First off I know more values in the sections we are discussing but only showing what needs to be shown, so easier to digest (right click image and open in new tab to see at best res).



Now the values I've changed you would not have access to by any software via normal GUI AFAIK. Even with this bios mod you can carry on making adjustments via Overdrive as you like or use MSI AB, etc to create your own custom curve.

Regarding sensitivity value +%, it is as stated ie default is 4836+100%=9672 .

I hope this explains what I did, do not hesitate to ask any further questions, if it is within my capability to explain I will gladly do so







.


----------



## keikei

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys, the Titans are gone, I'm officially an owner now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They're absolutely gorgeous and some of the best built cards I've ever come across. They are kind of a tight fit in the Air 540 though. How would you recommend me going about getting the most of these? I'm a complete noob to AMD cards and how they overclock.
> 
> 
> Spoiler: Warning: Spoiler!


Welcome to the club! Dat glass....


----------



## Semel

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys, the Titans are gone, I'm officially an owner now


Tbh I don't understand at all what was the point of getting rid of the two most powerful GPUs (titan x) on the planet and getting two nano's that are considerably slower..+have much less VRAM.
Quote:


> Originally Posted by *xkm1948*
> 
> After months of owning my Fury X now I finally have something to utilize the full potential of the FuryX.


We have games available already where fury struggles to get even stable 60 fps at 1080p..


----------



## xTesla1856

Quote:


> Originally Posted by *Semel*
> 
> Tbh I don't understand at all what was the point of getting rid of the two most powerful GPUs (titan x) on the planet and getting two nano's that are considerably slower..+have much less VRAM.


I sold them for a pretty good profit. Also, I didn't wanna support nV anymore with all the douchebaggery going on behind the scenes.


----------



## keikei

Quote:


> Originally Posted by *Semel*
> 
> Tbh I don't understand at all what was the point of getting rid of the two most powerful GPUs (titan x) on the planet and getting two nano's that are considerably slower..+have much less VRAM.


Two nano's is a great choice for 1080/1440 res. and he gets to keep $1000.
Quote:


> Originally Posted by *Semel*
> 
> We have games available already where fury struggles to get even stable 60 fps at 1080p..


Those same games, nvidia also struggles. These are large, graphically intense open world games. IE: GTAV, TW3.


----------



## Flamingo

Quote:


> Originally Posted by *gupsterg*
> 
> Many thanks for them
> 
> 
> 
> 
> 
> 
> 
> , +rep.
> 
> To answer your questions you sent via PM and perhaps to help others understand what I did or they can do view image below
> 
> 
> 
> 
> 
> 
> 
> .
> 
> First off I know more values in the sections we are discussing but only showing what needs to be shown, so easier to digest (right click image and open in new tab to see at best res).
> 
> 
> 
> Now the values I've changed you would not have access to by any software via normal GUI AFAIK. Even with this bios mod you can carry on making adjustments via Overdrive as you like or use MSI AB, etc to create your own custom curve.
> 
> Regarding sensitivity value +%, it is as stated ie default is 4836+100%=9672 .
> 
> I hope this explains what I did, do not hesitate to ask any further questions, if it is within my capability to explain I will gladly do so
> 
> 
> 
> 
> 
> 
> 
> .


okay, so in the bios mod, pls confirm which are changable through crimson. according to my understanding:

Crimson Target Temp = PowerLimit section Orange box = changable value shown in crimson = 85C
Crimson Target Fan = Purple box hex value = changeable value in crimson

Fan sensitivity = Green box (how fast it ramps up and slows down) = Not available/changeable in crimson
Light Green box = Temp when fan should start increasing and no throttle = Not available/changeable in crimson

What is the default(stock rom) value of the light green box? im bad at conversions









Ok understood it all.

Light and dark green are part of advanced fuzzy logic fan control. By default, Nano has value of 75C (4B). Why do I see throttling before that?


----------



## xTesla1856

They're actually R9 Furies, not Nanos, but yeah getting to keep 1000 bucks in addition to only marginally worse performance is a pretty sweet deal. The TItans were overkill for triple 1080p anyways.


----------



## keikei

Quote:


> Originally Posted by *xTesla1856*
> 
> They're actually R9 Furies, not Nanos, but yeah getting to keep 1000 bucks in addition to only marginally worse performance is a pretty sweet deal. The TItans were overkill for triple 1080p anyways.


Apologies.







Interestingly enough, both cards have near identical performance.


----------



## xTesla1856

Happy to report that there's no audible coil whine from either of my cards, where there was some with the titans. Also, the cards are freaking SILENT during heaven bench. Fan speed is at 40% while temps are well below 70. So far, MILES happier than with the Titans on Air. On to some games now


----------



## Radox-0

Quote:


> Originally Posted by *xTesla1856*
> 
> They're actually R9 Furies, not Nanos, but yeah getting to keep 1000 bucks in addition to only marginally worse performance is a pretty sweet deal. The TItans were overkill for triple 1080p anyways.


Strip that Titan X owner club from your sig, your not worthy enough anymore









On the other hand, welcome to the Fiji club









Yeah the Fury's are awesome card. Was always fun to stack my pair up against my Titian X's before moving to the nano. Like yourself the noise difference is crazy, although guess that's the pro's and con's of ref design. Congrats on the buy, look great










Spoiler: Warning: Spoiler!



Also TX comment was a joke


----------



## looncraz

Quote:


> Originally Posted by *escksu*
> 
> Is there anyway to reduce coil whine on the nano? I sold my nano because of the coil whine. If it can be reduced by undervolting, I will get another nano till polaris is out.


Coil whine is just the wires in an inductor (coil) vibrating against each other at a high frequency. This vibration will, in time, wear flat spots into the wires and the problem will change pitch, then go away.

You can speed this process up by finding the situation which causes the most whine, and leave it running overnight. Sometimes it take less time.

Some high-end coils are 'potted' to prevent this from happening, so you can sometimes even put a drop of superglue into an exposed coil and get it to stop (yes, I've done it).


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Crimson Target Temp = PowerLimit section Orange box = value shown in crimson = 85C (max value or default value?)


Yes, 85C is default in your ROM.
Quote:


> Originally Posted by *Flamingo*
> 
> Crimson Target Fan = Purple box hex value = changeable in crimson right? or its the max value?


Yes, changeable in Crimson, as stated in image only function to limit maximum fan %.
Quote:


> Originally Posted by *Flamingo*
> 
> Fan sensitivity = Green box (how fast it ramps up and slows down) = Not available/changeable in crimson


Correct.
Quote:


> Originally Posted by *Flamingo*
> 
> Light Green box = Temp when fan should start increasing and no throttle = Not available/changeable in crimson


Not changeable in Crimson, fan will start ramping before that temp so it can maintain GPU at that specified temp plus will ramp more if needed so you never go over that value. If fan reaches limit speed (purple box) and temp rise to 85C (orange box) clock throttle to protect GPU.
Quote:


> Originally Posted by *Flamingo*
> 
> What is the default(stock rom) value of the light green box?


4B=75 = 75C


----------



## nyk20z3

Which Nano do you guys recommend? I am prob going with the gigabyte because its the only brand i would trust outside of asus and evga. I am also ready to order the EK block which works with the reference design but from what ive seen all the Nano's are reference anyway with just different cooling solutions.


----------



## Radox-0

Quote:


> Originally Posted by *nyk20z3*
> 
> Which Nano do you guys recommend? I am prob going with the gigabyte because its the only brand i would trust outside of asus and evga. I am also ready to order the EK block which works with the reference design but from what ive seen all the Nano's are reference anyway with just different cooling solutions.


99.999% sure all the Nano's are the exact same anyway. Just Gigabyte, Asus etc slap their branding on the card and put it in their own box and manage the warranty policy.

EDIT: Yep to add the EK Nano block will work on any Nano you grab, all identical.


----------



## baii

Well, Asus have one in white, but won't make any different if you going on water.


----------



## Thoth420

Quote:


> Originally Posted by *liljonpsp*
> 
> Does anyone know where I can get a white backplate for the Fury X?


I had my EK black one painted white. I don't think any white are available still. They weren't when I got mine. Pic of my sig rig you can see what it looks like. I went glossy to see stuff on it but matte would look better.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Light and dark green are part of advanced fuzzy logic fan control. By default, Nano has value of 75C (4B). Why do I see throttling before that?


Due to "PowerLimit" , do as highlighted in this post if you want "PowerLimit" in ROM as required.

@members

New beta of HWiNFO should not require ADL to be enabled for VRM info support (this method can cause instability), please test beta guys and report to Martin so he may improve support







.

Link:- http://www.hwinfo.com/forum/Thread-Enabling-ADL-makes-the-AMD-Fury-crash?pid=11058#pid11058


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Flamingo*
> 
> Light and dark green are part of advanced fuzzy logic fan control. By default, Nano has value of 75C (4B). Why do I see throttling before that?
> 
> 
> 
> Due to "PowerLimit" , do as highlighted in this post if you want "PowerLimit" in ROM as required.
> 
> @members
> 
> New beta of HWiNFO should not require ADL to be enabled for VRM info support (this method can cause instability), please test beta guys and report to Martin so he may improve support
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Link:- http://www.hwinfo.com/forum/Thread-Enabling-ADL-makes-the-AMD-Fury-crash?pid=11058#pid11058
Click to expand...

What is HWinfo telling us about the VRM's?


----------



## gupsterg

VRM temps, A & W , the beta is not showing as much info as when enable ADL support (less stable) but Martin will probably be furthering support without ADL (more stable).


Spoiler: [email protected]



*Note:-* Some min readings for card are not normal (ie temp) as I had been folding prior to monitoring with HWiNFO.


----------



## Mumak

Just to clarify - the new values you see in the latest HWiNFO Beta are not obtained via I2C - they use GPU's own telemetry methods.
The VRMs are accessible, but the problem is that on the Fiji any attempt to access the GPU I2C bus can result in a system crash (black screen + fan @ max). I have discussed this with AMD and followed their recommendations how to access the I2C safely. That seemed to improve the situation, but still is not stable enough. It seems that on the (non-X) Fury the crash is less likely to occur, however on the Fury X I can get the system down within a few seconds when using GPU I2C. My understanding is that the GPU (most probably its SMC or drivers) are heavily communicating using the I2C and even when trying to synchronize with them still can lead to a crash.
So while HWiNFO32/64 v5.20 had GPU I2C support enabled on Fiji (might need to do "Reset GPU I2C Cache" first), I have decided to disable this support in the default mode in later versions. So all later versions will only attempt to access the GPU I2C when you enable the "GPU I2C via ADL" option. But I don't recommend this because of the mentioned problems.
I'm not sure if AMD will fix this (nor whether it's possible to fix this at all), so I recommend to rely on current alternate values, which seem to be quite reliable and stable. Though they won't be able to offer all the details as read directly from the GPU VRMs.
The new Beta can now safely report:

GPU VR VDDC Temperature
GPU VRM VDD Temperature
GPU Liquid Temperature (Fury X only)
GPU HBM Temperature - doesn't seem to be supported by current GPUs/drivers
GPU Core Voltage (VDDC)
GPU Memory Voltage (MVDDC)
GPU Core Current [Amps]
GPU Memory Current [Amps]
GPU Core Power [Watts]
GPU Memory Power [Watts]


----------



## gupsterg

Thanks for info







.

I'm not using "GPU I2C via ADL" on Fury (Fiji Pro) with latest beta, I'm just using defaults (ADL is unchecked in settings). Whatever is going on to support Fiji in above screenie for new beta "behind the scenes" I don't know







.

I'm guessing you work on HWiNFO from your post / quick look at your past posts? (sorry first time met yourself on forum







).

I have just ordered a Sapphire Fury X (Fiji XT), so will be testing beta on that as well







, so far on Fury (Fiji Pro) no issues with 5hrs or so of [email protected] with HWiNFO monitoring.


----------



## Mumak

Yep, I'm Martin - the author of HWiNFO


----------



## gupsterg

Ahhh! LOL , nice to meet you again


----------



## provost

I guess I have never returned anything to NewEgg, but this Premiere membership seems like bit of scam&#8230; lol I tried to return my Fury, as I wanted to get two Nanos just for giggles , but even with Premiere you can't return, you can only replace. Now, I did not tell them that I wanted to buy two more cards from them, so not sure if that would have made any difference.

But, I did want to see whether this Premiere thing was real, or just fluff, before I making any bigger purchases in the future&#8230;.lol I guess, I am too spoiled by Amazon Prime. I asked the rep what was the point of the Premiere 30 day return, and then asked him to find any items that would qualify for the 30 day return "privilege"&#8230;the rep tried but could not find any&#8230; LMAO
Anyway, still loving my Fury







, but two Nanos would have fit the build that I am thinking of a bit better&#8230; oh well, one less impulse buy&#8230; onto to Polaris then&#8230;


----------



## xTesla1856

Hey guys, I was doing some testing and there is some weird stuff is happening with my cards. I will attach a scrrencap of my Afterburner metrics during GTA V gameplay. Everything seems all over the place. I was used to the Titans just sticking to their max boost speeds and that's it. How do I get the most out of these Furies and get them to stick "stable" ?


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> Hey guys, I was doing some testing and there is some weird stuff is happening with my cards. I will attach a scrrencap of my Afterburner metrics during GTA V gameplay. Everything seems all over the place. I was used to the Titans just sticking to their max boost speeds and that's it. How do I get the most out of these Furies and get them to stick "stable" ?


I assume power limit is set to +50%? If not that's the first thing you should do.

Are you running 1080p or some other res?

It almost looks like vsync is on. So the clocks are just adjusting to stay at 60fps?


----------



## xTesla1856

Quote:


> Originally Posted by *battleaxe*
> 
> I assume power limit is set to +50%? If not that's the first thing you should do.
> 
> Are you running 1080p or some other res?
> 
> It almost looks like vsync is on. So the clocks are just adjusting to stay at 60fps?


Running at 5760x1080. Everything in AB untouched except a fan curve. It's weird, when I disable Vsync, the fps shoot up to like 75-90, but when I enable Vsync it struggles to maintain 60. But with Vsync off, the gameplay is all stuttery and teary.


----------



## Zealon

hmm, i was having a similar issue with my fury x. i have been using clockblocker to mitigate that problem and it has been working fine since.

edit: well i guess its not similar lol. really not sure now.


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> Running at 5760x1080. Everything in AB untouched except a fan curve. It's weird, when I disable Vsync, the fps shoot up to like 75-90, but when I enable Vsync it struggles to maintain 60. But with Vsync off, the gameplay is all stuttery and teary.


Quote:


> Originally Posted by *Zealon*
> 
> hmm, i was having a similar issue with my fury x. i have been using clockblocker to mitigate that problem and it has been working fine since.
> 
> edit: well i guess its not similar lol. really not sure now.


Still, might be a good idea to try. See about using clockblocker, see what happens.

Edit: And set power limit to +50% for sure. This will just let the card use whatever power it needs. Won't hurt anything.


----------



## xTesla1856

Question: If I set the frame target control to 60 in Radeon settings, does this remove the need for vsync?

EDIT: Nope, still stuttery and the cards are downclocking all the time.


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> Question: If I set the frame target control to 60 in Radeon settings, does this remove the need for vsync?
> 
> EDIT: Nope, still stuttery and the cards are downclocking all the time.


Power limit at +50% ???


----------



## xTesla1856

Yup, power limit at +50%


----------



## Thoth420

Quote:


> Originally Posted by *provost*
> 
> I guess I have never returned anything to NewEgg, but this Premiere membership seems like bit of scam&#8230; lol I tried to return my Fury, as I wanted to get two Nanos just for giggles , but even with Premiere you can't return, you can only replace. Now, I did not tell them that I wanted to buy two more cards from them, so not sure if that would have made any difference.
> 
> But, I did want to see whether this Premiere thing was real, or just fluff, before I making any bigger purchases in the future&#8230;.lol I guess, I am too spoiled by Amazon Prime. I asked the rep what was the point of the Premiere 30 day return, and then asked him to find any items that would qualify for the 30 day return "privilege"&#8230;the rep tried but could not find any&#8230; LMAO
> Anyway, still loving my Fury
> 
> 
> 
> 
> 
> 
> 
> , but two Nanos would have fit the build that I am thinking of a bit better&#8230; oh well, one less impulse buy&#8230; onto to Polaris then&#8230;


So it is with GPUs and Drives. Monitors however it saved me a ton of money playing panel lottery...def not a scam if used right.


----------



## dagget3450

Pretty sure this is the same issue many people including myself experienced with Furies and Crimson drivers. Clock blocker is an item to try, or you can also try pre-crimson drivers as a test? I use 15.10 beta currently. I don't think they have fixed this issue yet on fury. I had 120hz, and vsynched or unlimited experienced in certain games clock reductions that cause drops in fps below 120hz. The issue is not in the 15.10 drivers for me. only once i use any Crimson drivers.


----------



## battleaxe

Quote:


> Originally Posted by *xTesla1856*
> 
> Yup, power limit at +50%


Try clock blocker then. See what that does.


----------



## xTesla1856

Tried clockblocker, clocks stay at 1050mhz, GPU usage still wildly fluctuates. I know these cards have more in them, I can feel their power so to speak, but I can't seem to fully get to it. Might try older drivers tomorrow. As for now, I'm going to bed


----------



## Randomdude

Quote:


> Originally Posted by *xTesla1856*
> 
> Tried clockblocker, clocks stay at 1050mhz, GPU usage still wildly fluctuates. I know these cards have more in them, I can feel their power so to speak, but I can't seem to fully get to it. Might try older drivers tomorrow. As for now, I'm going to bed


Exactly how I feel about the card as well.


----------



## Medusa666

I'm getting my Sapphire R9 Fury Nitro OC as a replacement for the Nano I currently own.

I love this little card to bits, but lately the guy has been giving me artefacts and various visual glitches, so sadly I have to RMA it.

They offered me a new Nano but I the Nitro looks exciting so I got that instead.


----------



## Semel

Quote:


> Originally Posted by *xTesla1856*
> 
> Running at 5760x1080..


Quote:


> still stuttery


Did you check VRAM usage? The resolution is really high + if you have everything enabled\maxed out...Nano has only 4GB vram.. Crossfire won't turn it magically to 8GB

Check this out:

http://www.hardocp.com/article/2015/10/06/amd_radeon_r9_fury_x_crossfire_at_4k_review/5 (GTAV section)


----------



## waltercaorle

Quote:


> Originally Posted by *Mumak*
> 
> Yep, I'm Martin - the author of HWiNFO


thank you for all your hard work


----------



## Radox-0

Quote:


> Originally Posted by *Semel*
> 
> Did you check VRAM usage? The resolution is really high + if you have everything enabled\maxed out...Nano has only 4GB vram.. Crossfire won't turn it magically to 8GB
> 
> Check this out:
> 
> http://www.hardocp.com/article/2015/10/06/amd_radeon_r9_fury_x_crossfire_at_4k_review/5 (GTAV section)


The memory use does indeed look high. Appears to be 6GB? so maby its using the System RAM?


----------



## dagget3450

It's unlikely that VRAM limit is an issue at 5760x1080, as 4k is still a ways higher in resolution.


----------



## Radox-0

Quote:


> Originally Posted by *dagget3450*
> 
> It's unlikely that VRAM limit is an issue at 5760x1080, as 4k is still a ways higher in resolution.


Well about 2 megapixels apart, but does depend what settings are being. With a pair of Fury's you do have a decent amount of grunt to push higher settings, but some settings do use quiet abit of VRAM. Just looking at his MSI AB graph suggest's more is being used. But I dunno.


----------



## xkm1948

Crimson 16.2.1 Hotfix has just been released, go get it!

http://www2.ati.com/drivers/beta/non-whql-64bit-radeon-software-crimson-16.2.1-win10-win8.1-win7-feb27.exe


----------



## Mumak

I'm running the Fury X in BOINC ([email protected]) and while the clock is stable @ 1050 MHz (stock), the GPU usage fluctuates extremely and CPU usage for each task is unusually high (>85%). I too believe the GPU is able to do better, might still be an issue with drivers that need more optimization...


----------



## Alastair

I was amazed by my machine the other day. I was extracting some folders. And the files were being extracted amazingly fast compared to my friends older core 2 q8400 next to me. I also noticed my tach LED's on my one Fury card was flashing away furiously. I opened up afterburner and noticed the GPU1 usage was fluctuating between 0-100% quiet rapidly.

It looked like the GPGPU functions of the cards were in use! So cool!


----------



## gupsterg

Quote:


> Originally Posted by *waltercaorle*
> 
> thank you for all your hard work


Made my first donation to Martin yesterday via his site, it was not much 10USD but will be doing another once I sell some of the GPUs I seem to have collected to do Fiji bios mod







.

His support is phenomenal, I reckon we should all give something in appreciation.


----------



## Flamingo

Quote:


> Originally Posted by *Alastair*
> 
> I was amazed by my machine the other day. I was extracting some folders. And the files were being extracted amazingly fast compared to my friends older core 2 q8400 next to me. I also noticed my tach LED's on my one Fury card was flashing away furiously. I opened up afterburner and noticed the GPU1 usage was fluctuating between 0-100% quiet rapidly.
> 
> It looked like the GPGPU functions of the cards were in use! So cool!


Yea it seems winrar uses the gpu to compress and extract, cant seem to find options for that anywhere though. My nano was crashing when using WinRaR and hwinfo lol.


----------



## Mumak

Quote:


> Originally Posted by *Flamingo*
> 
> My nano was crashing when using WinRaR and hwinfo lol.


That should no longer happen with the latest HWiNFO Beta.


----------



## Alastair

Is there a way that I can use clockblocker to let 1 card stay full speed and the other downclock for non-intensive games. Say CS:GO etc etc.


----------



## xTesla1856

From the "Known issues" for 16.2.1:



AMD seems to know about this issue.


----------



## dagget3450

Quote:


> Originally Posted by *xTesla1856*
> 
> From the "Known issues" for 16.2.1:
> 
> 
> 
> AMD seems to know about this issue.


I think its been listed right after first crimson release. I wish we had separate drivers sometimes because this issue seems like its related to fury only. Which suggests they did something for older cards to work well, and fury is suffering. I almost wonder if they are just addressing hawaii and lower first because fury market is rather small.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> I think its been listed right after first crimson release. I wish we had separate drivers sometimes because this issue seems like its related to fury only. Which suggests they did something for older cards to work well, and fury is suffering. I almost wonder if they are just addressing hawaii and lower first because fury market is rather small.


I run the 390 thread and we have seen it over there too.... not sure why this series of driver have these clock issues.

I will admit it appears to be MUCH MORE prevalent on Fiji than Hawaii/Grenada, but there are several reports on both that I have seen.

I personally do not get any clock throttling on my 390 with the new driver at all though. My Fury got it in GTA V some with the second Crimson release (that supposedly fixed it), and it was really bad in Dirt Rally and BF4, but Crysis 3 kept it pegged.... not sure what's going on at this point....


----------



## xTesla1856

In BF4 there is no issue for me with clockblocker enabled, I played for about 2 hours, and even with Vsync enabled, the cards never dropped from 1050mhz and FPS was pegged at 60 almost all the way through. And that's with everything at Ultra. With Vsync off, fps shot up to 90-110. So these cards definetly seem to be beasts, but yeah some games just don't seem to fully utilize them IMO .


----------



## gupsterg

Quote:


> Originally Posted by *xTesla1856*
> 
> From the "Known issues" for 16.2.1:
> 
> 
> 
> AMD seems to know about this issue.


Even the 16.2 release notes had same text







, I haven't gamed much yet on Fury. Last night playing Crysis 3 didn't seem choppy, etc. Just did 1hr of SWBF and just like on Hawaii the menu is a bit choppy and intro cut scenes but in game smooth. I haven't yet FRAP tested games or used MSI AB to log clocks and like you say can be game dependent.
Quote:


> Originally Posted by *dagget3450*
> 
> I wish we had separate drivers sometimes because this issue seems like its related to fury only.


Even if the driver package is same each card is ID'd and a different driver path is selected. This was the jest of what I got from a post by The Stilt when he explained if we flash a 290/X to 390/X the 390/X still uses a different driver path as it's ID'd by a "fused ID" in ASIC.


----------



## Medusa666

I got the option to get a new R9 Nano or R9 Fury Nitro OC as a replacement for my broken Nano.

I'm wondering if the full Fiji chip that Nano has ever will have an advantage over the cut down Fury Pro chip? Is there any game or scenario where core clock speeds are secondary and the number of shaders etc and whatever there is more of in Nano comes to full use and has an advantage over the Fury?

It bothers me a bit to trade down from a full Fiji to a cut down version.


----------



## Radox-0

Quote:


> Originally Posted by *Medusa666*
> 
> I got the option to get a new R9 Nano or R9 Fury Nitro OC as a replacement for my broken Nano.
> 
> I'm wondering if the full Fiji chip that Nano has ever will have an advantage over the cut down Fury Pro chip? Is there any game or scenario where core clock speeds are secondary and the number of shaders etc and whatever there is more of in Nano comes to full use and has an advantage over the Fury?
> 
> It bothers me a bit to trade down from a full Fiji to a cut down version.


Having owned both, on normal cooler the fury will typically run quicker as the default fan profile will usually cause the Nano to throttle to between 900-950 Mhz so the fury's quicker speed makes up for cut down chip. Now you can of course ramp up the nano fan and in that case depending on the chip you can over clock it to usually 1050 easy and up to 1110 and typically more. In this case the fury will need to run 50 or so Mhz higher I found.

The Fury also seems to be able to top out slightly higher overall in terms of over clocking ability compared to what my Nano samples could do, even when under water. My preference, if your going to keep it on air, go Fury as the monster cooler will keep it very cooler and quiet. Nano if you put it under water IMO


----------



## gupsterg

Quote:


> Originally Posted by *Radox-0*
> 
> My preference, if your going to keep it on air, go Fury as the monster cooler will keep it very cooler and quiet. Nano if you put it under water IMO


+1.

The Nano is a 4 phase VRM @Medusa666 so on air I'd reckon it's gonna be hotter than the 6 phase Fury VRM + better cooler.

I modded my Fury Tri-X fan profile via ROM, it can sustain 55C very very quietly. Consequentially VRM is cooler as well vs stock fan profile plus I would guess HBM RAM is also. I'm using a torch to see through mesh side panel if the fans are actually running when gaming. I'm currently @ 1075MHz / 500MHz with VID of 1.250V using EVV in ROM, currently PC running [email protected] will update post with screenie of data if you like?

Plus you have got option of unlock lottery? mine even though said "you can attempt unlock at possible risk" (link) I gained 3840SP. Exactly between a Fury & Fury X, it has been running fine and dandy in [email protected] / desktop use / 3DM FS loops of 30-40min several times over due to test bios mods and finally some gaming use yesterday and today.

*** edit ***


Spoiler: [email protected] 2hrs @ [email protected] VID


----------



## baii

The nitro aren't supposed to get unlocked? On the other hand, 3 Trix I tried all can get 3840.


----------



## Medusa666

Quote:


> Originally Posted by *Radox-0*
> 
> Having owned both, on normal cooler the fury will typically run quicker as the default fan profile will usually cause the Nano to throttle to between 900-950 Mhz so the fury's quicker speed makes up for cut down chip. Now you can of course ramp up the nano fan and in that case depending on the chip you can over clock it to usually 1050 easy and up to 1110 and typically more. In this case the fury will need to run 50 or so Mhz higher I found.
> 
> The Fury also seems to be able to top out slightly higher overall in terms of over clocking ability compared to what my Nano samples could do, even when under water. My preference, if your going to keep it on air, go Fury as the monster cooler will keep it very cooler and quiet. Nano if you put it under water IMO


Quote:


> Originally Posted by *gupsterg*
> 
> +1.
> 
> The Nano is a 4 phase VRM @Medusa666 so on air I'd reckon it's gonna be hotter than the 6 phase Fury VRM + better cooler.
> 
> I modded my Fury Tri-X fan profile via ROM, it can sustain 55C very very quietly. Consequentially VRM is cooler as well vs stock fan profile plus I would guess HBM RAM is also. I'm using a torch to see through mesh side panel if the fans are actually running when gaming. I'm currently @ 1075MHz / 500MHz with VID of 1.250V using EVV in ROM, currently PC running [email protected] will update post with screenie of data if you like?
> 
> Plus you have got option of unlock lottery? mine even though said "you can attempt unlock at possible risk" (link) I gained 3840SP. Exactly between a Fury & Fury X, it has been running fine and dandy in [email protected] / desktop use / 3DM FS loops of 30-40min several times over due to test bios mods and finally some gaming use yesterday and today.
> 
> *** edit ***
> 
> 
> Spoiler: [email protected] 2hrs @ [email protected] VID


Thank you for your quick and informative replies.

I'l be clear about my question, here are the specifications of the Full Fiji vs cut down Fiji, and I only compare the differences;

4096 stream processors / 3584 stream processors, GCN units 64 units / 56 units, Texture mapping units 256 units / 224 units.

I understand that the performance difference shown by you and various reviewers are due to the core clock speed, with the air cooled Fury being able to maintain a higher avg. clock than the Nano.

What I really need to know is if there is any instance where the increase of the above units is giving the Fiji XT an advantage, or is core speed always the deciding factor, despite the XT having more of everything than the Fiji Pro, or are there engines / scenarios where the increased amount of GCN units et.al can improve the performance despite the lower clock speed?


----------



## gupsterg

Friday my Sapphire Fury X (stock water cooler) arrives (I wanna see if the cooling profile mods work on that).

I can give you data compare of say Fury at 3584 vs 3840 vs Fury X 4096







, FRAPS tests / 3DM , etc ...

Plus my own opinion on them







.


----------



## JunkoXan

I picked up my Nano last week.... have yet to open the box on it, along with the other parts I got over that week. I like the small size...


----------



## Radox-0

Quote:


> Originally Posted by *Medusa666*
> 
> Thank you for your quick and informative replies.
> 
> I'l be clear about my question, here are the specifications of the Full Fiji vs cut down Fiji, and I only compare the differences;
> 
> 4096 stream processors / 3584 stream processors, GCN units 64 units / 56 units, Texture mapping units 256 units / 224 units.
> 
> I understand that the performance difference shown by you and various reviewers are due to the core clock speed, with the air cooled Fury being able to maintain a higher avg. clock than the Nano.
> 
> What I really need to know is if there is any instance where the increase of the above units is giving the Fiji XT an advantage, or is core speed always the deciding factor, despite the XT having more of everything than the Fiji Pro, or are there engines / scenarios where the increased amount of GCN units et.al can improve the performance despite the lower clock speed?


DId not notice a difference in the few games or benchmarks that could not be solved by adding a 50 Mhz clock to the Fury XT to close the Delta


----------



## Semel

Guys, how do I remove +96mV voltage limit in afterburner? I would be obliged if you could provide me with more or less detailed mini-guide

Cheers.

PS Btw Do you lot use normal OC mode or unofficial OC mode in afterburner? I noticed that normal mode has a totally different voltage delivery (values are different, so say, +72mV in the normal mode is different than +72mV in the unofficial mode) compared to the unofficial mode(trixx uses it).

My OC was unstable even at +96mV (normal mode) and 100% stable at +72mV(unofficial mode). When you check voltage numbers normal mode tends to push lower voltage than the unofficial mode.


----------



## BoloisBolo

Hey guys. I was wondering if anybody has some benchmarks of fury nanos in crossfire for heaven and valley?


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Friday my Sapphire Fury X (stock water cooler) arrives (I wanna see if the cooling profile mods work on that).
> 
> I can give you data compare of say Fury at 3584 vs 3840 vs Fury X 4096
> 
> 
> 
> 
> 
> 
> 
> , FRAPS tests / 3DM , etc ...
> 
> Plus my own opinion on them
> 
> 
> 
> 
> 
> 
> 
> .


Looking forward to see this


----------



## Alastair

So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.

I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.


----------



## SuperZan

Quote:


> Originally Posted by *Alastair*
> 
> So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.
> 
> I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.


If nobody beats me to the punch I'll try and have it patched later on today to try my hand. Haven't had it patched in months.


----------



## Alastair

Quote:


> Originally Posted by *SuperZan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.
> 
> I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.
> 
> 
> 
> If nobody beats me to the punch I'll try and have it patched later on today to try my hand. Haven't had it patched in months.
Click to expand...

sorry what don't you have patched?


----------



## SuperZan

Quote:


> Originally Posted by *Alastair*
> 
> sorry what don't you have patched?


Star Citizen? I'm like four updates behind or some such.

edit: oh my, what a download :O

yay fibre


----------



## Flamingo

Quote:


> Originally Posted by *Radox-0*
> 
> Having owned both, on normal cooler the fury will typically run quicker as the default fan profile will usually cause the Nano to throttle to between 900-950 Mhz so the fury's quicker speed makes up for cut down chip. Now you can of course ramp up the nano fan and in that case depending on the chip you can over clock it to usually 1050 easy and up to 1110 and typically more. In this case the fury will need to run 50 or so Mhz higher I found.
> 
> The Fury also seems to be able to top out slightly higher overall in terms of over clocking ability compared to what my Nano samples could do, even when under water. My preference, if your going to keep it on air, go Fury as the monster cooler will keep it very cooler and quiet. Nano if you put it under water IMO


Im trying to collect Nano overclock results from everywhere. Review sites have it as such:



I think setting power limit to +30% is the safest (175W*1.3=227.W ~ 150WRails+75WPCI) for the Nano, and anything else is probably going over the power delivery capacity.


----------



## Maximization

Quote:


> Originally Posted by *Alastair*
> 
> So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.
> 
> I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.


I have had no problems so I can't really help you, don't forget the game is still beta and has allot of work to be done yet. I was surprised that gaming evolve app actually recognizes it now.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> Guys, how do I remove +96mV voltage limit in afterburner?


As fury also uses IR3567B (and from quick view of i2cdumps) I would assume it is the same as highlighted in this, here is another guide.

Note the difference of /wi4 & /wi6 , you are stating which i2c bus the IR3567B is at with the number. Then 30 is ID ref a i2cdump to know what bus/id for your card. 8D is the register then you have data value. Will also perhaps ask Unwinder on Guru3D.

I will test as soon as can and report back







.


----------



## Semel

Quote:


> Originally Posted by *gupsterg*
> 
> I will test as soon as can and report back
> 
> 
> 
> 
> 
> 
> 
> .


Awesome







Once you are done and if you succeed in unlocking voltage , could you post step by step mini-guide here?

Thanx


----------



## Flamingo

Is using hwinfo readings a reliable method of finding out power usage by the card? like gpu core + memory power = total usage by card?


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> Looking forward to see this


Feels like Christmas again!











I'll just have to hide the boxes and bank statements from the wife!








Quote:


> Originally Posted by *Semel*
> 
> Awesome
> 
> 
> 
> 
> 
> 
> 
> Once you are done and if you succeed in unlocking voltage , could you post step by step mini-guide here?
> 
> Thanx


No worries







, post an i2cdump of you card







, follow the info in heading *Gaining per DPM VID information and i2cdump* of Fiji bios mod thread.
Quote:


> Originally Posted by *Flamingo*
> 
> Is using hwinfo readings a reliable method of finding out power usage by the card? like gpu core + memory power = total usage by card?


I'd say pretty much yes.



I have "PowerLimit" values TDC 297A TDP 330W MPDL 330W in ROM for above [email protected] run, will be reducing each one to see it's effect.


----------



## BoloisBolo

Anybody in for CF Nano's on an itx mobo? If so linky here.


----------



## Semel

i2cdump.txt 26k .txt file

Quote:


> Originally Posted by *gupsterg*
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> , post an i2cdump of you card


Here you go


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> i2cdump.txt 26k .txt file
> 
> Here you go


Viewing i2cdump voltage control chip responded at bus 6 address 30, therefore this guide is valid IMO.

Instead of going straight to 100mV or higher, test it by adding /wi6,30,8d,01 to target box of shortcut properties of MSI Afterburner, you should see +6 on GPU core slider.

Example of how your shortcut property for "Target" box should be:-

"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,01

Note: the space between " /


----------



## Semel

I launched Ab with /wi6,30,8d,01... but nothings seems to have happened.I don't even see AB.exe in my task manager. .Do I have to wait for XX minutes or something?


----------



## gupsterg

No you need to launch MSI Afterburner afterwards (using a normal shortcut) .

View video, I just tested +25mV.


----------



## dagget3450

I apologize if i missed this but i know a while back fury and voltage was discussed. I have an issue where once i add over say +30mv my benchmark scores go down. For instance if i do max trixx voltage i think was 100mv+ my scores are worse than running lower clocks with no voltage.

So my question is is a bios mod going to affect this or was it a hardware limitation? I seem to recall someone did hard mods but still had this issue?

The silly thing is if i up voltage to the point it reduces performance ironically it makes it Look Like the Higher clocks are stable when its actually just throttling like mad.


----------



## Semel

But it is different..... I thought this command would allow me to move voltage slider to +106 instead of +100 but all this command did is upped my overall voltage (idle included) to 906 instead of 900. So if I increase voltage further my GPU would get hotter while idling...and this is definitely no good.

PS How do I reset it to default voltage limits? /wi6,30,8d,00 ?

PPS Yeah/wi6,30,8d,00 reverted settings to default ones. ..

So my question remains the same.. How do we remove +100mV (slider) limit? Same as in Trixx..


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> I apologize if i missed this but i know a while back fury and voltage was discussed. I have an issue where once i add over say +30mv my benchmark scores go down. For instance if i do max trixx voltage i think was 100mv+ my scores are worse than running lower clocks with no voltage.
> 
> So my question is is a bios mod going to affect this or was it a hardware limitation? I seem to recall someone did hard mods but still had this issue?
> 
> The silly thing is if i up voltage to the point it reduces performance ironically it makes it Look Like the Higher clocks are stable when its actually just throttling like mad.


My first aim of bios mod was to get a good fan profile set to OC and make sure VRM was cool, that is done.

Second aim was to set GPU frequency by ROM, that has been tested and done.

I'm now trying to set / test manual VID setup by ROM







.

Trust me so far Fiji bios modding is moving way faster than hawaii, when originally me, @ddsz and @OneB1t plus others were investigating it on Guru3D. This is due to all the experienced gained from hawaii bios mod







.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> But it is different..... I thought this command would allow me to move voltage slider to +106 instead of +100 but all this command did is upped my overall voltage (idle included) to 906 instead of 900. So if I increase voltage further my GPU would get hotter while idling...and this is definitely no good.


If you don't use the +xxmV shortcut and just move slider in MSI AB your idle voltage should increase, this is what occurs on my card. So it is no different in my experience, perhaps you have not noted this before? or my card is behaving differently








Quote:


> Originally Posted by *Semel*
> 
> So my question remains the same.. How do we remove +100mV (slider) limit? Same as in Trixx..


No idea.

All you said earlier was:-
Quote:


> Guys, how do I remove +96mV voltage limit in afterburner?


This has, if say you use +25mV shortcut then use slider to add +96 you end up higher.


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> My first aim of bios mod was to get a good fan profile set to OC and make sure VRM was cool, that is done.
> 
> Second aim was to set GPU frequency by ROM, that has been tested and done.
> 
> I'm now trying to set / test manual VID setup by ROM
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Trust me so far Fiji bios modding is moving way faster than hawaii, when originally me, @ddsz and @OneB1t plus others were investigating it on Guru3D. This is due to all the experienced gained from hawaii bios mod
> 
> 
> 
> 
> 
> 
> 
> .


Is there anything i can do to help?


----------



## Alastair

Quote:


> Originally Posted by *dagget3450*
> 
> I apologize if i missed this but i know a while back fury and voltage was discussed. I have an issue where once i add over say +30mv my benchmark scores go down. For instance if i do max trixx voltage i think was 100mv+ my scores are worse than running lower clocks with no voltage.
> 
> So my question is is a bios mod going to affect this or was it a hardware limitation? I seem to recall someone did hard mods but still had this issue?
> 
> The silly thing is if i up voltage to the point it reduces performance ironically it makes it Look Like the Higher clocks are stable when its actually just throttling like mad.


I think there is some sort of power limit throttling happening. But wait till the BIOS mods are out. I am sure we will see some good things.


----------



## Semel

Quote:


> Originally Posted by *gupsterg*
> 
> If you don't use the +xxmV shortcut and just move slider in MSI AB your idle voltage should increase, this is what occurs on my card. So it is no different in my experience, perhaps you have not noted this before? .


You are right. My bad


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> Is there anything i can do to help?


In this posted image is section of PowerPlay.

We either have to edit virtual VDDC sections, one or the other or both in image. Then also maybe third thing could be the DPM pointers in SCLK table section.

In hawaii there were 6 sections within powerplay with pair of voltage values per DPM per section, so about 6x2x8 = 96 hex values to set the 8 DPM voltages manually.

There is nothing more I can share to help until I test VID mod.
Quote:


> Originally Posted by *Semel*
> 
> You are right. My bad


No worries







.


----------



## Semel

Has anyone tried to find a way to hack AB to allow more than +100 voltage increase via the slider ? It's just i don't like this shortcut method tbh.... I took a glance at AB.exe via hex editor but I'm not that good at this perhaps someone who knows what he is doing could check it and see if it is possible to hack it


----------



## gupsterg

I have just completed 1st test of manual VID control by ROM







.

With this method we should be able to set any DPM whatever VID we like







, within the VDDC limit by voltage control chip *but* this I know how to mod as well







.

We will not be using offsets like when we add voltage with MSI AB which effects DPM 0 voltage ie lowest state = idle







.

Also after the testing today of writing registers to IR3567B using MSI AB I have a very strong feeling we can make PT1 & PT3 roms for Fury!

Please be patient guys on me publishing details, as I wish to a) test fully b) give best guide.


----------



## SuperZan

Quote:


> Originally Posted by *Alastair*
> 
> So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.
> 
> I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.


So after testing a bit, you're not alone. Can't get my second card to work properly and got wild fluctuations on the first, with the accompanying choppy performance. I tested in the same area as well as the hangar. I tried Force Crossfire, each of the default profiles, and Crysis3 profile, with and without ClockBlocker. I'm not sure what the issue is but now I feel I'll probably be poking at it the rest of the evening to see if there is a fix some sort.


----------



## Alastair

Quote:


> Originally Posted by *SuperZan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So any of you guys have Fury's and plays Star Citizen? Please can you guys help me out with performance. My performance is poor at best. 30FPS running around Crusader. Afterburner shows GPU1 fluctuating between 0% and 100% furiously. Clock blocker isn't helping me at all.
> 
> I was also able to Force Crossfire on with my 6850's but I can't seem to get it right with my Fury's and the new 2.2.2 (I think it's that version) of Star Citizen.
> 
> 
> 
> So after testing a bit, you're not alone. Can't get my second card to work properly and got wild fluctuations on the first, with the accompanying choppy performance. I tested in the same area as well as the hangar. I tried Force Crossfire, each of the default profiles, and Crysis3 profile, with and without ClockBlocker. I'm not sure what the issue is but now I feel I'll probably be poking at it the rest of the evening to see if there is a fix some sort.
Click to expand...

Thanks for your efforts!


----------



## Noirgheos

You guys should report en masse any issues we have. It seems AMD ingnore just a few people.


----------



## JunkaDK

So guys,

I'm looking for some advice on GPU cooling.. I am trying to get the most out of my R9 Fury STRIX by optimizing.

I found by tipping the front fan down towards the backplate , it really makes a huge difference on how cool it feels.. and i hope, adds alot more cooling to the VRM's.

I moved it down so that it was closer to the 2 bottom fans .. has anyone tried anything like this before? The fans are corsair AF140 fans, but i am thinking about trying some Noctua Industrial fans as my theory is they wil push more air towards the GPU. The airflow is minimal so i guess the AF fans is not the best..

Please hit me up with thoughts and ideas.


----------



## Medusa666

Want to thank you guys in this thread, I got my Sapphire Fury Nitro OC yesterday and I'm happy as can be, I loved the Nano and was a bit skeptical at first due to the sheer size of this card, but I like it, silent cool and performs like a beast.


----------



## keikei

Quote:


> Originally Posted by *Medusa666*
> 
> Want to thank you guys in this thread, I got my Sapphire Fury Nitro OC yesterday and I'm happy as can be, I loved the Nano and was a bit skeptical at first due to the sheer size of this card, but I like it, silent cool and performs like a beast.
> 
> 
> Spoiler: Warning: Spoiler!


My case fans are louder than the card (when gaming). Its sweet as hell. I love the auto settings for temp and fan speed. I dont have to touch anything once the settings are set.


----------



## fat4l

So @gupsterg have you received your fury X yet ?
Any voltage mods ? Shame u don't have custom wcooling. Would be fun to play with.
BTW do these cards have that blackout/rail volrage issue ?


----------



## NBrock

It was taken out of the payment for the item. Also try and figure out what shipping is going to cost you may want to mark up your item a little if you think shipping is gonna be super expensive.

Another thing I just thought about. You don't get the money right away. Amazon has designated cycles they send everyone's money out on. I don't remember what it is but they do list it when you set up your account and add banking info.


----------



## Noirgheos

Quote:


> Originally Posted by *NBrock*
> 
> It was taken out of the payment for the item. Also try and figure out what shipping is going to cost you may want to mark up your item a little if you think shipping is gonna be super expensive.
> 
> Another thing I just thought about. You don't get the money right away. Amazon has designated cycles they send everyone's money out on. I don't remember what it is but they do list it when you set up your account and add banking info.


Can I tell them to send it to paypal? I don't have a bank account. Yes, hard to believe.


----------



## NBrock

I am not sure to be honest. I do remember there being multiple options for setting up payment method but I don't remember if PayPal was an option. I would assume so...but we all know what happens when one assumes lol.


----------



## NBrock

To stay on topic lol. Does anyone else seem to get worse overclocking capability on Fury series with each driver release? Overall performance does seem to be improving with each driver...but I can't over clock as far.

Haha forgot that was the reason I checked this thread today.


----------



## xTesla1856

Both of my Furies hold 1100mhz game stable at +12mV. So far, I'm in love with the cards


----------



## p4inkill3r

Quote:


> Originally Posted by *NBrock*
> 
> To stay on topic lol. Does anyone else seem to get worse overclocking capability on Fury series with each driver release? Overall performance does seem to be improving with each driver...but I can't over clock as far.
> 
> Haha forgot that was the reason I checked this thread today.


I cannot say that I've seen anything like that occurring.


----------



## keikei

Quote:


> Originally Posted by *xTesla1856*
> 
> Both of my Furies hold 1100mhz game stable at +12mV. So far, I'm in love with the cards


Noice. What sort of frames are you getting with the oc?


----------



## Flamingo

Guys about the whole clockblocker screen corruption thing...

is it affecting certain cards only? if its a driver issue - related to low power state, then it should be affect all fiji users right? just wanna know the details coz im wondering if im safe from it or not (or prevent doing stuff that would trigger it)


----------



## xTesla1856

Quote:


> Originally Posted by *keikei*
> 
> Noice. What sort of frames are you getting with the oc?


I've only tested BF4 so far. Everything on Ultra settings at 5760x1080 with 2xMSAA enabled, I'm getting a rock solid 60 with Vsync enabled. No frame drops or stutters or anything. Gonna try some other games as well tonight.

What really amazes me though are the temps and the noise. My PC is sitting about 3ft away from me on my desk and with my Titans, I'd almost go deaf every time without a headset. Now, with 2 overclocked Furies, Max temps were 71 degrees on the top card and the fan never went above 55%. There's nothing but a slight hum coming from the case under load.


----------



## keikei

Quote:


> Originally Posted by *xTesla1856*
> 
> I've only tested BF4 so far. Everything on Ultra settings at 5760x1080 with 2xMSAA enabled, I'm getting a rock solid 60 with Vsync enabled. No frame drops or stutters or anything. Gonna try some other games as well tonight.
> 
> What really amazes me though are the temps and the noise. My PC is sitting about 3ft away from me on my desk and with my Titans, I'd almost go deaf every time without a headset. Now, with 2 overclocked Furies, Max temps were 71 degrees on the top card and the fan never went above 55%. There's nothing but a slight hum coming from the case under load.


I've read many experiences such as yourself and i start to wonder just how much nvidia spends on marketing and sponsorship to sell their cards versus actual r&d? Amd makes soild cards plain and simple. The disparity in the market saddens me, but with the new group of up and coming cards I think we can see an evening out of sorts. Nvidia has been mute on what pascal has to offer so far. The hype for polaris has gone up and i think if amd delivers, we can see them in a better position market-wise.


----------



## xTesla1856

Yeah, I agree with what you're saying. Nvidia is a corporate puppet master. I have had 6 Nvidia cards prior to the Fury and on paper, AMD cards always _seemed_ more powerful than Nvidia cards (i.e shader count and now memory bandwidth etc). But the benchmarks always told a different story. Then I learned about gameworks and all the behind-the-scenes-douchebaggery that Nvidia has been doing the last few years. They know their hardware is inferior at the moment, so they do everything they can to gimp their competition software-wise. DX12/Vulkan and Polaris should be a riot !


----------



## Kana-Maru

Quote:


> Originally Posted by *NBrock*
> 
> To stay on topic lol. Does anyone else seem to get worse overclocking capability on Fury series with each driver release? Overall performance does seem to be improving with each driver...but I can't over clock as far.
> 
> Haha forgot that was the reason I checked this thread today.


My Fury X still performs the same. I get the same OCs with Crimson Drivers as I did I last summer with the Catalyst Drivers. I can still get 1125Mhz on the Core with no power limit or voltage increases. My highest OC is still the same as well.

*Edit:*
Here are some benchmarks I posted in another topic. I've been messing around with my benchmarking program for better results.

I was playing Ryse: Son of Rome tonight at 4K. Instead of using my high 4.8Ghz overclock that usually run for benchmarks, I decided to run my 24/7 daily 4Ghz + DDR3-1400Mhz [9-9-9-24-1T]. I normally play games at 4Ghz, but I never benchmark at 4Ghz. I ran Ryse: Son of Rome at it's max settings @ 4K. The gameplay was great and the game is beautiful. That CryEngine 3 is gorgeous. I'm using the Crimson 16.1 Drivers.

*Ryse: Son of Rome [100% Maxed] - 4K*
AMD R9 Fury X @ *Stock* Settings [Crimson 16.1 Drivers]
[email protected]
RAM: DDR3-1400Mhz [9-9-9-24-1T]
Gameplay Duration: 25 minutes
*FPS Avg: 33.4fps*
FPS Max: 47.4fps
*FPS Min Caliber ™: 24.3fps
Frame time Avg: 29.9ms*

*Fury X Info:*
GPU Temp Avg: 41.5c
GPU Temp Max: 42c
GPU Temp Min: 34c

*CPU info:*
*CPU Temp Avg: 38.5c*
CPU Temp Max: 46c
CPU Temp Min: 36c
*CPU Usage Avg: 17.77%*
CPU Usage Max: 30.60%

The test above uses stock settings. I did run my Fury X overclocked as well:

I ran my Fury X with overclocked settings. I pushed the Core from 1050Mhz to 1150Mhz. A measly +100Mhz on the core which is only a 9.52% increase.
After running the benchmarking I saw a 11.1% increase in FPS and roughly a 11% performance boost in Frame Time ms.
That's not bad. A 9.52% GPU overclock = 11% performance increase in the game @ 4K resolution. Ryse: Son of Rome is a very demanding title. I'm still rocking the X58 platform by the way.

Next up is The Witcher 3 100% [no Nvidia GameWorks] @ 4K.


----------



## BIGTom

Quote:


> Originally Posted by *Flamingo*
> 
> Guys about the whole clockblocker screen corruption thing...
> 
> is it affecting certain cards only? if its a driver issue - related to low power state, then it should be affect all fiji users right? just wanna know the details coz im wondering if im safe from it or not (or prevent doing stuff that would trigger it)


Hey Flamingo,

I've not experienced the screen corruption issue with my Fury X and I've had it since launch. I exclusively use DisplayPort connection with a certified cable, and most of my reading it seems to impact HDMI users the most? EDIT it does seem to impact DP connections as well. I use an LG 34" 3440x1440 FWIW

However, I do have the downclocking problem with any Crimson driver. The Clockblocker program and registry item additions posted earlier worked well enough for me, but I will use Catalyst 15.11 until AMD pushes the official fix in a Crimson update.


----------



## xTesla1856

The Witcher 3 seems to hate overclocked AMD GPU's even more than Nvidia ones. The game bluescreens on me about 10 minutes in at 1100mhZ with +12mV. When I remove the OC, everything plays fine. I can crank the settings to ultra minus the GameWorks crap. Only things on high are foilage visibility and grass density. At triple 1080p, I'm very impressed, that's the same settings I had on my Titans and the Furies keep a rock solid 60fps.

Sorry if keep going on about games, but I'm still struck by how awesome these cards are


----------



## p4inkill3r

I haven't gotten TW3 stable on any significant overclock.


----------



## nyk20z3

I received my EK Nano block today but i noticed the jet plate is not center, its actually tilting down on the right side. I guess it moved a little bit when they where putting the front cover on. My question is will this affect performance and should i contact EK and see what they think ?


----------



## Thoth420

I haven't overclocked anything on this new rig yet. DDR4 is all new to me, never owned an i7, mobo feature set is daunting and scary and the Fury X seems to work just fine at stock clocks so I haven't even considered it. I tend to wait til I am on the driver I will be staying on for a long while before messing with GPU OC software. I am also waiting on a replacement pump as mine doesn't ramp up due to a dead sensor(D5 not the stock Fury X pump) so it would be foolish to start before that is fixed.

I often wonder what this loop is capable of but alas too busy(and a little scared as this is by far my most expensive rig) to find out lately.


----------



## Medusa666

Quote:


> Originally Posted by *keikei*
> 
> My case fans are louder than the card (when gaming). Its sweet as hell. I love the auto settings for temp and fan speed. I dont have to touch anything once the settings are set.


The max temperature of this card is supposed to be 80c with the OC BIOS enabled according to Sapphire? ( BIOS switch pressed in /w blue Sapphire logo )

My card is showing 84c while playing Elite Dangerous Horizons, that is 4c over the reported max temp by the manufacturer, would this be considered a problem?


----------



## xTesla1856

Quote:


> Originally Posted by *Medusa666*
> 
> The max temperature of this card is supposed to be 80c with the OC BIOS enabled according to Sapphire? ( BIOS switch pressed in /w blue Sapphire logo )
> 
> My card is showing 84c while playing Elite Dangerous Horizons, that is 4c over the reported max temp by the manufacturer, would this be considered a problem?


I set a custom fan curve on mine. Card stays below 70 and fan speed never goes above 60%.


----------



## gupsterg

Quote:


> Originally Posted by *Medusa666*
> 
> Want to thank you guys in this thread, I got my Sapphire Fury Nitro OC yesterday and I'm happy as can be, I loved the Nano and was a bit skeptical at first due to the sheer size of this card, but I like it, silent cool and performs like a beast.
> 
> 
> Spoiler: Warning: Spoiler!


I'll be honest I'm getting more and more tempted to keep my Fury Tri-X as each day goes by. Way cooler running than my hawaii and quieter and that's comparing it with Tri-X / Vapor-X editions of hawaii cards.

The Fury X seemed to be a right pain to handle with the AIO cooler, which again makes be think I'd rather have Fury Tri-X. One thing I also think is what if the hosing gets damaged, can it be replaced with off the shelf stuff? (got no WC experience). Is also the fluid non-electrically conductive?
Quote:


> Originally Posted by *fat4l*
> 
> So @gupsterg have you received your fury X yet ?
> Any voltage mods ? Shame u don't have custom wcooling. Would be fun to play with.
> BTW do these cards have that blackout/rail volrage issue ?


Fury X has AIO, which I think with cooling profile mod in bios should defo perform better than what info in reviews IMO. Yep got manual VID on Fiji







, I also believe VDDCI can be adjusted and hoping to nail MVDDC as well







. On Fiji IR3567B controls RAM voltage







and you see a value in HWiNFO for it so no DMM required to know if bios mod work for MVDDC.

TBH honest the way the cooling on the Fury Tri-X is on air I'd see no point in cost of WC, so far. I'd rather use that saving to say offset purchase of card or a future GPU.

Not had any blackouts yet, card is now upto 1090MHz using stock VID calc under EVV mode of 1.250V. Compared with Hawaii which lowered VID when changing DPM 7 frequency Fiji seems to recalculate it, when bios mod done. Max I've seen is 1.250V, I used to have 1.243V for 1000MHz in ROM under EVV.

Now as manual VID is sussed plan to test clocks more, just taking things slow / in stages so better know "things" about Fiji bios mods.
Quote:


> Originally Posted by *Medusa666*
> 
> The max temperature of this card is supposed to be 80c with the OC BIOS enabled according to Sapphire? ( BIOS switch pressed in /w blue Sapphire logo )
> 
> My card is showing 84c while playing Elite Dangerous Horizons, that is 4c over the reported max temp by the manufacturer, would this be considered a problem?


In Fury Tri-X STD or OC edition bios:-

Stock PL & increased PL ROMs MAX ASIC Temp is 85C , this value makes card throttle clock when GPU temp close to it, in my test did not aid fan to work more effectively in the way I wanted.

Cooling profile Target GPU Temperature is 75C in stock PL and 80C in increased PL ROM *but* as the sensitivity value for fan is still the same, you're not seeing a great increase in fan speed = GPU / RAM / VRM temps improvement.

Cooling profile Target GPU Temperature and sensitivity value for fan *can not be modified with any normal software* like say Overdrive / MSI AB, only bios mod does so far. This info is a few pages back in this thread plus in Fiji bios mod thread.

IMO they could have easily made those 2 values accessible in Overdrive and don't know why they haven't. IMO they have let down "enthusiasts" by this, but at least bios mod succeeds. I also think if this was available perhaps card would clock better vs stock profile in reviews / for users. Will be testing what max OC I get with my modded fan profile ROM vs stock.

Gotta work out how to stop the "automatic" data stamp being added to a bios flash by ATiFlash, as that is the only tell tale sign of flashing a card which would void warranty IMO. On hawaii this did not occur as flash was done in DOS, on Fiji even when you flash back your stock ROM from factory a new data stamp is added to ROM (comparing saved ROM with after flash and dump it).


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Fury X has AIO, which I think with cooling profile mod in bios should defo perform better than what info in reviews IMO. Yep got manual VID on Fiji
> 
> 
> 
> 
> 
> 
> 
> , I also believe VDDCI can be adjusted and hoping to nail MVDDC as well
> 
> 
> 
> 
> 
> 
> 
> . On Fiji IR3567B controls RAM voltage
> 
> 
> 
> 
> 
> 
> 
> and you see a value in HWiNFO for it so no DMM required to know if bios mod work for MVDDC.
> 
> TBH honest the way the cooling on the Fury Tri-X is on air I'd see no point in cost of WC, so far. I'd rather use that saving to say offset purchase of card or a future GPU.
> 
> Not had any blackouts yet, card is now upto 1090MHz using stock VID calc under EVV mode of 1.250V. Compared with Hawaii which lowered VID when changing DPM 7 frequency Fiji seems to recalculate it, when bios mod done. Max I've seen is 1.250V, I used to have 1.243V for 1000MHz in ROM under EVV.
> 
> Now as manual VID is sussed plan to test clocks more, just taking things slow / in stages so better know "things" about Fiji bios mods.
> In Fury Tri-X STD or OC edition bios:-


Nice! DO u see it any real hitting 1.2k?


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> Nice! DO u see it any real hitting 1.2k?


Sorry elaborate, did not make sense to me







.

I've yet to install the Fury X as been busy







plus that AIO unit is being a pain to get in an appropriate place in my case







, finding it restrictive a) to handle card b) hoses , etc; I may just test it with the rad sorta loosely sitting about.

You did see my other posts about Fury Tri-X on air being able to do 55C quietly when fan profile modded? Fury X with AIO profile modded may IMO mean for me custom WC is even less attractive. If GPU / RAM / VRM temps become phenomenally better I might be tempted to think I'd take on the hassle of accommodating the AIO solution. Another idea has been in my mind is swap the Tri-X cooler on they Fury X and flog the Fury with AIO solution.

We know 290 vs 290X with the 10% extra SP and TU didn't really equate to 10% gain in all scenarios, IIRC at most 5%.



So I reckon the 3840SP Fury gonna be real close clock for clock to Fury X.


----------



## JunkoXan

I put my system together last night _(7 hours it took put it all together)_ and the Nano is really good.... even at 1080p _(plan on doing 1440 in the coming few weeks)_


----------



## Arizonian

Quote:


> Originally Posted by *JunkoXan*
> 
> I put my system together last night _(7 hours it took put it all together)_ and the Nano is really good.... even at 1080p _(plan on doing 1440 in the coming few weeks)_


Nice - congrats. Yea once I start I don't like to stop until completely finished and booted up.

But no pics? LOL jk. Post'em so we can see your work of art when you get time.

Heads up.....Viewsonic has the *XG2703* 27" IPS 144 Hz Adaptive Sync monitor coming out this month. There are two versions XG2703 (freesync) and XG2703 GS (gsync). Both will be in the 700-800 range to compete with ASUS finally.


----------



## JunkoXan

Far from a masterpiece, but it has function. and the second one is my little world where my Xbone and my PC are both being used and so on.







and my chair is comfortable to despite being ripped up as it is.


----------



## keikei

Quote:


> Originally Posted by *JunkoXan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Far from a masterpiece, but it has function. and the second one is my little world where my Xbone and my PC are both being used and so on.
> 
> 
> 
> 
> 
> 
> 
> and my chair is comfortable to despite being ripped up as it is.


Nice set up. Have you considered some cable management? Buy some zip-ties and go to town. I find it makes things less cluttered. Its great when you have to vac under the table and not see a jungle of cables.


----------



## JunkoXan

I have plenty of zip ties even used a few twist ties the cables themselves had around them







, just ran into a bit of length issues. but overall no obstructions in terms of airflow, the cable in front of the front fan is actually pushed back now. the vac is mostly useless but is used. I don't like cable messes, it burned out my 486.

all I have to do now is get the cabling mess you see so obviously taken care of, played a little NFS MW2 and played good. noticed the GPU usage was erratic and the GPU did 1000mhz 99% of the time, the vram usage was 2gb. provided 45fps at 1080p, which I've got no problems with.


----------



## nyk20z3

The nano block and accessories arrived -


----------



## 98uk

Has anyone had to rma because of stability issues?

Card was stable at 1100mhz, now it's starting to throw driver crashes and occasionally a full screen freeze from which I have to reboot.

Tried reverting drivers with no luck and today it did the same at stock clocks.

Final thing to try would be win 10 reinstall... But im wondering if anyone has had a card to bad on them yet?


----------



## xTesla1856

Quote:


> Originally Posted by *98uk*
> 
> Has anyone had to rma because of stability issues?
> 
> Card was stable at 1100mhz, now it's starting to throw driver crashes and occasionally a full screen freeze from which I have to reboot.
> 
> Tried reverting drivers with no luck and today it did the same at stock clocks.
> 
> Final thing to try would be win 10 reinstall... But im wondering if anyone has had a card to bad on them yet?


Is this happening on every game you play or just some specific ones? For example The WItcher 3 bluescreens my entire PC with any sort of overclock. It's only stable at stock speeds.


----------



## gupsterg

Crysis 3 and SWBF all max'd setting no clock drops (only drops are menu); tested with 1095MHz set with MSI AB (plus mods below in ROM).

Crysis_3SWBF_HML.zip 20k .zip file


Previously tested 3DM FS any test looped no clock drops plus [email protected] 20hrs+ no clock drops (other than upload/download of unit), 1090MHz in ROM (plus mods below).

Crimson 16.2.1 driver at stock, my ROM only has cooling profile mod so GPU stay at 55C & PowerLimit upped in ROM.

Stock PL in ROM is TDC 270A / TDP 300W / MPDL 350W , I mod to 297A / 330W / 330W.

ROM uses EVV mode for VID (stock method), 1.250V for DPM 7.

I'd say Fury is growing on me







.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Sorry elaborate, did not make sense to me
> 
> 
> 
> 
> 
> 
> 
> .


1.2k....k=1000 so 1.2*1000=1200MHz


----------



## Flamingo

Quote:


> Originally Posted by *BIGTom*
> 
> Hey Flamingo,
> 
> I've not experienced the screen corruption issue with my Fury X and I've had it since launch. I exclusively use DisplayPort connection with a certified cable, and most of my reading it seems to impact HDMI users the most? EDIT it does seem to impact DP connections as well. I use an LG 34" 3440x1440 FWIW
> 
> However, I do have the downclocking problem with any Crimson driver. The Clockblocker program and registry item additions posted earlier worked well enough for me, but I will use Catalyst 15.11 until AMD pushes the official fix in a Crimson update.


thank you for explaining it. is the down clocking under load or idle? if load, is it because of temperatures? if idle, does it matter because there is no display corruption (in your case)?


----------



## Semel

What's the difference between GPU VR VDCC temperature and GPU VRM VDD temperature (hwinfo) ?


----------



## AndreDVJ

Quote:


> Originally Posted by *Flamingo*
> 
> thank you for explaining it. is the down clocking under load or idle? if load, is it because of temperatures? if idle, does it matter because there is no display corruption (in your case)?


All Fury's can experience display corruption if a certain scenario is achieved. From my testing it's caused by collisions caused by monitoring software running in parallel + clock speed fluctuation. HWinfo + MSI Afterburner caused lots and lots of corruption for me. Newer versions of HWInfo can co-exist more peacefully with MSI AB.


----------



## Mumak

That's right - latest versions of HWiNFO and MSI AB are synchronized when accessing AMD GPUs. AIDA64 and CPU-Z/HWMonitor should also follow shortly.


----------



## Semel

Is it possible to completely disable AB monitoring yet keep "hardware control" enabled (I presume it means changing clocks\voltage)?


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> 1.2k....k=1000 so 1.2*1000=1200MHz


Hmm dunno, if will get there. Slowly creeping upto 1100MHz at stock voltage, just been too busy to sit down and OC.
Quote:


> Originally Posted by *Semel*
> 
> Is it possible to completely disable AB monitoring yet keep "hardware control" enabled (I presume it means changing clocks\voltage)?


Try pausing it, see if that helps, right click graph area and pause.

You do know when you apply clocks / voltage through MSI AB and don't click reset and close it they stay applied? happens on hawaii and fiji on my rig. Usually they also stay applied after a restart as well, dunno about full shutdown and restart.


----------



## bluezone

Guess what arrived this morning
.

Sweet factory fresh good goodness.



Before:



Good bye old friends.

Thank You PCPER and AMD.

Now to try it out.


----------



## Neon Lights

Quote:


> Originally Posted by *bluezone*
> 
> Guess what arrived this morning
> .
> 
> Sweet factory fresh good goodness.
> 
> 
> 
> Before:
> 
> 
> 
> Good bye old friends.
> 
> Thank You PCPER and AMD.
> 
> Now to try it out.


Why exactly did you buy a Nano? Could you not have fit a larger graphic card with more power in the case you are using?


----------



## NBrock

He didn't buy it...he won it. That's why he thanked PCPER and AMD in his post.


----------



## bluezone

Yes easily in the Raven 3 case. I was waiting for the 14Nm AMD cards for an upgrade. But I won a R9 Nano during a Crimson preview podcast. It just arrived this morning.
But free is good.


----------



## NBrock

Quote:


> Originally Posted by *bluezone*
> 
> Yes easily in the Raven 3 case. I was waiting for the 14Nm AMD cards for an upgrade. But I won a R9 Nano during a Crimson preview podcast. It just arrived this morning.
> But free is good.


If you don't like it for whatever reason I'll take it off your hands lol


----------



## bluezone

I'll keep that in mind if I feel any disappointment.


----------



## Neon Lights

Quote:


> Originally Posted by *NBrock*
> 
> He didn't buy it...he won it. That's why he thanked PCPER and AMD in his post.


Quote:


> Originally Posted by *bluezone*
> 
> Yes easily in the Raven 3 case. I was waiting for the 14Nm AMD cards for an upgrade. But I won a R9 Nano during a Crimson preview podcast. It just arrived this morning.
> But free is good.


Ah ok


----------



## bluezone

At stock clocks my Nano doesn't beat my overclocked HD 7950's in crossfire bench marks, but it is close. About 130 points lower in 3d mark11.
but in game play its faster and smoother.

When I attempted a minor overclocking on the HBM I had major temp problems. Hit 90 deg C on a 50 Hz increase. About a 15 deg increase over what I had with all other settings the same. Might have to check the TIM.
anyone else run into this.


----------



## NBrock

Make sure your fan is spinning up with the load increase. Some people had issues with fans not speeding up as the load increased...but this was on older drivers. Did you do a clean install using DDU or the manual way?


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> Make sure your fan is spinning up with the load increase. Some people had issues with fans not speeding up as the load increased...but this was on older drivers. Did you do a clean install using DDU or the manual way?


Yes I did a clean uninstall with DDU, plus soft ware removal and I tried my custom fan profile(which I prefer being aggressive on testing).
I only get high temps when I try to overclock the HBM, So I doesn't seem likely its software related. But you never know.

This is a R9 Nano, perhaps I'm expecting too much from the stock cooling solution.

Power is set to +50%., +.10 Mv, +100 MHz on the clock and plus +50 Hrz on HBM. With a custom fan curve That ramps from 0 to 50% at 50C and then ramps up to 100% at 75 C.

That's what I had work up to when I stopped. the HBM overclock was the last thing I had changed.

PS very slight coil whine. pretty much unnoticeable from a couple feet away. My ears are getting old though.


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> Yes I did a clean uninstall with DDU, plus soft ware removal and I tried my custom fan profile(which I prefer being aggressive on testing).
> I only get high temps when I try to overclock the HBM, So I doesn't seem likely its software related. But you never know.
> 
> This is a R9 Nano, perhaps I'm expecting too much from the stock cooling solution.
> 
> Power is set to +50%., +.10 Mv, +100 MHz on the clock and plus +50 Hrz on HBM. With a custom fan curve That ramps from 0 to 50% at 50C and then ramps up to 100% at 75 C.
> 
> That's what I had work up to when I stopped. the HBM overclock was the last thing I had changed.
> 
> PS very slight coil whine. pretty much unnoticeable from a couple feet away. My ears are getting old though.


You can use afterburner for a custom fan profile, or edit the BIOS to tell the fans when to speed up. In the BIOS, the fan actually starts increasing speeds when the GPU reaches 75C (assuming you dont touch overdrive settings). The problem howver is the rate at which the fan increases its speed. I find the default BIOS too slow and afterburner too fast (crude). Here is a test

Method 1: Use afterburner fan profile
Method 2: Use 100% TFS and 75 or 80C TGT in overdive
Method 3: Use BIOS edit to increase fan sensitivity and target temperature at which fan actively ramps up speed but no throttle (TGT involves throttling apparently).


----------



## gupsterg

@Medusa666

May I trouble you, for some data?







.

1) bios dumps of both switch positions, do via GPU-Z. Looking at a GPU-Z screenie in Kit Guru review of Nitro it has updated bios. Be interesting to see if there are nay changes to PowerPlay which may aid us with FPS drop like @buildzoid has experienced.

2) can you provide registers and i2cdump for card at stock, ref heading *Gaining per DPM VID information and i2cdump* in OP of Fiji Bios mod thread.

@other members

ASIC profiling is occurring on Fiji like on Hawaii, ROM calculates VIDs based on GPU properties, this means:-

a) ROM does not require tailoring per GPU on PCB in a production situation

b) when comparing OC results and stating what voltage we used we maybe comparing not like for like.

For example:-

A card has VID for DPM 7 @ 1.250V (using EVV) and we add +25mV it equals VID of 1.275V.

Another has VID for DPM 7 @ 1.243V (using EVV) and we add +25mV it equals VID of 1.268V.


----------



## Asus11

had the r9 nano increased in performance since the first reviews etc?

cant lie.. I am tempted


----------



## NBrock

Quote:


> Originally Posted by *bluezone*
> 
> Yes I did a clean uninstall with DDU, plus soft ware removal and I tried my custom fan profile(which I prefer being aggressive on testing).
> I only get high temps when I try to overclock the HBM, So I doesn't seem likely its software related. But you never know.
> 
> This is a R9 Nano, perhaps I'm expecting too much from the stock cooling solution.
> 
> Power is set to +50%., +.10 Mv, +100 MHz on the clock and plus +50 Hrz on HBM. With a custom fan curve That ramps from 0 to 50% at 50C and then ramps up to 100% at 75 C.
> 
> That's what I had work up to when I stopped. the HBM overclock was the last thing I had changed.
> 
> PS very slight coil whine. pretty much unnoticeable from a couple feet away. My ears are getting old though.


Might be worth checking the TIM. I believe I saw someone that a person or two that switched to water blocks saw poor TIM coverage on the HBM.


----------



## elmonen

Hi! Just bought a new graphics card and just a little question.. There is this plastic film on the backplate which I immediatly took off (like it says in it) but there is also some plastic film on the red part (red part on the pic)..

http://img.clubic.com/0226000008105536-photo-radeonr9fury-9.jpg

Is it supposed to come off? I tried to take it off but it seems its fitted around the screws there so I wasnt sure.. maybe a stupid question but just wanted to make sure


----------



## huzzug

Yup. shouldn't be a problem if you took them off.


----------



## elmonen

Quote:


> Originally Posted by *huzzug*
> 
> Yup. shouldn't be a problem if you took them off.


I was more concerned about if I leave it there.. as I currently still have it there. Or should I just rip it off?


----------



## huzzug

Better to take it off. Not sure how it'd react if it gets toasty.


----------



## Alastair

Spoiler: So here are my Fury's. Some of you may have seen them earlier closer to when I got them. But I decided to break out the camera and get some glory shots.












Excuse the scuffs and scratches, the case is three years old now.


----------



## p4inkill3r

Looks great.


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> Might be worth checking the TIM. I believe I saw someone that a person or two that switched to water blocks saw poor TIM coverage on the HBM.


Turns out it was the TIM. There was sooo much TIM it looked like they were trying to spackle a hole in the wall. The entire interposer was filled in to the metal ring surrounding the chip. Idol temps even dropped 5 deg.

On a side note. After disassembly I've picked up coil whine. I need to figure out why that happened.

This particular Nano has serial number in the mid 800's, so it appears to be an early card even though it's 2016.


----------



## NBrock

Quote:


> Originally Posted by *bluezone*
> 
> Turns out it was the TIM. There was sooo much TIM it looked like they were trying to spackle a hole in the wall. The entire interposer was filled in to the metal ring surrounding the chip. Idol temps even dropped 5 deg.
> 
> On a side note. After disassembly I've picked up coil whine. I need to figure out why that happened.
> 
> This particular Nano has serial number in the mid 800's, so it appears to be an early card even though it's 2016.


Well I'm glad you sorted out your temps. Some people here have said the coil whine will get better if you run the card on a very high FPS screen for a while to "burn" them in.


----------



## xTesla1856

How's everyone's perofrmance with the division? So far, I'm not very happy. No matter what settings I use, I can't get over 45fps. Gonna try the new hotfix driver to see if it helps.


----------



## p4inkill3r

Quote:


> Originally Posted by *xTesla1856*
> 
> How's everyone's perofrmance with the division? So far, I'm not very happy. No matter what settings I use, I can't get over 45fps. Gonna try the new hotfix driver to see if it helps.




Me @ 4k


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> Well I'm glad you sorted out your temps. Some people here have said the coil whine will get better if you run the card on a very high FPS screen for a while to "burn" them in.


That's just it though. I had almost zero coil whine before I replaced the TIM. Now I've got fight of the bumble bees playing very loudly when I try to game. I'm going to guess that maybe I slightly warped the circuit board when I tightened the cooler back down. I'm going to loosen the tension screws 1/2 a turn. and see if that helps.


----------



## xTesla1856

The Division is a stuttery, flickering, choppy mess. I dunno if it's just terrible AMD optimization or my hardware. Really pissed ATM


----------



## SuperZan

Quote:


> Originally Posted by *xTesla1856*
> 
> The Division is a stuttery, flickering, choppy mess. I dunno if it's just terrible AMD optimization or my hardware. Really pissed ATM


I haven't heard much good performance-wise from anybody, red or green. If RS:S is anything to go by with regards to Ubisoft releases, it's going to take some post-release optimisation before the game plays acceptably for most.


----------



## xTesla1856

What do I do if all text in-game is blurry and fuzzy? My RivaTuner OSD is all fuzzy. Scaling and overscan are off.


----------



## p4inkill3r

Quote:


> Originally Posted by *xTesla1856*
> 
> What do I do if all text in-game is blurry and fuzzy? My RivaTuner OSD is all fuzzy. Scaling and overscan are off.


Are you playing on an Eyefinity setup? Try backing down to one monitor if so.


----------



## xTesla1856

Yes, I play with Eyefinity. This only started when I installed the new Hotfix driver today.


----------



## p4inkill3r

I'd try reverting or going to one monitor for the interim.


----------



## xTesla1856

Made a video, this is kinda ridiculous:


----------



## p4inkill3r

Quote:


> Originally Posted by *xTesla1856*
> 
> Made a video, this is kinda ridiculous:


Looks like classic Crossfire issues.


----------



## SuperZan

Yeah, you can mess around with triple-buffering and other settings to try to get a more stable xfire experience but it may not work very well until an improved profile is released. FCrimal got one around day-of, so hopefully the next driver/profile release won't be too long in coming.


----------



## Radox-0

Guru 3D have an article up on gpu scores. does seem crossfire causes some issues. They tested on the fury and saw the flickering you mentioned Tesla

Article here: http://www.guru3d.com/articles_pages/the_division_pc_graphics_performance_benchmark_review,8.html


----------



## Thoth420

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Alastair*
> 
> 
> 
> Spoiler: So here are my Fury's. Some of you may have seen them earlier closer to when I got them. But I decided to break out the camera and get some glory shots.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Excuse the scuffs and scratches, the case is three years old now.






The the artwork on the PSU is awesome and ties it all together perfectly!


----------



## Nameless101

Quote:


> Originally Posted by *bluezone*
> 
> That's just it though. I had almost zero coil whine before I replaced the TIM. Now I've got fight of the bumble bees playing very loudly when I try to game. I'm going to guess that maybe I slightly warped the circuit board when I tightened the cooler back down. I'm going to loosen the tension screws 1/2 a turn. and see if that helps.


I used my Nano for around two weeks before finally giving up and sending it back to Amazon today. For me at least the whine seemed to be getting worse, if anything, alternating between a lower buzz and a high squeak and evrything in between. I didn't want to risk beinf stuck with a card like that... Now I just need to decide whether to try my luck with another Nano or go with the more expensive Fury X, which by all accounts is much less susceptible to coil whine.


----------



## bluezone

Quote:


> Originally Posted by *Nameless101*
> 
> I used my Nano for around two weeks before finally giving up and sending it back to Amazon today. For me at least the whine seemed to be getting worse, if anything, alternating between a lower buzz and a high squeak and evrything in between. I didn't want to risk beinf stuck with a card like that... Now I just need to decide whether to try my luck with another Nano or go with the more expensive Fury X, which by all accounts is much less susceptible to coil whine.


Well it wont be necessary to return the Nano. I backed off the screws and heat cycled the card. Now the card is back to being it's unobtrusive self.
To be more colourful, It's suffered hive collapse syndrome and the bees are gone.


----------



## Radox-0

Quote:


> Originally Posted by *bluezone*
> 
> Well it wont be necessary to return the Nano. I backed off the screws and heat cycled the card. Now the card is back to being it's unobtrusive self.
> To be more colourful, It's suffered hive collapse syndrome and the bees are gone.


That's pretty interesting. So releasing the pressure reduce the whine. Nice


----------



## gupsterg

@fat4l

Fitted the Fury X today, was a mare to get where I wanted in my case, even though as they come pretty large case. Yes it's an old timer







SilverStone TJ06 but has some mods to bring it up to date







.

Any how quick little result Link:- Left Fury X vs right Fury with 3840SP

Note it's clock for clock compare







.

Both cards had cooling profile mod so GPU is maintained at =<55C , plus PL mod in ROM TDC 297A / TDP 330W / MPDL 330W.

Will do some more compares over the next few days.

Besides handling the AIO being a pain it distinctly sounds whinier, a) the coil noise b) I think the pump (but this is new / not early model) c) perhaps the single 120mm vs 3 fans on tri-x !? (will have to look at RPM data).

Personally I don't think Fury Tri-X / Nitro owners are missing out on anything much on cooling aspect vs Fury X AIO and with an unlock your pretty much there on performance. Even the VRM temps seem equal or better on Fury Tri-X in some quick compares I did (this is with FL mod on both).

Do luv the RADEON logo for sure, looks great through a mesh panel.


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Well it wont be necessary to return the Nano. I backed off the screws and heat cycled the card. Now the card is back to being it's unobtrusive self.
> To be more colourful, It's suffered hive collapse syndrome and the bees are gone.


Glad to hear that!


----------



## Arizonian

Quote:


> Originally Posted by *p4inkill3r*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Me @ 4k


You inspired me @ 4k

Nitro Fury 1150 Mhz Core Stock memory 4790 4.6 Ghz



Running Ultra default at 4K

Min *31* Max *61* Avg *43.7*


----------



## xTesla1856

Any word on whether or not Crimson 16.3 is gonna add a proper CFX profile for The Division?


----------



## Nameless101

Quote:


> Originally Posted by *bluezone*
> 
> Well it wont be necessary to return the Nano. I backed off the screws and heat cycled the card. Now the card is back to being it's unobtrusive self.
> To be more colourful, It's suffered hive collapse syndrome and the bees are gone.


Looks like there would have been one last trick to try and fix my card! Good on you! Must keep this in mind when i get my new card.


----------



## p4inkill3r

Quote:


> Originally Posted by *Arizonian*
> 
> You inspired me @ 4k
> 
> Nitro Fury 1150 Mhz Core Stock memory 4790 4.6 Ghz
> 
> 
> 
> Running Ultra default at 4K
> 
> Min *31* Max *61* Avg *43.7*


I'm finding the game very playable at these values. Freesync makes a huge difference.


----------



## p4inkill3r

Quote:


> Originally Posted by *xTesla1856*
> 
> Any word on whether or not Crimson 16.3 is gonna add a proper CFX profile for The Division?


I'd be willing to bet that it will.


----------



## Alastair

Guys. With Firestrike Basic. Are you guys seeing your scores go up at all? Im talking just the graphics score here. At 1100Mhz +0mv/ 545MHz my score did not change a singlr bit over 1000MHz.


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> Guys. With Firestrike Basic. Are you guys seeing your scores go up at all? Im talking just the graphics score here. At 1100Mhz +0mv/ 545MHz my score did not change a singlr bit over 1000MHz.


I got major point increases by using win7, windows 10 did horrible for me in cf results. Not sure but i wouldnt be surprised if win8 suffers as well.

I am going to go back to dual boot myself win10 is wearing thin on my patience


----------



## Alastair

Quote:


> Originally Posted by *dagget3450*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys. With Firestrike Basic. Are you guys seeing your scores go up at all? Im talking just the graphics score here. At 1100Mhz +0mv/ 545MHz my score did not change a singlr bit over 1000MHz.
> 
> 
> 
> I got major point increases by using win7, windows 10 did horrible for me in cf results. Not sure but i wouldnt be surprised if win8 suffers as well.
> 
> I am going to go back to dual boot myself win10 is wearing thin on my patience
Click to expand...

Im benching from my win7 dual boot.


----------



## xTesla1856

I also reverted back to Win7 after I started getting nothing but trouble with SLI and Surround. So far, I'm very happy, also in regards to Win10 essentially being spyware disguised as an OS. For maximum safety, I recommend SpyBot anti-beacon for blocking all connections to MS servers. Also disabled/blocked the Win10 update.

I'm riding this Win7 wave until the very bitter end


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> Im benching from my win7 dual boot.


Sorry was going by sig rig, i would say try an older driver? I had issues with crimson and cf but i think mine was more specifically i think to do with 2+gpus


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> Guys. With Firestrike Basic. Are you guys seeing your scores go up at all? Im talking just the graphics score here. At 1100Mhz +0mv/ 545MHz my score did not change a singlr bit over 1000MHz.


Yes I am on Fury Tri-X and Fury X, here are Fury X :- 1110 vs 1100 vs 1050

i5 rig in my sig but Fury-X , Win 7 Pro x64, Crimson v16.2.1 (default settings in driver).

Also I urge people to state VID from AiDA64 registers dump, view first NBrock Fury X data in this post.

His Fury X :-

DPM7: GPUClock = 1050 MHz, VID = 1.23100 V

Mine :-

DPM7: GPUClock = 1050 MHz, VID = 1.25000 V

My result is from stock factory ROM.

My_Fury_X_dumps.zip 9k .zip file


As I said before ASIC profiling is going on, all stock ROMs are EVV thus VID is calculated on LeakageID and other GPU properties.

Also below some results from registers when upping GPU frequency in MSI AB.



Spoiler: Stock registers dump GPU Freq & VID per DPM



Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  512 MHz, VID = 0.92500 V
DPM2: GPUClock =  724 MHz, VID = 0.93700 V
DPM3: GPUClock =  892 MHz, VID = 1.05000 V
DPM4: GPUClock =  944 MHz, VID = 1.10600 V
DPM5: GPUClock =  984 MHz, VID = 1.15600 V
DPM6: GPUClock = 1018 MHz, VID = 1.20000 V
DPM7: GPUClock = 1050 MHz, VID = 1.25000 V







Spoiler: 1090MHz set by MSI AB



Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  512 MHz, VID = 0.92500 V
DPM2: GPUClock =  746 MHz, VID = 1.05000 V
DPM3: GPUClock =  919 MHz, VID = 1.10600 V
DPM4: GPUClock =  972 MHz, VID = 1.15600 V
DPM5: GPUClock = 1014 MHz, VID = 1.20000 V
DPM6: GPUClock = 1049 MHz, VID = 1.25000 V
DPM7: GPUClock = 1090 MHz, VID = 1.25000 V







Spoiler: 1110MHz set by MSI AB



Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  512 MHz, VID = 0.92500 V
DPM2: GPUClock =  760 MHz, VID = 1.05000 V
DPM3: GPUClock =  937 MHz, VID = 1.10600 V
DPM4: GPUClock =  991 MHz, VID = 1.20000 V
DPM5: GPUClock = 1033 MHz, VID = 1.25000 V
DPM6: GPUClock = 1069 MHz, VID = 1.25000 V
DPM7: GPUClock = 1110 MHz, VID = 1.25000 V





There is an advantage *plus disadvantages* to what is occurring when we use MSI AB to set DPM 7 GPU frequency and lower are being automatically upped, it means when card does drop to a lower state it uses a higher frequency (advantage) but as VID has increased (which may not be needed technically, disadvantage) the "PowerTune" tech may elect to go further down a DPM to select one with lower VID = lower power usage to stay within "PowerLimits" and *we see a bigger clock drop = performance / FPS drop (disadvantage).*

With editing DPM 7 GPU frequency in ROM the lower DPM VID & GPU frequency are not effected







. Also with ROM mod we can make lower DPMs GPU frequency higher but with lower VID which could mean when "PowerTune" elect to select lower VID DPM it may not go down as low as the way MSI AB is effected VID with GPU frequency increase.

Will be testing these aspects very soon







.

*Note:-* I added no voltage in ROM or MSI AB, ROM was stock other than cooling profile mod and PL increase of 297A / 330W / 330W vs 270/300/300.


----------



## Pintek

Asus giving me the run around but is there any issues with the hbm on pciE 2.0 x16?


----------



## Alastair

Quote:


> Originally Posted by *Pintek*
> 
> Asus giving me the run around but is there any issues with the hbm on pciE 2.0 x16?


Nope. Both my Fury's run on PCI-E 2.0 16X. Remember 2.0 @ 16x is = to 3.0 @ 8x. So really there wont be any hassels. I pull over 28000 as a graphics score in Firestrike Basic. Which seems to be what two cards are getting around here in Basic.


----------



## Pintek

OK thanks! Think I got a Asus rep that didn't know enough besides spec sheets


----------



## bluezone

Quote:


> Originally Posted by *Radox-0*
> 
> That's pretty interesting. So releasing the pressure reduce the whine. Nice


Yes much better now. Don't forget I accidently caused my coil whine during a TIM replacement, It wold be good if this solution can help anyone else.
Quote:


> Originally Posted by *Radox-0*
> 
> That's pretty interesting. So releasing the pressure reduce the whine. Nice


Now my power supply sings solo. lol
Quote:


> Originally Posted by *Nameless101*
> 
> Looks like there would have been one last trick to try and fix my card! Good on you! Must keep this in mind when i get my new card.


There was a warranty sticker over one of the screws. So this defiantly voided any returns or RMA. Not that I had that option with no sales slip.


----------



## Alastair

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Radox-0*
> 
> That's pretty interesting. So releasing the pressure reduce the whine. Nice
> 
> 
> 
> Yes much better now. Don't forget I accidently caused my coil whine during a TIM replacement, It wold be good if this solution can help anyone else.
> Quote:
> 
> 
> 
> Originally Posted by *Radox-0*
> 
> That's pretty interesting. So releasing the pressure reduce the whine. Nice
> 
> Click to expand...
> 
> Now my power supply sings solo. lol
> Quote:
> 
> 
> 
> Originally Posted by *Nameless101*
> 
> Looks like there would have been one last trick to try and fix my card! Good on you! Must keep this in mind when i get my new card.
> 
> Click to expand...
> 
> There was a warranty sticker over one of the screws. So this defiantly voided any returns or RMA. Not that I had that option with no sales slip.
Click to expand...

Alot of people still seem to be able to RMA even with the sticker removed. I mean we need to be able to change TIM because they make such a damn mess with TIM these guys.


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> Alot of people still seem to be able to RMA even with the sticker removed. I mean we need to be able to change TIM because they make such a damn mess with TIM these guys.


Well that's good to know,









Anyone else waiting for the Crimson 16.3 to drop today?


----------



## Alastair

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Alot of people still seem to be able to RMA even with the sticker removed. I mean we need to be able to change TIM because they make such a damn mess with TIM these guys.
> 
> 
> 
> Well that's good to know,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else waiting for the Crimson 16.3 to drop today?
Click to expand...

is it dropping today?


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> is it dropping today?


It's out just now.

http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.aspx

I see they have added Vulcan support. Now hopefully no more having to double driver install for smoother game play in RotTR.


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> It's out just now.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.aspx
> 
> I see they have added Vulcan support. Now hopefully no more having to double driver install for smoother game play in RotTR.


Yay! Hitman support! I can't wait....even have a backup copy purchased on the Xbone just in case the PC release is a broken pile.









Also I know alot of you guys have been waiting for this:
*Core clocks may not maintain sustained clock speeds resulting in choppy performance and or screen corruption (Fixed with new Power Efficiency feature toggled to off)*


----------



## xTesla1856

Sadly no fix for The Division. Apparently, GameWorks was implemented just before launch, might explain why the Beta ran flawlessly but the launch version is a nightmare mess.


----------



## dagget3450

Quote:


> Originally Posted by *Thoth420*
> 
> Yay! Hitman support! I can't wait....even have a backup copy purchased on the Xbone just in case the PC release is a broken pile.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I know alot of you guys have been waiting for this:
> *Core clocks may not maintain sustained clock speeds resulting in choppy performance and or screen corruption (Fixed with new Power Efficiency feature toggled to off)*


if this is true i will $%^&$%&$#%^^#%$^#%^$#%@^%$#@^$%#@^@$#%^%#^%$#@^[email protected]%^$%#^@$%^@# x infinty

PLEASE FOR THE LOVE OF GOD LET THIS BE TRUE *clicks download*


----------



## Noirgheos

Quote:


> Originally Posted by *dagget3450*
> 
> if this is true i will $%^&$%&$#%^^#%$^#%^$#%@^%$#@^$%#@^@$#%^%#^%$#@^[email protected]%^$%#^@$%^@# x infinty
> 
> PLEASE FOR THE LOVE OF GOD LET THIS BE TRUE *clicks download*


IT IS IT IS

New issue though, games don't pop up, and scan doesn't work. Add them manually. All of you guys report this please.


----------



## SuperZan

Quote:


> Originally Posted by *xTesla1856*
> 
> Sadly no fix for The Division. Apparently, GameWorks was implemented just before launch, might explain why the Beta ran flawlessly but the launch version is a nightmare mess.


That is almost certainly the issue. Introducing gimpworks into the mix inevitably increases driver optimisation time for AMD.


----------



## bluezone

These are my new favorite drivers.

http://www.3dmark.com/3dm11/11040821

wish I could afford a water block and loop right now.

For reference.

http://www.3dmark.com/3dm11/10372801


----------



## buildzoid

HBM overclocking with Trixx causes a crash for me on the new drivers. Does it also do that for anyone else?


----------



## Thoth420

Quote:


> Originally Posted by *Noirgheos*
> 
> IT IS IT IS
> 
> New issue though, games don't pop up, and scan doesn't work. Add them manually. All of you guys report this please.


Le Sigh...I had that bug in 16.1 and it was gone in 16.2. Oh well


----------



## bluezone

Quote:


> Originally Posted by *buildzoid*
> 
> HBM overclocking with Trixx causes a crash for me on the new drivers. Does it also do that for anyone else?


I'm good with overclocking with TRIXX, but have lost VSR.


----------



## Thoth420

Quote:


> Originally Posted by *buildzoid*
> 
> HBM overclocking with Trixx causes a crash for me on the new drivers. Does it also do that for anyone else?


Did you install the new driver with it installed? Perhaps running? Those programs are often set to start with windows by default. This can be problematic...I always dump my OC software and profiles before a driver swap.


----------



## buildzoid

I never save any profiles it crashes when when I try to change the HBM clock after opening TriXX manually.

It might be a problem with the custom BIOS I'm running.


----------



## Spock121

The Fury X is just such a cute card.


----------



## SuperZan

Adorable! And looking great in the build.


----------



## Semel

Quote:


> Originally Posted by *bluezone*
> 
> These are my new favorite drivers.
> 
> http://www.3dmark.com/3dm11/11040821
> 
> wish I could afford a water block and loop right now.
> 
> For reference.
> 
> http://www.3dmark.com/3dm11/10372801


wow. this is 3K more than my fury 3840 at 1120/560









PS I didn't see any major increase in benchmarks with the new driver. Games however run smoother.


----------



## 98uk

-


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> These are my new favorite drivers.
> 
> http://www.3dmark.com/3dm11/11040821
> 
> wish I could afford a water block and loop right now.
> 
> For reference.
> 
> http://www.3dmark.com/3dm11/10372801


1150mhz on air? nice









what are your settings like? Power +50%? any voltage increase from msi?


----------



## gupsterg

So what do guys think about my OCs using stock VID?

Fury Tri-X unlock to 3840SP, 1090/525, VID 1.250V.

Fury X 1095MHz VID 1.250V, not yet upped RAM.

All had same 55c cooling profile plus 297A/330W/330W PowerLimit in ROM. VDDC is lower as I'm stating VID, LLC is stock. Stock coolers and TIM.

I'm finding higher OC are game / bench stable but not [email protected] I get "bad state" and "max retries" for a work unit. So above OCs are like 8-9hr [email protected] passes.

I have 3rd card to test as well, Fury X.

I'm finding the Fury X stock AIO fan has a whine, it's not the pump. Anyone else had this? hoping the 2nd Fury X does not. All cards are new purchased only a week or so ago.


----------



## Mumak

I can see slightly improved performance with Crimson 13.3 and the new "Power Efficiency" option switched off when running [email protected] BRP tasks.


----------



## lordymosh

Hi,

I recently built a pc about 2 months ago now. I have a bitfenix prodigy mini-itx case, 650 W psu, i5-4690k and a sapphire r9 390. I was originally planning on getting an R9 Nano but thought it cost too much. It had the price drop a few weeks later but already had my pc built. :/

Would it be worth it to sell my 390 and get the nano instead? I currently have a 1080p monitor but will be upgrading to 1440p in the next year. I had also liked the thought of a cystom water loop for the gpu down the line but Ek doesn't have a full block for sapphire's r9 390. Alphacool do but would rather Ek.

Unfortunately I've already activated my warranty and saw that sapphire doesn't allow warranties to be transferred. I did e-mail them about it today. Also I'm not sure how much money I could get for the 390.

Any suggestions? Is it worth the hassle to change to a r9 nano?


----------



## Agent Smith1984

Quote:


> Originally Posted by *lordymosh*
> 
> Hi,
> 
> I recently built a pc about 2 months ago now. I have a bitfenix prodigy mini-itx case, 650 W psu, i5-4690k and a sapphire r9 390. I was originally planning on getting an R9 Nano but thought it cost too much. It had the price drop a few weeks later but already had my pc built. :/
> 
> Would it be worth it to sell my 390 and get the nano instead? I currently have a 1080p monitor but will be upgrading to 1440p in the next year. I had also liked the thought of a cystom water loop for the gpu down the line but Ek doesn't have a full block for sapphire's r9 390. Alphacool do but would rather Ek.
> 
> Unfortunately I've already activated my warranty and saw that sapphire doesn't allow warranties to be transferred. I did e-mail them about it today. Also I'm not sure how much money I could get for the 390.
> 
> Any suggestions? Is it worth the hassle to change to a r9 nano?


If getting 3-5 more FPS matters to you, then it's worth it...... if not, just do some overclocking on the 390 and call it a day until the next gen.... I actually went from a Fury BACK TO A 390X because of the VRAM limitations I was experiencing at 4K resoltuion.

http://www.tweaktown.com/reviews/7335/amd-radeon-r9-nano-video-card-review-fury-dead/index7.html


----------



## gupsterg

Quote:


> Originally Posted by *lordymosh*
> 
> Would it be worth it to sell my 390 and get the nano instead? I currently have a 1080p monitor but will be upgrading to 1440p in the next year.


*My mini review*

I can not share real experience on Nano as not had one, but one thing I'm noting from others posts is on air it throttles more than Fury or Fury X. So if going WC Nano is good, as currently in the UK you can buy one ~£350 vs £470 for Fury X.

Next the Nano PCB has 4 phase rear VRM vs 6 on Fury & Fury X. I would count that as a plus on Fury & Fury X as a) load will be shared over more components be cooler running b) would also mean when you OC you have a better VRM to cope with it. Again if going WC on Nano the cooler VRM will cope with higher loading. I would ask @buildzoid as he has good insight on these things.

I have owned 4 Hawaii cards since last year Tri-X 290 STD (SOLD) , Asus DCUII 290X STD (SOLD) , Vapor-X 290X STD (HAVE) and Tri-X 290 OC (HAVE). Even with heavy bios mods + OC I don't believe they can match Fury or Fury X when they are OC'd as well.

I originally bought a Fury Tri-X and Fury X just to do Fiji bios mod.

Here are some 3DM FS full run scores, so 1080P, gives some indication of performance.

Link:- Fury X 1090/500 (4096SP) vs Fury Tri-X 1090/500 (3840SP) vs Fury Tri-X "out of box" (3584SP) vs Vapor-X 290X 1100/1525 (2816SP)

Notes:-

i) SP = Stream processors
ii) All Fury / X benches *without RAM OC*
iii) the Vapor-X 290X is very very close to MSI 390X Gaming (1100/1525 out of box)
iv) the Vapor-X has heavy bios mod going on the Fury / Fury X don't.
v) all cards stock coolers / TIM
vi) all tested in same i5 4690K rig as in my sig

The Vapor-X 290X has the coolest running VRM out of all the hawaii cards I had <60C on air. GPU with little fan profile mod would need modifying to cope with 1.3V (VID) plus 1100/1525 to maintain =<75C. The Tri-X & Vapor-X stock coolers are well regarded *but* the Fury Tri-X is quieter even when card OC'd. With room ambient of 18-24C the Fury Tri-X can easily maintain 55C on GPU with again bit of fan profile mod, *never on air did I experience that on Hawaii*, doubt Grenada (390/X) can either. Comparing the Vapor-X 290X 10 phase rear VRM to the 6 on Fury Tri-X I'm seeing =< 60C, which in my books is crazy cool. IF you compare the 6 phase VRM on the Tri-X 290 with the Fury Tri-X the 290 would reach 85C+.

Also I must state I am using stock 1.25V VID on Fury Tri-X & Fury X, they may OC further with same 1.3V VID I gave Vapor-X 290X.

Next time the Fury Tri-X is in my rig I'll lock it down to 3584SP and OC it so that can be compared. I regard the Fury to be the best bang for buck at mo taking all aspects into consideration, with the "unlock" lotto you could end up having a Fury X for the price of Fury or like I did something pretty much the same with 3840SP.

I have not had time to run FRAPS tests on games, due to bios mod investigations not had chance to game much but when I have I believe FPS is better and gameplay smoother.

Due to hanging around on Sapphire forum I have noted mods highlight if an owner of a card can prove purchase data warranty is valid regardless if invoice is not in their name. Fire that at support and they should be able to verify or give this Sapphire hardware rep on OCN a PM, VaporX.


----------



## fat4l

Nice finds.
To be honest, i dont rly see the power of Fury X







I'm kind of disappointed.... I'm leaning towards Nano more and more


----------



## gupsterg

As you have great WC and experience fat4l , I'd say 2x Nanos be cracking buy for you.

Also read this post the section for buildzoid.

Due to Power Efficiency being hidden and default = On in all past Crimsons I think that's why people been seeing better benches with CCC 15.7.1 (IIRC).

I also bet you if I lower PL back to stock in ROM with PE off Heaven gonna be flat line for GPU clock.

You see when I was upping PL to get Heaven to stabilise with drivers below 16.3 as they had PE hidden (default=On) the "PowerTune" tech got more aggressive to downclock GPU. Only way to make it flatter (not as good as 16.3 with PE=Off) was to find a point of PL which was just right so "PowerTune" did not intervene but was enough to allow card to get as close to what PL it needed for OC.

Another thing you may recall the headache on Hawaii with say "YouTube" run in FireFox with hardware acceleration on where clocks would rise to max OC/DPM7? Well first on Fiji all was fine (ie 300MHz flat) and I thought great new card they've nailed it. *But* when you disable PE in 16.3, guess what? bouncing clocks return







(not getting to DPM 7 though).

Also all those little things in Hawaii where when we opened windows / GPU became under slight load at desktop and clocks bounced, well they also return with PE = Off







, still not as bad as Hawaii *but* seems excessive when with PE = On it's flat 300MHz.


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> 1150mhz on air? nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what are your settings like? Power +50%? any voltage increase from msi?


That was just a bench marking run (Trixx). Too warm for gaming on air. That why I wished for a water block.. 1150 core, +25 Mv VDDC, 550 MC and 30% Power limit.
For 24/7 It's starting to look like 1100-1125 core, -25-0 Mv GPU Voltage and 35-40 Power limit. That is with the fan ramping up to 78% and 70-75 C running temp. With likely no memory overclock, because despite everything I've read very little to be gained with my card. Running about 200 point lower in 3D Mark 11.
My HD 7950's were the same way. They were Sapphires too. My GTX 560 Ti and GTX 650 Ti both like memory overclocks.

I've got to do one final Curing run on the Tim before where it will be set.


----------



## gupsterg

If your GPU is at 70-75C going by info I read the RAM will not OC so well due to temps.

I have seen some scaling with RAM clocks, Link:- Fury Tri-X (3840SP unlock) 1090/500 vs 1090/525

Not huge deal of increase but there consistently, the RAM just has so much bandwidth.

I reckon tighter RAM timings (if ever possible) would also show very little benefit, again due to the bandwidth.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> As you have great WC and experience fat4l , I'd say 2x Nanos be cracking buy for you.
> 
> Also read this post the section for buildzoid.
> 
> Due to Power Efficiency being hidden and default = On in all past Crimsons I think that's why people been seeing better benches with CCC 15.7.1 (IIRC).
> 
> I also bet you if I lower PL back to stock in ROM with PE off Heaven gonna be flat line for GPU clock.
> 
> You see when I was upping PL to get Heaven to stabilise with drivers below 16.3 as they had PE hidden (default=On) the "PowerTune" tech got more aggressive to downclock GPU. Only way to make it flatter (not as good as 16.3 with PE=Off) was to find a point of PL which was just right so "PowerTune" did not intervene but was enough to allow card to get as close to what PL it needed for OC.
> 
> Another thing you may recall the headache on Hawaii with say "YouTube" run in FireFox with hardware acceleration on where clocks would rise to max OC/DPM7? Well first on Fiji all was fine (ie 300MHz flat) and I thought great new card they've nailed it. *But* when you disable PE in 16.3, guess what? bouncing clocks return
> 
> 
> 
> 
> 
> 
> 
> (not getting to DPM 7 though).
> 
> Also all those little things in Hawaii where when we opened windows / GPU became under slight load at desktop and clocks bounced, well they also return with PE = Off
> 
> 
> 
> 
> 
> 
> 
> , still not as bad as Hawaii *but* seems excessive when with PE = On it's flat 300MHz.


Does this new feature in 16.3, PE, works for 290X cards too ? I remember seeing 300 series and Fury only(patch notes).


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> If your GPU is at 70-75C going by info I read the RAM will not OC so well due to temps.
> 
> I have seen some scaling with RAM clocks, Link:- Fury Tri-X (3840SP unlock) 1090/500 vs 1090/525
> 
> Not huge deal of increase but there consistently, the RAM just has so much bandwidth.
> 
> I reckon tighter RAM timings (if ever possible) would also show very little benefit, again due to the bandwidth.


Very good info to know.

If you reread my post you will notice I said with my card, not IMO. The Idea was that I was on the train of thought about temps. This is a Nano on air. I'm trying to avoid putting energy into the system faster than I can remove it. Well there are gains with OC'n Ram, my observed benefits were not worth it. It comes down to heat. I defiantly should of been clearer about the 200 point difference in 3D Mark 11. This being in reference to the difference between the hi OC ( with 550 MC) run vs. 1100 Mhz at -25 VDDC no ram overclock. The difference is not the scores (negligible), but the heat output I was seeing.


----------



## Semel

*Gupsterg*

I think it's fine as it is. I mean how PE on\off is done..

You enable it only when gaming.. It's practically the same as performance mode in nvidia control panel.


----------



## gupsterg

@fat4l

Dunno , will check







.

@bluezone

My post was just as info, I did say it's not a huge increase like you stated in your post above my earlier one, as agreement to your way of thinking







. I'd do the same as you are if had Nano / same situation







.

TBH even with Fury Tri-X or Fury X I'm upping GPU first to see if it then means to gain RAM I gotta lower GPU clock and as GPU clock is gonna give probably more gain I'd take that over RAM







. And also like you am checking voltage scaling, etc for what I then end up with on whole as temps / power draw / etc , etc







.

@Semel

Maybe I'm lazy







, maybe I just want no SW setting for each situation







. This is also the reason why I did bios mod on Hawaii and now Fiji I'd just like not to be setting up SW for each scenario.

My thinking is how about an "Auto" setting, for example if your at desktop efficiency on , when heavy GPU load at desktop / gaming it's defaults off. Perhaps even being able to set profiles which once set work as is, instead of repeated on/off







.


----------



## bluezone

My post was just as info, I did say it's not a huge increase like you stated in your post above my earlier one, as agreement to your way of thinking smile.gif . I'd do the same as you are if had Nano / same situation smile.gif .

TBH even with Fury Tri-X or Fury X I'm upping GPU first to see if it then means to gain RAM I gotta lower GPU clock and as GPU clock is gonna give probably more gain I'd take that over RAM smile.gif . And also like you am checking voltage scaling, etc for what I then end up with on whole as temps / power draw / etc , etc smile.gif .

Sorry my bad. I should of used emoji's to express my mood,






























Thanks to our conversation you got me thinking about alternate heat paths.







I've have seen FLIR images of the backside of the Nano. These show the Nano with heat concentration around the VRM's. So I ghetto mounted (zip tie and TIM) a small heat sink and had an immediately lower (15C) temp in the area. Operating temp dropped 2 C as well. I have a Raven 3 case with an inverted board, so very good air flow.

Right now I'm waiting to hear back from someone on Kijiji about a very old aftermarket GPU cooler, Just to see if more air flow across the back the card will improve things even more. I don't mind trying weird things if I think they will help.


----------



## gupsterg

I have Silverstone TJ06 also an inverted ATX case







. I have always luv'd it since buying it donkeys ago







. Had some real good times with what it has housed over the many many years I've owned







.

Last year modded 2x 140mm fans / 2x 92mm @ HDD cage / 2x SSD mounts. I did a mesh side panel mod donkeys ago when IIRC I had a GTX 280 or 285 (it need more air) why I never went lower with it was to keep sort of air flow going front to back with front / HSF / rear fans using the side panel as barrier.



Spoiler: Stock case






Never used the plastic tunnel ever with the case.





Spoiler: My Vapor- X 290X inside it.










Spoiler: Fury X through the mesh





Currently been too busy to give it an air dusting session!


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> I have Silverstone TJ06 also an inverted ATX case
> 
> 
> 
> 
> 
> 
> 
> . I have always luv'd it since buying it donkeys ago
> 
> 
> 
> 
> 
> 
> 
> . Had some real good times with what it has housed over the many many years I've owned
> 
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Last year modded 2x 140mm fans / 2x 92mm @ HDD cage / 2x SSD mounts. I did a mesh side panel mod donkeys ago when IIRC I had a GTX 280 or 285 (it need more air) why I never went lower with it was to keep sort of air flow going front to back with front / HSF / rear fans using the side panel as barrier.
> 
> 
> 
> Spoiler: Stock case
> 
> 
> 
> 
> 
> 
> Never used the plastic tunnel ever with the case.
> 
> 
> 
> 
> 
> Spoiler: My Vapor- X 290X inside it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Fury X through the mesh
> 
> 
> 
> 
> 
> Currently been too busy to give it an air dusting session!


That is a very nice looking build.







You must have a small hurricane going in in your case.









I should of said that my case is a vertical ATX. Panel at the top. Air flow bottom to top.


----------



## 98uk

Does anyone know how the new "power efficiency" option in 16.3 is affecting overclocks or performance on Fury?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> That is a very nice looking build.


Cheers







.
Quote:


> Originally Posted by *bluezone*
> 
> You must have a small hurricane going in in your case.


At times







.

The TY-143 are really great fans IMO, quiet when desktop use / gaming but got the headroom for stress testing. It was @doyll that pointed me to them, his post comparing them to Noctua. IMO a bargain at £7 on Amazon.co.uk when compared to like spec of PWM fans.
Quote:


> Originally Posted by *bluezone*
> 
> I should of said that my case is a vertical ATX. Panel at the top. Air flow bottom to top.


Seen some great builds in them as well







.
Quote:


> Originally Posted by *98uk*
> 
> Does anyone know how the new "power efficiency" option in 16.3 is affecting overclocks or performance on Fury?


Not tested if gain more of an OC, or checked performance stats, but for same setup of card via ROM result for GPU clock in Heaven below.



Spoiler: PE : On









Spoiler: PE : Off







HML files in attached zip.

Power_eff_check.zip 10k .zip file


*Note:-* With PE = Off you will see more clock bounce at low loads vs PE = On.


----------



## JunkaDK

Just got my second Asus R9 Fury STRIX yesterday









Haven't testet it much yet, but got a Fire Strike Extreme score of 14641 with my 2 cards, which was better than 99% .

This was with 1100mhz Core clock and 545 mhz mem clock







More testing to come.


----------



## NBrock

It seems so far with the Power Efficiency setting turned off and the new drivers that Folding at Home performance is a bit better.


----------



## Pintek

Have to thank everyone for giving so much helpful information least somewhere on the Internet knows what their doing! Amd and Asus email support is being vague and unhelpful x_x


----------



## fat4l

Cna anyone post the screenshot from that new "power" feature ?
@gupsterg cant see it on 290X


----------



## NBrock

Quote:


> Originally Posted by *fat4l*
> 
> Cna anyone post the screenshot from that new "power" feature ?
> @gupsterg cant see it on 290X


That's because it isn't for 290x. I am pretty sure just 390, 390x, and Fury series.


----------



## fat4l

Quote:


> Originally Posted by *NBrock*
> 
> That's because it isn't for 290x. I am pretty sure just 390, 390x, and Fury series.


exactly...but theres no difference between 290x and 390x.....
Again ...its AMD's brain...


----------



## gupsterg

Looks like 2xx series not getting it mate







.

Saw a poster in 285 bios mod thread flash his card to 380 and then get option.

Also concur with what NBrock said, seem 390/X and Fury/X + Nano peeps get PE option.

Open panel select Gaming > Global Graphics and option is there for above cards.

I'm guessing peeps with 290/X could flash to 390/X and get it perhaps, but as there is no 395X2 your out of luck I guess.

TBH all it does like I said in my earlier post is unlock PowerTune to not:-

a) effect idle clock to stick to DPM 0
b) DPM 7 is not throttled for scenarios where it is when PE = ON

Link to post

We 290/X & 295X2 were not getting the same throttling as Fury/X + Nano guys.


----------



## NBrock

To be fair we don't know if it's coming yet or not for sure...this is just a beta release. While it does suck that they haven't or may not release it they do have a reason (may not seem like a good reason but it is a business after all). It's another "feature" to help differentiate the 200 series from the 300 series.

I can't speak for the 300 cards but I didn't have the problems with Crimson on my 290x, 290 and 295x2 that I have with my Fury X due to the "power features". The only issue I had with the 295x2 was that I couldn't turn off CrossFire.


----------



## Flamingo

ROTR got the DX12 update, numbers for DX12 looking meh:

R9 Nano @ 1080p (all max, no HBAO, FXAA)
DX11 63.50fps
DX12 52.31fps

R9 Nano @ 1080p (all max, no HBAO, SSAA 4x)
DX11 35.93fps
DX12 28.38fps


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> Looks like 2xx series not getting it mate
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Saw a poster in 285 bios mod thread flash his card to 380 and then get option.
> 
> Also concur with what NBrock said, seem 390/X and Fury/X + Nano peeps get PE option.
> 
> Open panel select Gaming > Global Graphics and option is there for above cards.
> 
> I'm guessing peeps with 290/X could flash to 390/X and get it perhaps, but as there is no 395X2 your out of luck I guess.
> 
> TBH all it does like I said in my earlier post is unlock PowerTune to not:-
> 
> a) effect idle clock to stick to DPM 0
> b) DPM 7 is not throttled for scenarios where it is when PE = ON
> 
> Link to post
> 
> We 290/X & 295X2 were not getting the same throttling as Fury/X + Nano guys.


thanks ! I can still use Clockblocker so....


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> That's because it isn't for 290x. I am pretty sure just 390, 390x, and Fury series.


For me it's not available on the Nano. I believe they list it for the Fury x only. Anyone with a Fury AIB with any sign of the "power feature".


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> For me it's not available on the Nano. I believe they list it for the Fury x only. Anyone with a Fury AIB with any sign of the "power feature".


Sapphire Nitro Fury owner here, I get the option in Radeon Settings under the "Games" tab.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .
> At times
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The TY-143 are really great fans IMO, quiet when desktop use / gaming but got the headroom for stress testing. It was @doyll that pointed me to them, his post comparing them to Noctua. IMO a bargain at £7 on Amazon.co.uk when compared to like spec of PWM fans.
> Seen some great builds in them as well
> 
> 
> 
> 
> 
> 
> 
> .
> Not tested if gain more of an OC, or checked performance stats, but for same setup of card via ROM result for GPU clock in Heaven below.
> 
> Well my ghetto mod to remove heat worked better than expected. If I wanted I could 1150 24/7 at 30% power limit at 70-71C. Only problems are- not very good contact and zip-ties every where. I think I'll pass on leaving the extra cooler installed.


----------



## bluezone

Rise of the Tomb Raider has now added DX 12 features. I'm updating now on Steam.

Dev notes


__
https://140859222830%2Fdev-blog-bringing-directx-12-to-rise-of-the-tomb


----------



## Kana-Maru

I'm going to be posting my DX12 results from Hitman. Actual in-game benchmarks [not the built in benchmarking tool]. I guess I need to go ahead and get Tomb Raider as well since it has been updated to DX12. I'm running the Fury X @ stock by the way.


----------



## dagget3450

Cant wait to see how this all pans out vulkan and directx finally showing up.


----------



## Thoth420

Quote:


> Originally Posted by *Kana-Maru*
> 
> I'm going to be posting my DX12 results from Hitman. Actual in-game benchmarks [not the built in benchmarking tool]. I guess I need to go ahead and get Tomb Raider as well since it has been updated to DX12. I'm running the Fury X @ stock by the way.


Very interested to see your results.








My rig is down for a pump/res replacement and maybe a monoblock too. I been playing Hitman on my Xbone...can't manage to connect today but last night I had a blast.


----------



## Semel

RotTR has a "special" Dx12 Gimpworks Edition performance wise as it seems









Hitman performs quite good but DX12 mode is buggy.(just read the steam forums)


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> Well my ghetto mod to remove heat worked better than expected. If I wanted I could 1150 24/7 at 30% power limit at 70-71C. Only problems are- not very good contact and zip-ties every where. I think I'll pass on leaving the extra cooler installed.


Ahhh, any pics







, love "ghetto" mods at times







.

@Fiji owners

HBM Memory voltage control


----------



## Alastair

Have we managed to sort out the issue with Fury's loosing performance with high volts yet?


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> Rise of the Tomb Raider has now added DX 12 features. I'm updating now on Steam.
> 
> Dev notes
> 
> 
> __
> https://140859222830%2Fdev-blog-bringing-directx-12-to-rise-of-the-tomb


Crimson 16.3 keeps my frames capped at 60 under DX12 mode, DX11 is fine. Had to roll back to 16.2.1 to fix the issue.


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> Have we managed to sort out the issue with Fury's loosing performance with high volts yet?


Posted my current thoughts here.


----------



## Ragsters

I am seriously considering purchasing one of these bad boyz for my new rig but I have a concern. Do you guys know if the card will work with Korean monitors specifically the Acheiva Shimian?


----------



## buildzoid

Quote:


> Originally Posted by *Alastair*
> 
> Have we managed to sort out the issue with Fury's loosing performance with high volts yet?


I've manage to get a fire strike score over 18.5K with 1190mhz core clock using a custom BIOS. Before that BIOS 1190mhz would be scoring much lower due to needing a ton of core voltage to work. However you can't reliably go over 1.3V with that BIOS so voltmods seem to be the only way forward once you max out the BIOS.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Have we managed to sort out the issue with Fury's loosing performance with high volts yet?
> 
> 
> 
> Posted my current thoughts here.
Click to expand...

That sounds great. Now two questions.

1. Good for a standard Fury (Fiji Pro)
2. Have voltages beyond 1.3 with +25 been tried. I am just curious how much more performance we can get with maybe 1.35v or maybe even 1.4v.


----------



## buildzoid

Quote:


> Originally Posted by *Alastair*
> 
> That sounds great. Now two questions.
> 
> 1. Good for a standard Fury (Fiji Pro)
> 2. Have voltages beyond 1.3 with +25 been tried. I am just curious how much more performance we can get with maybe 1.35v or maybe even 1.4v.


1.4V will get all 3 of my FuryXs into 1200mhz+ core clocks. However that was using software voltage and the end result was FPS lower than 1125mhz with no voltage.


----------



## fat4l

Quote:


> Originally Posted by *buildzoid*
> 
> 1.4V will get all 3 of my FuryXs into 1200mhz+ core clocks. However that was using software voltage and the end result was FPS lower than 1125mhz with no voltage.


Why is it behaving like that ??


----------



## buildzoid

Quote:


> Originally Posted by *fat4l*
> 
> Why is it behaving like that ??


Because AMD's power management is broken.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Because AMD's power management is broken.


I concur without modding anything i can do the same sadly. Your on stock aio or waterblocks?


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> I concur without modding anything i can do the same sadly. Your on stock aio or waterblocks?


Stock AIO for now. I might get some higher RPM radiator fans but I'm not gonna do a custom loop since it's far too expensive and most of the gain is from the larger radiators not the water blocks themselves. I might try hook the stock Fury X waterblocks into larger radiators but I'm not going to do a full custom loop.


----------



## Kana-Maru

Quote:


> Originally Posted by *Thoth420*
> 
> Very interested to see your results.
> 
> 
> 
> 
> 
> 
> 
> 
> My rig is down for a pump/res replacement and maybe a monoblock too. I been playing Hitman on my Xbone...can't manage to connect today but last night I had a blast.


I completely forgot to post a link last night. Here is the link to my Hitman Fury X DX12 benchmarks.

*Hitman DX12 Fury X Benchmarks*
http://www.overclock-and-game.com/news/pc-gaming/42-hitman-directx-12-fury-x-benchmarks

I will be updating the article today with more info.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Stock AIO for now. I might get some higher RPM radiator fans but I'm not gonna do a custom loop since it's far too expensive and most of the gain is from the larger radiators not the water blocks themselves. I might try hook the stock Fury X waterblocks into larger radiators but I'm not going to do a full custom loop.


Yeah i am rather sure even people using full waterblocks are having same issues. Someone in here has done a hardware mod and i think its doing the same to some degree


----------



## Semel

Quote:


> Originally Posted by *buildzoid*
> 
> However you can't reliably go over 1.3V with that BIOS .


What about getting 1.3V? I can only set 1300 in bios and most of the time I get only 1.25V sometimes a tiny bit more. Adding more than 1.3V to achieve a real 1.3 V leads to BSOD.

Have you found a workaround for this or something?


----------



## gupsterg

"I can only set 1300 in bios and most of the time I get only 1.25V"

I have added a FAQ section to Fiji bios mod to cover this aspect.

Please understand you are setting VID in bios, what you see in MSI AB, TriXX is VDDC and this is:-

a) due to LLC

b) how varying apps create different loads and PowerTune tech will vary the VDDC.


----------



## Semel

You already explained it. This is not what I was asking him.
Quote:


> you can't *reliably go over 1.3V* with that BIOS .


this sentence somewhat presumes that his card can go up to 1.3V via bios..And if it is so I was just wondering what he did to achieve that.(workaround?)

It could be just "poor" wording that made me confused.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Ahhh, any pics
> 
> 
> 
> 
> 
> 
> 
> , love "ghetto" mods at times
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Fiji owners
> 
> HBM Memory voltage control


Sorry. For some reason I cannot get the pic to upload.

Okay, fixed.

I'm just about to remove it. Don't blame me if you hurt yourself falling down laughing.


----------



## nyk20z3

Picked up a open box Asus Nano from Micro Center today, now i can put my EK block on and let the games begin.


----------



## SuperZan

Nice grab, I do love the look of the Nano's under water. Lots of power with a small physical footprint.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> You already explained it. This is not what I was asking him.


Oops







.
Quote:


> Originally Posted by *Semel*
> 
> this sentence somewhat presumes that his card can go up to 1.3V via bios..And if it is so I was just wondering what he did to achieve that.(workaround?)
> 
> It could be just "poor" wording that made me confused.


Ahhh, ok







.
Quote:


> Originally Posted by *bluezone*
> 
> Don't blame me if you hurt yourself falling down laughing.


Not at all







, always good to see another's mods to perhaps gain ideas







.


----------



## NBrock

Soooo.....I took apart my Fury X to swap the TIM...now I get odd artifacts on my screen in games and occasionally in windows. I have don't this on many other graphics cards without issues. I pulled it apart again to double check everything....it all looks good as far as I can tell but the same thing happens.


----------



## buildzoid

Quote:


> Originally Posted by *NBrock*
> 
> Soooo.....I took apart my Fury X to swap the TIM...now I get odd artifacts on my screen in games and occasionally in windows. I have don't this on many other graphics cards without issues. I pulled it apart again to double check everything....it all looks good as far as I can tell but the same thing happens.


You probably did something to upset the interposer. It's a bunch of exposed 65nm wires so even so little as touching it is enough to potentially damage it.


----------



## ht_addict

Quote:


> Originally Posted by *NBrock*
> 
> Soooo.....I took apart my Fury X to swap the TIM...now I get odd artifacts on my screen in games and occasionally in windows. I have don't this on many other graphics cards without issues. I pulled it apart again to double check everything....it all looks good as far as I can tell but the same thing happens.


Can you post a picture.


----------



## NBrock

That's exposed? It had tim on it from the factory since it was put on sloppily. When I cleaned it off it looked like it had something over it kinda like glass. So how are people not killing their cards when they put water blocks on them?


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> That's exposed? It had tim on it from the factory since it was put on sloppily. When I cleaned it off it looked like it had something over it kinda like glass. So how are people not killing their cards when they put water blocks on them?


yes its exposed. you have to be very very careful of it. I didn't even try to remove all of my TIM because of this. Keep away from the interposer.

maybe we need a sticky for this point. Just so others don't mess up their cards over this.


----------



## NBrock

Well eff me. I guess I probably messed it up then. I thought I was being super careful but I guess it wasn't enough. I have never killed hardware before...this really sucks. Guess I won't be gaming for a few months







I seriously doubt there is any point in trying to get warranty on this since it was my own stupid fault. This has been a really ****ty week. Anyway thanks for the info.


----------



## bluezone

Quote:


> Originally Posted by *NBrock*
> 
> Well eff me. I guess I probably messed it up then. I thought I was being super careful but I guess it wasn't enough. I have never killed hardware before...this really sucks. Guess I won't be gaming for a few months
> 
> 
> 
> 
> 
> 
> 
> I seriously doubt there is any point in trying to get warranty on this since it was my own stupid fault. This has been a really ****ty week. Anyway thanks for the info.


Don't give up yet. I've had a few problems since Crimson 16.3. try rolling back your driver and see if you still have a problem.

Besides you couldn't of messed up as badly as the guy at a tech site who cleaned all it up with a screw driver. He lost his job.

edited because it's late and I cannot type worth a dam when I'm tired.


----------



## buildzoid

Quote:


> Originally Posted by *NBrock*
> 
> Well eff me. I guess I probably messed it up then. I thought I was being super careful but I guess it wasn't enough. I have never killed hardware before...this really sucks. Guess I won't be gaming for a few months
> 
> 
> 
> 
> 
> 
> 
> I seriously doubt there is any point in trying to get warranty on this since it was my own stupid fault. This has been a really ****ty week. Anyway thanks for the info.


The manufacturer will probably replace it for you if you claim an RMA. However you might have guilty conscious over it.


----------



## Alastair

Quote:


> Originally Posted by *NBrock*
> 
> That's exposed? It had tim on it from the factory since it was put on sloppily. When I cleaned it off it looked like it had something over it kinda like glass. So how are people not killing their cards when they put water blocks on them?




A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together

So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.


----------



## huzzug

Quote:


> Originally Posted by *NBrock*
> 
> Well eff me. I guess I probably messed it up then. I thought I was being super careful but I guess it wasn't enough. I have never killed hardware before...this really sucks. Guess I won't be gaming for a few months
> 
> 
> 
> 
> 
> 
> 
> I seriously doubt there is any point in trying to get warranty on this since it was my own stupid fault. This has been a really ****ty week. Anyway thanks for the info.


Put it in the box after assembling the cooler and claim RMA and keep your tongue behind the teeth. Ethically, it's wrong, but it'll never feel more right than this.


----------



## Alastair

Quote:


> Originally Posted by *huzzug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NBrock*
> 
> Well eff me. I guess I probably messed it up then. I thought I was being super careful but I guess it wasn't enough. I have never killed hardware before...this really sucks. Guess I won't be gaming for a few months
> 
> 
> 
> 
> 
> 
> 
> I seriously doubt there is any point in trying to get warranty on this since it was my own stupid fault. This has been a really ****ty week. Anyway thanks for the info.
> 
> 
> 
> Put it in the box after assembling the cooler and claim RMA and keep your tongue behind the teeth. Ethically, it's wrong, but it'll never feel more right than this.
Click to expand...

I agree. I know it is ethically wrong. But how many "ETHICAL" corporations are thee anyway. You give an inch and they will take a mile.

I won't claim an RMA against a retailer if it's my fault though. My local shops are good people and wouldn't wanna screw them over.


----------



## fat4l

Quote:


> Originally Posted by *buildzoid*
> 
> Because AMD's power management is broken.


What particularly ? Is it not getting nuff amps or limited by tdp or ?


----------



## buildzoid

Quote:


> Originally Posted by *fat4l*
> 
> What particularly ? Is it not getting nuff amps or limited by tdp or ?


It seems that the GPU just micro throttles more and more as you give it more voltage. This effect is completely unaffected by power or current limits(I tried a 65000 amp current limit and 65000 watt power limit) and it does not occur as long as you don't use software voltage control like Trixx or AB offer. BIOS mods work just fine but windows will BSOD if you set a VID greater than 1.3V. Hard mods should also work because what ever power management crap is causing the FPS loss will not actually know about the change in voltage and so shouldn't kick in.


----------



## Kriant

Got to say...those dx12 benchmarks make me wonder whether I should return to the red camp and expunge titans out of my rig while i still can lol.


----------



## huzzug

At most it would be a sidegrade. I'd wait out just a few months until new cards drop and MS pulls its head out of the rear to give proper DX12 support to games on Win10. Besides, Nvidia is not gonna let AMD run with DX12.


----------



## fat4l

Quote:


> Originally Posted by *buildzoid*
> 
> It seems that the GPU just micro throttles more and more as you give it more voltage. This effect is completely unaffected by power or current limits(I tried a 65000 amp current limit and 65000 watt power limit) and it does not occur as long as you don't use software voltage control like Trixx or AB offer. BIOS mods work just fine but windows will BSOD if you set a VID greater than 1.3V. Hard mods should also work because what ever power management crap is causing the FPS loss will not actually know about the change in voltage and so shouldn't kick in.


ah, thats sad







lets see what gupsterg brings in with his bios mods...

ANyway, +rep


----------



## ht_addict

Quote:


> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.


Quote:


> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.


Quote:


> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.


so whats the best way to clean the old TIM off?


----------



## Alastair

Quote:


> Originally Posted by *ht_addict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> 
> 
> A pic of my cards chip before I waterblocked them. Let me tell you. I was exceedingly careful when putting them together
> 
> So the exposed Interposer is that lovely looking orange die around the edge of the core and the HBM.
> 
> Click to expand...
> 
> so whats the best way to clean the old TIM off?
Click to expand...

Very gently. I used a toothpick to pull of the plastic protection that covered the interposer. Cause Sapphire's TIM application is appalling and it had gotten squeezed underneath this layer of protection. So I peeled off carefully using a toothpick to pull up the corners. Once the plastic was off I just cleaned everything up with a cloth gently.

PLEASE: DO NOT GET CONFUSED BETWEEN THE INTERPOSERS SHINY SURFACE WITH THE PLASTIC PROTECTION THAT SAPPHIRE PUTS. CHECK AND DOUBLE CHECK FIRST BEFORE YOU TRY REMOVING ANYTHING FROM AROUND THE INTERPOSER AREA.


----------



## Otterfluff

I used cotton Q tips to clean off my die+ interposer and for fine stuff I used edges from a paper towl borrowed from the kitchen. Very gently took off a little at a time.


----------



## SuperZan

Quote:


> Originally Posted by *Otterfluff*
> 
> I used cotton Q tips to clean off my die+ interposer and for fine stuff I used edges from a paper towl borrowed from the kitchen. Very gently took off a little at a time.


+1 on the paper edges. I used paper towel corners and corner bits of cardstock/construction paper to get negligently applied product out from against the edges of raised surfaces. My card is still in good shape so I suppose I was careful enough.


----------



## bluezone

Polaris and VR announcement today at 6:30 (EST).


----------



## Jflisk

What you are you considering the interposer the thin layer of orange. Is it like a membrane . I am thinking about water blocking my furys . Wonder if just 90% and toothpick and q tip will get it clean.


----------



## looncraz

Quote:


> Originally Posted by *Jflisk*
> 
> What you are you considering the interposer the thin layer of orange. Is it like a membrane . I am thinking about water blocking my furys . Wonder if just 90% and toothpick and q tip will get it clean.


Don't clean the interposer at all. Soak the die in 89~91 Octane for 30 seconds or so to soften the goop, and wipe the top of the dies only. Next, put some more 89~91% 2-Propanal/DHMO on the remaining goop and used compressed CO2 to push the goop away from the interposer. Any leftover goop that doesn't interfere with the mounting of new hardware should be left in place. Be careful not to run the equipment with any DHMO remaining, that stuff is nasty!


----------



## nyk20z3

For you guys that have water cooled a Nano, what type of screw driver did you use to remove the 4 tiny screws on the pci bracket covering the hdmi and displayports ? I have a tiny set and managed to get 1 screw out but the rest are refusing as i am not able to get enough tq on the tiny driver to get them out.


----------



## xTesla1856

It's finally happening guys!

http://videocardz.com/58547/amd-launches-radeon-pro-duo


----------



## Pintek

how many 8 pin power connectors is that using o.o?! Almost looks like its using a full 24 pin motherboard connector


----------



## SuperZan

Quote:


> Originally Posted by *Pintek*
> 
> how many 8 pin power connectors is that using o.o?! Almost looks like its using a full 24 pin motherboard connector


 




At least it's all standardised this time as opposed to the 295x2 requirements. Normal 8 pins, just... more!







(3 I believe)


----------



## Arizonian

^^^that's my fav^^^


----------



## Butthurt Beluga

Hey guys, I'm looking to buy a Radeon R9 Fury GPU.
The one I was looking at specifically was the GIGABYTE R9 Fury card on newegg
But I've never known Gigabyte to be a well-known AMD GPU vendor, and this one is priced very low relative to other Fury vendors, and I noticed it had very few reviews on both Newegg and just in general.
Any suggestions?


----------



## dagget3450

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Hey guys, I'm looking to buy a Radeon R9 Fury GPU.
> The one I was looking at specifically was the GIGABYTE R9 Fury card on newegg
> But I've never known Gigabyte to be a well-known AMD GPU vendor, and this one is priced very low relative to other Fury vendors, and I noticed it had very few reviews on both Newegg and just in general.
> Any suggestions?


If i am not mistaken Fury for the most part is mostly the same pcb. I know gigabyte has had some hawaii cards that were voltage locked. Given limited control currently with Fiji its probably not a big issue if thats the case. Outside that i have had a few gigabyte gpus latest being a 290 and it was fine.(it was reference so no much to say there.)

On a side note as a few others have mentioned we could use some Fiji subs for the green vs red thread. Not much Fiji runs showing up there would be nice to catch up as were falling behind again.
http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


----------



## SuperZan

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Hey guys, I'm looking to buy a Radeon R9 Fury GPU.
> The one I was looking at specifically was the GIGABYTE R9 Fury card on newegg
> But I've never known Gigabyte to be a well-known AMD GPU vendor, and this one is priced very low relative to other Fury vendors, and I noticed it had very few reviews on both Newegg and just in general.
> Any suggestions?


I had a Giga 7970 Windforce that was just lovely but I haven't heard a whole lot about the Gigabyte Furies. I suppose this is probably a "no news is good news" type of scenario. If you can get your hands on the Nitro I've heard nothing but glowing reviews on that card, @Arizonian could tell you much more about it.







Otherwise the more or less reference designs seem to do well; my XFX model didn't have any unlockable cores despite a perfect two-per-row all right-side lockout, but it clocks to a playable 1150/545. I can bench it at 1175 but stability in gaming won't hold.


----------



## looncraz

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Hey guys, I'm looking to buy a Radeon R9 Fury GPU.
> The one I was looking at specifically was the GIGABYTE R9 Fury card on newegg
> But I've never known Gigabyte to be a well-known AMD GPU vendor, and this one is priced very low relative to other Fury vendors, and I noticed it had very few reviews on both Newegg and just in general.
> Any suggestions?


If I were in the market for one, that'd actually be my top choice. I have Gigabyte's WF3 R9 290 and it is, by far, the best video card I've owned. The cooler is just amazing.

And Gigabyte's choice of connectors is impeccable... AMD doing away with the dual-DVI by default was just stupid. If Polaris GPUs don't have them by default, AMD could lose a great deal of potential sales.


----------



## Kana-Maru

^ DVI is old tech and 90% of the companies I work with have been pushing away from VGA & DVI for literally YEARS now. I'm going to miss VGA and DVI though. Display Port is just so much better. Especially when installing equipment. Do you know how frustrating it can get adding and removing VGA\DVI cables? Feels like carpal tunnel after awhile. Twist twist twist........even worse when it gets jammed and you have to get a tool.

New computers no longer come with a DVI or VGA from the latest laptops and desktops I've installed the past year. Display Port dongles came with the computers for older tech [VGA\DVI]. Intel and AMD has clearly stopped supporting VGA. DVI "probably" has another year or two in the commercial field. Outside of businesses the tech is moving on. Some newer TVs are missing DVI and VGA.

HDMI and DP are the future. Come join us. I prefer DP.


----------



## Butthurt Beluga

Quote:


> Originally Posted by *dagget3450*
> 
> If i am not mistaken Fury for the most part is mostly the same pcb. I know gigabyte has had some hawaii cards that were voltage locked. Given limited control currently with Fiji its probably not a big issue if thats the case. Outside that i have had a few gigabyte gpus latest being a 290 and it was fine.(it was reference so no much to say there.)
> 
> On a side note as a few others have mentioned we could use some Fiji subs for the green vs red thread. Not much Fiji runs showing up there would be nice to catch up as were falling behind again.
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


Quote:


> Originally Posted by *SuperZan*
> 
> I had a Giga 7970 Windforce that was just lovely but I haven't heard a whole lot about the Gigabyte Furies. I suppose this is probably a "no news is good news" type of scenario. If you can get your hands on the Nitro I've heard nothing but glowing reviews on that card, @Arizonian
> could tell you much more about it.
> 
> 
> 
> 
> 
> 
> 
> Otherwise the more or less reference designs seem to do well; my XFX model didn't have any unlockable cores despite a perfect two-per-row all right-side lockout, but it clocks to a playable 1150/545. I can bench it at 1175 but stability in gaming won't hold.


Quote:


> Originally Posted by *looncraz*
> 
> If I were in the market for one, that'd actually be my top choice. I have Gigabyte's WF3 R9 290 and it is, by far, the best video card I've owned. The cooler is just amazing.
> 
> And Gigabyte's choice of connectors is impeccable... AMD doing away with the dual-DVI by default was just stupid. If Polaris GPUs don't have them by default, AMD could lose a great deal of potential sales.


Thank you guys, I really appreciate the help.
Going to be buying the R9 Fury tomorrow.


----------



## looncraz

Quote:


> Originally Posted by *Kana-Maru*
> 
> ^ DVI is old tech and 90% of the companies I work with have been pushing away from VGA & DVI for literally YEARS now. I'm going to miss VGA and DVI though. Display Port is just so much better. Especially when installing equipment. Do you know how frustrating it can get adding and removing VGA\DVI cables? Feels like carpal tunnel after awhile. Twist twist twist........even worse when it gets jammed and you have to get a tool.
> 
> New computers no longer come with a DVI or VGA from the latest laptops and desktops I've installed the past year. Display Port dongles came with the computers for older tech [VGA\DVI]. Intel and AMD has clearly stopped supporting VGA. DVI "probably" has another year or two in the commercial field. Outside of businesses the tech is moving on. Some newer TVs are missing DVI and VGA.
> 
> HDMI and DP are the future. Come join us. I prefer DP.


The problem with that mentality is that 90%+ of all monitors are NOT going to be replaced for years. They are one of the longest-lived components a computer has... and there is no downside to using or providing support for DVI-D - it's the same signal tech as for HDMI. In addition, the majority of new monitors are also only DVI.

I have two high quality DL-DVI-D monitors. If AMD does not give me a video card with at least one DL-DVI-D and one HDMI I can use to power my other, I will jump ship and buy nVidia in one second flat.

One of ATi's big draws was unfaltering backwards compatibility - they even created breakout boxes to keep it going (I probably still have one or two in my closet). If I have to buy a $100 adapter to use my 144Hz 1080p monitor with AMD, but don't with nVidia... guess who wins the price war?

AMD can't afford to throw away customers.


----------



## Kana-Maru

Quote:


> Originally Posted by *looncraz*
> 
> The problem with that mentality is that 90%+ of all monitors are NOT going to be replaced for years. They are one of the longest-lived components a computer has... and there is no downside to using or providing support for DVI-D - it's the same signal tech as for HDMI. In addition, the majority of new monitors are also only DVI.
> 
> I have two high quality DL-DVI-D monitors. If AMD does not give me a video card with at least one DL-DVI-D and one HDMI I can use to power my other, I will jump ship and buy nVidia in one second flat.
> 
> One of ATi's big draws was unfaltering backwards compatibility - they even created breakout boxes to keep it going (I probably still have one or two in my closet). If I have to buy a $100 adapter to use my 144Hz 1080p monitor with AMD, but don't with nVidia... guess who wins the price war?
> 
> AMD can't afford to throw away customers.


Hey man I understand where you are coming from. I'm just letting you know that the tech has moved on from mid 1980 and late 1990 technology. So you can take that up with Intel and AMD. I've been knowing this for at least 4 years now which is why my monitor supports HDMI, Display Port and Dual-link DVI-D as well. No VGA though, but there super cheap adapters if I really needed one. I still have a old VGA monitor laying around somewhere. Anyways I understand that monitors have a much longer lifespan as well.

So if you choose to go with Nvidia that's your call. it's your money. If you do decide to switch to Nvidia be careful. I hear their drivers are killing people OS's at the moment. Some people are claiming dead GPUs from some horrible drivers recently. Hopefully you don't run into that IF you have to switch companies. If they win the price war and you switch congratulations.


----------



## bborokee

Hey guys,

I'm looking to buy a new gpu to replace my R9 380, since it really can't keep up with the latest titles without making compromises.
I know that polaris will be released soon, but I don't think i'll have any plans of upgrading to any other card until the Vega is released with the HBM2.

Therefore, I'm in the market for a fury. However, as I was browing through a website, I saw that the XFX Fury was being sold about 100 euros less than other vendors such as gigabyte, sapphire, etc.
If I'm correct, except for the Sapphire Nitro, all the fury cards are reference PCB by AMD, right? If that's the case, shouldn't their performance be near idential since they're using the same parts?

Has anyone had experience with the XFX variant? Should I just go the safe route with Sapphire cards?

Thanks in advance,


----------



## antonis21

Quote:


> Originally Posted by *bborokee*
> 
> Hey guys,
> 
> I'm looking to buy a new gpu to replace my R9 380, since it really can't keep up with the latest titles without making compromises.
> I know that polaris will be released soon, but I don't think i'll have any plans of upgrading to any other card until the Vega is released with the HBM2.
> 
> Therefore, I'm in the market for a fury. However, as I was browing through a website, I saw that the XFX Fury was being sold about 100 euros less than other vendors such as gigabyte, sapphire, etc.
> If I'm correct, except for the Sapphire Nitro, all the fury cards are reference PCB by AMD, right? If that's the case, shouldn't their performance be near idential since they're using the same parts?
> 
> Has anyone had experience with the XFX variant? Should I just go the safe route with Sapphire cards?
> 
> Thanks in advance,


Also ASUS fury strix is custom pcb like Sapphire's nitro..I dont know about XFX fury but if i was you i would go for sapphire nitro edition


----------



## xTesla1856

More information and pictures of the Pro Duo:

http://videocardz.com/58547/amd-launches-radeon-pro-duo


----------



## Semel

They want $1500 for this card? Thanx but no thanx. It's much cheaper to get two cards in crossfire and get higher performance (judging by their firestrike numbers).


----------



## Jflisk

I was going to say sell one of my cards for $500.00 or so and still be out another $1000 . $1200 maybe - $1500 they are on something.

Think I will hold out for the next gen at this point.


----------



## dagget3450

To be honest i am tired of having 4 gpus, i have been wanting to make a build with dual gpu on one pcb for space. But its already a tough sell due to 4bg vram being limited unless they come up with something to merge memory pools soon.


----------



## SuperZan

Quote:


> Originally Posted by *bborokee*
> 
> Hey guys,
> 
> I'm looking to buy a new gpu to replace my R9 380, since it really can't keep up with the latest titles without making compromises.
> I know that polaris will be released soon, but I don't think i'll have any plans of upgrading to any other card until the Vega is released with the HBM2.
> 
> Therefore, I'm in the market for a fury. However, as I was browing through a website, I saw that the XFX Fury was being sold about 100 euros less than other vendors such as gigabyte, sapphire, etc.
> If I'm correct, except for the Sapphire Nitro, all the fury cards are reference PCB by AMD, right? If that's the case, shouldn't their performance be near idential since they're using the same parts?
> 
> Has anyone had experience with the XFX variant? Should I just go the safe route with Sapphire cards?
> 
> Thanks in advance,


I've had a great experience with the XFX model. Holds 1150/545 in gaming without issue for me and the cooler is excellent for an air solution. I really haven't got anything bad to say about it.


----------



## bborokee

Quote:


> Originally Posted by *SuperZan*
> 
> I've had a great experience with the XFX model. Holds 1150/545 in gaming without issue for me and the cooler is excellent for an air solution. I really haven't got anything bad to say about it.


Alright-o, thanks for your input!

My mind is telling me that their performance should be about equal or same since they use same components, but my body is telling me to go with the sapphire or asus since they're more expensive (and don't we all know that more expensive = better! lol)

I might as well just save extra 100 euro and put that towards more ram or another ssd...

thanks again


----------



## gupsterg

Ref PCB

Sapphire Fury Tri-X STD or OC
XFX Fury Triple Dissipation

Custom PCB

Asus Fury Strix
Gigabyte Fury Windforce OC
Sapphire Fury Nitro

AFAIK currently all Fury X use same ref PCB / AIO but it seems it has had 3 revisions of water pump.
Quote:


> We removed nine screws and opened up our Fury X to see what version of the pump we got since we were getting excessive pump noise. Evidently, there are three pump versions - (1) the version the reviewers got with the Cooler Master sticker, (2) another pump version with an embossed Cooler Master logo, and (3) this one with the plainly embossed Cooler Master logo.


Quote link

Personally I'm liking the Sapphire Fury Tri-X over the Sapphire Fury X (x2) I own, much easier ti handle / house the air cooler card than AIO. My Fury unlocked to 3840SP and in quick tests I did is very close or equal to the Fury X.


----------



## Agent Smith1984

Hmmm, so AMD convinced me to pay $200 over a 390 for ~10% more performance (for the record I sold off the Fury and went to 390x @ $360 after rebate with Hitman and a cool mouse).....
Now they want $1500 for Pro Duo??? They should of stuck $1199 on it and called it a day (and even that's a hard sale in my opinion).
This card will be discontinued before it ever catches any traction in the market..... they waited this long to launch it, only to turn around and give us Polaris in a few more months.

Am I missing something? I have seen first hand the limitations that 4GB can still pose, so unless DX12 is not only going to be used on most new titles, but also implement dual RAM pooling, then I see this one as a miss.....


----------



## bborokee

Quote:


> Originally Posted by *gupsterg*
> 
> Ref PCB
> 
> Sapphire Fury Tri-X STD or OC
> XFX Fury Triple Dissipation
> 
> Custom PCB
> 
> Asus Fury Strix
> Gigabyte Fury Windforce OC
> Sapphire Fury Nitro
> 
> AFAIK currently all Fury X use same ref PCB / AIO but it seems it has had 3 revisions of water pump.
> Quote link
> 
> Personally I'm liking the Sapphire Fury Tri-X over the Sapphire Fury X (x2) I own, much easier ti handle / house the air cooler card than AIO. My Fury unlocked to 3840SP and in quick tests I did is very close or equal to the Fury X.


Do you think there will be any performance edge on Ref PCB vs Custom PCB?


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmmm, so AMD convinced me to pay $200 over a 390 for ~10% more performance (for the record I sold off the Fury and went to 390x @ $360 after rebate with Hitman and a cool mouse).....
> Now they want $1500 for Pro Duo??? They should of stuck $1199 on it and called it a day (and even that's a hard sale in my opinion).
> This card will be discontinued before it ever catches any traction in the market..... they waited this long to launch it, only to turn around and give us Polaris in a few more months.
> 
> Am I missing something? I have seen first hand the limitations that 4GB can still pose, so unless DX12 is not only going to be used on most new titles, but also implement dual RAM pooling, then I see this one as a miss.....


Polaris that shows up first is going to be midline, second it appears that they will use hbm 4gig again on those? Like someone else said they wouldn't launch it this late if they thought it would be eliminated so fast. (Not looking at nvidias pascal but their own offerings) /speculation

I do agree that the 4gig at 4k is an issue, however now they are pushing VR? Too much stuff pulling in different directions.


----------



## Laquel

Did a fan mod on my nano

Result:
Runs cooler and quieter than stock fan -> win$$


----------



## SuperZan

Quote:


> Originally Posted by *Laquel*
> 
> Did a fan mod on my nano
> 
> Result:
> Runs cooler and quieter than stock fan -> win$$


Ha! Neat mod.


----------



## bluezone

quote name="Laquel" url="/t/1547314/official-amd-r9-radeon-fury-nano-x-pro-duo-fiji-owners-club/7490#post_24991512"]Did a fan mod on my nano

Result:
Runs cooler and quieter than stock fan -> win$$[/quote]

I had wondered if a better fan would help, what fan is that? does it come with the correct plug?


----------



## gupsterg

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Am I missing something? I have seen first hand the limitations that 4GB can still pose, so unless DX12 is not only going to be used on most new titles, but also implement dual RAM pooling, then I see this one as a miss.....


GTA V was what you mentioned in the 390 owners thread was being limited by 4GB @ 4K.

I can't see it (Bit Tech Fury Tri-X OC review), I'm a very new Fiji owners so looking for guidance. If any other Fiji owner can also help me to understand the 4GB limitations, I'm all ears







.

My post is aimed at me being educated by info you or another can share and in no way I'm saying you're wrong







.
Quote:


> Originally Posted by *bborokee*
> 
> Do you think there will be any performance edge on Ref PCB vs Custom PCB?


Ref PCB Fury and Fury X are the same, 6 phases on rear, Nano is 4 phases. Personally the stock Fury/X VRM is great IMO for us normal overclockers (not got extreme OC experience). With a little fan profile mod you'll see a good decrease in VRM temps for very little gain in noise. Consequentially the GPU and HBM will also be cooler, I have set fan profile in ROM so GPU is maintained at 55C.

The air cooler on the Fury Tri-X is on a par with the AIO Fury X, IIRC from my own logs VRM temps are actually better on the Fury Tri-X vs the Fury X (few degrees). TBH when I compare VRM temp on my Sapphire Tri-X 290 OC (ref PCB 6 phase) it's lower on the Fury Tri-X by ~20C IIRC . Early on I'd be checking if the fans were running on the Fury Tri-X (do stop at idle). Both the Fury Xs I have seem to have a little fan whine at low RPM (do not stop at idle), one more than the other, I prefer the Tri-X cooler over AIO.

The ref PCB can seem to have more coil whine under load (plus depending on load) but it's not that bad IMO, the cards are so silent so I think that's why it can be more perceivable (silent room, otherwise I doubt you'd know). I have a Sapphire Vapor-X 290X with Black Diamond chokes (ie coil) they are pretty quiet but if ear to case I can hear them, I'd think the Sapphire Fury Nitro will be quieter for coil whine vs Fury Tri-X .

I also owned an Asus DCUII 290X at one point, the solid core chokes where silent, IIRC even ear to case I couldn't hear them. Thus I would assume the Strix would be the quietest regarding whine. The other thing about the Strix is it has 10 phases on the rear and 2 up front (ref PCB is 6+1 IIRC). As the load is shared over more components I'd reckon it would be cooler than Sapphire Tri-X / Nitro, perhaps a Strix owner will share data with us all







. For extreme OC (WC/LN2) I reckon the Strix is better VRM then ref PCB / Nitro / WF.

Not seen a review or PCB image of Gigabyte Fury Windforce, but PCB is full length of cooler (hi res side photo). Some prefer the shorter PCB on Fury/X (I do) plus IIRC some say as last fan on Tri-X / Nitro have no PCB under it better for air flow (dunno though). Nitro has slightly longer PCB than Tri-X.

As I like doing bios mods on cards the dual bios feature I like to have, IIRC the Strix is 1 bios only, I also think the Windforce is as well(see no switch in side image). The other benefit of ref PCB is if you decide to ever go custom WC loop you gonna find more blocks available (AFAIK).

I hope above info helps you decide in sense of PCB, you can probably tell I'm happy with ref PCB and prefer it for my purposes.


----------



## bborokee

Quote:


> Originally Posted by *gupsterg*
> 
> SNIP (Had to cut it down, not to clutter up the forum


Whoa, that's a load of information!

Thank you so much on the ref pcb vs custom pcb. I think i've solidified my choice to go with the XFX Fury. I don't think pulling 100 euros more for a sapphire tri-x which uses the same ref pcb is worth for my case. Although it may give better thermal performance, i don't think i can stay away from the XFX.

It's rather strange, since most of the other Furys are priced at their normal price range of early-mid 500's, but XFX seems to be priced at mid 400's. At that price range, i have the choice of going 390x or the XFX fury, but it's pretty much a given answer that i'll go with the better card, the Fury.

Once again, thank you for taking your time and walking me through my inability to make up my mind haha


----------



## Laquel

Quote:


> Originally Posted by *bluezone*
> 
> quote name="Laquel" url="/t/1547314/official-amd-r9-radeon-fury-nano-x-pro-duo-fiji-owners-club/7490#post_24991512"]Did a fan mod on my nano
> 
> Result:
> Runs cooler and quieter than stock fan -> win$$


I had wondered if a better fan would help, what fan is that? does it come with the correct plug?[/quote]

It's a be quiet silent wings 2 which fits the stock heatsink perfectly. You'll need an adapter like this: http://www.moddiy.com/product_images/y/047/4-Pin_PWM_Fan_Connector_(Male)_to_4-Pin_Mini_GPU_Fan_Connector_(Female)__85503_zoom.jpg


----------



## gupsterg

Quote:


> Originally Posted by *bborokee*
> 
> Whoa, that's a load of information!
> 
> Thank you so much on the ref pcb vs custom pcb.


No worries







.
Quote:


> Originally Posted by *bborokee*
> 
> I think i've solidified my choice to go with the XFX Fury. I don't think pulling 100 euros more for a sapphire tri-x which uses the same ref pcb is worth for my case. Although it may give better thermal performance, i don't think i can stay away from the XFX.


May not be much difference TBH, especially taking the 100 euros into context.


Spoiler: XFX



(link)

Quote:


> Shown from the end you get an eagle's eye view of the six 8 mm heatpipes, now heatpipes are measured from an outside diameter so the inside diameter of these pipes is approx 6 mm. So you take a 6 mm heatpipe the interior is 4 mm which means that the 8 mm heatpipe design has about 50% more actual capacity than the 6 mm design used on most video cards.








Spoiler: Sapphire



(link 1 link 2)

Quote:


> The cooler is comprised of a multi heatpipe array with 1x 10mm, 2 x 8mm and 4x 6mm heatpipes.


Quote:


> The heatpipes in turn run through both a smaller copper baseplate that covers the VRM MOSFETs, and a larger copper baseplate that covers the Fiji GPU itself.






Dunno if the XFX cooler heatpipes attach to VRM, but the coolers look so similar if you ignore the shrouds/fans/backplate design, so I'd assume they may. Perhaps an owner of a XFX Fury will chime in







.
Quote:


> Originally Posted by *bborokee*
> 
> At that price range, i have the choice of going 390x or the XFX fury, but it's pretty much a given answer that i'll go with the better card, the Fury.


My Hawaii cards I never saw GPU temps of 55C being sustained very quietly on air and these were non ref coolers. VRM temps on the 6 phase ref PCB Fiji is like the 12 phases design on my Vapor-X 290X (2 front 10 rear). A 390/X is gonna be very similar on temps as 290/X, to me Fiji rocks on that point.

So far many reviews I've read I see no issue with the 4GB HBM vs 8GB GDDR5.

OC'ing is never known until we have the "hardware" so to me when I see in the Bit Tech review a Fury (3584SP) @ 1040/500MHz beating a MSI 390X Gaming @ 1100/1525 plus taking other aspects of Fury into consideration I'd have Fury.

Originally I was a bit like "What!?" about Fiji, now as I have 2 of them running [email protected] and had more time with them, they have grown on me that I'm selling my Hawaii cards







.


----------



## SuperZan

Quote:
Originally Posted by *gupsterg* 


> Dunno if the XFX cooler heatpipes attach to VRM, but the coolers look so similar if you ignore the shrouds/fans/backplate design, so I'd assume they may. Perhaps an owner of a XFX Fury will chime in
> 
> 
> 
> 
> 
> 
> 
> .


Indeed I did just a few posts back!







My XFX card has been a great overclocker and runs nice and cool (45-60C) whilst gaming, a bit hotter if I take her to 1175/545 for benchmarking purposes what with the consistent high load.


----------



## Awsan

So is the Gigabyte r9 fury at 469$ a good buy? and does any one know if it will fit in the 250D as a Tri-x can fit


----------



## gupsterg

Quote:


> Originally Posted by *SuperZan*
> 
> Indeed I did just a few posts back!


Yep saw that and was hoping you'd pop back







.
Quote:


> Originally Posted by *SuperZan*
> 
> runs nice and cool (45-60C) whilst gaming


Now this was the juicy data lacking in your earlier post I'd be after







(+rep) .

Mind if I ask is that GPU temp or VRM? stock fan profile or custom? any idea on room ambient temp?


----------



## SuperZan

Quote:


> Originally Posted by *gupsterg*
> 
> Yep saw that and was hoping you'd pop back
> 
> 
> 
> 
> 
> 
> 
> .
> Now this was the juicy data lacking in your earlier post I'd be after
> 
> 
> 
> 
> 
> 
> 
> (+rep) .
> 
> Mind if I ask is that GPU temp or VRM? stock fan profile or custom? any idea on room ambient temp?












18-19C ambient consistent, and the 45-60C number is for GPU temp with a barely modified fan profile (ramping up about 5% faster). VRM temps will top 90C gaming or benching but don't throttle the card at the 1175/545 I can bench at or the 1150/545 I can run gaming-stable. Trying to push 1200 that changes quickly.


----------



## nyk20z3

EK Nano block mounted -


----------



## gupsterg

Quote:


> Originally Posted by *SuperZan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 18-19C ambient consistent, and the 45-60C number is for GPU temp with a barely modified fan profile (ramping up about 5% faster). VRM temps will top 90C gaming or benching but don't throttle the card at the 1175/545 I can bench at or the 1150/545 I can run gaming-stable. Trying to push 1200 that changes quickly.


+rep







.

Do you know if heatpipes touch VRM baseplate?


Spoiler: Sapphire Fury Tri-X







Sorry for more questions







.
Quote:


> Originally Posted by *nyk20z3*
> 
> EK Nano block mounted


Nice







, luv the white glove used to present this small yet mighty card!







.


----------



## SuperZan

Quote:


> Originally Posted by *gupsterg*
> 
> +rep
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Do you know if heatpipes touch VRM baseplate?
> 
> 
> Spoiler: Sapphire Fury Tri-X
> 
> 
> 
> 
> 
> 
> 
> Sorry for more questions
> 
> 
> 
> 
> 
> 
> 
> .


Cheers!







And it's no bother, I'm multi-tasking so I'm glad somebody's reminding me about what I'd forgot to add. And the pipes do make contact neatly, no ASUS 290 series issues here


----------



## gupsterg

, +rep .


----------



## bluezone

Quote:


> Originally Posted by *Laquel*
> 
> I had wondered if a better fan would help, what fan is that? does it come with the correct plug?


It's a be quiet silent wings 2 which fits the stock heatsink perfectly. You'll need an adapter like this: http://www.moddiy.com/product_images/y/047/4-Pin_PWM_Fan_Connector_(Male)_to_4-Pin_Mini_GPU_Fan_Connector_(Female)__85503_zoom.jpg[/quote]








Excellent. I'll will have to hunt down that adapter.

how much did your temps drop?


----------



## Laquel

Quote:


> Originally Posted by *bluezone*
> 
> It's a be quiet silent wings 2 which fits the stock heatsink perfectly. You'll need an adapter like this: http://www.moddiy.com/product_images/y/047/4-Pin_PWM_Fan_Connector_(Male)_to_4-Pin_Mini_GPU_Fan_Connector_(Female)__85503_zoom.jpg










Excellent. I'll will have to hunt down that adapter.

how much did your temps drop?[/quote]
It's hard to say exactly cause I swapped my case at the same time but when comparing at the same noise level, the new fan is much cooler and quiet even at full rpm's


----------



## Flamingo

Quote:


> Originally Posted by *Laquel*
> 
> Did a fan mod on my nano
> 
> Result:
> Runs cooler and quieter than stock fan -> win$$


do you have a mod log for this?

1500RPM vs 3000RPM and it does a better cooling job? :0


----------



## Laquel

Quote:


> Originally Posted by *Flamingo*
> 
> do you have a mod log for this?
> 
> 1500RPM vs 3000RPM and it does a better cooling job? :0


Well it's 1900rpm but still yeah. I think the stock fan was quite unbearable at >2500 or so and it was there often. I didn't do a mod log but I do have a couple more pics


Spoiler: Pic1









Spoiler: Pic2







I used the cardboard from the package to make the shroud and an adapter from gelid to attach the fan. The fan cable is tucked between the card and my mobo since it's quite long. The temps aren't that much better but I've raised the temp and power limit anyways so it's almost always between 950 and 1000mhz while gaming. And of course it keeps those temps much quieter than the stock fan.


----------



## gupsterg

It's a nice neat mod







.


----------



## Shogon

Quote:


> Originally Posted by *Laquel*
> 
> Well it's 1900rpm but still yeah. I think the stock fan was quite unbearable at >2500 or so and it was there often. I didn't do a mod log but I do have a couple more pics
> 
> 
> Spoiler: Pic1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pic2
> 
> 
> 
> 
> 
> 
> 
> I used the cardboard from the package to make the shroud and an adapter from gelid to attach the fan. The fan cable is tucked between the card and my mobo since it's quite long. The temps aren't that much better but I've raised the temp and power limit anyways so it's almost always between 950 and 1000mhz while gaming. And of course it keeps those temps much quieter than the stock fan.


Nice mod! I might try this out if everything works well with Ashes Multi-GPU use. Otherwise I would prefer to get a waterblock for the card and alter the bios so it has more than enough power to reach Fury X speeds. I do have plenty of fans doing nothing though, and it looks like it would do a much better job over the stock cooler.
Quote:


> Originally Posted by *Awsan*
> 
> So is the Gigabyte r9 fury at 469$ a good buy? and does any one know if it will fit in the 250D as a Tri-x can fit


I have no idea if it will fit in your case or not. I was extremely tempted to go for that r9 fury as well considering the price. Sadly I'm also limited in space (due to the water pump) so I opted for a Nano, and as far as I can tell there are no waterblocks for that Gigabyte card either







. Hopefully it'll do some good in Ashes


----------



## Alastair

Quote:


> Originally Posted by *Laquel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Flamingo*
> 
> do you have a mod log for this?
> 
> 1500RPM vs 3000RPM and it does a better cooling job? :0
> 
> 
> 
> Well it's 1900rpm but still yeah. I think the stock fan was quite unbearable at >2500 or so and it was there often. I didn't do a mod log but I do have a couple more pics
> 
> 
> Spoiler: Pic1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pic2
> 
> 
> 
> 
> 
> 
> 
> I used the cardboard from the package to make the shroud and an adapter from gelid to attach the fan. The fan cable is tucked between the card and my mobo since it's quite long. The temps aren't that much better but I've raised the temp and power limit anyways so it's almost always between 950 and 1000mhz while gaming. And of course it keeps those temps much quieter than the stock fan.
Click to expand...

i thought that was genuinely the shroud that particular nano came with. But damn. It looks good for cut cardboard!


----------



## huzzug

Quote:


> Originally Posted by *Alastair*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Laquel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Flamingo*
> 
> do you have a mod log for this?
> 
> 1500RPM vs 3000RPM and it does a better cooling job? :0
> 
> 
> 
> Well it's 1900rpm but still yeah. I think the stock fan was quite unbearable at >2500 or so and it was there often. I didn't do a mod log but I do have a couple more pics
> 
> 
> Spoiler: Pic1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pic2
> 
> 
> 
> 
> 
> 
> 
> I used the cardboard from the package to make the shroud and an adapter from gelid to attach the fan. The fan cable is tucked between the card and my mobo since it's quite long. The temps aren't that much better but I've raised the temp and power limit anyways so it's almost always between 950 and 1000mhz while gaming. And of course it keeps those temps much quieter than the stock fan.
> 
> Click to expand...
> 
> i thought that was genuinely the shroud that particular nano are with. But damn. It looks good for cut cardboard!
Click to expand...

He did a better job at designing a shroud than AMD themselves.


----------



## Laquel

Well thanks guys, you're being far too kind! I Appreciate it!

I think all nanos come with the reference cooler and maybe a sticker that tells the vendor


----------



## gupsterg

Besides your nano mod I like your rig pics as well







.


----------



## hyp36rmax

Sooooo I just made an order for two FURY X for some crossfire action... ABOUT FREAKING TIME! Especially as the OP haha.

Next are a couple EK blocks


----------



## xTesla1856

Guys, I have an emergency and I'm super-panicking right now:

I was playing Witcher 3 and about 25 minutes in to the game, I get a black screen followed by a bluescreen and my PC now fails to boot.

When trying to reboot, I get these weird blue artifacts in the middle of the screen:

Whenever it gets near the OS, the top card gets stuck at 100% fan speed and just stays there and I have to kill the PC.
Booting safe mode works, although I still get artefacts:

I uninstalled the driver via DDU and reinstalled, and I still get artifacts. Is it possible that my top card just out of the blue took a **** on me during gameplay, NON OC'ed and at 65°C ? This has never happened to me and I don't think I'd have the nerves or patience for RMAing this and explaining it to my retailer since I live in Switzerland and RMA's suck here.

BTW, I'm using the rig in my sig, all brand-new components.

EDIT: Booting with my top card alone, I still get artifacting and the PC fails to load Windows and the card gets stuck at 100% fan speed.
EDIT 2: Booting with my bottom card alone, the PC gets to WIndows just fine, no artefacts, Eyefinity gets picked up instantly.
EDIT 3: Barely managed to read what the Bluescreen said: atikmpag.sys, which I know is driver related. But why? The cards were working perfectly mere minutes ago.
EDIT 4: Switched the two cards around (bottom is no top and vice-versa). The artifacting card is now stuck at 1050mhz and won't downclock, even though usage is at 0.


----------



## p4inkill3r

Looks like classic card failure to me, unfortunately.


----------



## Pintek

Sounds similar to my experience with the asus r9 nano white just got my replacement nano an its been behaving itself sofar hell its even 20oC cooler then the first one at idle!

If your under warranty get that thing taken care of!


----------



## xTesla1856

The cards are barely two weeks old, who would've thought my AMD adventure would be so short-lived


----------



## xTesla1856

Reinstalled the bad card in the top slot again, artifacting galore, with an atikmpag.sys bluescreen on boot. I'm about to throw this pc out the window


----------



## Pintek

seriously just keep calm an drop the manufacture or the retail you got the card from to send you a new one. I think there just is gonna be a few hickups with the new tech thats in the fiji cards but they really should catch these in q/a.


----------



## Thoth420

Quote:


> Originally Posted by *xTesla1856*
> 
> The cards are barely two weeks old, who would've thought my AMD adventure would be so short-lived


My first Fury X did the same thing. Went a week and started artificating like nuts then next day couldn't get a signal out of any of the 3 dp ports at all. RMA'd for a new one and it's been fine since. Quite frustrating but frankly I would rather hardware die early than later.


----------



## SuperZan

Indeed. If it must quit early, let it quit under warranty.


----------



## dagget3450

Quote:


> Originally Posted by *hyp36rmax*
> 
> Sooooo I just made an order for two FURY X for some crossfire action... ABOUT FREAKING TIME! Especially as the OP haha.
> 
> Next are a couple EK blocks


look forward to your experience









Quote:


> Originally Posted by *xTesla1856*
> 
> Reinstalled the bad card in the top slot again, artifacting galore, with an atikmpag.sys bluescreen on boot. I'm about to throw this pc out the window


That sucks man







hope it gets fixed quickly by rma or return.


----------



## Alastair

Does anyone get stutter with CFX Fury's in Skyrim? I'm not talking about like microstutter. When playing Skyrim everything is normally smooth us butter with frame pacing on or off doesn't matter. But I get these weird stutters when I get to walls. Particularly cave walls and stone walls. I can't explain it.


----------



## Alastair

I'll note it also happens in single card mode.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> I'll note it also happens in single card mode.


Try capping fps a few FPS below 60. Does it go away?

Are you playing in fullscreen or borderless windowed mode?


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I'll note it also happens in single card mode.
> 
> 
> 
> Try capping fps a few FPS below 60. Does it go away?
> 
> Are you playing in fullscreen or borderless windowed mode?
Click to expand...

Well my monitor is 75hz and I've been using V sync

I play full screen, I think. I'll have to check.

It actually occurs less with VSR and 1440P than my screens native, 1080P.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Well my monitor is 75hz and I've been using V sync
> 
> I play full screen, I think. I'll have to check.
> 
> It actually occurs less with VSR and 1440P than my screens native, 1080P.


Try not to exceed 60hz when playing Gamebyro titles resolution doesn't matter as much even with up or downsampling as long as your hardware can handle it. Also Ultra shadows tax the hell out of the CPU and if you instead set them to High (and then fiddle with them in the config to your taste) chances are you will net better performance and IQ ironically. Aside that I haven't bothered to play it lately and never on a 75hz monitor only 60hz and 144hz(which I set to 60 to play Skyrim).

Also give fullscreen borderless windowed mode a try.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Well my monitor is 75hz and I've been using V sync
> 
> I play full screen, I think. I'll have to check.
> 
> It actually occurs less with VSR and 1440P than my screens native, 1080P.
> 
> 
> 
> Try not to exceed 60hz when playing Gamebyro titles resolution doesn't matter as much even with up or downsampling as long as your hardware can handle it. Also Ultra shadows tax the hell out of the CPU and if you instead set them to High (and then fiddle with them in the config to your taste) chances are you will net better performance and IQ ironically. Aside that I haven't bothered to play it lately and never on a 75hz monitor only 60hz and 144hz(which I set to 60 to play Skyrim).
> 
> Also give fullscreen borderless windowed mode a try.
Click to expand...

Well I don't seem to be dropping frames. So I can definitely handle the Ultra shadows. I seem to maintain 75fps quite consistently. It even reads 75fps when looking at the things that are causing choppy performance. I'll try target frame rate control to 60Fps. And see if that makes a difference.


----------



## NBrock

Quote:


> Originally Posted by *xTesla1856*
> 
> Guys, I have an emergency and I'm super-panicking right now:
> 
> I was playing Witcher 3 and about 25 minutes in to the game, I get a black screen followed by a bluescreen and my PC now fails to boot.
> 
> When trying to reboot, I get these weird blue artifacts in the middle of the screen:
> 
> Whenever it gets near the OS, the top card gets stuck at 100% fan speed and just stays there and I have to kill the PC.
> Booting safe mode works, although I still get artefacts:
> 
> I uninstalled the driver via DDU and reinstalled, and I still get artifacts. Is it possible that my top card just out of the blue took a **** on me during gameplay, NON OC'ed and at 65°C ? This has never happened to me and I don't think I'd have the nerves or patience for RMAing this and explaining it to my retailer since I live in Switzerland and RMA's suck here.
> 
> BTW, I'm using the rig in my sig, all brand-new components.
> 
> EDIT: Booting with my top card alone, I still get artifacting and the PC fails to load Windows and the card gets stuck at 100% fan speed.
> EDIT 2: Booting with my bottom card alone, the PC gets to WIndows just fine, no artefacts, Eyefinity gets picked up instantly.
> EDIT 3: Barely managed to read what the Bluescreen said: atikmpag.sys, which I know is driver related. But why? The cards were working perfectly mere minutes ago.
> EDIT 4: Switched the two cards around (bottom is no top and vice-versa). The artifacting card is now stuck at 1050mhz and won't downclock, even though usage is at 0.


Did you remove the coolers and replace thermal paste at all? You could have done what I done and messed up the card by touching the exposed interposer.


----------



## Semel

*xTesla1856*

Have u tried switching to the second bios?


----------



## Papa Emeritus

I've been lurking here for years, but finally i made an account!









First thing that comes to mind? Post a sloppy pic of my two Fury X's in the owners club







. Got the first one just right after the release last summer, and the second one in October. They have been under water for most of the time, without any problems.


----------



## hyp36rmax

Quote:


> Originally Posted by *Papa Emeritus*
> 
> I've been lurking here for years, but finally i made an account!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First thing that comes to mind? Post a sloppy pic of my two Fury X's in the owners club
> 
> 
> 
> 
> 
> 
> 
> . Got the first one just right after the release last summer, and the second one in October. They have been under water for most of the time, without any problems.


Welcome to the club and OCN!


----------



## Papa Emeritus

Thanks


----------



## xTesla1856

Quote:


> Originally Posted by *NBrock*
> 
> Did you remove the coolers and replace thermal paste at all? You could have done what I done and messed up the card by touching the exposed interposer.


Nope, never took em apart, temps are great.
Quote:


> Originally Posted by *Semel*
> 
> *xTesla1856*
> Have u tried switching to the second bios?


Yep, same thing happens on both BIOSes.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Well I don't seem to be dropping frames. So I can definitely handle the Ultra shadows. I seem to maintain 75fps quite consistently. It even reads 75fps when looking at the things that are causing choppy performance. I'll try target frame rate control to 60Fps. And see if that makes a difference.


It's more the game engine trying to periodically resync with the panel....poorly. Google 64hz bug and Skyrim if you aren't already aware. If you try target frame rate control try 58 and 56 FPS clamps as well. I remember someone having a similar problem with rock textures specifically.

If you pan the camera diagonally upward for instance when looking at a wall or rock in question...is that when you experience more stutter? As opposed to just horizontal or vertical(this is easy for me to test because I play Skyrim with a controlller...also not a terrible idea as I hear input lag from the mouse can be horrendous in this game).


----------



## Agent Smith1984

Anybody in Fiji land got Hitman working at all?

Definitely broken for the 290/390 folks!!!

Game locks, artifacts, crashes, what a heap of garbage!!


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody in Fiji land got Hitman working at all?
> 
> Definitely broken for the 290/390 folks!!!
> 
> Game locks, artifacts, crashes, what a heap of garbage!!


Sigh.....the release was less than stable for Xbone but personally all I have had were connection issues. Glad to know when I get my rig up and running I can expect worse potentially from the PC copy. $120 well spent


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody in Fiji land got Hitman working at all?
> 
> Definitely broken for the 290/390 folks!!!
> 
> Game locks, artifacts, crashes, what a heap of garbage!!


It works on my R9 290X with no Crossfire support yet though... I'll check my FURYX's as soon as they arrive next week.


----------



## Agent Smith1984

Less than stable is an understatement for me right now, lol

The game crashes every 2-10 minutes in DX12. I played for about 30 minutes on DX 11 yesterday and it was okay, but then when I went to play again last night the game would lock up on the intro graphics every time. The built in benchmarks won't even run at all (thiough the DX11 did the first 2 or 3 times.) I see people all over Steam and elsewhere seeing the same issues. Youtube has videos of the exact things also.

Ridiculous, GARBAGE, porting...... Thank goodness I got this game for free with my video card, but it would be nice to be able to play it.....


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Well I don't seem to be dropping frames. So I can definitely handle the Ultra shadows. I seem to maintain 75fps quite consistently. It even reads 75fps when looking at the things that are causing choppy performance. I'll try target frame rate control to 60Fps. And see if that makes a difference.
> 
> 
> 
> It's more the game engine trying to periodically resync with the panel....poorly. Google 64hz bug and Skyrim if you aren't already aware. If you try target frame rate control try 58 and 56 FPS clamps as well. I remember someone having a similar problem with rock textures specifically.
> 
> If you pan the camera diagonally upward for instance when looking at a wall or rock in question...is that when you experience more stutter? As opposed to just horizontal or vertical(this is easy for me to test because I play Skyrim with a controlller...also not a terrible idea as I hear input lag from the mouse can be horrendous in this game).
Click to expand...

funny thing this. I checked if my crossfire was enabled and I saw it wasn't. However according to afterburner during stutters I am still maintaining 75fps. So I was like. Ok. Let's disable and re-enable crossfire. =fixed. No stutters. Maybe crossfire wasn't working right.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> funny thing this. I checked if my crossfire was enabled and I saw it wasn't. However according to afterburner during stutters I am still maintaining 75fps. So I was like. Ok. Let's disable and re-enable crossfire. =fixed. No stutters. Maybe crossfire wasn't working right.


Sweet! Most fickle game ever so once you got it running smooth set configs to read only and enjoy. I also at this stage back up my entire game and then start pushing up U Grids because the pop in is just so terrible I just can't stand it. This can lead to broken NPC pathing and such but in the world of Bugrim a worthy tradeoff.


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Less than stable is an understatement for me right now, lol
> 
> The game crashes every 2-10 minutes in DX12. I played for about 30 minutes on DX 11 yesterday and it was okay, but then when I went to play again last night the game would lock up on the intro graphics every time. The built in benchmarks won't even run at all (thiough the DX11 did the first 2 or 3 times.) I see people all over Steam and elsewhere seeing the same issues. Youtube has videos of the exact things also.
> 
> Ridiculous, GARBAGE, porting...... Thank goodness I got this game for free with my video card, but it would be nice to be able to play it.....


Funny thing is I tried the benchmark also and I get a long black screen the first time. It worked after the second time after a black screen for a few seconds.


----------



## BIGTom

Quote:


> Originally Posted by *Papa Emeritus*
> 
> I've been lurking here for years, but finally i made an account!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First thing that comes to mind? Post a sloppy pic of my two Fury X's in the owners club
> 
> 
> 
> 
> 
> 
> 
> . Got the first one just right after the release last summer, and the second one in October. They have been under water for most of the time, without any problems.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Beautiful! Welcome to the club Papa


----------



## Agent Smith1984

Quote:


> Originally Posted by *hyp36rmax*
> 
> Funny thing is I tried the benchmark also and I get a long black screen the first time. It worked after the second time after a black screen for a few seconds.


Same black screen crap with me too!


----------



## Awsan

Am I going crazy or there aren't any info about the size(Dimension) of the Gigabyte R9 Fury???


----------



## BIGTom

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody in Fiji land got Hitman working at all?
> 
> Definitely broken for the 290/390 folks!!!
> 
> Game locks, artifacts, crashes, what a heap of garbage!!


I had a CTD when I first launched the game in DX12 on release day. After that, I've not had any issues at all and the game runs buttery smooth.









EDIT: Actually I do have one issue. Using DX12, the game caps at 60 fps on Fiji and I cannot find a way to remove it. Tried Exclusive Fullscreen, Windowed Fullscreen, Vsync toggles and more. This behavior is not exhibited running the game on DX11. It's not a terrible inconvenience because I have a 3440x1440 60hz panel, but it did take the fun away from benchmarking in DX12.


----------



## Semel

Windowed mode fixes 60 fps lock on fury. But benchmarking in a windowed mode is not a good idea coz performance will always be lower than in an exclusive fullscreen mode..

This is what I got:

1920x1080 dx12, render target reuse disabled, windowed, all maxed out except for supersampling
---- CPU ----
99.79fps Average
---- GPU ----
100.54fps Average

1920x1080 dx12, render target reuse enabled, windowed, all maxed out except for supersampling
--- CPU ----
95.45fps Average
---- GPU ----
96.14fps Average


----------



## Chris1504

Hello guys
Yesterday i got my R9 Nano and i have some trouble with it. While playing games (The Witcher 3, Ark, The Division) I have a red dot on the screen like in the picture below

it dosen't change it's position unless i restart the game. In The Division it looks like it is a light source because it mirrors on shine surfaces. I also have some flickering like
Quote:


> Originally Posted by *xTesla1856*
> 
> Made a video, this is kinda ridiculous:


and in The Division in some areas masive fps drops below 5 fps.

I have just 1 R9 Nano so no crossfire

Is this a driver issue or should i send the card back?


----------



## dagget3450

Maybe we should call crossfire something like... cross flickering. Well except the above case unless you have an amd integrated gpu also an it auto turned on crossfire errr cross flicker


----------



## xTesla1856

Quote:


> Originally Posted by *Chris1504*
> 
> Is this a driver issue or should i send the card back?


I would RMA that card. One of my R9 Furies also just died on me yesterday. Seems to be a lot more prevalent with Fiji cards than any other series.


----------



## p4inkill3r

Quote:


> Originally Posted by *Chris1504*
> 
> Hello guys
> Yesterday i got my R9 Nano and i have some trouble with it. While playing games (The Witcher 3, Ark, The Division) I have a red dot on the screen like in the picture below
> 
> it dosen't change it's position unless i restart the game. In The Division it looks like it is a light source because it mirrors on shine surfaces. I also have some flickering like
> and in The Division in some areas masive fps drops below 5 fps.
> 
> I have just 1 R9 Nano so no crossfire
> 
> Is this a driver issue or should i send the card back?


That's a pretty strange problem.







:


----------



## Agent Smith1984

Quote:


> Originally Posted by *Papa Emeritus*
> 
> I've been lurking here for years, but finally i made an account!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First thing that comes to mind? Post a sloppy pic of my two Fury X's in the owners club
> 
> 
> 
> 
> 
> 
> 
> . Got the first one just right after the release last summer, and the second one in October. They have been under water for most of the time, without any problems.


Nice rig!

And even better handle/avatar!

Big Ghost fan here


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> funny thing this. I checked if my crossfire was enabled and I saw it wasn't. However according to afterburner during stutters I am still maintaining 75fps. So I was like. Ok. Let's disable and re-enable crossfire. =fixed. No stutters. Maybe crossfire wasn't working right.
> 
> 
> 
> Sweet! Most fickle game ever so once you got it running smooth set configs to read only and enjoy. I also at this stage back up my entire game and then start pushing up U Grids because the pop in is just so terrible I just can't stand it. This can lead to broken NPC pathing and such but in the world of Bugrim a worthy tradeoff.
Click to expand...

please. How do I get rid of texture and item pop in. It's irritating. And I've set all the sliders to max on the launcher menu. If I could set the graphics any higher I would. I'm seriously gonna get Mods for Skyrim. Some of those pictures. Damn.....


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> please. How do I get rid of texture and item pop in. It's irritating. And I've set all the sliders to max on the launcher menu. If I could set the graphics any higher I would. I'm seriously gonna get Mods for Skyrim. Some of those pictures. Damn.....


Here is an extensive post about bumping the U Grids which will load more cells in the exterior world:
http://forums.bethsoft.com/topic/1274926-ugridstoload-skyrimini-comparisons-and-explanation-default-57911/


----------



## Agent Smith1984

Just an update on Hitman....

I got dx12 bench to work at 1080p in windowed mode.

14fps over dx11 at max settings... (67 vs 81)

That's awesome! I'll be happy to see a fix for the crashing, and also happy to see many more dx12 games to come!

Mind you, I'm on an AMD CPU, with a 390x, would love to see AMD CPU + fury results!

Thanks


----------



## Spock121

I'm running an [email protected] and a Fury X but I can't get the Hitman DX12 bench to work, no mater what the settings.


----------



## Semel

Quote:


> Originally Posted by *Spock121*
> 
> I'm running an [email protected] and a Fury X but I can't get the Hitman DX12 bench to work, no mater what the settings.


What's the problem?


----------



## Spock121

Quote:


> Originally Posted by *Semel*
> 
> What's the problem?


Never mind, apparently it was FRAPS causing the issue. Though now when the benchmark seems to end I get no results window or output file.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Spock121*
> 
> Never mind, apparently it was FRAPS causing the issue. Though now when the benchmark seems to end I get no results window or output file.


Me either, i just watch the last number! Lol

Bottom line is, the game is broken for PC... Period


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Me either, i just watch the last number! Lol
> 
> Bottom line is, the game is broken for PC... Period


You aren't missing much as there is basically no content yet. 1 mission (2 training ops).
Essentially this game is a beta in it's state and you can see it when you play. Personally I am shelving it until all of the content is out.


----------



## Semel

Quote:


> Originally Posted by *Spock121*
> 
> Though now when the benchmark seems to end I get no results window or output file.


x:\Users\xxx\hitman\profiledata.txt


----------



## Alwrath

Just ordered a Radeon Fury Asus Strix off the egg open box for 391.99. It was 420 something out the door. What a steal! Upgrading from a Radeon 290 tri-x so should be sweet.







I will be ready for DOOM once its released thats for sure.


----------



## BIGTom

Quote:


> Originally Posted by *Semel*
> 
> Windowed mode fixes 60 fps lock on fury. But benchmarking in a windowed mode is not a good idea coz performance will always be lower than in an exclusive fullscreen mode..
> 
> This is what I got:
> 
> 1920x1080 dx12, render target reuse disabled, windowed, all maxed out except for supersampling
> ---- CPU ----
> 99.79fps Average
> ---- GPU ----
> 100.54fps Average
> 
> 1920x1080 dx12, render target reuse enabled, windowed, all maxed out except for supersampling
> --- CPU ----
> 95.45fps Average
> ---- GPU ----
> 96.14fps Average


AMD just released driver version 16.3.1 and this release fixes frame rate lock to monitor refresh rate. I can now run the game and benchmark higher than 60 fps in Windowed Fullscreen on my 3440x1440 60hz panel.
I still cannot get Exclusive Fullscreen to work however...


----------



## Shogon

Stupid me didn't think about what brands support removing the stock cooler. From what I've researched Asus and Sapphire are a negatory. Kinda sucks I didn't consider the right brand before getting this Asus Nano. Unless Asus emails me back and allows water blocks I'm probably going to go for a Fury X with Gigabyte or MSI.


----------



## JunkaDK

Hey guys.. hoping for some suggestions/help here from anyone with R9 Fury's.

My issue is that i cannot get freesync to work ( Asus MG279Q Freesync). I actived it in the OSD and in Crimson 16.3, but whenever i check the OSD it just steady at 144hz even though i can see in benchmarks it is running 60-80 FPS sometimes. - the hz does not sync with the FPS. I've tested with 3D mark Firestrike, Unigine Heaven and Battlefield 4 - all fullscreen. Tried with V-sync and Crossfire on and off but no difference.

I am running 2 x R9 Fury STRIX in crossfireX. I tried uninstalling with DDU and reinstalling several times. Only thing i haven't removed yet is RadeonPro.. Could that have anything to do with it?

Maybe is just want to knwo where to focus.. is it a GPU problem or a screen problem? og maybe both?









Anyways..

ANY Tips and tricks will be MUCH appreciated on this matter.


----------



## spyshagg

crossfire + freesync only works with the option "frame-pacing" activated. Suffice to say all software that interferes with the drivers should be removed prior to making conclusions


----------



## JunkaDK

Quote:


> Originally Posted by *spyshagg*
> 
> crossfire + freesync only works with the option "frame-pacing" activated. Suffice to say all software that interferes with the drivers should be removed prior to making conclusions


As i remember it was activated, but i will doublecheck when i get home







thanks.


----------



## Papa Emeritus

Quote:


> Originally Posted by *BIGTom*
> 
> Beautiful! Welcome to the club Papa


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice rig!
> 
> And even better handle/avatar!
> 
> Big Ghost fan here


Thanks!







Yeah they're a great band


----------



## xTesla1856

Spoke to my retailer, showed them pictures and a detailed description. They accepted the RMA right away. New Fury should be here next week.


----------



## dagget3450

Quote:


> Originally Posted by *xTesla1856*
> 
> Spoke to my retailer, showed them pictures and a detailed description. They accepted the RMA right away. New Fury should be here next week.


Wootwoot!


----------



## Awsan

For real no one on this plant knows the exact dimensions of the Gigabyte R9 Fury?? or the max GPU cooler hight allowed in the Obsidian 250D????


----------



## Roaches

Quote:


> Originally Posted by *Awsan*
> 
> For real no one on this plant knows the exact dimensions of the Gigabyte R9 Fury?? or the max GPU cooler hight allowed in the Obsidian 250D????


Which one you talking about? windforce 3 model? or reference?

If WF3 model I'd say it should fit it the case allows a maximum of 12" length cards since the cooler/shroud of all WF3 models are roughly the same in length.
the PCB is standard height so vertical dimensions shouldn't be a concern compared to a tall card like EVGA classified cards.
Quote:


> Originally Posted by *Roaches*
> 
> Wow that card is 309mm long (from Gigabyte's website), it won't fit in the FT02 (with AP181/182 fans) since that case has a maximum clearance of about 12-1/8 inches maximum for expansion cards length.
> 
> Here's Guru 3D's measurement of the card's length.
> 
> 
> 
> Should fit since its roughly over 11 inches but not over 12. I would trust that over Gigabyte's measurement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though Gigabyte dimensions usually include the full lip bracket dimensions, which may explain the exaggerated length callout.
> 
> EDIT: Newegg's measurements : Card Dimensions (L x H)
> 11.61" x 5.08"
> 
> Yeah I'd trust this card will fit in your case.


----------



## Jflisk

Anyone try disabling ULPS with the new 16.3.1 driver . For some reason mine will not disable. Thanks


----------



## SLK

Got my new Visiontek R9 Nano installed. Coil whine isn't too bad and it maxed out at 65c last night, hardly audible and no throttling. Did I get lucky or do these run relatively quiet?


----------



## Awsan

Sorry double post!


----------



## Awsan

Quote:


> Originally Posted by *Awsan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roaches*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Awsan*
> 
> For real no one on this plant knows the exact dimensions of the Gigabyte R9 Fury?? or the max GPU cooler hight allowed in the Obsidian 250D????
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which one you talking about? windforce 3 model? or reference?
> 
> If WF3 model I'd say it should fit it the case allows a maximum of 12" length cards since the cooler/shroud of all WF3 models are roughly the same in length.
> the PCB is standard height so vertical dimensions shouldn't be a concern compared to a tall card like EVGA classified cards.
> Quote:
> 
> 
> 
> Originally Posted by *Roaches*
> 
> Wow that card is 309mm long (from Gigabyte's website), it won't fit in the FT02 (with AP181/182 fans) since that case has a maximum clearance of about 12-1/8 inches maximum for expansion cards length.
> 
> Here's Guru 3D's measurement of the card's length.
> 
> 
> 
> Should fit since its roughly over 11 inches but not over 12. I would trust that over Gigabyte's measurement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though Gigabyte dimensions usually include the full lip bracket dimensions, which may explain the exaggerated length callout.
> 
> EDIT: Newegg's measurements : Card Dimensions (L x H)
> 11.61" x 5.08"
> 
> Yeah I'd trust this card will fit in your case.
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> My issue the depth of the card and its distance from the top of the GPU fans to the side panel
> 
> This is that card you posted the G1 Gaming 980
> 
> 
> And this is the R9 fury
Click to expand...

No body was able to give me the max depth for a card in the 250D, even that card's size is not on gigabyte's official website while other cards its mentioned

These are the specs for the 980


These are the specs for the fury


----------



## Roaches

Quote:


> Originally Posted by *Awsan*
> 
> No body was able to give me the max depth for a card in the 250D, even that card's size is not on gigabyte's official website while other cards its mentioned


Roughly 2-1/2 slots wide but not 3 slots. A 250D can fit a Titan Z which was also 2.5 slots wide.


----------



## Awsan

Quote:


> Originally Posted by *Roaches*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Awsan*
> 
> No body was able to give me the max depth for a card in the 250D, even that card's size is not on gigabyte's official website while other cards its mentioned
> 
> 
> 
> Roughly 2-1/2 slots wide but not 3 slots. A 250D can fit a Titan Z which was also 2.5 slots wide.
Click to expand...

Its impossible to fit it in there with out modding?

OH sorry i see that the third bracket doesn't need to be screwed in , Thanks for the info mate


----------



## bluezone

A new driver already?

http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.1.aspx

that was fast.


----------



## Tgrove

Quote:


> Originally Posted by *JunkaDK*
> 
> Hey guys.. hoping for some suggestions/help here from anyone with R9 Fury's.
> 
> My issue is that i cannot get freesync to work ( Asus MG279Q Freesync). I actived it in the OSD and in Crimson 16.3, but whenever i check the OSD it just steady at 144hz even though i can see in benchmarks it is running 60-80 FPS sometimes. - the hz does not sync with the FPS. I've tested with 3D mark Firestrike, Unigine Heaven and Battlefield 4 - all fullscreen. Tried with V-sync and Crossfire on and off but no difference.
> 
> I am running 2 x R9 Fury STRIX in crossfireX. I tried uninstalling with DDU and reinstalling several times. Only thing i haven't removed yet is RadeonPro.. Could that have anything to do with it?
> 
> Maybe is just want to knwo where to focus.. is it a GPU problem or a screen problem? og maybe both?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways..
> 
> ANY Tips and tricks will be MUCH appreciated on this matter.


Disable crossfire and reboot, disable freesync in crimsom, reboot. Enable crossfire thrn reboot, then enable freesync and reboot. Also,check to make sure refresh rate is selected in windows


----------



## Awsan

Sorry but one last stupid question as the Fury and the Nano are the same price what is my best option?


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> A new driver already?
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD_Radeon_Software_Crimson_Edition_16.3.1.aspx
> 
> that was fast.


And still no FlickerFire fix for The Division. This is starting to get on my nerves. Gimpworks seems to have completely ****ed the game from Beta to release


----------



## SuperZan

Quote:


> Originally Posted by *Awsan*
> 
> Sorry but one last stupid question as the Fury and the Nano are the same price what is my best option?


If you're not using SFF and you're not concerned about squeezing in a larger card, the Fury is IMO easier to work with from an OC standpoint. That's not to say that Nano's can't clock well but they tend to prefer a home under water to best resist throttling. The Fury models have air coolers that can stand up to some rather solid overclocks without a lot of fussing about. If you are planning on going with a water-cooled solution to integrate into an existing loop the Nano is a full chip without anything cut or locked. The Fury still does great underwater but you roll the dice as to whether or not you'll have unlockable CU's, and beyond that, whether you'll get a symmetrical number of them.

With your case and no pre-existing loop I'd try to fit a Fury, but that's just my opinion.


----------



## JunkaDK

I found a solution. In crimson i had to set a fixed framrate to 144.. Now its working ? Freesync up to 144hz.. Looks incredible ?


----------



## EnthusiastMe

Hello,

I recently acquired my Nano. I'm quite pleased with it.

One game I like to play is DIrt Rally. I ran the game's benchmark at 4K on Ultra (no AA) and got an average FPS of 60.24. The minimum FPS never dropped below 52. This little card packs quite the punch.

And between the sell of my 290X and the Nano's price drop, I didn't have to pay very much.


----------



## Pintek

Well this happened today when I logged in my computer today with my asus r9 nano white...



rebooted an it went back to normal but makes me... rather paranoid the card was at 35oC so heat shouldn't be a issue...


----------



## xTesla1856

Could be a driver issue. Once my card started artefacting, it never went away


----------



## dagget3450

The 16.3 drivers are working wonders for me in window7, not so much in windows10 though. I am going to do a fresh reload on windows 10 as mine is about 6 months or so old and see. I am getting roughly 10-20fps deficit in windows 10 crossfire. Its in everything also, not just benches. If this happens on a fresh copy of win10, i am just gonna revert back to win7 or setup dual boot but dx12 isn't looking worth worrying about for now.

Fire Strike ultra, just shy of 18k gpu score. with 4 furies small oc


----------



## JunkaDK

Hmm interesting .. Looking forward to your Results ?


----------



## Jflisk

Quote:


> Originally Posted by *Pintek*
> 
> Well this happened today when I logged in my computer today with my asus r9 nano white...
> 
> 
> 
> rebooted an it went back to normal but makes me... rather paranoid the card was at 35oC so heat shouldn't be a issue...


That happens from time to time . AMD is working on it - Just left click personalize>change screen resolution >change it back fixed . In other words its not you card. Also install the latest driver 16.3.1 seems to help anything after 16.3 .


----------



## Semel

Quote:


> Originally Posted by *Pintek*
> 
> Well this happened today when I logged in my computer today with my asus r9 .


When it happens change resolution and revert it back or take out your display port cable and put it back. It's a display corruption that happens only under low load due to bugged power efficiency AMD's tricks..


----------



## Noufel

got my furies strix two days ago, installed them in the first and third PCIe slot of my ME6 ( enabled cfx with the plx chip not native one ) to give them some fresh air








i got maximum temps for the upper gpu 80C and the other one 70C in heavy load usage, are those good temps ? my side panel fan is exhaust btw (should i put it intake ?)


----------



## xTesla1856

Seems kinda high, my top card (Nitro R9 Fury) never exceeds 70. Maybe the Strix cooler isn't as good as Sapphires design. I'd put a custom fan curve on yours!


----------



## SuperZan

Quote:


> Originally Posted by *Noufel*
> 
> got my furies strix two days ago, installed them in the first and third PCIe slot of my ME6 ( enabled cfx with the plx chip not native one ) to give them some fresh air
> 
> 
> 
> 
> 
> 
> 
> 
> i got maximum temps for the upper gpu 80C and the other one 70C in heavy load usage, are those good temps ? my side panel fan is exhaust btw (should i put it intake ?)


I prefer side intake, brings cooler air over the GPU's and the board, and as @xTesla1856 says a nice custom fan curve could be very helpful.


----------



## Noufel

Quote:


> Originally Posted by *xTesla1856*
> 
> Seems kinda high, my top card (Nitro R9 Fury) never exceeds 70. Maybe the Strix cooler isn't as good as Sapphires design. I'd put a custom fan curve on yours!


Quote:


> Originally Posted by *SuperZan*
> 
> I prefer side intake, brings cooler air over the GPU's and the board, and as @xTesla1856
> says a nice custom fan curve could be very helpful.


Thnx a lot i'll put the side fan intake and make a custum curve tommorow and post the results


----------



## Thoth420

Quote:


> Originally Posted by *Semel*
> 
> When it happens change resolution and revert it back or take out your display port cable and put it back. It's a display corruption that happens only under low load due to bugged power efficiency AMD's tricks..


I could have sworn that was on the resolved issues list for Crimson 16.3. Is it still occurring?


----------



## pdasterly

is there a waterblock available for gigabyte fury?
http://www.gigabyte.com/products/product-page.aspx?pid=5680#kf


----------



## SuperZan

Looking at EK, XSPC, and Swiftech sites the Windforce is *not *listed as compatible with existing Fury blocks due to the length of the PCB. I've not yet found a specific block for the Giga WF Fury.


----------



## pdasterly

no wonder it was cheapest fury


----------



## Jflisk

Quote:


> Originally Posted by *Thoth420*
> 
> I could have sworn that was on the resolved issues list for Crimson 16.3. Is it still occurring?


I am on 16.3.1 I have not seen it yet - This problem takes time to occur and is as random as the lottery. I don't remember seeing it happen in 16.3 but 16.3 was not on my system that long.


----------



## Thoth420

Quote:


> Originally Posted by *Jflisk*
> 
> I am on 16.3.1 I have not seen it yet - This problem takes time to occur and is as random as the lottery. I don't remember seeing it happen in 16.3 but 16.3 was not on my system that long.


I'm aware it takes long idle time to manifest just curious if anyone had it happen post the 16.3 "fix". Cheers


----------



## rx7racer

Quote:


> Originally Posted by *Thoth420*
> 
> I'm aware it takes long idle time to manifest just curious if anyone had it happen post the 16.3 "fix". Cheers


Well I can't say I am on 16.3.1 but I am running 16.3 and I still have my display #2 doing it from time to time. For some odd reason it hasn't done it on my DP out that I use for monitor #1.

It usually happens when waking or when launching a game after long down time. Don't get me wrong it's not like everyday but couple times a week though.


----------



## Thoth420

Quote:


> Originally Posted by *rx7racer*
> 
> Well I can't say I am on 16.3.1 but I am running 16.3 and I still have my display #2 doing it from time to time. For some odd reason it hasn't done it on my DP out that I use for monitor #1.
> 
> It usually happens when waking or when launching a game after long down time. Don't get me wrong it's not like everyday but couple times a week though.


Interesting... thanks.


----------



## Tobiman

Quote:


> Originally Posted by *EnthusiastMe*
> 
> Hello,
> 
> I recently acquired my Nano. I'm quite pleased with it.
> 
> One game I like to play is DIrt Rally. I ran the game's benchmark at 4K on Ultra (no AA) and got an average FPS of 60.24. The minimum FPS never dropped below 52. This little card packs quite the punch.
> 
> And between the sell of my 290X and the Nano's price drop, I didn't have to pay very much.


Do you mind sharing your scores with the 290X? I play quite a few racing sims and my 290 keeps my frames around 90 in assetto corsa and dirt rally on 1440p. I wouldn't mind a boost into the 120s, if a nano can achieve that.


----------



## Tobiman

Quote:


> Originally Posted by *xTesla1856*
> 
> And still no FlickerFire fix for The Division. This is starting to get on my nerves. Gimpworks seems to have completely ****ed the game from Beta to release


The division flickers on Nvidia cards as well but not constantly.


----------



## xTesla1856

Yeah, really seems that multi-GPU support is downright nonexistent yet, even though during the beta they promised it would be implemented from the release.


----------



## Willius

Just ordered my R9 Nano for my HTPC. Can't wait to get my hands on it


----------



## Metalhead79

I couldn't resist the $470 price on Amazon for the Gigabyte R9 Fury. I debated between that and the R9 Nano, but decided the Fury would be easier to overclock with it's bigger cooler and since they perform very close to the Fury X, I'm not losing much potential performance.

It'll be here today and I'm excited! I haven't had a highend GPU since the HD5870.

Any specific tips for OCing the Gigabyte Fury's?


----------



## Jesse36m3

I seem to have hit a wall while overclocking my fury x. Can't really get her past 1160Mhz Core, 550Mhz memory, and that's +75mv.

Even with a full EK block I was expecting a bit more.

Anybody else have overclocking results? Very hard to find solid info just on Google, so I thought I'd come here and see what the community has discovered.

Also, MSI Afterburner is acting weird with voltage control. When I select a value and click apply, it jumps back ~4mv. What's the deal with that?


----------



## SuperZan

Quote:


> Originally Posted by *Jesse36m3*
> 
> I seem to have hit a wall while overclocking my fury x. Can't really get her past 1160Mhz Core, 550Mhz memory, and that's +75mv.
> 
> Even with a full EK block I was expecting a bit more.
> 
> Anybody else have overclocking results? Very hard to find solid info just on Google, so I thought I'd come here and see what the community has discovered.
> 
> Also, MSI Afterburner is acting weird with voltage control. When I select a value and click apply, it jumps back ~4mv. What's the deal with that?


Have you selected unofficial overclocking in the Afterburner settings? At the stock settings it dials back the voltage a touch. Also it's worth noting that the HBM runs only at set intervals of 500, 545, 600, 666, so if you set it higher or lower than those values it will round to the nearest one. You can probably top 1200 with additional voltage from Afterburner's unofficial overclocking limits but the HBM is tricky to get stable. Mine really likes 545 but 600 has been no dice.


----------



## Jesse36m3

Quote:


> Originally Posted by *SuperZan*
> 
> Have you selected unofficial overclocking in the Afterburner settings? At the stock settings it dials back the voltage a touch. Also it's worth noting that the HBM runs only at set intervals of 500, 545, 600, 666, so if you set it higher or lower than those values it will round to the nearest one. You can probably top 1200 with additional voltage from Afterburner's unofficial overclocking limits but the HBM is tricky to get stable. Mine really likes 545 but 600 has been no dice.


Thanks for the info, I had no idea about the HBM.

I did not check that box for "unofficial overclocking" before starting.

Will report back.

Edit: do I select WITH or WITHOUT PowerPlay?


----------



## SuperZan

Quote:


> Originally Posted by *Jesse36m3*
> 
> Thanks for the info, I had no idea about the HBM.
> 
> I did not check that box for "unofficial overclocking" before starting.
> 
> Will report back.
> 
> Edit: do I select WITH or WITHOUT PowerPlay?


No worries, let us know how you get on. And without Power play, I'd advise.


----------



## Metalhead79

I got my Fury. I love the triple slot cooler. But.....Idle temps on my Fury are pretty high. It's sitting between 46-48c. By comparision my GTX 970 would idle at 24-26c. It is down clocking to 300mhz at idle. Is this normal?


----------



## Spartoi

Hello, I'll soon be getting a Sapphire Fury Tri-X. I previously had a 390x that I used with a Kraken X41 AIO water cooler via G10. On the 390x, I had to buy separate heatsinks for the VRMs because they got very hot. I assume the same will happen on Fury and so I was wondering where are the VRMs located on the Tri-X Fury and if anyone knows a heatsink kit I can buy for them somewhere?


----------



## buildzoid

Quote:


> Originally Posted by *Metalhead79*
> 
> I got my Fury. I love the triple slot cooler. But.....Idle temps on my Fury are pretty high. It's sitting between 46-48c. By comparision my GTX 970 would idle at 24-26c. It is down clocking to 300mhz at idle. Is this normal?


Yeah that's normal for the stock settings.
Quote:


> Originally Posted by *SuperZan*
> 
> Have you selected unofficial overclocking in the Afterburner settings? At the stock settings it dials back the voltage a touch. Also it's worth noting that the HBM runs only at set intervals of 500, 545, 600, 666, so if you set it higher or lower than those values it will round to the nearest one. You can probably top 1200 with additional voltage from Afterburner's unofficial overclocking limits but the HBM is tricky to get stable. Mine really likes 545 but 600 has been no dice.


This isn't true at all. My car will refuse to run an mhz over 570 and 571 to 545 than to 600. Also my Firestrike scores are better at 570mhz than at 545mhz. Really I haven't seen any solid proof for this statement but I've seen plenty of benchmark scores that show it isn't true. Also all the LN2 overclocking on Fiji cards show better and better scores going well over 666mhz. So there has to be HBM clock options up to 1Ghz(no one AFAIK has gone past that even with LN2).


----------



## SuperZan

Quote:


> Originally Posted by *Metalhead79*
> 
> I got my Fury. I love the triple slot cooler. But.....Idle temps on my Fury are pretty high. It's sitting between 46-48c. By comparision my GTX 970 would idle at 24-26c. It is down clocking to 300mhz at idle. Is this normal?


My Fiji cards idle down to 300MHz so I think that number is the standard. As to temps, mine is sat at 25C idle in a ~20C ambient setting. Have you got any intake on the GPU/motherboard directly, whether side or front? Also you may wish to have a look at the fan curve - a custom setting only subtly tweaked can make a rather large difference - it did so for my XFX Fury.


----------



## SuperZan

Quote:


> Originally Posted by *buildzoid*
> 
> This isn't true at all. My car will refuse to run an mhz over 570 and 571 to 545 than to 600. Also my Firestrike scores are better at 570mhz than at 545mhz. Really I haven't seen any solid proof for this statement but I've seen plenty of benchmark scores that show it isn't true. Also all the LN2 overclocking on Fiji cards show better and better scores going well over 666mhz. So there has to be HBM clock options up to 1Ghz(no one AFAIK has gone past that even with LN2).


It came from an AMD rep and experientially it's turned out to be true for many people. Let's have a look at those FS scores. Are they within the margin of error?

http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2477&cores=1#start=0#interval=20 A cursory glance seems to corroborate what the AMD fellow had to say as well. He could very well be wrong but I haven't seen anything that convinces me otherwise.

Here's the original post that introduced the concept: http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-pro-duo-fiji-owners-club/6650#post_24834799


----------



## Spartoi

Quote:
Originally Posted by *SuperZan* 

Quote:


> Originally Posted by *Spartoi*
> 
> 
> Hello, I'll soon be getting a Sapphire Fury Tri-X. I previously had a 390x that I used with a Kraken X41 AIO water cooler via G10. On the 390x, I had to buy separate heatsinks for the VRMs because they got very hot. I assume the same will happen on Fury and so I was wondering where are the VRMs located on the Tri-X Fury and if anyone knows a heatsink kit I can buy for them somewhere?
> 
> 
> 
> 
> My Fiji cards idle down to 300MHz so I think that number is the standard. As to temps, mine is sat at 25C idle in a ~20C ambient setting. Have you got any intake on the GPU/motherboard directly, whether side or front? Also you may wish to have a look at the fan curve - a custom setting only subtly tweaked can make a rather large difference - it did so for my XFX Fury.


I don't have the Fury yet so I don't know what the temps will be but when I get it I will take off the stock cooler and install only a AIO on the core, leaving the VRMs naked. That why I want to know where the VRMs are located so I can put a heatsink on them when I get the card.


----------



## SuperZan

Quote:



> Originally Posted by *Spartoi*
> 
> I don't have the Fury yet so I don't know what the temps will be but when I get it I will take off the stock cooler and install only a AIO on the core, leaving the VRMs naked. That why I want to know where the VRMs are located so I can put a heatsink on them when I get the card.


My mistake, quoted the wrong person.  Here you are: http://cdn.wccftech.com/wp-content/uploads/2015/07/front.jpg


----------



## Spartoi

Quote:


> Originally Posted by *SuperZan*
> 
> My mistake, quoted the wrong person.
> 
> 
> 
> 
> 
> 
> 
> Here you are: http://cdn.wccftech.com/wp-content/uploads/2015/07/front.jpg


I'm still lost.









What card is in the picture? I'll be getting the Sapphire Tri-X Fury whose PCB looks like the one I link in my picture (based on the bottom of this page). Based on that picture, can you circle or box the area of the VRM location? For example, I think these are the VRMs but I'm not sure.


----------



## SuperZan

Quote:



> Originally Posted by *Spartoi*
> 
> I'm still lost.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What card is in the picture? I'll be getting the Sapphire Tri-X Fury whose PCB looks like the one I link in my picture (based on the bottom of this page). Based on that picture, can you circle or box the area of the VRM location? For example, I think these are the VRMs but I'm not sure.


It's the Sapphire Fury Tri-X, all nekkid.  Here's a nicely-done encirclement (rectanglement, rather ) of the Fury VRM. You'll see the Tri-X VRM coincide with the graphic, they only look a bit fancier.









 Fury Tri-X

 VRM locations


----------



## buildzoid

These are the MOSFETs(the parts of a VRM that run hot):


----------



## SuperZan

^ guy knows his hardware mods. One of the first to foray into volt-mods on Fury if I'm not greatly mistaken.


----------



## Metalhead79

Quote:


> Originally Posted by *SuperZan*
> 
> My Fiji cards idle down to 300MHz so I think that number is the standard. As to temps, mine is sat at 25C idle in a ~20C ambient setting. Have you got any intake on the GPU/motherboard directly, whether side or front? Also you may wish to have a look at the fan curve - a custom setting only subtly tweaked can make a rather large difference - it did so for my XFX Fury.


I have a Fractal Define R4. HDD cages have been removed. I have two 140mm XSPC radiator fans as intake and one of the Fractal case fans as side intake. I have removed the PCI brackets. CPU cooler is a Thermalright TRUE Power 140. No exhaust fan.

With my GTX 970 this gave nice low idle temps and load temps around 65-70c. I found that a bottom intake fan increased load temps by about 3c, so I didn't use one for that GPU. I will test it out with the Fury. I haven't touched the fan curves yet. I was just shocked to see 20c higher idle. I was expecting 5-10c more.


----------



## Spartoi

Thanks for the help everyone.


----------



## SuperZan

Quote:


> Originally Posted by *Metalhead79*
> 
> I have a Fractal Define R4. HDD cages have been removed. I have two 140mm XSPC radiator fans as intake and one of the Fractal case fans as side intake. I have removed the PCI brackets. CPU cooler is a Thermalright TRUE Power 140. No exhaust fan.
> 
> With my GTX 970 this gave nice low idle temps and load temps around 65-70c. I found that a bottom intake fan increased load temps by about 3c, so I didn't use one for that GPU. I will test it out with the Fury. I haven't touched the fan curves yet. I was just shocked to see 20c higher idle. I was expecting 5-10c more.


It does seem a bit high with your setup. Not much I could think of beyond what you've done save playing with the fan curve. Best of luck getting it sorted.


----------



## Spartoi

I've read that software can not read the VRM temperature on Fury series. Is that still true?


----------



## Mumak

Quote:


> Originally Posted by *Spartoi*
> 
> I've read that software can not read the VRM temperature on Fury series. Is that still true?


HWiNFO can


----------



## Metalhead79

It turned out to be the drivers. I noticed that Hitman in DX11 had a lot of weird graphical issues, so I used AMD's driver cleaner utility and then did a clean install. It fixed those issues and my idle temps dropped to 30c. I did adjust the fan curves a little, too.

I'm pretty happy with it. It's much, much faster than my GTX 970.


----------



## Agent Smith1984

Anybody got any recent Heaven runs with latest drivers?

1080 / 8x / ultra of course.

Thanks


----------



## gupsterg

i5 rig as per my sig, daily OS (ie not bench tweaked), Crimson 16.3.1 (PE=Off, Tess.=Off), updated factory ROM via Sapphire support.



Spoiler: ROM mods



- GPU 1110 RAM 535 in ROM
- Overdrive Limit for RAM matched to RAM clock in ROM
- All DPMs VID fixed manually as stock EVV calculated for 1050/500 (as this is VID, VDDC is lower







).


Spoiler: Warning: Spoiler!



[ GPU PStates List ]

DPM0: GPUClock = 300 MHz, VID = 0.90000 V
DPM1: GPUClock = 512 MHz, VID = 0.92500 V
DPM2: GPUClock = 724 MHz, VID = 0.93700 V
DPM3: GPUClock = 892 MHz, VID = 1.01800 V
DPM4: GPUClock = 944 MHz, VID = 1.06800 V
DPM5: GPUClock = 984 MHz, VID = 1.11800 V
DPM6: GPUClock = 1018 MHz, VID = 1.16200 V
DPM7: GPUClock = 1110 MHz, VID = 1.21200 V



- PowerTune mod in ROM 350W/325A/350W (stock = 270W/300A/300W)
- Fan Table mod for "Fuzzy Logic" (55C GPU Temp target for cooling, +150% fan sensitivity, 75C GPU Throttle temp)
- RAM Timings mod 400MHz timings in 500MHz & 600MHz strap (again due to interface width of 4096Bit negligible boost but still do mod)





Heaven_PEoff_Tessoff.zip 1k .zip file


As more of my time being spent on bios investigations no idea if this good for Fiji or not.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> i5 rig as per my sig, daily OS (ie not bench tweaked), Crimson 16.3.1 (PE=Off, Tess.=Off), updated factory ROM via Sapphire support.
> 
> 
> 
> Spoiler: ROM mods
> 
> 
> 
> - GPU 1110 RAM 535 in ROM
> - Overdrive Limit for RAM matched to RAM clock in ROM
> - All DPMs VID fixed manually as stock EVV calculated for 1050/500 (as this is VID, VDDC is lower
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [ GPU PStates List ]
> 
> DPM0: GPUClock = 300 MHz, VID = 0.90000 V
> DPM1: GPUClock = 512 MHz, VID = 0.92500 V
> DPM2: GPUClock = 724 MHz, VID = 0.93700 V
> DPM3: GPUClock = 892 MHz, VID = 1.01800 V
> DPM4: GPUClock = 944 MHz, VID = 1.06800 V
> DPM5: GPUClock = 984 MHz, VID = 1.11800 V
> DPM6: GPUClock = 1018 MHz, VID = 1.16200 V
> DPM7: GPUClock = 1110 MHz, VID = 1.21200 V
> 
> 
> 
> - PowerTune mod in ROM 350W/325A/350W (stock = 270W/300A/300W)
> - Fan Table mod for "Fuzzy Logic" (55C GPU Temp target for cooling, +150% fan sensitivity, 75C GPU Throttle temp)
> - RAM Timings mod 400MHz timings in 500MHz & 600MHz strap (again due to interface width of 4096Bit negligible boost but still do mod)
> 
> 
> 
> 
> 
> Heaven_PEoff_Tessoff.zip 1k .zip file
> 
> 
> As more of my time being spent on bios investigations no idea if this good for Fiji or not.


Well, it's certainly a good score, do you have one with tess on? Just for comparison sake.

Thanks


----------



## gupsterg

No worries







, I did 3 runs, from date stamp you'll see order







.

Other_Heaven_runs.zip 2k .zip file


Even with upped PowerLimit in ROM with PE = On (default setting in driver), kills min FPS (from past monitoring of GPU clock with PE = On it throttles card regardless of temps,etc).

TBH I never use Heaven or Valley as benchmarks, just found too much variation between runs IIRC. I use them really as 2nd line artifact testers, 1st line for me is 3DM FS (looped). For performance testing I use 3DM and in game benchmarks (FRAPS of some).


----------



## Awsan

Tell me why I shouldn't buy a nano for $429.99

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150754&cm_re=r9_fury-_-14-150-754-_-Product


----------



## p4inkill3r

Quote:


> Originally Posted by *Awsan*
> 
> Tell me why I shouldn't buy a nano for $429.99
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150754&cm_re=r9_fury-_-14-150-754-_-Product


Buy it!


----------



## Agent Smith1984

Yeah, that's a pretty good deal!


----------



## SuperZan

Quote:


> Originally Posted by *Awsan*
> 
> Tell me why I shouldn't buy a nano for $429.99
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150754&cm_re=r9_fury-_-14-150-754-_-Product


I hope it's already in your cart.


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody got any recent Heaven runs with latest drivers?
> 
> 1080 / 8x / ultra of course.
> 
> Thanks


I know it doesnt help but 4x furyx i was able to get 3rd place top 30 heaven here.(1080p)

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores

When i get time ill play with it more as its not using all 4 100% sadly.


----------



## pdasterly

Quote:


> Originally Posted by *dagget3450*
> 
> I know it doesnt help but 4x furyx i was able to get 3rd place top 30 heaven here.(1080p)
> 
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores
> 
> When i get time ill play with it more as its not using all 4 100% sadly.


nice, are they under water, if not where did you mount the 4 radiators.


----------



## dagget3450

Quote:


> Originally Posted by *pdasterly*
> 
> nice, are they under water, if not where did you mount the 4 radiators.


Right now its a horrible mess of 2 psu's and wires rads laying on a desk. I am working on the case while being consumed with wasting time benching lol. I am almost burnt out so now its a matter of getting things done  My furies are on stock AIO, i haven't been convinced outside of aesthetics that water blocks will help with overclocking.


----------



## pdasterly

Quote:


> Originally Posted by *dagget3450*
> 
> Right now its a horrible mess of 2 psu's and wires rads laying on a desk. I am working on the case while being consumed with wasting time benching lol. I am almost burnt out so now its a matter of getting things done  My furies are on stock AIO, i haven't been convinced outside of aesthetics that water blocks will help with overclocking.


big case, two up front and two on top, I think they would fit in my corsair air 540 nicely, might be able to squeeze a aio cpu cooler in there too


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> 
> , I did 3 runs, from date stamp you'll see order
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Other_Heaven_runs.zip 2k .zip file
> 
> 
> Even with upped PowerLimit in ROM with PE = On (default setting in driver), kills min FPS (from past monitoring of GPU clock with PE = On it throttles card regardless of temps,etc).
> 
> TBH I never use Heaven or Valley as benchmarks, just found too much variation between runs IIRC. I use them really as 2nd line artifact testers, 1st line for me is 3DM FS (looped). For performance testing I use 3DM and in game benchmarks (FRAPS of some).


Were pretty close in scores.

C:\Users\brian\Unigine_Heaven_Benchmark_4.0_20160323_1842.html


----------



## gupsterg

Link to your HDD not gonna work my online buddy







.

Zip file, use the "paperclip" icon to attach to post







or open HTML file on your PC get screenshot and insert as image







.

What clocks your Nano at?


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Link to your HDD not gonna work my online buddy
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Zip file, use the "paperclip" icon to attach to post
> 
> 
> 
> 
> 
> 
> 
> or open HTML file on your PC get screenshot and insert as image
> 
> 
> 
> 
> 
> 
> 
> .
> 
> What clocks your Nano at?


sorry wasn't thinking.











1125 Mhz, "0" offset voltage, 545 HBM and power limit 36%

Edit: Forgot to add this is while maintaining 65 deg C


----------



## gupsterg

Very interesting







.

In drivers switch "Power Efficiency" to Off , be interesting to see how a Nano reacts







, defo helps Fury / X







.



Then do a run with PE = Off still *and* change "Tessellation mode" from "AMD Optimized" to "Overide ..." and then a box will appear "Maximum Tessellation Level" , set to Off. Even if you have tessellation "Extreme" in Heaven it will be overridden like in my testing







.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Very interesting
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In drivers switch "Power Efficiency" to Off , be interesting to see how a Nano reacts
> 
> 
> 
> 
> 
> 
> 
> , defo helps Fury / X
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Then do a run with PE = Off still *and* change "Tessellation mode" from "AMD Optimized" to "Overide ..." and then a box will appear "Maximum Tessellation Level" , set to Off. Even if you have tessellation "Extreme" in Heaven it will be overridden like in my testing
> 
> 
> 
> 
> 
> 
> 
> .


PE switch isn't present on Crimson for me, just like VSR will not latch to on for me. I'll try disableing PE in radeon Mod.
Same results in Crimson on a secondary PC I have (3750k). Other people are having the same problem.

Give me 5 mins, to do another run.


----------



## bluezone

Turning off PE in Radeon Mod caused a BSOD, as far as I can tell the driver are now corrupted. Have to do a clean reinstall.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Very interesting
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In drivers switch "Power Efficiency" to Off , be interesting to see how a Nano reacts
> 
> 
> 
> 
> 
> 
> 
> , defo helps Fury / X
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Then do a run with PE = Off still *and* change "Tessellation mode" from "AMD Optimized" to "Overide ..." and then a box will appear "Maximum Tessellation Level" , set to Off. Even if you have tessellation "Extreme" in Heaven it will be overridden like in my testing
> 
> 
> 
> 
> 
> 
> 
> .


Radeon Mod worked ok with 16.3 but for me doesn't like 16.3.1, weird,

So these are runs with PE on

For reference run -Tess X64:



Tess set to OFF:



what do you think?:


----------



## dagget3450

I must have been under a rock, cause radeonmod was off my radar. This is the first i heard of it and i feel dumb . I had tried radeonpro a few times and gave up. I am curious to see if this will work better than the radeon settings panel in crimson.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> 1125 Mhz, "0" offset voltage, 545 HBM and power limit 36%
> 
> Edit: Forgot to add this is while maintaining 65 deg C


To me that's nice OC







, how are you managing 65c on GPU? did you do fan mod to your Nano?
Quote:


> Originally Posted by *bluezone*
> 
> Tess set to OFF:
> 
> 
> 
> what do you think?:


To me good run, very comparable with my run PE = Off , Tess. = Off @ 1110 / 535







.

I will do a PE = On , Tess. = Off run as didn't do one yesterday.


----------



## huzzug

To divert from this benches, has anyone tried mining with these cards. I know the 290x managed 700-800mhash. How much do these cards manage considering HBM ? For giggles though !!


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> 1125 Mhz, "0" offset voltage, 545 HBM and power limit 36%


Also it would be interesting to see your registers dump (ref OP of Fiji bios mod thread). You will see in this linked post data from my 3x cards plus others.

VID per DPM is set due to GPU properties, stock ROMs use EVV (Electronic Variable Voltage), this is so each ROM does not need to be tailored per GPU.

I found using MSI AB when OC'ing card per DPM VID & GPU clock will change. Depending on how far we push highest DPM clock(DPM 7), see this linked post.

I'm totally now using ROM based OC'ing for 24/7 use and only use MSI AB to set an OC for intial testing. Once set in ROM I see no variation in set VID per DPM or lower DPM GPU clock.


----------



## fjordiales

Question for those who use crossfire, is there a trick to make the bottom card the primary card? Everytime I disable xfire, bottom works but once i activate it, nothing is displayed. I know it's a dumb question but it seems like xfire is different from SLI. Thanks!


----------



## Alastair

Is there a reason why my one GPU runs at 1.175V stock and the other at 1.195V stock? Back in my HD6850 days both cards of mine were shipped at 1.1V.


----------



## dagget3450

Quote:


> Originally Posted by *fjordiales*
> 
> Question for those who use crossfire, is there a trick to make the bottom card the primary card? Everytime I disable xfire, bottom works but once i activate it, nothing is displayed. I know it's a dumb question but it seems like xfire is different from SLI. Thanks!


Try this, disable crossfire then connect monitor to card you want primary, then renable ccrossfire.just note primary card is also used by bios which is usually top card so on a reboot you may not see anything until your in windows. My mainboard bios allows me to assign a pcie slot as primary if i want.


----------



## fjordiales

Quote:


> Originally Posted by *dagget3450*
> 
> Try this, disable crossfire then connect monitor to card you want primary, then renable ccrossfire.just note primary card is also used by bios which is usually top card so on a reboot you may not see anything until your in windows. My mainboard bios allows me to assign a pcie slot as primary if i want.


+ rep for tip.

Same thing. It works when I go plug it in after disabling xfire but the moment I go xfire, screen goes blank. I might just end up reinstalling crimson.


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> Is there a reason why my one GPU runs at 1.175V stock and the other at 1.195V stock? Back in my HD6850 days both cards of mine were shipped at 1.1V.


Possible reasons:-

i) differing LeakageID / GPU Properties = different calculated EVV VID

ii) due to CF/ULPS

I'm hoping to do CF setup soon, so far been using single cards at a time to assess OC limit.


----------



## dagget3450

Quote:


> Originally Posted by *fjordiales*
> 
> + rep for tip.
> 
> Same thing. It works when I go plug it in after disabling xfire but the moment I go xfire, screen goes blank. I might just end up reinstalling crimson.


Maybe try rebooting after you disable crossfire and hook up card you want as primary. Then once in windows try turning crossfire on?

You may have an option in your bios for setting a primary slot/gpu


----------



## fjordiales

Quote:


> Originally Posted by *dagget3450*
> 
> Maybe try rebooting after you disable crossfire and hook up card you want as primary. Then once in windows try turning crossfire on?
> 
> You may have an option in your bios for setting a primary slot/gpu


I have maximus vi formula and reading up on where to find the pcie primary settings. No dice on the reboot thing then xfire, same result. Will figure out the pcie settings since that might work better.


----------



## bluezone

Quote:


> Originally Posted by *fjordiales*
> 
> I have maximus vi formula and reading up on where to find the pcie primary settings. No dice on the reboot thing then xfire, same result. Will figure out the pcie settings since that might work better.


This is back before I learned the importance of manually removing old Catalyst driver files after uninstalling them. So this may not be applicable.

When I first started cross-firing my HD7950's; I originally had a single card. Cross-fire would not enable for me or would work very badly. The cause seemed to be that the two card were running concurrently both the new and the old Catalyst drivers. Meaning the primary card had updated to the new driver and the secondary card was using the old version of the driver.

So my question is are both cards on the same driver?


----------



## fjordiales

Quote:


> Originally Posted by *bluezone*
> 
> This is back before I learned the importance of manually removing old Catalyst driver files after uninstalling them. So this may not be applicable.
> 
> When I first started cross-firing my HD7950's; I originally had a single card. Cross-fire would not enable for me or would work very badly. The cause seemed to be that the two card were running concurrently both the new and the old Catalyst drivers. Meaning the primary card had updated to the new driver and the secondary card was using the old version of the driver.
> 
> So my question is are both cards on the same driver?


I will double check when I get home but there was a link regarding how to uninstall drivers and I followed it. A lot of steps but I got rid of all the ccc leftovers. I will check if I do have both drivers and the mobo pcie primary thing.

I have a phobya nb eloop 120mm under the 2nd card and it would be great if I can make it my primary card.


----------



## SuperZan

Quote:


> Originally Posted by *fjordiales*
> 
> I will double check when I get home but there was a link regarding how to uninstall drivers and I followed it. A lot of steps but I got rid of all the ccc leftovers. I will check if I do have both drivers and the mobo pcie primary thing.
> 
> I have a phobya nb eloop 120mm under the 2nd card and it would be great if I can make it my primary card.


You should be able to do it on the MaxVI; I've got the option available on my Sabretooth and my GA Z170 Gaming7. Just look under NB settings in your peripherals or whichever equivalent tab in your UEFI. If you've successfully cleared out the drivers, selected your primary PCI-e device in BIOS/UEFI , and connected your monitor cable to the card you want in the hot seat, you should be golden. Oh, and love the Harley Quinn rig. Obviously.


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> I must have been under a rock, cause radeonmod was off my radar. This is the first i heard of it and i feel dumb . I had tried radeonpro a few times and gave up. I am curious to see if this will work better than the radeon settings panel in crimson.


I ran across Radeon Mod only a little while ago. it's used to enable and disable some hidden features. It also preforms some automated registry hacks. In the latest version one of those hacks is to enable/disable PE.








Quote:


> Originally Posted by *gupsterg*
> 
> To me that's nice OC
> 
> 
> 
> 
> 
> 
> 
> , how are you managing 65c on GPU? did you do fan mod to your Nano?
> To me good run, very comparable with my run PE = Off , Tess. = Off @ 1110 / 535
> 
> 
> 
> 
> 
> 
> 
> .


Note: I do see spikes up to 70C, but stays around 65C most of the Haven run. For gaming I still prefer running a 1100 clock so I don't have to keep an eye on it, not to mention I can under volt then (lower thermals).

I'm doing several different things to keep temperatures down.

1) Cool floor air. My PC is the basement and a window is cracked open. so the floor is cold and my case intakes from the bottom.

2) A even thin application of Gelid GC-Extreme TIM. That's has been cured and broken in.

3) Custom fan curve set to just audible at 68%@ 65C, and ramping up aggressively from there. That way I know to watch my temps via audio cue when it starts getting hot.

4)A medium sized heat sink placed on the reverse side the card, with a fan blowing directly on it. (If anyone decides to try this. Be careful!!! There is exposed electrical componentry that can be shorted. Try this at your own peril).
Quote:


> Originally Posted by *gupsterg*
> 
> Also it would be interesting to see your registers dump (ref OP of Fiji bios mod thread). You will see in this linked post data from my 3x cards plus others.
> 
> VID per DPM is set due to GPU properties, stock ROMs use EVV (Electronic Variable Voltage), this is so each ROM does not need to be tailored per GPU.
> 
> I found using MSI AB when OC'ing card per DPM VID & GPU clock will change. Depending on how far we push highest DPM clock(DPM 7), see this linked post.
> 
> I'm totally now using ROM based OC'ing for 24/7 use and only use MSI AB to set an OC for intial testing. Once set in ROM I see no variation in set VID per DPM or lower DPM GPU clock.


I've been following that thread with most intrest.









I'll have a go at the dump I re-read up on how to do the dump.

Have you had any luck in accessing the on die controller? I prefer the ROM method your using, but the hard mod'ers (?) have a point due to the on die controller.

QUOTE:" I found using MSI AB when OC'ing card per DPM VID & GPU clock will change. Depending on how far we push highest DPM clock(DPM 7), see this linked post."

I'm taking a shot in the dark but could this be due amount power delivery noise as VID is increased?


----------



## fjordiales

Quote:


> Originally Posted by *SuperZan*
> 
> You should be able to do it on the MaxVI; I've got the option available on my Sabretooth and my GA Z170 Gaming7. Just look under NB settings in your peripherals or whichever equivalent tab in your UEFI. If you've successfully cleared out the drivers, selected your primary PCI-e device in BIOS/UEFI , and connected your monitor cable to the card you want in the hot seat, you should be golden. Oh, and love the Harley Quinn rig. Obviously.


Thanks! Will definitely check when I get home. Will do everyone's tips here. I know I overlooked something. Especially the primary pcie slot.


----------



## gupsterg

@bluezone

Cheers for info on you cooling setup







.
Quote:


> Originally Posted by *bluezone*
> 
> Have you had any luck in accessing the on die controller?


No, main focus is ROM, I think there is SMC firmware in ROM but not sure







, I reckon it's the huge section from 0x38000 onwards.

AFAIK SMC manipulation gonna be through messaging it, see the link by Semel to Guru3D post by @Unwinder. Someone is using that info on Tonga but not working.

I'm gonna roll with ROM edits as the information on messaging SMC a) not make sense to me b) then on reboot, etc you'd be setting up card again IMO, where as ROM you've set parameters (that we know) and it's set regardless of reboot/powerdown,etc. You can still use MSI AB, etc to change voltage / clocks, etc with custom ROM.
Quote:


> Originally Posted by *bluezone*
> 
> I prefer the ROM method your using, but the hard mod'ers (?) have a point due to the on die controller.


Do you mean with hard mod SMC is fooled? if so I think so, but the current bios mod are not causing an issue with SMC. Only snag we have is limit of 1300mV manual VID per DPM.
Quote:


> Originally Posted by *bluezone*
> 
> I'm taking a shot in the dark but could this be due amount power delivery noise as VID is increased?


IMO why the VID / GPU clocks are changing when we OC through software (ie MSI AB) is the vastly "Dynamic" nature of Fiji compared with past gen. The linux driver structures files I'm using to translate bios has IIRC some info relating to this.

I was virtually balled over when recently aiding the Tonga bios modders with some info share learnt from Fiji, that they have truly "Dynamic VDDCI".

For example Fiji has 1 RAM clock state 500MHz with 1000mV VDDCI (did create ROM with 2 states







), Tonga ROMs have minimum 2 states RAM / VDDCI and some even 4. Then the VDDCI changes per RAM state for GPU clock/VID from what I can make out from ROM and some result data a member posted. SO it has really more than say the amount of VDDCI states shown in ROM, I hope I'm making sense







.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> @bluezone
> 
> IMO why the VID / GPU clocks are changing when we OC through software (ie MSI AB) is the vastly "Dynamic" nature of Fiji compared with past gen. The linux driver structures files I'm using to translate bios has IIRC some info relating to this.
> 
> I was virtually balled over when recently aiding the Tonga bios modders with some info share learnt from Fiji, that they have truly "Dynamic VDDCI".
> 
> For example Fiji has 1 RAM clock state 500MHz with 1000mV VDDCI (did create ROM with 2 states
> 
> 
> 
> 
> 
> 
> 
> ), Tonga ROMs have minimum 2 states RAM / VDDCI and some even 4. Then the VDDCI changes per RAM state for GPU clock/VID from what I can make out from ROM and some result data a member posted. SO it has really more than say the amount of VDDCI states shown in ROM, I hope I'm making sense
> 
> 
> 
> 
> 
> 
> 
> .


If I am following you correctly (my limitations of knowledge, not yours).

It would be interesting if someone could ferret out the reference table used; just to see what is in the various fields: that trip the different dynamic states of the VDDCI.


----------



## pdasterly

can the nano xfire with a fury?


----------



## SuperZan

Quote:


> Originally Posted by *pdasterly*
> 
> can the nano xfire with a fury?


It certainly can. All Fiji cards can Crossfire with the others.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> If I am following you correctly (my limitations of knowledge, not yours).


Even with the "meddling" of Hawaii bios mod this is new territory for me







.
Quote:


> Originally Posted by *bluezone*
> 
> It would be interesting if someone could ferret out the reference table used; just to see what is in the various fields: that trip the different dynamic states of the VDDCI.


Best to either discuss in Fiji Bios mod thread or Tonga (links to translations in Tonga thread 1st then 2nd).


----------



## Spartoi

New Tri-X Fury owner and I was wondering what kinda overclock is common, bad, and good for this card?


----------



## Willius

Received my XFX Nano yesterday, card looks awesome. But it sounds like a someone is welding when it's under load. Coil whine. I'm gonna stress it a bit tonight see if I can reduce it a bit


----------



## hyp36rmax

Finally!


----------



## dagget3450

Quote:


> Originally Posted by *hyp36rmax*
> 
> Finally!


Good luck and congrats!

I still don't understand why so many are water cooling these fury cards given they overclock about as good as dirt. I guess i can see where the AIO isnt as attractive as a full cover water block. I guess its all aesthetics?


----------



## pdasterly

Quote:


> Originally Posted by *dagget3450*
> 
> Good luck and congrats!
> 
> I still don't understand why so many are water cooling these fury cards given they overclock about as good as dirt. I guess i can see where the AIO isnt as attractive as a full cover water block. I guess its all aesthetics?


amd was supposed to offer fury x with waterblock like hydro copper


----------



## hyp36rmax

Quote:


> Originally Posted by *dagget3450*
> 
> Good luck and congrats!
> 
> I still don't understand why so many are water cooling these fury cards given they overclock about as good as dirt. I guess i can see where the AIO isnt as attractive as a full cover water block. I guess its all aesthetics?


Thanks... Some people add aftermarket water blocks to add these GPU's to their existing loops, cool it a little more efficiently as well all in a MATX chassis. It's not always about overclocking them. I just happen to have a 4K monitor and TV at home that I want to drive.


----------



## NBrock

Just be real careful when removing the stock thermal paste. You don't want to mess up the interposer.


----------



## nyk20z3

Quote:


> Originally Posted by *NBrock*
> 
> Just be real careful when removing the stock thermal paste. You don't want to mess up the interposer.


What exactly is this interposer ?, I was gentle removing the excess tim on my asus nano but i don't recall any one ever mentioning this about nvidia cards. I cant imagine i can damage a component while removing tim and if so that's a bad design per manufacturer.


----------



## Tgrove

Quote:


> Originally Posted by *nyk20z3*
> 
> What exactly is this interposer ?, I was gentle removing the excess tim on my asus nano but i don't recall any one ever mentioning this about nvidia cards. I cant imagine i can damage a component while removing tim and if so that's a bad design per manufacturer.


I believe its the orangeish/goldish part in the sides of of where yiu put the tim. Read somewhere a guy bricked his card because he cleaned the interposer


----------



## Forceman

Quote:


> Originally Posted by *nyk20z3*
> 
> What exactly is this interposer ?, I was gentle removing the excess tim on my asus nano but i don't recall any one ever mentioning this about nvidia cards. I cant imagine i can damage a component while removing tim and if so that's a bad design per manufacturer.


It is the shiny gold thing that sits between the PCB and the GPU die and HBM stacks (and what they are mounted on). The Fury is the first card to use one, so that's why you haven't seen it before.


----------



## looncraz

Quote:


> Originally Posted by *NBrock*
> 
> Just be real careful when removing the stock thermal paste. You don't want to mess up the interposer.


The best way would probably be to soak it in 91% isopropyl alcohol for a few minutes (you'll have to keep adding a little as it evaporates), then using compressed air (15psi max - such as from canned air) to blow much of it away from the chips. Then just wipe off the tops of the dies and repeating until any compound is at least 0.5mm below the top of the dies so new applications won't be disturbed during mounting.

I've used this method before to remove compound from areas where I could not use a brush safely.


----------



## looncraz

Quote:


> Originally Posted by *Willius*
> 
> Received my XFX Nano yesterday, card looks awesome. But it sounds like a someone is welding when it's under load. Coil whine. I'm gonna stress it a bit tonight see if I can reduce it a bit


Can you try setting Frame Rate Target Control to 144hz to see if that gets rid of it? I'm curios if it'll be that easy.


----------



## hyp36rmax

Quote:


> Originally Posted by *looncraz*
> 
> The best way would probably be to soak it in 91% isopropyl alcohol for a few minutes (you'll have to keep adding a little as it evaporates), then using compressed air (15psi max - such as from canned air) to blow much of it away from the chips. Then just wipe off the tops of the dies and repeating until any compound is at least 0.5mm below the top of the dies so new applications won't be disturbed during mounting.
> 
> I've used this method before to remove compound from areas where I could not use a brush safely.


Thanks for the tips


----------



## Scottland

Quote:


> Originally Posted by *Laquel*
> 
> Well it's 1900rpm but still yeah. I think the stock fan was quite unbearable at >2500 or so and it was there often. I didn't do a mod log but I do have a couple more pics
> 
> 
> Spoiler: Pic1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pic2
> 
> 
> 
> 
> 
> 
> 
> I used the cardboard from the package to make the shroud and an adapter from gelid to attach the fan. The fan cable is tucked between the card and my mobo since it's quite long. The temps aren't that much better but I've raised the temp and power limit anyways so it's almost always between 950 and 1000mhz while gaming. And of course it keeps those temps much quieter than the stock fan.


Hi,

Do you reckon you could shoehorn 2 92mm fans or 1 120mm fan on there?


----------



## bluezone

Quote:


> Originally Posted by *Willius*
> 
> Received my XFX Nano yesterday, card looks awesome. But it sounds like a someone is welding when it's under load. Coil whine. I'm gonna stress it a bit tonight see if I can reduce it a bit


make sure its your card and not some other system component that is having the coil whine. e.g. power supply.


----------



## bluezone

Quote:


> Originally Posted by *hyp36rmax*
> 
> Finally!


Awsome!!


----------



## fjordiales

@dagget3450
@bluezone
@SuperZan

I got the 2nd card to work as primary. Thanks for the tips. For some reason, my mobo(Maximus vi formula) doesn't have the option to pick which is primary pcie. Well, saw it on the ROG forums that it doesn't have that option. Bummer but probably it's a z87 mobo. Anyway, I shut off the PC, unplugged the power cable from 1st card, plugged hdmi to 2nd card.

Booted from there and it read it as primary. Shut off, plugged the power cables back on 1st card then no signal at 1st then it went straight to windows, which is expected. Reset Crimson, activated xfire, restart and it works! Thanks for the help. Still doing more research about the mobo not having the option on primary pcie...


----------



## SuperZan

Very nice. How curious though that the M VI doesn't have the option. Glad you're able to get it to work through display switching at least.


----------



## fjordiales

Quote:


> Originally Posted by *SuperZan*
> 
> Very nice. How curious though that the M VI doesn't have the option. Glad you're able to get it to work through display switching at least.


unfortunately i went back to top card after the bios flash. I Had to repeat the process everytime i do a flash and it's inconvenient. It's a bummer though since i googled "maximus vi primary pcie" and it pointed me to the ROG forums. its auto pcie on top card unless not occupied. What sabertooth do you have? I think on my next build, i will go MSI mobo, sapphire GPU.

My wife's old and current mobo are MSI and they have great support. called them about an issue and put me on hold then verified my shipping address, sent replacement part. I even asked if they sell the decal and they gave it for free. sapphire r9 fury, either nitro or tri-x have been phenomenal as far as cooling, especially performance to noise. and they have dual bios. Seems like even though the Strix PCB is custom, it's all looks.


----------



## SuperZan

Quote:


> Originally Posted by *fjordiales*
> 
> unfortunately i went back to top card after the bios flash. I Had to repeat the process everytime i do a flash and it's inconvenient. It's a bummer though since i googled "maximus vi primary pcie" and it pointed me to the ROG forums. its auto pcie on top card unless not occupied. What sabertooth do you have? I think on my next build, i will go MSI mobo, sapphire GPU.
> 
> My wife's old and current mobo are MSI and they have great support. called them about an issue and put me on hold then verified my shipping address, sent replacement part. I even asked if they sell the decal and they gave it for free. sapphire r9 fury, either nitro or tri-x have been phenomenal as far as cooling, especially performance to noise. and they have dual bios. Seems like even though the Strix PCB is custom, it's all looks.


I've been very happy with the XFX triple-dissipation Fury, it keeps quite cool as well. Overall the Fury-series has some well designed coolers from multiple manufacturers. As far as the Sabretooth, it's sat on my backup rig with an FX8320 so it's the 990FX Sabretooth R2.0. It's been a great board - tons of options and great OC support. For my Intel rigs after some trial and error I've almost always gone with Gigabyte and I'm extremely pleased with their product quality and customer service, at least for Intel processors. For Haswell my Z97M-D3H does have PCI-e slot selection as an option.

I've only owned one each of MSI and Asrock but the MSI is still going strong with a 2600 as a folder (it's a cheaper H61M-P31/W8) so I have no issues with their quality either. Ultimately I've been very happy with ASUS boards from the AMD side but I've had a lot more success with Gigabyte's Intel offerings. MSI provides good results with multiple chipsets though so I don't think you can go wrong there.


----------



## Willius

Anyone experience with water cooling the Nano? I really like the size of it, and it would perfectly fit in my Main rig, a Caselabs Mercury S3. And swapping it's 970 strix out.
Does it perform better when watercooled?


----------



## battleaxe

Hey guys, we need all of you if you can to go over and run these benches shown on the OP of this thread: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd

I'm sure others have mentioned it, but the competition is heating up and its becoming quite fun over there. Go join in the fun.









Just follow the OP for submission, its very easy to do.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Hey guys, we need all of you if you can to go over and run these benches shown on the OP of this thread: http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd
> 
> I'm sure others have mentioned it, but the competition is heating up and its becoming quite fun over there. Go join in the fun.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just follow the OP for submission, its very easy to do.


Second that! Come on fury owners, let's get a win for big red!

Really easy to enter!


----------



## dagget3450

Also, note all the benching software requirements are free, you can use the demo versions if you don't own the 3dmark apps. second, its gpu scores only so even if your lacking cpu somewhat you can still compete.


----------



## gupsterg

Come on Fiji owners!







, Red team needs you!







, even with relatively mild OCs the Fiji is rocking in my books







.


----------



## buildzoid

I have Vmod components coming Monday. Should be fun to see what a Fury X can do when it isn't being held back.


----------



## gupsterg

*1x special offer*







.

Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course







) I will offer ROM mod service.

*Only 2 conditions:-*

i) do a entry .

ii) ROM mod done after entry, within my own time constraints (which usually is not long wait







) .


----------



## xTesla1856

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service.
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


I'd enter, but I'm still waiting for one of my Furies to return from RMA...


----------



## gupsterg

No worries







, I'll be happy to help when your sorted







.


----------



## dagget3450

Yeah the contest opened my eyes to the performance gap im getting in win10 vs win7. I am finding i am consistently down 10fps or more avg in windows 10. Now i may have another issue I've run into. I did a fresh win7 load and ran updates first, then loaded drivers and etc.. My quad fps was almost 100 fps less!!! So now i am wondering if i got a bad load with updates or if somehow win7 is getting gimped by updates. So here i am on a 3rd win7 install and i will NOT do any updates to see how performance is. If it works fine i am going to do updates in small segments and test.

I cannot even begin to understand whats going on but i am going to find out. My best submissions were on a win7 pro disc with sp1 and no other updates. So i am going to match that first and see where to start. I don't know if its amd drivers or os causing such fps varaitions but its ridiculous.

my proof below
Quote:


> Alright i know im not going to throw much data out here to support this but i also don't want to hijack this thread with my issue. On the other hand it may be relevant for people using amd gpus (fiji mainly?) in windows 10 vs win8/7.
> 
> I get really bad scores in firestrike in win8 almost comparable to win10. win7 on the other hand performs absurdly better. I may be wrong about win8 vs win7 and i don't have much to compare in data form such as logs. That said here are some heaven benchs to backup my win10 vs 7/8 claim. If i can get some time i can probably dig up some old firestrike runs to compare win7/win10
> 
> win 7 same hardware except on ddr4 2666 and cpu @ 4.5
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> win8 same hardware except ddr4 3200 and cpu @4.6
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Win10 same hardware and settings exactly as win8 run above
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> What i suspect is that 16.3/16.3.1 crimson and most likely previous versions of crimson is not applying global settings when it comes to frame pacing and or power efficiency toggle. This is just a guess though. Right now for me Windows 10 blows absolute chunks. I will be formatting and going back to windows 7, and leave win10 on my spare drive for now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I currently have 2 windows 10 installations and both perform equally bad, one if about 6 months old, and the other is an upgrade from win8 i just did a couple hours ago.
> 
> for anyone on mobile who cant read images summary:
> 
> win7 245 fps avg
> win8 247.2 fps avg
> win10 237 fps avg
> 
> i7 5960x/ 4x FuryX @ 1100/500


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> Yeah the contest opened my eyes to the performance gap im getting in win10 vs win7. I am finding i am consistently down 10fps or more avg in windows 10. Now i may have another issue I've run into. I did a fresh win7 load and ran updates first, then loaded drivers and etc.. My quad fps was almost 100 fps less!!! So now i am wondering if i got a bad load with updates or if somehow win7 is getting gimped by updates. So here i am on a 3rd win7 install and i will NOT do any updates to see how performance is. If it works fine i am going to do updates in small segments and test.
> 
> I cannot even begin to understand whats going on but i am going to find out. My best submissions were on a win7 pro disc with sp1 and no other updates. So i am going to match that first and see where to start. I don't know if its amd drivers or os causing such fps varaitions but its ridiculous.
> 
> my proof below


Apparently win 8 is best for 3D benchmarks.

I did a test of win 10 vs 7 and 7 was getting me the exact same scores in 3 way so I just stuck to 7. 10 is a pain to work with.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Apparently win 8 is best for 3D benchmarks.
> 
> I did a test of win 10 vs 7 and 7 was getting me the exact same scores in 3 way so I just stuck to 7. 10 is a pain to work with.


Do you know by chance what drivers you tested both os with?


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> Do you know by chance what drivers you tested both os with?


16.1B IIRC


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> 16.1B IIRC


Welp on win7 sp1 right off disc without updates using 16.3 im back to good FS scores. I don't know why its like this for me, i have been using 16.3/16.3.1 and same hardware. Anyways ill get back to benching and hope for some new high scores lol. I will just do a backup of this windows and try updates and see if something gets nerfed.


----------



## kittysox

Just ordered a sapphire nano as my triumphant return to team red. It's replacing the worst video card I've ever owned (gtx770) and I am wondering how badly my current ivy bridge i5 3570k will hold it back? I'm really not wanting to do a full rebuild until zen core is released but if needs be I will.


----------



## bluezone

Quote:


> Originally Posted by *kittysox*
> 
> Just ordered a sapphire nano as my triumphant return to team red. It's replacing the worst video card I've ever owned (gtx770) and I am wondering how badly my current ivy bridge i5 3570k will hold it back? I'm really not wanting to do a full rebuild until zen core is released but if needs be I will.


this is a sandybridge I7 does that look held back? (I5 3570k has better IPC per core).

http://www.3dmark.com/3dm11/11042507

If I have time I'll do a comparison on a similar rig I have with I5 3570k.


----------



## kittysox

Quote:


> Originally Posted by *bluezone*
> 
> this is a sandybridge I7 does that look held back? (I5 3570k has better IPC per core).
> 
> http://www.3dmark.com/3dm11/11042507
> 
> If I have time I'll do a comparison on a similar rig I have with I5 3570k.


Awesome ty, that is exactly what I wanted to hear. By Tuesday I should have the nano and my sfx power supply to shove into my new Lian li PC-05sx. I am beyond excited


----------



## SuperZan

Quote:


> Originally Posted by *kittysox*
> 
> Awesome ty, that is exactly what I wanted to hear. By Tuesday I should have the nano and my sfx power supply to shove into my new Lian li PC-05sx. I am beyond excited


Yeh, you're golden. 3770k to 6700k was a vanity improvement. My CPU scores are a bit better in benchmarks but gaming-wise it's not much of an FPS increase or frame-time improvement for my use-case. Certainly not a £270 difference; the fact that I was able to turn it into a £55 deal after selling the 3770k was really the only reason I did upgrade.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service.
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


i already have two entries. Can I has service for Fury Tri-X


----------



## gupsterg

Happy to help but please do Vantage X2 class plus all test X1 class







, it will boost our scores







, I wanna see green team crushed!







.

Hoping to get my "ghetto" Fiji CF setup done in a day or so to add X2 subs







.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Happy to help but please do Vantage X2 class plus all test X1 class
> 
> 
> 
> 
> 
> 
> 
> , it will boost our scores
> 
> 
> 
> 
> 
> 
> 
> , I wanna see green team crushed!
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Hoping to get my "ghetto" Fiji CF setup done in a day or so to add X2 subs
> 
> 
> 
> 
> 
> 
> 
> .


ll do vantage as soon as i can get it to work. It keeps insta hard locking when i try to open it.


----------



## gupsterg

Are you on Win 10? I'm on Win 7 Pro x64.

If you do X1 Class that will add 100K point AFAIK







.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Are you on Win 10? I'm on Win 7 Pro x64.
> 
> If you do X1 Class that will add 100K point AFAIK
> 
> 
> 
> 
> 
> 
> 
> .


no i have a dual boot. 10 for gaming and everyday and 7 for benching. Ill do it all.


----------



## gupsterg

Cheers







, I will do whatever I know so far to help with bios mod for you







.


----------



## Wagnelles

Do we have a release date for the Radeon Pro Duo?


----------



## xTesla1856

We're leading by 160k points, the Titan X owner's club is fuming


----------



## dagget3450

it only takes a few subs to cancel that lead out, also we got a whole week before its over


----------



## Semel

Quote:


> Priorities in my real life have changed, cannot give any ETA. (c) Unwinder


It's about unofficial OCing mode without powerplay support in AB. I guess we won't see it any time soon.It's a pity.. I bet this mode would fix performance loss when increasing VDDC voltage.

What I don't understand is though.... it's the official MSI tool.It's on their site. . Why do they rely only on one man to develop it? Couldn't they get him some help or something..I dunno..


----------



## dagget3450

Quote:


> Originally Posted by *Semel*
> 
> It's about unofficial OCing mode without powerplay support in AB. I guess we won't see it any time soon.It's a pity.. I bet this mode would fix performance loss when increasing VDDC voltage.
> 
> What I don't understand is though.... it's the official MSI tool.It's on their site. . Why do they rely only on one man to develop it? Couldn't they get him some help or something..I dunno..


Honestly it would be nice if AMD had a way to do this instead of the god awful overdrive. Sapphire Trixx doesnt controll fan speed for me on the newest crimsons. Are these guys having to reverse engineer their utilities because it seems like its more complicated than it should be. I don't know i lack the brain power to understand this on a intimate level.


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> Honestly it would be nice if AMD had a way to do this instead of the god awful overdrive. Sapphire Trixx doesnt controll fan speed for me on the newest crimsons. Are these guys having to reverse engineer their utilities because it seems like its more complicated than it should be. I don't know i lack the brain power to understand this on a intimate level.


Are you using version 5.2.1 of TRIXX?


----------



## dagget3450

Quote:


> Originally Posted by *bluezone*
> 
> Are you using version 5.2.1 of TRIXX?


That is correct v5.2.1


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> That is correct v5.2.1


Ok that's not the problem. Did you hit reset and uninstall or rest and deactivate before installing the new Crimson driver.
If not the Trixx utility may not of latched on to the hooks in the Crimson driver. Try a reset of Trixx and uninstall, reinstall.

EDIT: It might be a good idea to delete any installed Trixx related Program files on your C: drive.
If you don't know how I'll walk you through it. Easy peasy


----------



## diggiddi

Quote:


> Originally Posted by *gupsterg*
> 
> Are you on Win 10? I'm on Win 7 Pro x64.
> 
> If you do X1 Class that will add 100K point AFAIK
> 
> 
> 
> 
> 
> 
> 
> .


YHPM


----------



## dagget3450

Quote:


> Originally Posted by *bluezone*
> 
> Ok that's not the problem. Did you hit reset and uninstall or rest and deactivate before installing the new Crimson driver.
> If not the Trixx utility may not of latched on to the hooks in the Crimson driver. Try a reset of Trixx and uninstall, reinstall.
> 
> EDIT: It might be a good idea to delete any installed Trixx related Program files on your C: drive.
> If you don't know how I'll walk you through it. Easy peasy


Im on a fresh install of win7 pro 64, crimson 16.3 in this case, but it was also the same result in win10 fresh install, and previous win7 install that was also fresh on 16.3.1 i think

Not sure if that matters

Edit: overdrive is not activated, and the issue is when i try to use fixed or custom. It just stays on auto. One thing i didn't try was to set to fixed, and then reboot and test?


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> Im on a fresh install of win7 pro 64, crimson 16.3 in this case, but it was also the same result in win10 fresh install, and previous win7 install that was also fresh on 16.3.1 i think
> 
> Not sure if that matters
> 
> Edit: overdrive is not activated, and the issue is when i try to use fixed or custom. It just stays on auto. One thing i didn't try was to set to fixed, and then reboot and test?


Try activating Over Drive first.
you still could end up with corrupted dll's on the Trixx package. But now I am wondering if you could run the system file checker to check for operating system file corruption.


----------



## bluezone

Crimson 10.3.2 is supposed to be incoming today. I tried a link but came up page not found.


----------



## Dirgeth

Hey guys!

Did someone test the results of exchanging thermal paste on R9 Nano?
For MX-4 or CLU... just on stock cooler..
ty!


----------



## nyk20z3

Finally got the block mounted in the case and the loop filled but of course i run in to more issues. The block came with the jetplate not centered from the factory so i contacted EK and they said i can open the block and re center it. I did so and then tightened the block back down as tight as i could but now it looks like coolant is escaping the cambers some how, there are no leaks since the o ring is secure but this doesn't look normal to me.


----------



## Dirgeth

so when u have water cooling on Nano do u have stable 1000MHz boost with PT+50% ?
And stable boost after OC ?
Temp of card?


----------



## Radox-0

Quote:


> Originally Posted by *Dirgeth*
> 
> so when u have water cooling on Nano do u have stable 1000MHz boost with PT+50% ?
> And stable boost after OC ?
> Temp of card?


Under water there is no throttling. Can maintain a smooth 1000 no issue. Couple of samples I have out under water all maintain 1100 MHz no issue also once you remove the thermal part from the equation.


----------



## Dirgeth

so do u think exchange thermal-paste for MX-4 or maybe CLU for stock cooler helps a little?
If u have it with stock cooler on AIR every little change help with throttling..


----------



## bluezone

Here's the link for the new drive. Mainly VR additions from the look of it.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16-3-2.aspx

Hmmmm... no down load from that page just release notes.

try this instead,

http://support.amd.com/en-us/download/desktop?os=Windows%207%20-%2064


----------



## Radox-0

Quote:


> Originally Posted by *Dirgeth*
> 
> so do u think exchange thermal-paste for MX-4 or maybe CLU for stock cooler helps a little?
> If u have it with stock cooler on AIR every little change help with throttling..


I repasted my card with thermal grizzly kyronaught out of curiosity, helped shaved about 3 degrees for a given fan profile. May help abit but will not overcome the throttling. Only upping the fan profile and upping the power limit will help in that regards in air. Having said that the fan runs fairly quiet even then so worth it if you want to maintain 1000mhz


----------



## bluezone

Ok the new drivers offer a very small uptick in performance, What's nice about them is I have larger range for under volting(I can under volt at 1125 now) I can under volt to a lower setpoint and 600Hrz on memory is not crashing but still not stable (artifacts) . This is a step in the right direction.,

1200 clock on core at +.048 offset voltage seems to run without crashing too. I don't have the thermal headroom to confirm full stability though.


----------



## bluezone

Quote:


> Originally Posted by *kittysox*
> 
> Just ordered a sapphire nano as my triumphant return to team red. It's replacing the worst video card I've ever owned (gtx770) and I am wondering how badly my current ivy bridge i5 3570k will hold it back? I'm really not wanting to do a full rebuild until zen core is released but if needs be I will.


As promised here my max run on other PC. 3570k no tessellation though. Put me in 2nd place for team red in 3d mark 11 in red Vs green.

http://www.3dmark.com/3dm11/11119474


----------



## kittysox

Mines all going together tonight. Hopefully I can get some benchmarks run tomorrow.


----------



## Dirgeth

do u know to exist some kind of custom cooler for Nano on AIR or AiO ? (no watercooling system)


----------



## xkm1948

Question: If I turn of Tesllation in AMD settings will the results still be validate? I got a huge jump in FireStrike Extreme from 7500 to 8800 if I turn off Tesllation.


----------



## xkm1948

Quote:


> Originally Posted by *Dirgeth*
> 
> do u know to exist some kind of custom cooler for Nano on AIR or AiO ? (no watercooling system)


Strap a bigger fan on it?


----------



## Agent Smith1984

Quote:


> Originally Posted by *xkm1948*
> 
> Question: If I turn of Tesllation in AMD settings will the results still be validate? I got a huge jump in FireStrike Extreme from 7500 to 8800 if I turn off Tesllation.


They are viable as a "no tess" run, and you can compare your no tess results to others, but they aren't really "valid" FireStrike runs.

I run no tess for bench scores sometimes, like now for example on the fanboy comp, for daily benching and testing, I leave it on for apples to apples to other cards....


----------



## Alastair

Guys why is my GPU'S usage doing this? I am getting stuttering all over the place. I don't feel like I am getting the performance I deserve from two Fury's at the moment. It happens in a few games but this particular image was taken from skyrim. Cpu usage is not hitting 100% on any of the cores.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> Guys why is my GPU'S usage doing this? I am getting stuttering all over the place. I don't feel like I am getting the performance I deserve from two Fury's at the moment. It happens in a few games but this particular image was taken from skyrim. Cpu usage is not hitting 100% on any of the cores.


AB goes wonky sometimes on these. It's likey AB.

set the refresh on AB to a couple seconds and see if persists.


----------



## Alastair

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys why is my GPU'S usage doing this? I am getting stuttering all over the place. I don't feel like I am getting the performance I deserve from two Fury's at the moment. It happens in a few games but this particular image was taken from skyrim. Cpu usage is not hitting 100% on any of the cores.
> 
> 
> 
> 
> AB goes wonky sometimes on these. It's likey AB.
Click to expand...

AB NZXT CAM and HWinfo are wonky? Besides. The stuttering I seem to be eating suggest otherwise.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> AB NZXT CAM and HWinfo are wonky? Besides. The stuttering I seem to be eating suggest otherwise.


Try making the refresh on AB longer. This helped on mine. Any other monitoring software also, set it so it has more time to refresh. Mine is set to 5000ms


----------



## ronaldoz

Hi! Helping over here would be really awesome! Just to be sure, things will be fine: use 3 x CPU-Z (CPU, Mother Board, Memory) and 1 x GPU-Z









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


----------



## kittysox

Finally got my parts in, micro center ran the i7 Skylake on sale this week for 299 so I went ahead and took the plunge.




Hope to get some benchmarks ran tomorrow afternoon.


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> AB NZXT CAM and HWinfo are wonky? Besides. The stuttering I seem to be eating suggest otherwise.


This looks like the issues i had with certain crimson drivers. Which drivers are you using?
i would suggest either something old like 15.7 or last catalyst drivers before crimson, or maybe try 16.3 or later.


----------



## xkm1948

Quote:


> Originally Posted by *ronaldoz*
> 
> Hi! Helping over here would be really awesome! Just to be sure, things will be fine: use 3 x CPU-Z (CPU, Mother Board, Memory) and 1 x GPU-Z
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


Just joined the cause.


----------



## BIGTom

Crimson 16.3.2 gives a nice performance increase from what I can tell. Firestrike Graphics score went up 145 points but DX12 gains were even better. HITMAN DX12 increased FPS by 7%.


----------



## JunkaDK

Quote:


> Originally Posted by *BIGTom*
> 
> Crimson 16.3.2 gives a nice performance increase from what I can tell. Firestrike Graphics score went up 145 points but DX12 gains were even better. HITMAN DX12 increased FPS by 7%.


When i launch Hitmax with DX12 it insta-crashes







Running 2 x R9 Fury Strix. Stock settings..

Got any tips or tricks i could try? I uninstalled drivers with DDU before installing latest crimson.


----------



## Metalhead79

What's a safe voltage level? I can hit 1150mhz with +48mv in MSI Afterburner. Temps stay around 66-68c at that voltage.


----------



## Maximization

Quote:


> Originally Posted by *BIGTom*
> 
> Crimson 16.3.2 gives a nice performance increase from what I can tell. Firestrike Graphics score went up 145 points but DX12 gains were even better. HITMAN DX12 increased FPS by 7%.


agreed nice improvements


----------



## dagget3450

You fiji guys pumping volts can you compare stock volts vs volt increase? Im finding tampering with volts just ruins scores.

Anyone with 2 or more fury can you try the following.

Firestrike runs
Run stock clocks and 0mv
Run stock clocks 0mv and 50% power level
Run stock clocks and +25mv 50% power level
Run stock clocks and +50mv 50% power level
Run stock clocks and +75mv 50% power level

If thats too much at least run 1,2 and 5 - i would like to see the differences in scores. For me anything over 0mv is a decrease in fps.


----------



## Radox-0

Quote:


> Originally Posted by *dagget3450*
> 
> You fiji guys pumping volts can you compare stock volts vs volt increase? Im finding tampering with volts just ruins scores.
> 
> Anyone with 2 or more fury can you try the following.
> 
> Firestrike runs
> Run stock clocks and 0mv
> Run stock clocks 0mv and 50% power level
> Run stock clocks and +25mv 50% power level
> Run stock clocks and +50mv 50% power level
> Run stock clocks and +75mv 50% power level
> 
> If thats too much at least run 1,2 and 5 - i would like to see the differences in scores. For me anything over 0mv is a decrease in fps.


Not x but nano and my fps / firestrike scores decrease with anything over 12 mv even when it allows me to stabilise a higher clock overall performance decreases. Sweet spot is 1130 MHz and 12 mv. Adding some move mv allows me to stabilise up to 1150 but performance takes a hit.


----------



## dagget3450

Quote:


> Originally Posted by *Radox-0*
> 
> Not x but nano and my fps / firestrike scores decrease with anything over 12 mv even when it allows me to stabilise a higher clock overall performance decreases. Sweet spot is 1130 MHz and 12 mv. Adding some move mv allows me to stabilise up to 1150 but performance takes a hit.


My reason for asking this is more sinister than just numbers. I honestly believe that fury is only operating on stock voltage and the voltage adjustments are a sham. I am thinking that either it be drivers or hardware AMD has it locked down. I would love to be proven wrong.

I want to see someone show proof that their voltage adjusted overclock beats:

Your voltage overclock. Vs. 50mhz+core. +50mhzmem 0mv. +50%power level

Would be nice to have screenshots and links for runs


----------



## Radox-0

Quote:


> Originally Posted by *dagget3450*
> 
> My reason for asking this is more sinister than just numbers. I honestly believe that fury is only operating on stock voltage and the voltage adjustments are a sham. I am thinking that either it be drivers or hardware AMD has it locked down. I would love to be proven wrong.
> 
> I want to see someone show proof that their voltage adjusted overclock beats:
> 
> Your voltage overclock. Vs. 50mhz+core. +50mhzmem 0mv. +50%power level
> 
> Would be nice to have screenshots and links for runs


Will get some screenies once I have my monitor back. These are the typical settings I use to stabilise things

1075 Mhz / +50% power target / -12mv
1085 Mhz / +50% Power target / -6mv
1100 Mhz / +50% Power target/ + 0mv
1110 Mhz / +50% power target / +6mv
1120 Mhz / +50% power target / +12 mv
1130 Mhz / +50% power target / +24 mv
1140 Mhz / + 50% power target / +48 mv

Here is an example of some runs of mine showing the best performance was the core clock at around the 1115 mark and memory at 500 (shows 508 / 505 but Fiji only overclocks in discreet steps of 500 / 545 / 600 / 666 etc) Also shows a run at 1130 / 550 but the graphics score is nearly 1000 lower as I needed to use about 24mv to stabilise it where as the lower runs were on stock mv or + 6

http://www.3dmark.com/compare/fs/7317452/fs/7127963/fs/7128062/fs/7133926#


----------



## dagget3450

Quote:


> Originally Posted by *Radox-0*
> 
> Will get some screenies once I have my monitor back. These are the typical settings I use to stabilise things
> 
> 1075 Mhz / +50% power target / -12mv
> 1085 Mhz / +50% Power target / -6mv
> 1100 Mhz / +50% Power target/ + 0mv
> 1110 Mhz / +50% power target / +6mv
> 1120 Mhz / +50% power target / +12 mv
> 1130 Mhz / +50% power target / +24 mv
> 1140 Mhz / + 50% power target / +48 mv
> 
> Here is an example of some runs of mine showing the best performance was the core clock at around the 1115 mark and memory at 500 (shows 508 / 505 but Fiji only overclocks in discreet steps of 500 / 545 / 600 / 666 etc) Also shows a run at 1130 / 550 but the graphics score is nearly 1000 lower as I needed to use about 24mv to stabilise it where as the lower runs were on stock mv or + 6
> 
> http://www.3dmark.com/compare/fs/7317452/fs/7127963/fs/7128062/fs/7133926#


Thank you +rep. So this is a nano correct? My optimal performance is 1100/500-570 0mv. 50PL
Even if i add 6mv it affects one of the gpu tests negatively. I can reproduce this on MSI AB, or TRIXX.
Its possible for me CF is somehow bugged and additional cards aren't getting setting passed on. I dont know how or why but once i leave 0mv its bad news in firestrike. I will try these tests on a single gpu tonight. It would be awesome if we had some crossfire results on these types of tests.


----------



## Radox-0

Quote:


> Originally Posted by *dagget3450*
> 
> Thank you +rep. So this is a nano correct? My optimal performance is 1100/500-570 0mv. 50PL
> Even if i add 6mv it affects one of the gpu tests negatively. I can reproduce this on MSI AB, or TRIXX.
> Its possible for me CF is somehow bugged and additional cards aren't getting setting passed on. I dont know how or why but once i leave 0mv its bad news in firestrike. I will try these tests on a single gpu tonight. It would be awesome if we had some crossfire results on these types of tests.


Yep indeed a nano. Yes something similar for myself tri x and MSI AB produced similar effect. Be interesting how your single card runs go.

I note in your earlier comment you increased the mv increments in 25. may be worthwhile trying smaller increments if you have not already to see how 6 / 12 etc behave.


----------



## gupsterg

Quote:


> Originally Posted by *Radox-0*
> 
> Fiji only overclocks in discreet steps of 500 / 545 / 600 / 666 etc


A few times people have posted this info in Fiji bios mod thread







, I don't see this in my bench compares







.

I see that if I run 535MHz is stable but 540 or 545 isn't.

Another card I have 525MHz is stable but 530 or 535 isn't.

In both cases increased RAM is faster than bench compared with 500MHz, if either of those clocks was locking to a discreet step then why is the bench faster? why does going a few MHz higher cause instability?

Next RAM timings / straps in ROM.



There are RAM straps 100MHz , 400Mhz, 500MHz & 600MHz.

400MHz strap has tighter timings than 500MHz, 500MHz & 600MHz have the same timings.

So why are there not timings straps same as these steps? ie 500 / 545 / 600 / 666

Applying how straps worked on Hawaii we have:-

100MHz strap works for RAM frequency of upto 100MHz, 101MHz to 400MHz RAM clock would engage 400MHz strap timings. When we set 401MHz RAM clock would engage 500MHz strap timings, as 500MHz & 600MHz have same timings the range = 401MHz to 600MHz.

Then on Hawaii when you went past the last strap in RAM frequency the last straps timings were used ie anything over 600MHz would use 600MHz strap.

@The Stilt this information about discreet stepping of HBM clock posted by AMD Matt is it true? cheers.


----------



## Dirgeth

1070/560MHz + 0mV +50PT


----------



## BIGTom

Quote:


> Originally Posted by *JunkaDK*
> 
> When i launch Hitmax with DX12 it insta-crashes
> 
> 
> 
> 
> 
> 
> 
> Running 2 x R9 Fury Strix. Stock settings..
> 
> Got any tips or tricks i could try? I uninstalled drivers with DDU before installing latest crimson.


I had a crash in DX12 on launch day. I have Shader Cache on AMD Optimized and disabled Render Target Reuse in HITMAN setting.
Not sure this will help, but I do hope so. DX12 in HITMAN is so good for Fury. Good luck


----------



## Radox-0

Quote:


> Originally Posted by *gupsterg*
> 
> A few times people have posted this info in Fiji bios mod thread
> 
> 
> 
> 
> 
> 
> 
> , I don't see this in my bench compares
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I see that if I run 535MHz is stable but 540 or 545 isn't.
> 
> Another card I have 525MHz is stable but 530 or 535 isn't.
> 
> In both cases increased RAM is faster than bench compared with 500MHz, if either of those clocks was locking to a discreet step then why is the bench faster? why does going a few MHz higher cause instability?
> 
> Next RAM timings / straps in ROM.
> 
> 
> 
> There are RAM straps 100MHz , 400Mhz, 500MHz & 600MHz.
> 
> 400MHz strap has tighter timings than 500MHz, 500MHz & 600MHz have the same timings.
> 
> So why are there not timings straps same as these steps? ie 500 / 545 / 600 / 666
> 
> Applying how straps worked on Hawaii we have:-
> 
> 100MHz strap works for RAM frequency of upto 100MHz, 101MHz to 400MHz RAM clock would engage 400MHz strap timings. When we set 401MHz RAM clock would engage 500MHz strap timings, as 500MHz & 600MHz have same timings the range = 401MHz to 600MHz.
> 
> Then on Hawaii when you went past the last strap in RAM frequency the last straps timings were used ie anything over 600MHz would use 600MHz strap.
> 
> @The Stilt this information about discreet stepping of HBM clock posted by AMD Matt is it true? cheers.


I usually find up to 575 is stable but produces similar results to 545 in games , which is really within margin of error. None of my Fiji based cards have sadly been stable at 600 Mhz in order to allow me to properly test, and really I should test the memory on its own









In regards to AMD Matt's comments they were posted in another forum and I initially posted them here as it was an interesting point and to see what others for. I did follow up with a question on said forums in which i asked about readings and was informed OSD's are simply reporting back false information and Fiji memory will round accordingly. Here is the initial post for reference: https://forums.overclockers.co.uk/showthread.php?t=18678073&highlight=username_AMDMatt&page=233 Post # 6977 and follow up about the OSD false readings: https://forums.overclockers.co.uk/showthread.php?t=18678073&highlight=username_AMDMatt&page=234 Post # 6996

I would love to try and test myself, but alas not intuned with behind the scenes info and my cards do not seem to OC well enough on the memory to allow me to see a pattern.

Cant comment on the issue of not working on 535 as I am not sure how it rounds, my theory could be if at 535 its rounding to 545 which for the card is not stable, it will just cause issues. Should make a point of asking how the rounding is calculated on that note.


----------



## gupsterg

Yep seen the info on OCuk and he also posted it on Guru3D







.

Lets take for example 500 and 545 steps







, as my cards don't go any higher (well they do to a degree but not long term stable, I mean like 1hr+ of 3DM FS looped or upto 12hrs [email protected]).

If the card was rounding 525 or 535 to 545 then 545 would be stable on my cards? which is not.

Then I'd think it must be rounding down to 500 but when 525 or 535 is performing better consistently in benches I can't then deem it's rounding down to 500.

Thoroughly confused in a way when I look at my data/VRAM_Info in ROM *Vs* AMD Matt's info.


----------



## Radox-0

Yes thoroughly confusing









Think I will drop him a message and see if I get a response as to how it rounds exactly.


----------



## buildzoid

Well I tried cap modding one of my cards.

The results are unsurprising and un-impressive. I added 8mF to the output side of the VRM. Stability wise I got from the card passing 3Dmark at 1160mhz to passing at 1170mhz. This is on stock volts so maybe it will have more of an impact in a higher core voltage situation. I'm still trying to figure out how to add caps to the HBM VRM since I have to keep the caps in places that will allow the cards to fit in crossfire.

Here's some picks of what the mod looks like:





The soldering is pretty bad because I didn't managed to find a store selling flux and because it's my first time using unleaded solder.


----------



## gupsterg

Quote:


> Originally Posted by *Radox-0*
> 
> Yes thoroughly confusing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Think I will drop him a message and see if I get a response as to how it rounds exactly.


Now the other thing is this, forget what MSI AB does when we wanna OC HBM.

In the ROM, the PowerPlay section (this is the main area for modding) there is a thing called OverDrive Limits.



In stock ROM the RAM OD Limit is 500MHz, if I increase that value to say 600MHz you will start seeing a RAM slider in OD page of driver.



The example image above was where I set it as 525MHz hence you see slider bar limited to that, the bar increments in 5MHz steps, not 500 545 600 666.

You can see some other images / test results in OP of Fiji bios mod heading *How to edit OverDrive Limits in PowerPlay*.


----------



## Alastair

So I reckon I have added about 200K to the Red Team for the Fanboy comp.
I did X1 and X2 in both Vantage and Firestrike. I don't have 3DMark 11 as I would of done that too, but I do not have enough data cap left at the end of the month to download it.









EDIT: But back a few weeks ago I still had an X2 validation I added for 11 so that one is still in the system!


----------



## xkm1948

Does it matter if I use, say a "cracked" version of 3DMark11? I don't want to pay for the CDKEY just for an entry since I already submitted a FireStrike score.


----------



## dagget3450

Quote:


> Originally Posted by *xkm1948*
> 
> Does it matter if I use, say a "cracked" version of 3DMark11? I don't want to pay for the CDKEY just for an entry since I already submitted a FireStrike score.


You don't need that, the benches are available on free demo versions, just make sure you save the HTML link with your run. Use the 3dmark demo versions my man save yourself the trouble.


----------



## BIGTom

Submitted x1 FuryX at 1130/545 for Team Red.


----------



## edmwxyz

Hi all! I just bought the Sapphire Fury Tri-X today. Decide to unlock it. cuinfo16 looks bad because one of the X is not in the right place. But still decide to unlock all to try. Quite surprise that it actually work. All 4096 shaders are working fine (tried War thunder, 3Dmark, furmark and Unigine Heaven).


----------



## dagget3450

Lucky you


----------



## Maximization

AMD looks like we are ahead

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd

anyone left with some fury power lets nail these titans too the wall.


----------



## bluezone

Do we need add to another card to the page title?

http://www.extremetech.com/computing/225817-amd-announces-new-dual-gpu-firepro-designed-for-hpc


----------



## dagget3450

Passive cooling, 300w wow.. it doesnt have a fan?


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> Passive cooling, 300w wow.. it doesnt have a fan?


I'm not sure if it has a fan or not. If everyone here pools their resources, we should be able to come up with the $4999 needed to purchase one of these cards.

I promise when the funds are raised that I will give a full report on its capabilities.


----------



## kittysox

Guys my 3d mark vantage crashes every time on the physics test. Any idea what I need to adjust to get it to finish the benchmark?


----------



## Radox-0

Quote:


> Originally Posted by *kittysox*
> 
> Guys my 3d mark vantage crashes every time on the physics test. Any idea what I need to adjust to get it to finish the benchmark?


Physics test is CPU based so I would imagine lowering any overclock you may have will help.


----------



## MrKoala

Quote:


> Originally Posted by *dagget3450*
> 
> Passive cooling, 300w wow.. it doesnt have a fan?


It needs a fan, just not inside itself.

Would be put into something like this: http://b2b.gigabyte.com/MicroSite/395/images/G190-H44-airflow-image.png

If you want to use one in your desktop, you would have to water cool it or ghetto a cooling setup: https://software.intel.com/sites/default/files/100_4154.JPG

Passive cooling is actually beneficial in this case as you don't have to fit a fan in the dual slot. All the space within the shell can be used for pipes and fins. The fan will sit outside the PCIe area where space is less constrained.


----------



## SLK

Looks like the Nano throttles on Ashes of the Singularity with DX12 at stock speeds. (Power limit +50) Anyone else can confirm this?


----------



## Alastair

Quote:


> Originally Posted by *MrKoala*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dagget3450*
> 
> Passive cooling, 300w wow.. it doesnt have a fan?
> 
> 
> 
> It needs a fan, just not inside itself.
> 
> Would be put into something like this: http://b2b.gigabyte.com/MicroSite/395/images/G190-H44-airflow-image.png
> 
> If you want to use one in your desktop, you would have to water cool it or ghetto a cooling setup: https://software.intel.com/sites/default/files/100_4154.JPG
> 
> Passive cooling is actually beneficial in this case as you don't have to fit a fan in the dual slot. All the space within the shell can be used for pipes and fins. The fan will sit outside the PCIe area where space is less constrained.
Click to expand...

in that second image? What cards are those? Xeon Phi? They aren't GPU's from the looks of it. No PCI-E power connectors.


----------



## ogow89

So just got the R9 Nano and here are some benchmarks,

Stock clocks:

Valley:


Spoiler: Warning: Spoiler!







Heaven:


Spoiler: Warning: Spoiler!







Firestrike:


Spoiler: Warning: Spoiler!







Overclocked 1050mhz on the core:

Valley:


Spoiler: Warning: Spoiler!







Heaven:


Spoiler: Warning: Spoiler!







Firestrike:


Spoiler: Warning: Spoiler!







1070mhz on the core crashed on dying light so i just took it as unstable, and as far as the memory goes, i haven't really tried to overclock it since it sets too close to the core that it would just raise the temps which might have caused throttling. Anyways, i wish to know how my score compare to a stock furyx.









So how do they look?

Also i wanted to note that every couple of hours of constant use while overclocked to 1050mhz, the division crashes on me and the screen goes black. So is my overclock unstable for extended hours of use?

I feel like i got a terrible overclocker seeing that some can go as high as 1100mhz on just 50% power limit.


----------



## Dirgeth

For me looks like 1055 stable on my Nano.. What is your voltage under load? My is 1.181-1.193v under load in Division .. (its default) ty!


----------



## ogow89

I will look into that right now, do you have any issues while the card is overclocked, and would like to see your firestrike graphics score, and valley, and heaven scores, just to see where i fit.


----------



## bborokee

I've posted a reply couple of pages ago saying how i was contemplating between r9 390 and a fury....

Well, now I'm a proud owner of a XFX Fury







!

...Just gotta wait for it to arrive on my doorsteps. So excited heh


----------



## ogow89

So the max VDDC shown by gpu z is 1.2375, that is while overclocked with 50 power limit. Same number was shown via msi afterburner.


----------



## Dirgeth

I have under-load 1.181-1.193v only :/

here is my compare vs Nano and R9 390 overclocked


----------



## Flamingo

Quote:


> Originally Posted by *SLK*
> 
> Looks like the Nano throttles on Ashes of the Singularity with DX12 at stock speeds. (Power limit +50) Anyone else can confirm this?


What resolution, and version of AoTS?


----------



## ogow89

Quote:


> Originally Posted by *Dirgeth*
> 
> I have under-load 1.181-1.193v only :/
> 
> here is my compare vs Nano and R9 390 overclocked


God damn, how were you able to keep the fan that low, and the gpu temp low? Mine hits 80% fan speed to keep it at 70°c max.


----------



## Dirgeth

idk







.. i have a CM STRYKER Case ..

So 3 hours gameplay of The Division and max temp of my Nano was 75°c


----------



## ogow89

Quote:


> Originally Posted by *Dirgeth*
> 
> idk
> 
> 
> 
> 
> 
> 
> 
> .. i have a CM STRYKER Case ..
> 
> So 3 hours gameplay of The Division and max temp of my Nano was 75°c


What about the gpu fan speed, how high was it?


----------



## Dirgeth

u can see it on vid.. i have from 70°c to 83°c 65% FAN so.. my max is 65% of FAN speed and i dont have more than 75°c under full load


----------



## ogow89

Quote:


> Originally Posted by *Dirgeth*
> 
> u can see it on vid.. i have from 70°c to 83°c 65% FAN so.. my max is 65% of FAN speed and i dont have more than 75°c under full load


Does the card ever throttle, and how did you overclock it, and did you use a set up a custom fan profile to allow the fan to go up or is it on stock?


----------



## Dirgeth

Just this.. i play Division for 3 hours and card didnt throttle.. same for Battlefield 4


----------



## ogow89

Quote:


> Originally Posted by *Dirgeth*
> 
> 
> 
> Just this.. i play Division for 3 hours and card didnt throttle.. same for Battlefield 4


How did you unlock the memory overclocking on msi? the slider won't move for me.


----------



## Dirgeth

settings: Extended official overclocking limits 
But u cant changing voltage on Afterburner


----------



## ogow89

My card throttels at 74°c, so you must have done something that allows the card to go higher.


----------



## Dirgeth

so i have 1080p 144Hz screen ..
If u have 1440p this is why u have throttling at 74°c


----------



## ogow89

Also 1080p, and i play the division at 1080p maxed out setting, but in other games i use either 1440p vsr or higher.


----------



## ogow89

Just test 560 on the memory and it was stable, but no benefit to performance what so ever, so i ll just leave it at stock.


----------



## Dirgeth

i have to go downclock on my core to 1055MHz at 1.181v under load.. 1065MHz is not stable on BF4


----------



## ogow89

Quote:


> Originally Posted by *Dirgeth*
> 
> i have to go downclock on my core to 1055MHz at 1.181v under load.. 1065MHz is not stable on BF4


It depends on the game, actually. The division just gave me 1075/570 stable for 1:30, and dying light didn't even launch. So i just figured use dying light as the best game to test stable clocks, was also the one game i used to see how stable my r9 290 pcs+ was. So ultimately 1050 on the core is the most stable for me, and hbm is useless to overclock as i see it.


----------



## dagget3450

Quote:


> Originally Posted by *ogow89*
> 
> It depends on the game, actually. The division just gave me 1075/570 stable for 1:30, and dying light didn't even launch. So i just figured use dying light as the best game to test stable clocks, was also the one game i used to see how stable my r9 290 pcs+ was. So ultimately 1050 on the core is the most stable for me, and hbm is useless to overclock as i see it.


oddly enough for me i am finding HBM gives different results in different programs. Sometimes it gives me more fps and sometimes i get less fps depending on the item i'm testing.


----------



## ogow89

Quote:


> Originally Posted by *dagget3450*
> 
> oddly enough for me i am finding HBM gives different results in different programs. Sometimes it gives me more fps and sometimes i get less fps depending on the item i'm testing.


Which games give you better performance when overclocking the vram?


----------



## SLK

Quote:


> Originally Posted by *Flamingo*
> 
> What resolution, and version of AoTS?


Newest... 1.00.18769.

1440p uncapped frame rate


----------



## gupsterg

Are people finding if they don't OC GPU they can clock RAM higher? I seem to be finding that with my card .


----------



## ogow89

Quote:


> Originally Posted by *gupsterg*
> 
> Are people finding if they don't OC GPU they can clock RAM higher? I seem to be finding that with my card .


If you are talking about gpu core and vram, then yes, you can clock vram higher if you leave the core at stock and works both ways. If you mean system ram then no. I can clock my ram higher if i downclock my cpu.


----------



## gupsterg

Thanks







, yes, I meant GPU clock and HBM clock







.


----------



## MAMOLII

i have found stable oc with the stilt's bios!! but now in latest crimson drivers does not work :doh:no display when loading windows! and if i try to install crimson it stucks and again no display!Tried in two different win installations and i had the same problem, flashed back stock bios and crazy artifacts+rsod are back


----------



## Alastair

Guys please for the love of god can some one explain something to me before I break something. 1. Every time I log into windows right, I am IN THE DESKTOP, My screen goes blank, I get the windows sound that plays when a new device is detected and then the screen comes back. Why? And I am not overclocked on my cards at the moment. Even with only 50% fan speed my cards do not break 40C yet the driver crashes. I know this because my Tacho LED's on my top card go from 7 down to 1. And then the PC reboots. Why is this happening?


----------



## p4inkill3r

Does it happen with just one card? Try changing display ports?


----------



## Alastair

Quote:


> Originally Posted by *p4inkill3r*
> 
> Does it happen with just one card? Try changing display ports?


Well I am using crossfire, happens in intensive games.

Its the same symptoms I get when I have unstable overclock. However it is happening at stock clocks too. The driver just crashes then BSOD. this a new 16.3.2. And obviously there is this HIGHLY annoying thing of the cards re-initializing after windows log in.


----------



## gupsterg

Not seen this with my cards on 2 differing rigs and saying won't happen to me







.

Just to rule out "iffy" driver I'd uninstall driver via usual control panel method and then run DDU to clean system and then reinstall driver.

So far I've used 3 differing crimson drivers in my ownership and do above when updating drivers and had no issues.

I have been playing with unlock SP on 1 card and modded ROMs on all 3 cards.

Yet to reinstall driver at any point a) due to an issue b) to make mod via ROM to apply to cards.


----------



## Alastair

Do you think its has got anything to do with VSR and GpU scaling being turned on?


----------



## gupsterg

Dunno.

I have no issues with or without any combo of those features.

I'm on Win7 Pro x64, Crimson 16.3.2, custom ROMs both rigs, one rig DP connection other HDMI; I checked on both rigs.

I'd class all I see is normal behaviour on my cards (ie no issue for GPU tach / POST / OS load / transition from logon to desktop).

I'd go driver clean route and perhaps check what you have loading with OS at startup.


----------



## BIGTom

Quote:


> Originally Posted by *gupsterg*
> 
> Dunno.
> 
> I have no issues with or without any combo of those features.
> 
> I'm on Win7 Pro x64, Crimson 16.3.2, custom ROMs both rigs, one rig DP connection other HDMI; I checked on both rigs.
> 
> I'd class all I see is normal behaviour on my cards (ie no issue for GPU tach / POST / OS load / transition from logon to desktop).
> 
> I'd go driver clean route and perhaps check what you have loading with OS at startup.


I am with gupsterg on the clean reinstallation of drivers but also uninstall overclocking software and test without them.


----------



## nyk20z3




----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> Do you think its has got anything to do with VSR and GpU scaling being turned on?


I had a few issues similar to this. I tried the miniDP port and it fixed it. Doesn't mean it will work for you, but something to try. I used the DP to MiniDP adapter that came with one of my old cards. Seemed to have fixed it.
Quote:


> Originally Posted by *nyk20z3*


Dang that is tiny.


----------



## Elmy

http://www.3dmark.com/fs/6704407 Fastest AMD 3DMark Extreme Score in 3DMark Hall of fame


----------



## xkm1948

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> http://www.3dmark.com/fs/6704407 Fastest AMD 3DMark Extreme Score in 3DMark Hall of fame


Now imagine having 4 of the Fury Pro Duo. Mind blowing.


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> http://www.3dmark.com/fs/6704407 Fastest AMD 3DMark Extreme Score in 3DMark Hall of fame


Oh now you show up! we could have used you in the red vs green fanboy thread. We still won but would have loved to see another quad fury in there.
Only have a beta driver/tess modded run to compare http://www.3dmark.com/3dm/11453565


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> http://www.3dmark.com/fs/6704407 Fastest AMD 3DMark Extreme Score in 3DMark Hall of fame


also, what i cant figure out is why overclocking fury affects certain parts of the bench positively and other negative. For instance if i OC Vram i suffer more in first GPU test and combined, but gpu2 test usually is faster than if i don't oc vram. Throw in voltage and its wonky as hell and usually results in a negative score across board.


----------



## bluezone

I got bored and decided to do Vantage, Mark11 and Fire Strike bench marks at 1175.core clock(1200 locks up) +50 offset.

http://www.3dmark.com/3dmv/5438456

http://www.3dmark.com/3dm/11505897 (tops out at 74C)

http://www.3dmark.com/3dm11/11142495

Not as good as my 1100 @ -32 offset but not too bad.


----------



## ogow89

Quote:


> Originally Posted by *bluezone*
> 
> I got bored and decided to do Vantage, Mark11 and Fire Strike bench marks at 1175.core clock(1200 locks up) +50 offset.
> 
> http://www.3dmark.com/3dmv/5438456
> 
> http://www.3dmark.com/3dm/11505897 (tops out at 74C)
> 
> http://www.3dmark.com/3dm11/11142495
> 
> Not as good as my 1100 @ -32 offset but not too bad.


How come it says 1075mhz on the firestrike? And how did you increase the voltage? Also how are you cooling the card?


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> How come it says 1075mhz on the firestrike? And how did you increase the voltage? Also how are you cooling the card?


I don't know what happened there. Give me 10 min, I'll have to re run that one. I'm using sapphire Trixx.


----------



## bluezone

Well I'd thought I had cracked 1175. I must not of reset the clocks correctly after a crash.

I can get good runs of Vantage and 3D Mark 11 @ 1175 with +54Mv offset, But Fire Strike is a no go. Crashes about 2 seconds before the end of the Demo when the real test begins.

http://www.3dmark.com/3dmv/5438477

http://www.3dmark.com/3dm11/11142674

EDIT: however Fire Strike @ 1163 will run without issues.

http://www.3dmark.com/3dm/11506716


----------



## one80

Seems that Trixx won't let me change my clock past 1050mhz, but will still let me change my memory.

Afterburner will let me change the clock, but not the memory.

Have updated both - any ideas?


----------



## ogow89

Quote:


> Originally Posted by *bluezone*
> 
> Well I'd thought I had cracked 1175. I must not of reset the clocks correctly after a crash.
> 
> I can get good runs of Vantage and 3D Mark 11 @ 1175 with +54Mv offset, But Fire Strike is a no go. Crashes about 2 seconds before the end of the Demo when the real test begins.
> 
> http://www.3dmark.com/3dmv/5438477
> 
> http://www.3dmark.com/3dm11/11142674
> 
> EDIT: however Fire Strike @ 1163 will run without issues.
> 
> http://www.3dmark.com/3dm/11506716


Don't see any performance increase there compared to the one before.


----------



## ogow89

Quote:


> Originally Posted by *one80*
> 
> Seems that Trixx won't let me change my clock past 1050mhz, but will still let me change my memory.
> 
> Afterburner will let me change the clock, but not the memory.
> 
> Have updated both - any ideas?


Msi after burner allows memory change, but you need to check the extended overclock in the settings.


----------



## Papa Emeritus

Quote:


> Originally Posted by *one80*
> 
> Seems that Trixx won't let me change my clock past 1050mhz, but will still let me change my memory.
> 
> Afterburner will let me change the clock, but not the memory.
> 
> Have updated both - any ideas?


Quote:


> Originally Posted by *ogow89*
> 
> Msi after burner allows memory change, but you need to check the extended overclock in the settings.


Maybe you also need to reinstall AB, fixed it for me.


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> Don't see any performance increase there compared to the one before.


I prefer lower clocks and under volting, e.g. neg. offset.

Less heat is produced and I often get slightly better results. Plus I can overclock memory then too.

http://www.3dmark.com/3dm11/11142569

My best guess is it either power or temperature related throttling due to this card being a Nano on air.

I see you have gotten your Nano up to 1070 core. Have you tried Crimson 16.3.2. I could not run over 1150 before I the new Crimson.

Also have you replaced the TIM yet.


----------



## ogow89

1070 for a min or two benchmark is okay but I can't get passed 1050 stable for gaming with the 50% powerlimit, and i don't think it is a good idea to overvolt as the card gets hot too fast.


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> 1070 for a min or two benchmark is okay but I can't get passed 1050 stable for gaming with the 50% powerlimit, and i don't think it is a good idea to overvolt as the card gets hot too fast.


I do not like over volting myself. I had high tempature problems until I replaced the TIM with Gelid GC. There is better stuff than that to use, but it's what I had on hand.

Just be careful of the interposer if you do replace the TIM.


----------



## ogow89

Quote:


> Originally Posted by *bluezone*
> 
> I do not like over volting myself. I had high tempature problems until I replaced the TIM with Gelid GC. There is better stuff than that to use, but it's what I had on hand.
> 
> Just be careful of the interposer if you do replace the TIM.


How much difference in temps did it make?


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> How much difference in temps did it make?


For me, I when I first received the Nano. I was not paying enough attention to the temps during a test run. I think I was @ 1075 core and +.25 offset. I suddenly realized I had hit 90c. I quickly aborted the run and decided to replace the TIM. I rarely see over 75C now.

EDIT: I just remembered It was my first attempt at overclocking the HBM memory. I had been peaking at 75C. now I'm usually in the 65-70C range depending on what I'm doing.

That is @ 1100 core, -32 offset, HBM 545 and a custom fan profile.


----------



## ogow89

Quote:


> Originally Posted by *bluezone*
> 
> For me, I when I first received the Nano. I was not paying enough attention to the temps during a test run. I think I was @ 1075 core and +.25 offset. I suddenly realized I had hit 90c. I quickly aborted the run and decided to replace the TIM. I rarely see over 75C now.
> 
> EDIT: I just remembered It was my first attempt at overclocking the HBM memory. I had been peaking at 75C. now I'm usually in the 65-70C range depending on what I'm doing.
> 
> That is @ 1100 core, -32 offset, HBM 545 and a custom fan profile.


I don't know how you managed to get the card to run at 1100 with -32offset, but ill give a shot once i first get a stable cpu overclock. What about the powerlimit?


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> I don't know how you managed to get the card to run at 1100 with -32offset, but ill give a shot once i first get a stable cpu overclock. What about the powerlimit?


Probably just the silicon lottery. For me the highest clock I can run (1115) without adding voltage is very close to my best under volt clock combination(1100 @ -.32) I see no gains and or instability below that voltage offset.


----------



## ogow89

what about the powerlimit slider, do you use it?


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> what about the powerlimit slider, do you use it?


Yes it's at +50%.


----------



## ogow89

okay then ill try it, do you use only trix for the voltage and msi for the rest or just mainly trix?


----------



## bluezone

Quote:


> Originally Posted by *ogow89*
> 
> okay then ill try it, do you use only trix for the voltage and msi for the rest or just mainly trix?


I use Trixx only. It's just simpler for me and I've had better luck with it. (stability)

Strangely These are the exact same GPU settings I used on my HD7950's. If I remember correctly if you take a HD7970 and double its core count it's exactly the same as a Fury X or Nano.


----------



## bborokee

I've received my XFX Fury today.

However, when the gpu goes under load, it's making this buzzing sound (coil whine)

time to send it back for a replacement....

What a luck i have, eh?


----------



## ogow89

Quote:


> Originally Posted by *bborokee*
> 
> I've received my XFX Fury today.
> 
> However, when the gpu goes under load, it's making this buzzing sound (coil whine)
> 
> time to send it back for a replacement....
> 
> What a luck i have, eh?


Had that the first 2 or 3 day, and now i don't hear it anymore. I suggest using it for 2 or 3 days and see if it goes away after awhile.


----------



## bborokee

Quote:


> Originally Posted by *ogow89*
> 
> Had that the first 2 or 3 day, and now i don't hear it anymore. I suggest using it for 2 or 3 days and see if it goes away after awhile.


Alright, I'll continue on with benchmark and load tests for next couple of days to see if the whine goes away, if not i'll just have to rma it :/

Thanks for the advice


----------



## ogow89

Try also gaming for extended hours to keep the gpu active. Someone also suggested a while back that if you keep crysis 1 on the menu overnight, the coilwhine would eventually disappear since the fps goes over 5000.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bborokee*
> 
> I've received my XFX Fury today.
> 
> However, when the gpu goes under load, it's making this buzzing sound (coil whine)
> 
> time to send it back for a replacement....
> 
> What a luck i have, eh?


Every one you get is probably going to do that bud...


----------



## pdasterly

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Every one you get is probably going to do that bud...


guess i got lucky, i have gigabyte fury-non x and its whisper quiet. The fans in my water cooled setup are louder. Backplate does get pretty warm to touch


----------



## Thoth420

Quote:


> Originally Posted by *pdasterly*
> 
> guess i got lucky, i have gigabyte fury-non x and its whisper quiet. The fans in my water cooled setup are louder. Backplate does get pretty warm to touch


With that PSU I believe you.


----------



## bluezone

another new driver already. I cannot keep up with them.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.1.aspx


----------



## diggiddi

Quote:


> Originally Posted by *bluezone*
> 
> another new driver already. I cannot keep up with them.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.1.aspx


IKR


----------



## looncraz

Quote:


> Originally Posted by *bborokee*
> 
> I've received my XFX Fury today.
> 
> However, when the gpu goes under load, it's making this buzzing sound (coil whine)
> 
> time to send it back for a replacement....
> 
> What a luck i have, eh?


As was mentioned, run the card - don't return it right away.

The buzzing is just small whiles in the coils vibrating at certain frequencies. If you keep them at that frequency long enough the wires will abrade against each other and the sound will change and - eventually - go away.

Worst card I've ever had that did that was my 7870XT. Doesn't make a peep now.


----------



## bborokee

Quote:


> Originally Posted by *looncraz*
> 
> As was mentioned, run the card - don't return it right away.
> 
> The buzzing is just small whiles in the coils vibrating at certain frequencies. If you keep them at that frequency long enough the wires will abrade against each other and the sound will change and - eventually - go away.
> 
> Worst card I've ever had that did that was my 7870XT. Doesn't make a peep now.


Yeah, I plan to keep a close watch on the card for next couple of days while loading it.
I was just sheerly surprised/disappointed, but also didn't know what to expect since this was my first high-end card...heh

was scared for my life when i heard it for the first time, cuz my R9 380 never did this haha


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> another new driver already. I cannot keep up with them.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.1.aspx


----------



## xTesla1856

Sad times, it looks like my RMA'd Fury will get here on the 25th, which is just torture. Apparently my retailer is waiting for Sapphire to RMA the card, before sending me a new one. Is this common practice, or is my retailer just crap? They had the cheapest price on Nitro Furies in Switzerland.


----------



## Alastair

How good are Fury's at Bitcoin mining? I am moving to uncapped internet soon (YAY ME!!!!) and yeah. Might look at leaving my machine to mine while I am working. Also plan to put my old HD6850's to work as well.


----------



## ogow89

Quote:


> Originally Posted by *xTesla1856*
> 
> Sad times, it looks like my RMA'd Fury will get here on the 25th, which is just torture. Apparently my retailer is waiting for Sapphire to RMA the card, before sending me a new one. Is this common practice, or is my retailer just crap? They had the cheapest price on Nitro Furies in Switzerland.


Nope just your retailer. If the card is faulty and the retailer tested it if it produces the same result as what you had and the issue was present, then they send the card to the manufacturer. In the mean time, if they had a replacement card in stock, they can just send it to you, so you don't have to wait for sapphire to send in a new card. Your retailer is just being cheap. Not sure where you got it from, but here in germany, i buy my parts from Mindfactory and usually that is how it goes.

Only in case there is non in stock, you have to wait for the replacement card to come in. And if the card is sold out and not in production no more and yet under the warranty, you get store credit for the full price you paid. That is how it went with me with the r9 290 pcs+. First card burned, and got a replacement with in 7 days; which is the time it took for the card to arrive and get sorted out and tested, and packaging and sending for the replacement card. Second time the cooler got busted, and sent it last month, and they gave store credits for the full price since there are no r9 290 anymore.


----------



## ogow89

Quote:


> Originally Posted by *Alastair*
> 
> How good are Fury's at Bitcoin mining? I am moving to uncapped internet soon (YAY ME!!!!) and yeah. Might look at leaving my machine to mine while I am working. Also plan to put my old HD6850's to work as well.


you are better off buying an r9 295x2

https://tech4gamers.com/amd-radeon-r9-fury-x-favorite-card-mining-3d-rendering-bitcoin-benchmarks/


----------



## Alastair

Quote:


> Originally Posted by *ogow89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> How good are Fury's at Bitcoin mining? I am moving to uncapped internet soon (YAY ME!!!!) and yeah. Might look at leaving my machine to mine while I am working. Also plan to put my old HD6850's to work as well.
> 
> 
> 
> you are better off buying an r9 295x2
> 
> https://tech4gamers.com/amd-radeon-r9-fury-x-favorite-card-mining-3d-rendering-bitcoin-benchmarks/
Click to expand...

I am not buying new stuff to mine. If I mine it will be with what I already have lying around. Which are my HD6850's and my Furies in my main rig


----------



## pdasterly

for sapphire you can contact athlon micro directly for rma
[email protected]


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> I am not buying new stuff to mine. If I mine it will be with what I already have lying around. Which are my HD6850's and my Furies in my main rig


Alastair - Bit coin mining difficulty is so high now it is not worth it at all the even try it with cards. I had to give up my mining ASIC because the difficulty is just to far gone.









You may however find an alt coin that is still minable with Cards and make it worth your while.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jflisk*
> 
> Alastair - Bit coin mining difficulty is so high now it is not worth it at all the even try it with cards. I had to give up my mining ASIC because the difficulty is just to far gone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You may however find an alt coin that is still minable with Cards and make it worth your while.


Yeah I thought people were done with it to be honest.... I mean, that is why we were getting used 7900 cards for $75-100 and 290 cards for $175-200 a year ago......

Hell I bought my son's 7950 for $60 on craigslist, and my first Tri-X 290 for $160 on craigslist, both second hand mining cards from people closing shop....

Then after the liquidation phase ended, the values went back up....


----------



## Alastair

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I am not buying new stuff to mine. If I mine it will be with what I already have lying around. Which are my HD6850's and my Furies in my main rig
> 
> 
> 
> Alastair - Bit coin mining difficulty is so high now it is not worth it at all the even try it with cards. I had to give up my mining ASIC because the difficulty is just to far gone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You may however find an alt coin that is still minable with Cards and make it worth your while.
Click to expand...

can you trade alt coins? for bitcoins or money?


----------



## Jflisk

Quote:


> Originally Posted by *Alastair*
> 
> can you trade alt coins? for bitcoins or money?


You can trade up from altcoins to bitcoin then cash out.

Bitrix is an exchange there are others out there .
https://bittrex.com/

I would not leave to much coin in any exchange for too long.

you might be better off renting you hash rate here
https://www.nicehash.com/

exchange rates and difficulty can be found here
https://bitcoinwisdom.com/

Good luck


----------



## pdasterly

having issue with my fury, I have monitor on dp port and my soundbar on hdmi. Was working great, now machine loses signal after windows splash screen. I can unplug hdmi and everything is working fine again but unable to get sound via hdmi?


----------



## rv8000

Quote:


> Originally Posted by *pdasterly*
> 
> having issue with my fury, I have monitor on dp port and my soundbar on hdmi. Was working great, now machine loses signal after windows splash screen. I can unplug hdmi and everything is working fine again but unable to get sound via hdmi?


Did you just update your driver?

If you did, do another clean reinstall or rollback; could be a driver issue.


----------



## pdasterly

Quote:


> Originally Posted by *rv8000*
> 
> Did you just update your driver?
> 
> If you did, do another clean reinstall or rollback; could be a driver issue.


was on last driver before the one just released a few days ago, did clean uninstall with ddu and installed latest driver.

I did swap monitors, thats the only change. went from acer x34 to xr341ck


----------



## rv8000

Quote:


> Originally Posted by *pdasterly*
> 
> was on last driver before the one just released a few days ago, did clean uninstall with ddu and installed latest driver.


So the issue just started, but happened with either driver, or exclusively with 16.4.1?


----------



## pdasterly

Quote:


> Originally Posted by *rv8000*
> 
> So the issue just started, but happened with either driver, or exclusively with 16.4.1?


both, now i think about it, it happened after swapping monitors


----------



## rv8000

Quote:


> Originally Posted by *pdasterly*
> 
> both, now i think about it, it happened after swapping monitors


Sounds like the issue is either with the monitor or cables. Try testing some other cables if you have spares, if not try the same setup with a different monitor/tv and see what happens.


----------



## pdasterly

Quote:


> Originally Posted by *rv8000*
> 
> Sounds like the issue is either with the monitor or cables. Try testing some other cables if you have spares, if not try the same setup with a different monitor/tv and see what happens.


tried 3 different dp cables and 2 different hdmi cables.
tried a few things, when i plug the hdmi into card both ports lose signal, unplug hdmi then signal comes right back

tried all dp ports on card also

plugged hdmi into tv and it works, now im really confused


----------



## NBrock

Mining on video cards isn't worth the electricity spent....even with Alt coins. If you don't mind the power bill you have to be able to dedicate time to day trading coins and watching to see what coin is trending so you aren't just mining something that's now worthless a few days after you started. There aren't as many new alt coins as there use to be since everything started getting more and more difficult to mine and a bunch of people lost interest.
If you really want to run your cards you can always do some [email protected] and join the OC.net team. There are competitions and prizes and it's all in the name of Science!


----------



## bluezone

Anyone else read this?

http://www.hardocp.com/article/2016/04/01/ashes_singularity_day_1_benchmark_preview#.VwWPfWfmo7J


----------



## vieuxchnock

_Did Fire Strike tonight with my Sapphire Tri-X Fury

i7 [email protected]
Tri-X 1125/570
+96mv
+50 Power limit
fans 100%
Temperature 65







_


----------



## rv8000

Quote:


> Originally Posted by *vieuxchnock*
> 
> _Did Fire Strike tonight with my Sapphire Tri-X Fury
> 
> i7 [email protected]
> Tri-X 1125/570
> +96mv
> +50 Power limit
> fans 100%
> Temperature 65
> 
> 
> 
> 
> 
> 
> 
> _










nice!

I've been away from benching and messing with my GPU for too long. Last time I did so there were no extra volts, and I had flashed my card back to 56/64 CU. Time to break 17k


----------



## dagget3450

Quote:


> Originally Posted by *vieuxchnock*
> 
> _Did Fire Strike tonight with my Sapphire Tri-X Fury
> 
> i7 [email protected]
> Tri-X 1125/570
> +96mv
> +50 Power limit
> fans 100%
> Temperature 65
> 
> 
> 
> 
> 
> 
> 
> _


Can you do a regular firestrike run on stock to copare to your regular firestrike oc run?


----------



## vieuxchnock

_The results at stock

1000/500






Next week, I will have my second Fury and test in XFire.

_


----------



## Willius

I will be blocking my R9 Nano tonight with an EK block.

Aside from the instruction of course, as I've read it's very easy to damage HBM memory. What should I look out for specifically?


----------



## Radox-0

Quote:


> Originally Posted by *Willius*
> 
> I will be blocking my R9 Nano tonight with an EK block.
> 
> Aside from the instruction of course, as I've read it's very easy to damage HBM memory. What should I look out for specifically?


Just need to be gentle cleaning the interposer (orange looking area around the core and between the hbm stacks). Aside from that straight forward waterblocking the nano. I just use a cotton bud with TIM cleaner on and gently clean away the existing stuff

Looks forward to getting great results with it blocked


----------



## Willius

Quote:


> Originally Posted by *Radox-0*
> 
> Just need to be gentle cleaning the interposer (orange looking area around the core and between the hbm stacks). Aside from that straight forward waterblocking the nano. I just use a cotton bud with TIM cleaner on and gently clean away the existing stuff
> 
> Looks forward to getting great results with it blocked


Luckly it ain't the first gpu I will be blocking. But having read about that interposer here and there made me curious.

Thanks for the advice! +Rep. Will post results when I have my main rig up and running again!


----------



## Atomagenesis

Does anyone know if there is a waterblock available for the Gigabyte R9 Fury: http://www.gigabyte.com/products/product-page.aspx?pid=5680#kf

I just ordered one, can't find a waterblock anywhere.


----------



## Alastair

Quote:


> Originally Posted by *Atomagenesis*
> 
> Does anyone know if there is a waterblock available for the Gigabyte R9 Fury: http://www.gigabyte.com/products/product-page.aspx?pid=5680#kf
> 
> I just ordered one, can't find a waterblock anywhere.


That would be a no most likely. It is not a reference board. So it is most likely a no.


----------



## Atomagenesis

Oh well, I got it for ridiculously cheap, I can't complain too much.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Atomagenesis*
> 
> Oh well, I got it for ridiculously cheap, I can't complain too much.


And now you know why, lol









Giga brought this thing to the table so late, and with Fury being a "transition" GPU, you likely won't see FCB's from EK, but... and this is a big but, you may see some hybrid blocks from Alphacool. Those guys seem to throw something together for every black sheep card there is. When 390X launched only a few boards were reference and there were no blocks... then alphacool launched their hybrid water blocks for the Sapphire's and the MSI's.... after a few months EK quickly caught on began to produce full cover blocks for those cards. Oddly, per the CEO of EK, they had ZERO plans of ever making a full cover block for the aftermarket 390x's, but the demand pretty much led them to it. In your case that demand may not be there with the majority of Fury customers buying reference based cards


----------



## Papa Emeritus

Quote:


> Originally Posted by *Atomagenesis*
> 
> Oh well, I got it for ridiculously cheap, I can't complain too much.


Fun fact, the gigabyte card is the most expensive one here in Sweden, and Tri-X the cheapest.

Gigabyte: 739$
Tri-X 588$


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> tried 3 different dp cables and 2 different hdmi cables.
> tried a few things, when i plug the hdmi into card both ports lose signal, unplug hdmi then signal comes right back
> 
> tried all dp ports on card also
> 
> plugged hdmi into tv and it works, now im really confused


Check the Device Manager to see if an error flag is present or is changing status when your having the problem. Because it sounds possibly like the pc is not sensing the HDMI under certain conditions.

also check this out.

http://support.amd.com/en-us/kb-articles/Pages/GPU70NoAudiofromHDTV.aspx


----------



## bluezone

Quote:


> Originally Posted by *Thoth420*


I missed this. LOL. Yes a very good problem


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> Check the Device Manager to see if an error flag is present or is changing status when your having the problem. Because it sounds possibly like the pc is not sensing the HDMI under certain conditions


soon as i plug in hdmi, the screen goes blank(monitor is on DP)


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> soon as i plug in hdmi, the screen goes blank(monitor is on DP)


OK I got you now. so what you are saying is that HDMI is trying to become master display when you only want audio.


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> OK I got you now. so what you are saying is that HDMI is trying to become master display when you only want audio.


I dont think so, when i plug that same hdmi into my tv i get same results, weird that it worked with x34 monitor but not xr341ck


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> I dont think so, when i plug that same hdmi into my tv i get same results, weird that it worked with x34 monitor but not xr341ck


I'm headed out to a movie right now, If you have multi monitor support in your PC Bios would mind toggling it on to see if this helps.If not try 16.1 Crimson driver. I'll check when I get back to see what happened.


----------



## pdasterly

i have maximus vi hero, has onboard video(hdmi). It auto disables when pcie gpu is used. Will try older driver


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> i have maximus vi hero, has onboard video(hdmi). It auto disables when pcie gpu is used. Will try older driver


It's to bad the HDMI onboard auto disables. I was going to suggest adding a 2nd monitor on it and setting it as the home screen To check the Device Manger from there.
Did the driver change help?
.
If that didn't help try this.

http://windows.microsoft.com/en-US/windows-vista/Troubleshoot-monitor-and-video-card-problems


----------



## bluezone

I was looking through the manual for you board and found this.



enable that.


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> I was looking through the manual for you board and found this.
> 
> 
> 
> enable that.


That option is not available, they must have taken it out with later bios updates
Should look like this but I only have primary display option

edit: figured out how to get hidden menu in bios, still doesn't work. Im plugging hdmi into gpu card not the mobo.

tried plugging hdmi into mobo and i get nothing, i enabled multi-monitor in bios


----------



## hyp36rmax

*From a pair of Sapphire R9 290X VAPOR-X 8GB GPU's in Crossfire to R9 FURY X in Crossfire







*


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> That option is not available, they must have taken it out with later bios updates
> Should look like this but I only have primary display option
> 
> edit: figured out how to get hidden menu in bios, still doesn't work. Im plugging hdmi into gpu card not the mobo.
> 
> tried plugging hdmi into mobo and i get nothing, i enabled multi-monitor in bios


"Im plugging hdmi into gpu card not the mobo"That was a work around I was going to have you try plus a second monitor can be used to trouble shoot. How about Optical S/PDIF out to your sound bar, Its compressed audio 5.1(DTS) out but if I remember correctly 2.1 is not compressed. you have a very decent audio chip onboard to use for this. (I use my on board audio for blue ray playback).

I'm just thinking your main goal is how to get audio out.

"tried plugging hdmi into mobo and i get nothing" You need a Intel video driver for it to work plus it need to be plugged into a monitor it to work. Get it either off your installation disk or:

https://downloadcenter.intel.com/ or from asus.

I working several different things(angles) at the same time for you with this. Admittedly this drives anyone watching me work rather nuts. If they try to figure out what I'm doing. I played chess a lot as a teen against the reigning Ontario champ. So I got used to planning many moves ahead. LOL


----------



## bluezone

Quote:


> Originally Posted by *hyp36rmax*
> 
> *From a pair of Sapphire R9 290X VAPOR-X 8GB GPU's in Crossfire to R9 FURY X in Crossfire
> 
> 
> 
> 
> 
> 
> 
> *


----------



## Papa Emeritus

Quote:


> Originally Posted by *hyp36rmax*
> 
> *From a pair of Sapphire R9 290X VAPOR-X 8GB GPU's in Crossfire to R9 FURY X in Crossfire
> 
> 
> 
> 
> 
> 
> 
> *


Beautiful! Would love to build in that case


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> "Im plugging hdmi into gpu card not the mobo"That was a work around I was going to have you try plus a second monitor can be used to trouble shoot. How about Optical S/PDIF out to your sound bar, Its compressed audio 5.1(DTS) out but if I remember correctly 2.1 is not compressed. you have a very decent audio chip onboard to use for this. (I use my on board audio for blue ray playback).
> 
> I'm just thinking your main goal is how to get audio out.
> 
> "tried plugging hdmi into mobo and i get nothing" You need a Intel video driver for it to work plus it need to be plugged into a monitor it to work. Get it either off your installation disk or:
> 
> https://downloadcenter.intel.com/ or from asus.
> 
> I working several different things(angles) at the same time for you with this. Admittedly this drives anyone watching me work rather nuts. If they try to figure out what I'm doing. I played chess a lot as a teen against the reigning Ontario champ. So I got used to planning many moves ahead. LOL


yes i gave up, i loaded intel gpu drivers. Still as soon as i plug in hdmi everything goes blank. Already using optical output. I believe the dp monitor is the culprit


----------



## MrKoala

Quote:


> Originally Posted by *Alastair*
> 
> in that second image? What cards are those? Xeon Phi? They aren't GPU's from the looks of it. No PCI-E power connectors.


Xeon Phi. Those things have power connectors at the back.


----------



## ogow89

Should I keep my R9 Nano or return it?

The card works fine, but it would suck if in 2 months newer cards get released for 300 euros with better performance than a 500 euros card. So should i return the card and wait for the next gpus?


----------



## Radox-0

Quote:


> Originally Posted by *ogow89*
> 
> Should I keep my R9 Nano or return it?
> 
> The card works fine, but it would suck if in 2 months newer cards get released for 300 euros with better performance than a 500 euros card. So should i return the card and wait for the next gpus?


That's technology for you, always something shinier and faster around the corner. Really it's only a decision you can make, if you can live without a gpu then sure why not. Best case, the newer cards are faster and cheaper. Worst case the current gen of cards gen a small notch down in price.

If your after the nano form factor itself, gut feeling is we will need to wait abit for a similar form factor card that is also quiet a powerhouse.


----------



## ogow89

Quote:


> Originally Posted by *Radox-0*
> 
> That's technology for you, always something shinier and faster around the corner. Really it's only a decision you can make, if you can live without a gpu then sure why not. Best case, the newer cards are faster and cheaper. Worst case the current gen of cards gen a small notch down in price.
> 
> If your after the nano form factor itself, gut feeling is we will need to wait abit for a similar form factor card that is also quiet a powerhouse.


I have a midtower, so basically i can fit any card out there, i just saw the chance in buying a fury x like performance (when clocked at 1050 with powerlimit at 50%) for around 120 euros less. R9 Nano selling for 480 € and fury x for 620 €. And i like the way the card looks and how light it is. But i don't to be pull my hair when newer cards hit the market with more affordable prices and similar to better performance. As far as patience go, it would be irritating, but i could wait. I just wanted to first get some opinions. I got the card a week ago, and i still have one week to go to return it and get full refund. (2 weeks agreement and what not).


----------



## Agent Smith1984

I would think you'd have contemplated that some before purchasing the card, but in any case, nano is a good card and will continue to be good for some time. You can definitely expect the next gen of $350-400 cards to perform just as well or better though...


----------



## ogow89

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I would think you'd have contemplated that some before purchasing the card, but in any case, nano is a good card and will continue to be good for some time. You can definitely expect the next gen of $350-400 cards to perform just as well or better though...


I did actually, i just wanted to take the card for a spin and see how much better it is that my r9 290. And in some games the performance was marginal and up to 15 more fps at 1440p even compared to my overclocked r9 290. And while i could say i find the performance really great, i still don't want to have regrets later on.

My r9 290 which i purchased at the end of 2014, had fan issues, and i rma'd after stocks were gone in the hopes to get an r9 390 as a replacement. But i got full refund. So i took ordered the card with the intention in mind to first try it and see how much better it is. Now that i am seeing the prices of gpus are dropping with the r9 390x devil from powercolor costing 350€, i kind of am worried that the new gpus will drop sooner than i thought.


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> yes i gave up, i loaded intel gpu drivers. Still as soon as i plug in hdmi everything goes blank. Already using optical output. I believe the dp monitor is the culprit


Yes I as well believe it's the DP monitor. Other than setting up a remote desk top I'm out of ideas on observing what is going on when you plug into the HDMI. Everything seems to be a dead end.


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> Yes I as well believe it's the DP monitor. Other than setting up a remote desk top I'm out of ideas on observing what is going on when you plug into the HDMI. Everything seems to be a dead end.


would have never guessed it was monitor but i just swapped monitors and everything worked fine, weird one


----------



## Radox-0

Quote:


> Originally Posted by *ogow89*
> 
> I did actually, i just wanted to take the card for a spin and see how much better it is that my r9 290. And in some games the performance was marginal and up to 15 more fps at 1440p even compared to my overclocked r9 290. And while i could say i find the performance really great, i still don't want to have regrets later on.
> 
> My r9 290 which i purchased at the end of 2014, had fan issues, and i rma'd after stocks were gone in the hopes to get an r9 390 as a replacement. But i got full refund. So i took ordered the card with the intention in mind to first try it and see how much better it is. Now that i am seeing the prices of gpus are dropping with the r9 390x devil from powercolor costing 350€, i kind of am worried that the new gpus will drop sooner than i thought.


If you have the 290 I would just keep that to be honest, send the nano back and see what next gen brings. The 290 is still a solid card so not as if you would suffer and you case sounds like you have no restriction as such that would preclude you from using the larger / typical size form factor I imagine the next gen will initially realease as.


----------



## kittysox

My nano just impresses me more every day, in such a small package and I had real heat concerns when I started my build. It being air cooled in the 05sx case. Noise level, performance, temperatures, everything is much much better than I really thought it was going to be. I've not overclocked a thing on it, temps seem stay mostlyin the mid 60's at 3440x1440. This thread was invaluable in my decision to pick it up and I want to thank you all. I can't wait to see what the next gen of Radeon cards can do, because this thing is just beyond impressive.


----------



## RatPatrol01

New R9 Nano showed up today, gotta say this thing is pretty nuts! Was tempted to try and wait for all the new cards coming but I have a feeling we aren't gonna see anything quite like the nano


----------



## Atomagenesis

Quote:


> Originally Posted by *RatPatrol01*
> 
> New R9 Nano showed up today, gotta say this thing is pretty nuts! Was tempted to try and wait for all the new cards coming but I have a feeling we aren't gonna see anything quite like the nano


Nice man, I bet it is awesome. My R9 Fury should be here tomorrow, I'm stoked.


----------



## RatPatrol01

Quote:


> Originally Posted by *Atomagenesis*
> 
> Nice man, I bet it is awesome. My R9 Fury should be here tomorrow, I'm stoked.


I actually considered a Fury myself for a bit, then I remembered it had to fit in an EVGA Hadron lol


----------



## Alastair

Quote:


> Originally Posted by *RatPatrol01*
> 
> New R9 Nano showed up today, gotta say this thing is pretty nuts! Was tempted to try and wait for all the new cards coming but I have a feeling we aren't gonna see anything quite like the nano


I do not think we will see flagships like the Fiji's yet for at least another year. Im pretty sure Vega and what not are going to replace the aging entry and mid level Pitcairn based chips. But I mean looking at NVidia only able to get a partially disabled big die to work on their Teslas with HBM 2 means that from NVidia's side a big Pascal is still a ways out, I do not see AMD adding anything on that sort of scale either for a while. I rate Fiji was a good investment.


----------



## RatPatrol01

Quote:


> Originally Posted by *Alastair*
> 
> I do not think we will see flagships like the Fiji's yet for at least another year. Im pretty sure Vega and what not are going to replace the aging entry and mid level Pitcairn based chips. But I mean looking at NVidia only able to get a partially disabled big die to work on their Teslas with HBM 2 means that from NVidia's side a big Pascal is still a ways out, I do not see AMD adding anything on that sort of scale either for a while. I rate Fiji was a good investment.


Yeah that's what I'm banking on by buying now, and I'm certainly enjoying the performance, but man is the nano a loud little thing! No coil whine yet though, so that's nice.


----------



## Arizonian

Quote:


> Originally Posted by *Alastair*
> 
> I do not think we will see flagships like the Fiji's yet for at least another year. Im pretty sure Vega and what not are going to replace the aging entry and mid level Pitcairn based chips. But I mean looking at NVidia only able to get a partially disabled big die to work on their Teslas with HBM 2 means that from NVidia's side a big Pascal is still a ways out, I do not see AMD adding anything on that sort of scale either for a while. I rate Fiji was a good investment.


When I was looking at new GPU back in December members were advicing others wait for Polaris or Pascal.....since then I've had six months of fun with my fury and moved to 4K UHD gaming. I'm sorry for those that did wait half a year till now and still are.

I feel the Fiji performance for $499 was the best buy I've had since since 2010. I sold a 780ti in its place whch was the worst loss in value to date.

Glad I didn't play the waiting game. I would have missed out on a lot and wouldn't be 4K UHD gaming like I am today.


----------



## Alastair

Well I do not have a UHD screen of even a QHD screen. I am still on 1080P. However I can still enjoy features like VSR while I am waiting to see what these new High Dynamic Range screens bring to the table.


----------



## RatPatrol01

Older game on a goofy engine but man, never thought I'd be able to max out settings on something then vsync _down_ to 144fps!


----------



## Thoth420

Quote:


> Originally Posted by *RatPatrol01*
> 
> 
> 
> Older game on a goofy engine but man, never thought I'd be able to max out settings on something then vsync _down_ to 144fps!


Probably my favorite game in the past 5 years easy. I replay it at least twice a year along with DX and Invisible War.


----------



## gupsterg

Quote:


> Originally Posted by *hyp36rmax*
> 
> *From a pair of Sapphire R9 290X VAPOR-X 8GB GPU's in Crossfire to R9 FURY X in Crossfire
> 
> 
> 
> 
> 
> 
> 
> *


WOW nice rig





















.
Quote:


> Originally Posted by *Alastair*
> 
> I do not think we will see flagships like the Fiji's yet for at least another year. Im pretty sure Vega and what not are going to replace the aging entry and mid level Pitcairn based chips. But I mean looking at NVidia only able to get a partially disabled big die to work on their Teslas with HBM 2 means that from NVidia's side a big Pascal is still a ways out, I do not see AMD adding anything on that sort of scale either for a while. I rate Fiji was a good investment.


My thoughts exactly at present







.

@subscribers

Anyone got some results for push pull fan config on Fury X stock AIO unit?


----------



## ogow89

So i have been playing with my r9 nano card for a week now, and i noticed that when overclocked and with powerlimit at 50%, games crash and i get a box with ''No signal'' Is anyone with an r9 nano seeing the same thing when overclocked?


----------



## Metalhead79

Quote:


> Originally Posted by *RatPatrol01*
> 
> Yeah that's what I'm banking on by buying now, and I'm certainly enjoying the performance, but man is the nano a loud little thing! No coil whine yet though, so that's nice.


Same here, my Fury ended up being $423 after a 10% refund from Amazon (they shipped the GPU in the retail box - just slapped a shipping sticker on the box). Polaris may or may not be as fast but I'm playing my games NOW on a high end GPU - not waiting another 3+ months for something that may or may not be better to come out..... and then waiting even longer for the custom cooler cards to come out....and then being told to wait for Vega since it'll be just around the corner by then.


----------



## hyp36rmax

Quote:


> Originally Posted by *bluezone*


Quote:


> Originally Posted by *Papa Emeritus*
> 
> Beautiful! Would love to build in that case


Quote:


> Originally Posted by *gupsterg*
> 
> WOW nice rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .
> My thoughts exactly at present
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @subscribers
> 
> Anyone got some results for push pull fan config on Fury X stock AIO unit?


Thanks guys!! I can't wait to start her up! She's my 4k Beast!


----------



## buildzoid

Been slowly working on maxing out the Fury Xs. Managed to get 38.8K graphics score in Firestrike today.

http://hwbot.org/submission/3185463_

The second card seems to be my best clocker but I messed up the volt mod on it so it's still on the stock BIOS as a safety precaution. Once I fix that the card will be on a high power BIOS and should do well over 1210mhz with better efficiency.



I strapped 10K uF to the Vcore and 4K uF to the HBM. It went from crashing in seconds at 1170/570 to barely passing at 1170/600mhz. The primary card is un modded since it wouldn't fit next to my TC-14PE. Once I get H2O for the 5960X I mod that card too. It seems to be my second best and does 1190/570mhz rock solid on my latest high power BIOS.

My third card is currently not working because I'm a lazy idiot an decided to mod even though I still haven't bought better wire. The card's fine but the VRM won't power on until I fix it. It's also my worst card so I'm not too fussed about not having it functioning right now.

I should be able to get top 20 in Firestrike dual and triple GPU once I get everything 100% working.


----------



## p4inkill3r

Quote:


> Originally Posted by *gupsterg*
> 
> Anyone got some results for push pull fan config on Fury X stock AIO unit?


I am sure it would shave a few degrees off, but my stock setup is still ultra quiet and performing fantastically.

I'm considering selling my Fury X, though. All of the sudden, I have absolutely zero desire to play games.


----------



## gupsterg

I agree AIO gives good performance, probably don't need to go push pull TBH.

My current 24/7 ROM 1135 / 540 VID 1.250V (+38mV over stock VID) MVDDC 1.3V with Fan profile / PowerLimit mods.

Approx. 21C room ambient, below data from cold boot and then idle for 25min.



Then HWiNFO timer/stats reset and 26min 3DM FS GT1 looped.



Attached below 3DM data file for above run, as HWiNFO in average column showed GPU as 1056MHz, I did another run but MSI AB logging data (22min under load).

1135_540_VID38mv_FT_PL.zip 141k .zip file


Silent room I'd say my fan profile is audible but pretty quiet IMO, with some background noise no chance of hearing system IMO.

My initial feelings on Fiji upgrade were not exactly "gobsmacked" *but* now I'm very happy with upgrade







.


----------



## Flamingo

Quote:


> Originally Posted by *SLK*
> 
> Newest... 1.00.18769.
> 
> 1440p uncapped frame rate


Yep throttles alright. This was at 20% power, 85C temp limit (never went above 79C in tests), max fan settings, crazy settings, 1440p.


----------



## SLK

Quote:


> Originally Posted by *Flamingo*
> 
> Yep throttles alright. This was at 20% power, 85C temp limit (never went above 79C in tests), max fan settings, crazy settings, 1440p.


Reduce the voltage. I can do -30 core voltage and sustain clocks. Its definitely the board power limit alright.


----------



## Willius

I've got a little problem with my R9 nano. It could be a driver issue too. I don't know.

Whenever i stress/bench on STOCKclocks / %50 power Target with Heaven benchmark the screen goes black. I'm running the latest Crimson driver.
Card is cooled with an EK Waterblock.
I do not have this problem when I play games like Heroes of the Storm, call of duty black ops 3.

Any ideas?

Edit:

After some more googling I found that it might be because of temp monitor programs.


----------



## ogow89

Quote:


> Originally Posted by *Willius*
> 
> I've got a little problem with my R9 nano. It could be a driver issue too. I don't know.
> 
> Whenever i stress/bench on STOCKclocks / %50 power Target with Heaven benchmark the screen goes black. I'm running the latest Crimson driver.
> Card is cooled with an EK Waterblock.
> I do not have this problem when I play games like Heroes of the Storm, call of duty black ops 3.
> 
> Any ideas?


Replug the pci-e power connector to the card, switch gpu bios and/or switch the RAM sticks places.

Had similar issue with the exception that it was also happening in demanding games like the the division, and did all 3 of those, and issue is gone.

Great card, but i decided to send it back and wait for the next gpus. Read somewhere that the ones coming out in june r9 490s clocked at 800mhz will be faster than the gtx 980ti and fury x.


----------



## Willius

Quote:


> Originally Posted by *ogow89*
> 
> Replug the pci-e power connector to the card, switch gpu bios and/or switch the RAM sticks places.
> 
> Had similar issue with the exception that it was also happening in demanding games like the the division, and did all 3 of those, and issue is gone.
> 
> Great card, but i decided to send it back and wait for the next gpus. Read somewhere that the ones coming out in june r9 490s clocked at 800mhz will be faster than the gtx 980ti and fury x.


I will try that. With probably the exception of switching the ram. I'm on ITX and only have 2 slots. And t has a Waterblock on it. Ram is working fine too since I'm LinX stable.


----------



## ogow89

Quote:


> Originally Posted by *Willius*
> 
> I will try that. With probably the exception of switching the ram. I'm on ITX and only have 2 slots. And t has a Waterblock on it. Ram is working fine too since I'm LinX stable.


Alot of people suggested switching the places of ram, so i would consider it. Even if you have 2 slots, just switch their places with each other.


----------



## Willius

Quote:


> Originally Posted by *ogow89*
> 
> Alot of people suggested switching the places of ram, so i would consider it. Even if you have 2 slots, just switch their places with each other.


I think it might be my PSU. When i bench the fan goes banana's. Although 520w should be enough to power my system.

I will try and switch the ram.


----------



## kittysox

Quote:


> Originally Posted by *Willius*
> 
> I think it might be my PSU. When i bench the fan goes banana's. Although 520w should be enough to power my system.
> 
> I will try and switch the ram.


I had a 520 watt seasonic in a build with a slightly overclocked ivy bridge i5 and 970 and it constantly black screen rebooted until I upped the psu to a higher wattage.


----------



## Willius

It ran my 4670k @ 4.4 with The 970 at 1528mhz like a charm. But since I've redone my main build, added a second DDC, and a Aquaero 6. And changed the gpu to the R9 Nano. I might have overdone it all a bit to say the least. For now I will undervolt the R9 nano. And will change the PSU when I have the funds.

Also, it doesn't reboot. The screen just goes black. Sound keeps running. But at this point I'm fairly sure it's the PSU.


----------



## Jesse36m3

Quote:


> Originally Posted by *Willius*
> 
> Also, it doesn't reboot. The screen just goes black. Sound keeps running. But at this point I'm fairly sure it's the PSU.


This happens to my Fury X as well when I detect an unstable overclock. This is my first AMD card in a looooong time. My NVidia cards would just crash to desktop and then reset the clocks back to default. Now I have to manually restart my PC because the screen is black. Is that normal? PSU is only a year old [EVGA 750G2]. CPU and GPU are both in a custom loop.


----------



## DedEmbryonicCe1

Even with a 1050W PSU I still get the system rebooting partially to a black screen with unstable overclocks. It happens dozens of times when I'm trying to beat a personal best.


----------



## Radox-0

Yep I get it with unstable clocks at time when pushing it hard. Had the screen go black even on a 1200 Watt HXi now in my main rig with a 4690k and the Nano so barely breaking a sweat for the PSU. Most often it will crash to desktop, but seems at times pushing it hard caused a black screen.


----------



## ogow89

Not sure of how far many of you are pushing the r9 nano when overclocking, but 1050mhz on the core and no hmb overclock with 50% power limit should be stable as long it is kept cool.

All of the black screen issues and no signal crashes while gaming were solved in my case upon switching places of the RAM sticks, switching the gpu bios, and replugging the pci power connector. I suggest also uninstalling msi afterburner and/or sapphire trixx software and deleting their settings followed by uninstalling the amd driver and crimson settings and wiped clean with DDU to ensure no leftovers exist.

Also for anyone with an i5 cpu, if the cpu is overclocked i suggest checking if the cpu overclock is 100% stable. While many stress tests will turn positive i had games crashing on me. The r9 nano while may not be impressive in terms of raw power at stock, when overclocked to a fury x stock clocks it is an enthusiast level gpu. In such case, the cpu is maxed out in the latest games and being stressed out to 100% causing for some reboots or red screen or other issues due to instability.

Owner of an i5 4690k and brief owner of an r9 nano (returned and decided to wait for polaris), ran in all of the aforementioned issues and hope any or all of my suggestions work for you.


----------



## pdasterly

bad motherboard in monitor
Quote:


> Originally Posted by *bluezone*
> 
> Yes I as well believe it's the DP monitor. Other than setting up a remote desk top I'm out of ideas on observing what is going on when you plug into the HDMI. Everything seems to be a dead end.


----------



## bluezone

Quote:


> Originally Posted by *pdasterly*
> 
> bad motherboard in monitor


Hopefully it's under warranty. .


----------



## pdasterly

Quote:


> Originally Posted by *bluezone*
> 
> Hopefully it's under warranty. .


recertified but only 1 week old


----------



## battleaxe

Quote:


> Originally Posted by *buildzoid*
> 
> Been slowly working on maxing out the Fury Xs. Managed to get 38.8K graphics score in Firestrike today.
> 
> http://hwbot.org/submission/3185463_
> 
> The second card seems to be my best clocker but I messed up the volt mod on it so it's still on the stock BIOS as a safety precaution. Once I fix that the card will be on a high power BIOS and should do well over 1210mhz with better efficiency.
> 
> 
> 
> I strapped 10K uF to the Vcore and 4K uF to the HBM. It went from crashing in seconds at 1170/570 to barely passing at 1170/600mhz. The primary card is un modded since it wouldn't fit next to my TC-14PE. Once I get H2O for the 5960X I mod that card too. It seems to be my second best and does 1190/570mhz rock solid on my latest high power BIOS.
> 
> My third card is currently not working because I'm a lazy idiot an decided to mod even though I still haven't bought better wire. The card's fine but the VRM won't power on until I fix it. It's also my worst card so I'm not too fussed about not having it functioning right now.
> 
> I should be able to get top 20 in Firestrike dual and triple GPU once I get everything 100% working.


Is there a way to fix a card with a fried VRM? I assume its a VRM as one of the caps on the back of the card is burnt badly.


----------



## p4inkill3r

Quote:


> Originally Posted by *battleaxe*
> 
> Is there a way to fix a card with a fried VRM? I assume its a VRM as one of the caps on the back of the card is burnt badly.


A RMA normally fixes that.


----------



## buildzoid

Quote:


> Originally Posted by *battleaxe*
> 
> Is there a way to fix a card with a fried VRM? I assume its a VRM as one of the caps on the back of the card is burnt badly.


All the cards in the photo are fine. The bad card is currently sitting dissasbled on my desk. I know what's wrong with it, it just takes a while to fix.


----------



## battleaxe

Quote:


> Originally Posted by *buildzoid*
> 
> All the cards in the photo are fine. The bad card is currently sitting dissasbled on my desk. I know what's wrong with it, it just takes a while to fix.


Yeah, I have a dead card. Burnt cap or something on the back of the card. I'm pretty handy with things, but had never tried to replace something like this. So I was just curious. There's several of them on the back, so its one of those little things right behind the VRM.

I have it out on RMA, but just in case they decide not to honor it, I would like to try to get it running again. It was a good card. What do you think? It possible?
Quote:


> Originally Posted by *p4inkill3r*
> 
> A RMA normally fixes that.


Yes. I'm just really hoping they fix my card and don't give me a different one. I had a good clocker. Before she blew up on me.


----------



## buildzoid

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I have a dead card. Burnt cap or something on the back of the card. I'm pretty handy with things, but had never tried to replace something like this. So I was just curious. There's several of them on the back, so its one of those little things right behind the VRM.
> 
> I have it out on RMA, but just in case they decide not to honor it, I would like to try to get it running again. It was a good card. What do you think? It possible?
> Yes. I'm just really hoping they fix my card and don't give me a different one. I had a good clocker. Before she blew up on me.


If you blew out one of the tantlum caps on the back you just need to desolder it and it should work again. If one of the FETs blew up you probably won't be able to fix it with some very expensive PCB repair equipment.


----------



## battleaxe

Quote:


> Originally Posted by *buildzoid*
> 
> If you blew out one of the tantlum caps on the back you just need to desolder it and it should work again. If one of the FETs blew up you probably won't be able to fix it with some very expensive PCB repair equipment.


Its one of the black items that stick up off the card the furthest. Its all black and you can see the contacts coming off the sides of it. There are several of them and one is badly burnt. I imagine they will replace it. But I'm trying to get a plan just in case.

Its about 1/8" of an inch long and slightly more narrow than wide. A little less than 1/8" maybe. I'll try to get a pick. This is a 390x BTW. Not a Fury or Nano. But you seemed like you had replaced some things before so figured I would ask. Seems like it would be kinda fun to replace if they don't fix it.

Edit: I am really hoping I get this exact card back. She was a good one before she popped.


----------



## jprovido

my gpu usage is all over the place I already reinstalled windows. are these normal for amd cards? I have an r9 fury (been an nvidia user for many years)


----------



## p4inkill3r

Quote:


> Originally Posted by *jprovido*
> 
> my gpu usage is all over the place I already reinstalled windows. are these normal for amd cards? I have an r9 fury (been an nvidia user for many years)


You're going to need to show some examples, nobody can diagnose an issue just from this scant information (other than your temporary insanity being lifted due to moving away from nvidia).


----------



## jprovido

Quote:


> Originally Posted by *p4inkill3r*
> 
> You're going to need to show some examples, nobody can diagnose an issue just from this scant information (other than your temporary insanity being lifted due to moving away from nvidia).


im out now I will post gpu usages later.

also I noticed in dota 2 im getting worse performance compared to my gtx 970. I can't hit 144fps anymore (144hz monitor) and my gpu load is maxed out at 100% I have my fury overclocked at 1110mhz 550mhz memory. compared to my gtx 970 I see the gpu load at 70-80%. I'm running freesync at 60hz to 144hz with my fury dunno if that makes a difference

both ran with my sig rig. i7 5820k @ 4.6ghz


----------



## jprovido

isn't this a bit low? it's overclocked too. I think there's something horribly wrong


----------



## Awsan

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> isn't this a bit low? it's overclocked too. I think there's something horribly wrong


Try running it at stock, i have seen several cards that will score considerably low when overclocked. [Fury magic







]


----------



## dagget3450

Quote:


> Originally Posted by *Awsan*
> 
> Try running it at stock, i have seen several cards that will score considerably low when overclocked. [Fury magic
> 
> 
> 
> 
> 
> 
> 
> ]


Yep, many are having negative effects when adding voltage. Try running at stock as mentioned or even a small oc on stock voltage and 50% pl


----------



## SuperZan

I get my best Fire Strike scores with my Fiji's at max power limit with overclock, 545 on HBM, no additional voltage. I've seen some people score higher with additional voltage, some without. It's really a YMMV, I haven't yet seen a simple formula for maxing benchmarks with Fiji. In the Novice Nimbles I had to come up with four different overclocks for four different benchmarks.


----------



## jprovido

stock clocks stock everything. im getting really annoyed now. can you guys post your fury benchmarks just for reference


----------



## SuperZan

When I'm home I'll see about trying to do one with a single GPU for your reference. This one is with Crossfire from when I still had my 3770k, it's the most recent FS Extreme I've run: http://www.3dmark.com/fs/6875677 .


----------



## dagget3450

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> stock clocks stock everything. im getting really annoyed now. can you guys post your fury benchmarks just for reference


Can you try 50+ mhz core/ +40mhz ram/ +50 Power limit and 0 voltage(default voltage)? curious to see if you will match your OC run earlier


----------



## jprovido

Quote:


> Originally Posted by *dagget3450*
> 
> Can you try 50+ mhz core/ +40mhz ram/ +50 Power limit and 0 voltage(default voltage)? curious to see if you will match your OC run earlier


crashes right away without the additional voltage. im getting bad stutters at far cry 3 too. and this is coming from a fresh install. I will try non-crimson drivers maybe it will perform better.


----------



## dagget3450

Quote:


> Originally Posted by *jprovido*
> 
> crashes right away without the additional voltage. im getting bad stutters at far cry 3 too. and this is coming from a fresh install. I will try non-crimson drivers maybe it will perform better.


Gotcha.

We have different setups and i dont want to pull my cards out, but running without crossfire i get this on a furyx - also tess is on with my run stock gpu clocks


----------



## SuperZan

Quote:


> Originally Posted by *jprovido*
> 
> crashes right away without the additional voltage. im getting bad stutters at far cry 3 too. and this is coming from a fresh install. I will try non-crimson drivers maybe it will perform better.


I had good luck with 15.7 and 15.11 both before I picked up my second card. I haven't run the latest Crimson drivers with a single GPU so I can't speak to their quality but those two editions of Catalyst worked well for me.


----------



## jprovido

i've had this card for 2 days I've had nothing but problems. I don't want to return it because freesync is freakin awesome. I will try other drivers. wish me luck


----------



## SuperZan

Best of luck to you. It's a great card, sorry to hear that it's been so troublesome for you. Hopefully a different driver will set it right.


----------



## jprovido

Quote:


> Originally Posted by *SuperZan*
> 
> Best of luck to you. It's a great card, sorry to hear that it's been so troublesome for you. Hopefully a different driver will set it right.


i'm not being biased or anything but my experience with my gtx 970 compared to this is night and day. I've only had the gtx 970 for less than 3 weeks so I was planning on returning it tomorrow but now I'm having second thoughts. I hope the other drivers will be better It's really hard to let go of freesync


----------



## SuperZan

Quote:


> Originally Posted by *jprovido*
> 
> i'm not being biased or anything but my experience with my gtx 970 compared to this is night and day. I've only had the gtx 970 for less than 3 weeks so I was planning on returning it tomorrow but now I'm having second thoughts. I hope the other drivers will be better It's really hard to let go of freesync


It's funny because I had the exact opposite, kept a 970 for all of a week coming from 770 SLI. It's odd what you've got going on though and it's hard to pin down what it might be. For one, in Crimson, disabling Power Efficiency under Gaming -> General could help. Otherwise the performance should be a bit better than what you had, so maybe under 15.7 or 15.11 you can find the performance and stability you want.


----------



## jprovido

Quote:


> Originally Posted by *SuperZan*
> 
> It's funny because I had the exact opposite, kept a 970 for all of a week coming from 770 SLI. It's odd what you've got going on though and it's hard to pin down what it might be. For one, in Crimson, disabling Power Efficiency under Gaming -> General could help. Otherwise the performance should be a bit better than what you had, so maybe under 15.7 or 15.11 you can find the performance and stability you want.


when it works it's awesome but it's like hit or miss for me. this destroys GTA V compared to my 970 but a measly game like dota 2 I can't maintain 144fps I'm getting 100-120fps and gpu load is 100% which is really weird. I have a far more consistent experience with my gtx 970.

I've tried 15.11 I still get stutters with far cry 3. I will focus on fixing this for now will update you guys I will try every bit of information I can google online


----------



## dagget3450

Quote:


> Originally Posted by *jprovido*
> 
> when it works it's awesome but it's like hit or miss for me. this destroys GTA V compared to my 970 but a measly game like dota 2 I can't maintain 144fps and gpu load is 100% which is really weird. I have a far more consistent experience with my gtx 970.
> 
> I've tried 15.11 I still get stutters with far cry 3. I will focus on fixing this for now will update you guys I will try every bit of information I can google online


Someone else was having similar issues i think.... well his\her was crossfire related nm..

http://www.overclock.net/t/1597252/r9-fury-crossfire-fps-drops-lags-unstable-framerate


----------



## jprovido

after trying to fix it for hours I decided to just return my r9 Fury. I guess it's having a problem with my system or something I can't really explain I've tried EVERYTHING I can't get this guy to work properly.







Freesync will be missed


----------



## SuperZan

Just kidding of course! Sorry to hear that the Fury wouldn't work for you, but enjoy the 970.


----------



## gupsterg

Are there members on OCN experiencing display corruption on Fury / Fury X / Nano like highlighted in this AMD Community thread?


----------



## xTesla1856

Quote:


> Originally Posted by *gupsterg*
> 
> Are there members on OCN experiencing display corruption on Fury / Fury X / Nano like highlighted in this AMD Community thread?


I experienced it once or twice right when I got my cards, but since then never. I think a driver might've fixed it, or my defective second card was causing it.


----------



## gupsterg

Cheers for info, when home I'll link the straw poll which is in that thread. IIRC about 30 to 40 people answered yes to having issue, I had 1 Fury X which did it twice, once on a HDTV via HDMI and once on PC monitor via DP. IIRC 16.3.x crimson driver.

Last night read whole thread and there are members repeatedly getting issue, even with latest drivers. Was surprised to read Clockblocker was created by a member to solve (workaround) to issue.

IIRC "Power Efficiency" switch in drivers was given to resolve display corruption but has not fixed it.

AMD had cards for testing that customers RMA'd but have not been able to reproduce issue. Some members have had issue on even replacement cards.

*** edit ***

Strawpoll link


----------



## DedEmbryonicCe1

This is the display corruption when idle on the desktop for very long periods of time? I haven't had that with 16.3.2 yet, but experienced it in older versions. For me the quickest fix was to open Trixx and hit Apply. Even if you didn't change any settings it would clear the corruption.


----------



## gupsterg

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> This is the display corruption when idle on the desktop for very long periods of time?


Threads context is mainly on light loads, some in other situations IIRC. Very random and intermittent, some get it every couple of hours use, others days apart. Some it goes away for a while and then strikes back with vengeance.

TBH if I was in the boat some members are with their cards in that thread I'd either get a refund and be buying a differing AMD card or go green







(I haven't had a green card in 5-6yrs).


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Threads context is mainly on light loads, some in other situations IIRC. Very random and intermittent, some get it every couple of hours use, others days apart. Some it goes away for a while and then strikes back with vengeance.
> 
> TBH if I was in the boat some members are with their cards in that thread I'd either get a refund and be buying a differing AMD card or go green
> 
> 
> 
> 
> 
> 
> 
> (I haven't had a green card in 5-6yrs).


I have yet to have this occur to me and I wonder if it is due to cables and not the GPU. I also have my system set to never sleep, hibernate but the monitor does shutdown after idle of 15 minutes so that may be why I haven't had it occur yet. I also tend to leave my system on 24/7 and only shut it down if I am going to be away from home for over 48 hours and only restart it if there is a reason to(software install, updates etc.)

My first Fury X was a dud but my replacement has been flawless so far.


----------



## gupsterg

Members in that AMD thread have tried:-

i) differing cables
ii) differing ports
iii) some differing screens
iv) refresh rates
v) drivers
vi) roms
vii) fresh install of OS

In my case 2 different rigs (in my sig) /screens/cables/connection type/refresh rates, same card though. As it only happened twice over approx. 30 day period it didn't bother me _and_ I wasn't keeping that card (Fury X).

On a Fury Tri-X I had, I never saw that issue, again disposed of after playing with it (~30days use). I have 1x Fury X left, so far no issue like that or any other. Both these cards were moved about between rigs like the one which exhibited issue.

Some of the members it's been a 4mth+ rollercoaster







.

My reason to post info was to see if there are other owners experiencing it on OCN and perhaps the members which come forward here can post there to add weight for a resolution from AMD.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Members in that AMD thread have tried:-
> 
> i) differing cables
> ii) differing ports
> iii) some differing screens
> iv) refresh rates
> v) drivers
> vi) roms
> vii) fresh install of OS
> 
> In my case 2 different rigs (in my sig) /screens/cables/connection type/refresh rates, same card though. As it only happened twice over approx. 30 day period it didn't bother me _and_ I wasn't keeping that card (Fury X).
> 
> On a Fury Tri-X I had, I never saw that issue, again disposed of after playing with it (~30days use). I have 1x Fury X left, so far no issue like that or any other. Both these cards were moved about between rigs like the one which exhibited issue.
> 
> Some of the members it's been a 4mth+ rollercoaster
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My reason to post info was to see if there are other owners experiencing it on OCN and perhaps the members which come forward here can post there to add weight for a resolution from AMD.


Sounds like a nightmare.....I'll update to my situation if it manifests. I don't have much time to game with school and work lately but the rig is almost always on and in an idle state. I am kind of hoping to see it manifest at this point as I have a ton of monitoring software running and absolutely nothing is overclocked and on this site that can server as a very useful variable especially with new hardware.

Thanks for the info as I only browse this thread casually as my Fury X problems have been fairly limited and until all this new hardware passes the 6 month mark I refuse to OC anything as a general personal rule. Everything passed severe burn in at stock which is good enough for me to start with. I am a novice overclocker so I tend to hold off on my newest rig for a bit.


----------



## AndreDVJ

Display corruption is caused by conflicts between monitoring software. Get rid of all monitoring software specially FRAPS, Tri-X, and older versions of MSI AB and HWiNFO. Stick with a single software for all your monitoring needs, either MSI Afterburner or HWiNFO. I resolved this like three months ago.


----------



## BIGTom

Quote:


> Originally Posted by *AndreDVJ*
> 
> Display corruption is caused by conflicts between monitoring software. Get rid of all monitoring software specially FRAPS, Tri-X, and older versions of MSI AB and HWiNFO. Stick with a single software for all your monitoring needs, either MSI Afterburner or HWiNFO. I resolved this like three months ago.


I believe monitoring software also may be the culprit. I have not experienced this issue at all on my Fury X that I purchased at launch last summer. I rarely use any monitoring software unless I am benching or testing an overclock..


----------



## Thoth420

I'm using AIDA64 for my primary monitoring software. I also have HWinfo and CPU Z....haven't run Fraps in ages.
As far as GPU OC software I prefer MSI AB but I don't have it installed at current as my Fury X is in a custom loop and not overclocked.


----------



## waltercaorle

hi, I have a tri-x with the official bios OC, connected via DP

no problem after hibernation state / inactivity
as soon as I got it I had several flickering problems in idle. Now with 16.3 the problem seems vanished. Anyway, I just turn on/off the monitor to make everything work
with >1.3v, the screen goes black but the gpu still works and sometimes the signal returns
with >1.4v ...







great electromagnetic shot in headphones and all is black


----------



## gupsterg

@Thoth420

Cheers for info to say your card is all good







, when I experienced display corruption card was at stock clocks/rom, etc.

@AndreDVJ

I had no monitoring software running in the background when it occurred on my 2x rigs same card







. Perhaps post your resolution on linked AMD thread to see if it helps others







.

@BIGTom @waltercaorle

Cheers guys for info to say your cards are all good







.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> @Thoth420
> 
> Cheers for info to say your card is all good
> 
> 
> 
> 
> 
> 
> 
> , when I experienced display corruption card was at stock clocks/rom, etc.
> 
> @AndreDVJ
> 
> I had no monitoring software running in the background when it occurred on my 2x rigs same card
> 
> 
> 
> 
> 
> 
> 
> . Perhaps post your resolution on linked AMD thread to see if it helps others
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @BIGTom @waltercaorle
> 
> Cheers guys for info to say your cards are all good
> 
> 
> 
> 
> 
> 
> 
> .


No worries and I will keep you all posted if it does manifest as my whole system is new hardware and quite new. Once a rig passes the 6 month mark without any problems then I start OCing starting with the CPU and while I could probably break my rule with this system since everything is running ice cold in the loop at stock clocks time has been my enemy lately. School, work and girlfriend leave me mostly resorting to gaming on my console unless I feel the urge to play something like BF4 or The Witcher 3. I can't wait for Deus Ex (and Hitman to drop all of the content) as that is really what I built this system for.


----------



## Perfect-Anubis

Hello everyone.

I plan on getting a new gpu very soon and am considering the Nano. I found one going for $450 on Amazon Warehouse Deals in "very good" condition. Is this a good deal price wise?


----------



## p4inkill3r

Quote:


> Originally Posted by *Perfect-Anubis*
> 
> Hello everyone.
> 
> I plan on getting a new gpu very soon and am considering the Nano. I found one going for $450 on Amazon Warehouse Deals in "very good" condition. Is this a good deal price wise?


That is an OK price IMO, but if you've waited this long, wait just a tad bit longer and see what you can in a couple of months when Polaris releases.


----------



## Mumak

If using HWiNFO, make sure to use the latest version (even Beta).


----------



## Flamingo

Quote:


> Originally Posted by *SLK*
> 
> Reduce the voltage. I can do -30 core voltage and sustain clocks. Its definitely the board power limit alright.


Hmm did you mean -30mV from MSI afterburner?

This is the result:


----------



## SLK

Yes, and I noticed it started throttling as well, just not as bad. Power consumption on my UPS rang in at 445w total which is high. I am afraid to undervolt it anymore but maybe it CAN undervolt even more. Even with -30mv the highest VCore recorded was 1.215v.


----------



## SLK

Quote:


> Originally Posted by *Perfect-Anubis*
> 
> Hello everyone.
> 
> I plan on getting a new gpu very soon and am considering the Nano. I found one going for $450 on Amazon Warehouse Deals in "very good" condition. Is this a good deal price wise?


Jet.com New XFX R9 nano $445.66 after shop15 promo. XFX warranty service is worth it.


----------



## Elmy

I get to play with this all weekend.

Radeon Pro Duo FTW !!!!!

#GettobeKingNerdforaweekend


----------



## SuperZan

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> I get to play with this all weekend.
> 
> Radeon Pro Duo FTW !!!!!
> 
> #GettobeKingNerdforaweekend


Ooh! She's beautiful.


----------



## vieuxchnock

Does somebody know what is that switch on the backside of a Sapphire Fury? It's not the dual bios switch.


----------



## Elmy

Quote:


> Originally Posted by *vieuxchnock*
> 
> Does somebody know what is that switch on the backside of a Sapphire Fury? It's not the dual bios switch.


I think thats the one that controls the color of the led.


----------



## p4inkill3r

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> I get to play with this all weekend.
> 
> Radeon Pro Duo FTW !!!!!
> 
> #GettobeKingNerdforaweekend


Are you allowed to post benchmarks?


----------



## dagget3450

I don't see why not, its 2 fury chips on a single pcb? It's either slightly slower or faster than 2 Furies so its not like its going to reveal a new world order. However i guess it's possible they have it under some sort of NDA?


----------



## p4inkill3r

Quote:


> Originally Posted by *dagget3450*
> 
> I don't see why not, its 2 fury chips on a single pcb? It's either slightly slower or faster than 2 Furies so its not like its going to reveal a new world order. However i guess it's possible they have it under some sort of NDA?


Benchmark/review embargoes are a thing.


----------



## buildzoid

So adding caps to the HBM VRMon a Fury X gets huge gains for benching. My very worst card topped out at 550mhz HBM. With 4K uF it could do 570mhz fully stable and 600mhz with heavy artifacting and crashing. With 6K uF the card does 600mhz still with heavy artifacts but it passes 3Dmark and scores into the 20Ks even with stock core clocks. With core clock at 1190mhz I'm getting over 21K. Mind you this is all on stock HBM voltages. I bet you could easily clean up the artifacts by just raising HBM voltage.

Also I think a better cap mix might do better than just piling on high capacity electrolytics like I did. Mine are all 7m ohm ESR. So maybe with some lower ESR caps added to the mix the stability would further improve.


----------



## Medusa666

Hi guys

I have a Sapphire R9 Fury Nitro OC, and the fans won't stop spinning in idle.

They are constantly at 25%, and when I game the RPM stays the same, 25%. No matter if the temperature rise to 85c, the fans do not increase in RPM and the card throttle.

I have tried with both BIOS, i.e the switch both pressed ( blue light ) and non-pressed ( no light ).

Any ideas?

Running Windows 10, 64-bit.

Glad for any advice.


----------



## gupsterg

I noted that behavior when modded a ROM to be Lookup Table fan mode rather than default Advanced fan mode (aka Fuzzy Logic).

Have you set a custom fan profile via software?


----------



## Elmy

Quote:


> Originally Posted by *p4inkill3r*
> 
> Are you allowed to post benchmarks?


Nope :-/

I can tell you its 10 and 1/2" s long.

For some reason I don't if its me or what but it seems to run smoother than 2 Fury X's. ( I didn't say faster... Just smoother)

It looks like it has the same fan and radiator as the Fury X.

It takes 3 8 pin to power it. But everyone knows that already.

It has the same connections as the Fury X. 3 Full size displayport and one HDMI.

It comes in a big box and has a what I assume is a dead Fiji chip as a souvenir.

I am not under an NDA. They never asked me to not post anything. But I know there is one and I can't release any performance numbers until I see them released by someone else first.

I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.

This will be hands down the best VR card for the time being. ( my opinion )

AMD has had the fastest Graphics card for the past 2 years with the 295X2. This thing will most likely take its place as the new fastest graphics card in the world. Its kind of funny when I am at LAN parties people come up to me and ask what GPU's I have because I have waterblocks on them and can't really tell unless you are a graphics card aficionado. I tell them that its a pair of the fastest GPU's in the world and then I ask them what that is and they usually respond with the Titan X. I then have to educate them.

Anyways I will have more information in the future.


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> Nope :-/
> 
> I can tell you its 10 and 1/2" s long.
> 
> For some reason I don't if its me or what but it seems to run smoother than 2 Fury X's. ( I didn't say faster... Just smoother)
> 
> It looks like it has the same fan and radiator as the Fury X.
> 
> It takes 3 8 pin to power it. But everyone knows that already.
> 
> It has the same connections as the Fury X. 3 Full size displayport and one HDMI.
> 
> It comes in a big box and has a what I assume is a dead Fiji chip as a souvenir.
> 
> I am not under an NDA. They never asked me to not post anything. But I know there is one and I can't release any performance numbers until I see them released by someone else first.
> 
> I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.
> 
> This will be hands down the best VR card for the time being. ( my opinion )
> 
> AMD has had the fastest Graphics card for the past 2 years with the 295X2. This thing will most likely take its place as the new fastest graphics card in the world. Its kind of funny when I am at LAN parties people come up to me and ask what GPU's I have because I have waterblocks on them and can't really tell unless you are a graphics card aficionado. I tell them that its a pair of the fastest GPU's in the world and then I ask them what that is and they usually respond with the Titan X. I then have to educate them.
> 
> Anyways I will have more information in the future.


On a side note, the fact they are releasing this now suggests to me the polaris launching in a couple months will all be low/mid range. I don't see them doing this if they were going to replace it with polaris in 2 months. So what i'm wondering is if people are going to be super disappointed on what launches by nvidia and amd this summer.(see a lot of them suggesting faster the 980ti by 30% etc..)


----------



## gupsterg

Quote:


> Originally Posted by *Elmy*
> 
> I get to play with this all weekend.
> 
> Radeon Pro Duo FTW !!!!!
> 
> #GettobeKingNerdforaweekend











Quote:


> Originally Posted by *Elmy*
> 
> I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.










, really good for a 1x 120mm rad cooling 2x Fiji, for me much easier to house this setup than 2x Fury X, only downside I guess is price







.


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *buildzoid*
> 
> So adding caps to the HBM VRMon a Fury X gets huge gains for benching. My very worst card topped out at 550mhz HBM. With 4K uF it could do 570mhz fully stable and 600mhz with heavy artifacting and crashing. With 6K uF the card does 600mhz still with heavy artifacts but it passes 3Dmark and scores into the 20Ks even with stock core clocks. With core clock at 1190mhz I'm getting over 21K. Mind you this is all on stock HBM voltages. I bet you could easily clean up the artifacts by just raising HBM voltage.
> 
> Also I think a better cap mix might do better than just piling on high capacity electrolytics like I did. Mine are all 7m ohm ESR. So maybe with some lower ESR caps added to the mix the stability would further improve.


Are you going to post a guide when you are done?


----------



## buildzoid

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> Are you going to post a guide when you are done?


IDK I don't think it's really guide worthy material. I might do a quick video guide or something. I'll probably do a general cap mod guide because it's mostly just knowing what caps to choose and then just finding a places where the caps can easily be attached to GND and the voltage that you're trying to smooth.


----------



## p4inkill3r

Quote:


> Originally Posted by *Elmy*
> 
> Nope :-/
> 
> I can tell you its 10 and 1/2" s long.
> 
> For some reason I don't if its me or what but it seems to run smoother than 2 Fury X's. ( I didn't say faster... Just smoother)
> 
> It looks like it has the same fan and radiator as the Fury X.
> 
> It takes 3 8 pin to power it. But everyone knows that already.
> 
> It has the same connections as the Fury X. 3 Full size displayport and one HDMI.
> 
> It comes in a big box and has a what I assume is a dead Fiji chip as a souvenir.
> 
> I am not under an NDA. They never asked me to not post anything. But I know there is one and I can't release any performance numbers until I see them released by someone else first.
> 
> I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.
> 
> This will be hands down the best VR card for the time being. ( my opinion )
> 
> AMD has had the fastest Graphics card for the past 2 years with the 295X2. This thing will most likely take its place as the new fastest graphics card in the world. Its kind of funny when I am at LAN parties people come up to me and ask what GPU's I have because I have waterblocks on them and can't really tell unless you are a graphics card aficionado. I tell them that its a pair of the fastest GPU's in the world and then I ask them what that is and they usually respond with the Titan X. I then have to educate them.
> 
> Anyways I will have more information in the future.












Thanks for the information.


----------



## djsatane

Hi, I just got ASUS fury x and installed it, my previous card was 290x and I used older trixx software to use custom fan profile. Now with latest Crimson drivers and fury x my question is whether there is any way to set target gpu temperature and target fan speed in crimson options where I would have fan to actually speed up already at 50C or do I have to use latest Trixx software if I want to have nice custom fan profile? Thanks! Btw I will read how to join the club and take pics a bit later.


----------



## gupsterg

@djsatane

Crimson drivers not gonna allow what you wish to achieve.

You got to 2 routes:-

i) custom fan curve via MSI AB, TriXX, etc

ii) bios mod


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> @djsatane
> 
> Crimson drivers not gonna allow what you wish to achieve.
> 
> You got to 2 routes:-
> 
> i) custom fan curve via MSI AB, TriXX, etc
> 
> ii) bios mod


Thanks for reply! I would use TriXX but the new 5.xx Trixx utility cannot be minimized.... they still havent fixed that so I cannot really use that, very annoying, I will try MSI AB. As for bios mod thats for last resort as all I want is custom fan profile.


----------



## gupsterg

Bios mod is the best IMO







, you need no SW running or setting fan







.

i) save stock ROM using GPU-Z .

ii) use Fiji Bios editor to make your stock ROM like this:-



iii) flash modded ROM using ATiWinFlash (in op of that thread).


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> Bios mod is the best IMO
> 
> 
> 
> 
> 
> 
> 
> , you need no SW running or setting fan
> 
> 
> 
> 
> 
> 
> 
> .
> 
> i) save stock ROM using GPU-Z .
> 
> ii) use Fiji Bios editor to make your stock ROM like this:-
> 
> 
> 
> iii) flash modded ROM using ATiWinFlash (in op of that thread).


Yes, I may do this, I wish these 2 extra options were in Crimson drivers like in that picture of bios editor... lol but thats too much to ask from AMD I guess...


----------



## gupsterg

Totally agree







, if they were in driver it would be ACE







, instead they give you 2 options which are useless







.


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> ii) use Fiji Bios editor to make your stock ROM like this:-
> 
> 
> 
> iii) flash modded ROM using ATiWinFlash (in op of that thread).


I have used your settings and modded my bios(I used latest asus provided bios for fury x off their website as original before edit), and original that came with card on the other bios switch. So far all good but I have to do some testing, thanks for your help!

My build is simple, nothing flashy but I am happy with my fury x so far!


----------



## bluezone

Quote:


> Originally Posted by *djsatane*
> 
> Hi, I just got ASUS fury x and installed it, my previous card was 290x and I used older trixx software to use custom fan profile. Now with latest Crimson drivers and fury x my question is whether there is any way to set target gpu temperature and target fan speed in crimson options where I would have fan to actually speed up already at 50C or do I have to use latest Trixx software if I want to have nice custom fan profile? Thanks! Btw I will read how to join the club and take pics a bit later.


Does the temperature target feature in the Crimson driver not work?


----------



## gupsterg

Quote:


> Originally Posted by *djsatane*
> 
> I have used your settings and modded my bios(I used latest asus provided bios for fury x off their website as original before edit), and original that came with card on the other bios switch. So far all good but I have to do some testing, thanks for your help!
> 
> My build is simple, nothing flashy but I am happy with my fury x so far!


No worries







.
Quote:


> Originally Posted by *bluezone*
> 
> Does the temperature target feature in the Crimson driver not work?


All that does is make your card throttle to keep to target gpu temp, it's not the fan table portion of target gpu temp which effects fan.


----------



## Flamingo

How will the Polaris 10 compared to the Nano? If its ~2k stream processors with 256bit, it cant be faster than the Nano right? Im assuming that would be the R9 480.

But the Polaris lineup supposedly includes the 490 and 490x. and supposedly launches in June... is it a good time to sell the Nano and wait for the 490?

Ofcourse it has to be SFF like the Nano









Also considering the P10 ran Hitman DX12 at 1440p at 60fps:



Anyone with a Nano and Hitman can check and see if it runs 1440p at 60fps?


----------



## dagget3450

I am selling off my furyx, but not cause of polaris. I have a feeling that polaris will disappoint if your expecting high end. What i mean is given the launch of pro duo makes me think they will launch low/midline cards in june. Then maybe higher end later in year.? This is just speculation on my part.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> All that does is make your card throttle to keep to target gpu temp, it's not the fan table portion of target gpu temp which effects fan.


I didn't know that. Hearing that now, I'll probably never try it and stick with Trixx minimised to the task bar.


----------



## bluezone

Anyone else read this spec leak about Polaris and Vega?

http://wccftech.com/amd-polaris-11-gpu-specifications-leaked-compubench/


----------



## DedEmbryonicCe1

Yesterday I managed to get the IR 3567B voltage controller to its max (http://www.irf.com/product-info/datasheets/data/pb-ir3567b.pdf) temp of 85°C. There was a noticeable "hot electronics" smell (not burnt) that led me to back off on my voltage. It could be a temporary thing when it first gets to this temperature but I'm being extra cautious.


Spoiler: Warning: Spoiler!







I looked back through old reviews with teardowns of the Fury X cooler (https://www.techpowerup.com/reviews/AMD/R9_Fury_X/5.html) to realize it has no thermal pad connecting it to the aluminum frame. That seems like an incredibly simple fix that should have been included from the start but oh well... I don't suppose anyone happens to know the height of the air gap at that location?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I didn't know that. Hearing that now, I'll probably never try it and stick with Trixx minimised to the task bar.


No worries







, why use SW for fan profile when you can mod the ROM?

You do know ROM uses Advanced fan control (aka Fuzzy Logic) based on load, temps etc it will adjust fan profile to maintain the temp target, it is more "dynamic" than say custom fan curve with SW, which is basically lookup table ie for x degrees do x PWM.
Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> Yesterday I managed to get the IR 3567B voltage controller to its max (http://www.irf.com/product-info/datasheets/data/pb-ir3567b.pdf) temp of 85°C.


The temps you see in HWiNFO are for the mosfets per loop, ie loop 1 = GPU loop 2 = HBM. They are not temps of IR3567B, it does not need cooling.



Basically the the HBM VRM is cooler due to load and also isn't getting heated coolant flowing to it like the GPU VRM.



Quote:


> Originally Posted by *bluezone*
> 
> Anyone else read this spec leak about Polaris and Vega?
> 
> http://wccftech.com/amd-polaris-11-gpu-specifications-leaked-compubench/


Yep and other stuff, you seen this.


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *gupsterg*
> 
> The temps you see in HWiNFO are for the mosfets per loop, ie loop 1 = GPU loop 2 = HBM. They are not temps of IR3567B, it does not need cooling.


So VR VDDC = GPU and VRM VDD = HBM?
1.275v on the core gave me 85°C and 1.25v was 80°C. I'm guessing the VRMs are rated to 100-105°C or thereabouts.


----------



## Visa Declined

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> So VR VDDC = GPU and VRM VDD = HBM?
> 1.275v on the core gave me 85°C and 1.25v was 80°C. I'm guessing the VRMs are rated to 100-105°C or thereabouts.


I personally did not see the performance gains in adding voltage and clocking past 1,100Mhz with my Fury's. I guess I'm getting too old to see the worth in stressing the cards components out, only for relatively small gains. IMO these cards are already pushed to their efficiency limit when they leave the factory.


----------



## wesbluemarine

I have a R9 Nano and i'm using the Catalyst 15.11.1, because they provide a gaming experience without coil whine (except a bit on The Witcher 3).
I'd like to use crimson drivers, because they provide better performance (i've tested myself) but they produce a bit louder coil whine.
Isn't possibile to eliminate coil whine with overclock/overvolt? Any R9 Nano/Fury X owner can help me?
Thanks


----------



## Visa Declined

Quote:


> Originally Posted by *wesbluemarine*
> 
> I have a R9 Nano and i'm using the Catalyst 15.11.1, because they provide a gaming experience without coil whine (except a bit on The Witcher 3).
> I'd like to use crimson drivers, because they provide better performance (i've tested myself) but they produce a bit louder coil whine.
> Isn't possibile to eliminate coil whine with overclock/overvolt? Any R9 Nano/Fury X owner can help me?
> Thanks


I've read through a ton of Amazon reviews for Nano's, and it seems like every manufacturers version has coil whine to some degree. I personally wouldn't let that stop me from using the best performing driver, but it sounds like it's bugging you pretty bad. Unchecked/super-high frame rates will cause my Fury's to whine sometimes too, the exiting screen on the Unigine Heaven benchmark will make my cards scream like crickets on meth.


----------



## dagget3450

Quote:


> Originally Posted by *Visa Declined*
> 
> I've read through a ton of Amazon reviews for Nano's, and it seems like every manufacturers version has coil whine to some degree. I personally wouldn't let that stop me from using the best performing driver, but it sounds like it's bugging you pretty bad. Unchecked/super-high frame rates will cause my Fury's to whine sometimes too, the exiting screen on the Unigine Heaven benchmark will make my cards scream like crickets on meth.


Crickets are on meth screaming, or your on meth when you hear crickets scream?


----------



## gupsterg

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> So VR VDDC = GPU and VRM VDD = HBM?
> 1.275v on the core gave me 85°C and 1.25v was 80°C. I'm guessing the VRMs are rated to 100-105°C or thereabouts.


Ref this post, PowerPlay of Fiji has VRTemp limit of 105C.

http://www.irf.com/product-info/datasheets/data/irf6811spbf.pdf
http://www.irf.com/product-info/datasheets/data/irf6894mpbf.pdf
Quote:


> Originally Posted by *Visa Declined*
> 
> I personally did not see the performance gains in adding voltage and clocking past 1,100Mhz with my Fury's. I guess I'm getting too old to see the worth in stressing the cards components out, only for relatively small gains. IMO these cards are already pushed to their efficiency limit when they leave the factory.


My stock VID for DPM 7 is 1.212V , I can do 1135/535 with 1.243V (ie +31mV). GPU +8% over stock, RAM +7% over stock with +2.6% VID over stock ((0.031/1.212)*100). Note I stated VID not VDDC as shown in monitoring software.

Here is my 3DM compare, so I gain 6.1%-6.8% in graphic oriented tests, 9.8% in combined test where CPU & GPU are tested. To me that's good scaling and especially for +2.6% VID increase.

There is a phenomenon with Fiji were by adding extra voltage you may experience FPS drop. I have noted it and also Buildzoid plus others, yet to determine if it is LeakageID based, VID or powerdraw, etc.



So far +50mV I note no drop in FPS, 3DM compare.


----------



## Visa Declined

Those are really nice results gupsterg, what are your temps like with that large of an overclock? Sorry if you typed it and I missed it, I'm on mobile inside of a noisey club.


----------



## gupsterg

Cheers







.

I will post screenies later, now I'm on a mobile







.

From memory max I see max 77C on GPU VRM with 3DM FS Demo looped, GPU is maintained at ~50C. That is of course a custom ROM







. Mods are:-

i) GPU DPM7 VID & clock mod.
ii) HBM clock mod in ROM with RAM Overdrive matched.
iii) GPU target temp in fan table of PowerPlay mod, hence GPU/HBM/VRM cooler.
iv) Powerlimit in PowerTune table of PowerPlay mod.

It blows away my best 24/7 use Hawaii card with custom ROM







. [email protected] is much cooler IIRC than 3DM FS, multiple testing of upto 18hrs so far. Much quieter/cooler than any of the Hawaii aftermarket cooler cards I've had (DCU2/Tri-X/Vapor-X). And still very close to stock AIO fan noise with profile mod IMO







.

I had originally 3 cards (2x Fury X, 1x Fury Tri-X), I kept the best one







. My other Fury X had 1.25V VID DPM7 max it would do is 1090/[email protected] VID. Any VID increase upto 50mV would not help it at all. Fury Tri-X DPM7 1.243V, again 1090/525, upto 50mV applied no gain in OC. That did unlock to 3840SP, was benching identical to genuine Fury X clock for clock in 3DM FS.

I will also post some data later which IMO proves that HBM does not stick to discrete steps like AMD Matt has posted on several forums (ie 500, 545, ...).

Do a search for Fiji bios mod on OCN, you'll find info that should help with your OC'ing via ROM if wish to implement







.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> , why use SW for fan profile when you can mod the ROM?
> 
> You do know ROM uses Advanced fan control (aka Fuzzy Logic) based on load, temps etc it will adjust fan profile to maintain the temp target, it is more "dynamic" than say custom fan curve with SW, which is basically lookup table ie for x degrees do x PWM.
> 
> Yep and other stuff, you seen this.


Yes it's one the reasons I keep lurking on the Bios Mod thread.








I'm on the fence because I get decent performance now and I'm not versed in doing an actual Bios Mods.

That was a very interesting read.

This is a YouTube video by fellow who does a good aggregate of GPU and CPU information. Iike the way he thinks.





.

I have insomnia again and I'm about to try going back to bed.


----------



## Stige

Asking this on behalf of a friend so bear with me, I know very little about these.

It seems there is only a single 8 pin connector on the Nano and it power throttles a lot due to that? Is there anything to do to redeem this issue apart from undervolting this card? The nano seems rather.. underwhelming due to this issue I think.


----------



## Visa Declined

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .
> 
> From memory max I see max 77C on GPU VRM with 3DM FS Demo looped, GPU is maintained at ~50C. That is of course a custom ROM
> 
> 
> 
> 
> 
> 
> 
> . Mods are:-


That is totally awesome!


----------



## gupsterg

Cheers







.

Stock TIM/pads, AIO unit, fan, etc, only thing not stock is fan profile







. My case has mods but nothing which I class as over the top which would significantly improve temps with AIO perfomance on Fury X







.

TBH I wish I had the Tri-X cooler on Fury X. That with fan mod also maintain ~55c tested only, could do better IMO, not much louder than stock profile. GPU VRM cooler on that vs Fury X as independent cooling to GPU plus no heated coolant going to it. I found it easier to handle that card plus better case compatibility.

Room ambients upto ~24c in all scenarios.


----------



## hyp36rmax

Hey guys a BIOS update for the NANO and FURY X

Quote:


> *AMD Outs Video BIOS Update for R9 Fury Series with Improved UEFI Support*
> 
> by
> 
> btarunr
> 
> Monday, April 18th 2016 02:07 Discuss (4 Comments)
> AMD released an official video-card BIOS update for the Radeon R9 Fury X and Radeon R9 Nano graphics cards, which improve UEFI BIOS support. End users on our forums are also reporting improved overclocking stability. UEFI support at the video-BIOS level is required for the card to run without CSM at the system-BIOS end, in turn enabling useful OS features such as Secure Boot. Several of AMD's add-in board (AIB) partners already ship their cards with UEFI-ready BIOS. AMD is distributing the BIOS as ROM images, and it takes thorough knowledge of how to flash your graphics card's BIOS, to make use of these ROM images.
> 
> *DOWNLOAD:* AMD Video BIOS Update for Radeon R9 Nano | Radeon R9 Fury X | From AMD Website


*Source: *Link


----------



## Alastair

Quote:


> Originally Posted by *hyp36rmax*
> 
> Hey guys a BIOS update for the NANO and FURY X
> Quote:
> 
> 
> 
> *AMD Outs Video BIOS Update for R9 Fury Series with Improved UEFI Support*
> 
> by
> 
> btarunr
> 
> Monday, April 18th 2016 02:07 Discuss (4 Comments)
> 
> AMD released an official video-card BIOS update for the Radeon R9 Fury X and Radeon R9 Nano graphics cards, which improve UEFI BIOS support. End users on our forums are also reporting improved overclocking stability. UEFI support at the video-BIOS level is required for the card to run without CSM at the system-BIOS end, in turn enabling useful OS features such as Secure Boot. Several of AMD's add-in board (AIB) partners already ship their cards with UEFI-ready BIOS. AMD is distributing the BIOS as ROM images, and it takes thorough knowledge of how to flash your graphics card's BIOS, to make use of these ROM images.
> 
> 
> 
> *DOWNLOAD:* AMD Video BIOS Update for Radeon R9 Nano | Radeon R9 Fury X | From AMD Website
> 
> 
> 
> *Source: *Link
Click to expand...

No love for Fury Tri-X owners?


----------



## dagget3450

Good lord how long have these gpus been out? I guess it's nice they are actively improving. I wonder what effect this will have on overclocking.


----------



## gupsterg

Quote:


> Originally Posted by *hyp36rmax*
> 
> Hey guys a BIOS update for the NANO and FURY


AIBs have also had updated ROM for a while, for example Asus and Gigabyte have had them on their sites, but the AMD Community forum ones are the latest version/compile date.

Sapphire gave me an updated one for Fury X via support ticket service, "out of box" Fury X was Non UEFI ROM, the AMD release is still newer than the updated UEFI ROM by Sapphire. There are updated Fury Tri-X STD & OC edition as well, Szaby59 posted them in fiji bios mod thread.

The latest AMD Fury X and Nano ROMs I have modded to have VDDC offset to allow more than 1.3V if required and also done MVDDC offset, they are in OP of Fiji bios mod thread







. Users who have tested the Fury X & Nano ROMs have had positive feedback in that thread as well







.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hyp36rmax*
> 
> Hey guys a BIOS update for the NANO and FURY
> 
> 
> 
> AIBs have also had updated ROM for a while, for example Asus and Gigabyte have had them on their sites, but the AMD Community forum ones are the latest version/compile date.
> 
> Sapphire gave me an updated one for Fury X via support ticket service, "out of box" Fury X was Non UEFI ROM, the AMD release is still newer than the updated UEFI ROM by Sapphire. There are updated Fury Tri-X STD & OC edition as well, Szaby59 posted them in fiji bios mod thread.
> 
> The latest AMD Fury X and Nano ROMs I have modded to have VDDC offset to allow more than 1.3V if required and also done MVDDC offset, they are in OP of Fiji bios mod thread
> 
> 
> 
> 
> 
> 
> 
> . Users who have tested the Fury X & Nano ROMs have had positive feedback in that thread as well
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Do these "new" BIOS'es still have the negative scaling as the offset voltage increases?


----------



## Maximization

Where do get new bios......?????


----------



## toncij

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> I get to play with this all weekend.
> 
> Radeon Pro Duo FTW !!!!!
> 
> #GettobeKingNerdforaweekend


In the lower right corner you get a replacement chip?


----------



## bluezone

Quote:


> Originally Posted by *hyp36rmax*
> 
> Hey guys a BIOS update for the NANO and FURY X
> 
> *Source: *Link


I flashed the newest Nano Bios and it seems a little bit better then the one I was using. Weird thing is though I kept getting strange cooling fan speed spikes with it. So I monitored with GPU-Z and it was showing 419C spikes. Not possible and no there isn't a decimal point missing. LOL

Very starnge.


----------



## dagget3450

Quote:


> Originally Posted by *toncij*
> 
> In the lower right corner you get a replacement chip?


They include a free replacement chip in case you kill one overclocking. Just pop open the shroud an viola!


----------



## bluezone

Quote:


> Originally Posted by *Stige*
> 
> Asking this on behalf of a friend so bear with me, I know very little about these.
> 
> It seems there is only a single 8 pin connector on the Nano and it power throttles a lot due to that? Is there anything to do to redeem this issue apart from undervolting this card? The nano seems rather.. underwhelming due to this issue I think.


Undervolting on the Nano is used for two reasons. 1/ to reduce temperatures and 2/ allow the Power limit to be set at +50% and still get performance.
IMO It's roughly comparable to (maybe a little better depending on the game) to a GTX 980. Throw a water block($$) on it and looks a even better.


----------



## Zealon

I've got it flashed to the new BIOS. Testing some games now.



http://www.techpowerup.com/gpuz/details/8bsgy


----------



## buildzoid

Quote:


> Originally Posted by *Alastair*
> 
> Do these "new" BIOS'es still have the negative scaling as the offset voltage increases?


As far as I can tell they do but on the other hand they clock really well. I have the BIOS modded for 1.3V VID and all three of my cards are doing about 1190mhz core in 3Dmark.


----------



## toncij

Quote:


> Originally Posted by *dagget3450*
> 
> They include a free replacement chip in case you kill one overclocking. Just pop open the shroud an viola!


No, really, what is it? It does look like a chip.














A sticker?


----------



## xkm1948

Quote:


> Originally Posted by *toncij*
> 
> No, really, what is it? It does look like a chip.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A sticker?


Defective Fiji core as decoration. They should have made it into a key-chain.


----------



## toncij

Quote:


> Originally Posted by *xkm1948*
> 
> Defective Fiji core as decoration. They should have made it into a key-chain.


LOL, that is a neat touch.
Are you under NDA or you can tell us what do you think of the card? How it runs? Compared to available R9 Nano CF, etc... Some 3D Mark maybe?
Where did you buy it so early anyway?


----------



## Agent Smith1984

He didn't buy it, lol

Pretty sure he is keeping things under wraps until someone else posts about the card first....

He has already given a nice report on temps and how "smooth" the card runs, though he said something to the equivalent of "not saying it's faster than Fury X CF, just seems smoother"

I am looking forward to a review, but I gotta be honest..... At $1500 there's no way this card stands a chance in the market except for people wanting to build smaller form factor systems, OR people wanting 4x CF in a smaller mid-tower setup (like my s340)

On the other hand, if this thing were $1000-1200, then I could easily see it having a place in any high-end rig. Just a little too rich for my blood.


----------



## Alastair

Quote:


> Originally Posted by *toncij*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Elmy*
> 
> 
> 
> I get to play with this all weekend.
> 
> Radeon Pro Duo FTW !!!!!
> 
> #GettobeKingNerdforaweekend
> 
> 
> 
> In the lower right corner you get a replacement chip?
Click to expand...

I will buy that chip and pay for shipping. The decoration chip! Please! Senpai! I beg you!


----------



## toncij

Quote:


> Originally Posted by *Agent Smith1984*
> 
> He didn't buy it, lol
> 
> Pretty sure he is keeping things under wraps until someone else posts about the card first....
> 
> He has already given a nice report on temps and how "smooth" the card runs, though he said something to the equivalent of "not saying it's faster than Fury X CF, just seems smoother"
> 
> I am looking forward to a review, but I gotta be honest..... At $1500 there's no way this card stands a chance in the market except for people wanting to build smaller form factor systems, OR people wanting 4x CF in a smaller mid-tower setup (like my s340)
> 
> On the other hand, if this thing were $1000-1200, then I could easily see it having a place in any high-end rig. Just a little too rich for my blood.


He got it for free for a review or as an AMD partner?

Yes, $1500 is high, but if you have a reason to go for 4x CF on 2 slots or 2x on 1, it's the only solution. Today when we have SSDs on PCIe, that makes sense.


----------



## Agent Smith1984

Anyone interested in performance numbers..... here is the footnote on the product page at the AMD website:

Testing conducted by AMD Performance Labs as of March 7, 2016 on the AMD Radeon™ Pro Duo, AMD Radeon™ R9 295X2 and Nvidia's Titan Z, all dual GPU cards, on a test system comprising Intel i7 5960X CPU, 16GB memory, Nvidia driver 361.91, AMD driver 15.301 and Windows 10 using 3DMark Fire Strike benchmark test to simulate GPU performance. PC Manufacturers may vary configurations, yielding different results. At 1080p, 1440p, and 2160P, AMD Radeon™ R9 295X2 scored 16717, 9250, and 5121, respectively; Titan Z scored 14945, 7740, and 4099, respectively; and AMD Radeon™ Pro Duo scored 20150, 11466, and 6211, respectively, outperforming both AMD Radeon™ R9 295X2 and Titan Z. RPD-1

The problem with those numbers, is that they appear to be overall scores so no actual GFX score information.


----------



## bluezone

On a Off Topic: I see the new possible PS4 seems to be quite an upgrade. looks like they will be using a R9 490 equivalent for the GPU. The internal name is NEO.
Project Morphius Was PlayStation VR. Any bets on Project Trinity or Project Agent Smith?









http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluezone*
> 
> On a Off Topic: I see the new possible PS4 seems to be quite an upgrade. looks like they will be using a R9 490 equivalent for the GPU. The internal name is NEO.
> Project Morphius Was PlayStation VR. Any bets on Project Trinity or Project Agent Smith?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/


I read about it and discussed in the 390 thread earlier.

It's still only going to have 36 CU's in the 900MHz range, so on paper it's only going to be as fast as maybe the 380x, however if it's new GCN architecture, and gets the 2306 shaders forecasted for Polaris 10, it could be pretty strong for console.... Based on the bandwidth improvement, it may also use GDDR5x instead of GDDR5....

I'm pretty sure that they will definitely use polaris for this project to save on power consumption and console's worst enemy... HEAT

Here is Mircrosoft's second chance to make a worthy Xbox One.


----------



## bluezone

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I read about it and discussed in the 390 thread earlier.
> 
> It's still only going to have 36 CU's in the 900MHz range, so on paper it's only going to be as fast as maybe the 380x, however if it's new GCN architecture, and gets the 2306 shaders forecasted for Polaris 10, it could be pretty strong for console.... Based on the bandwidth improvement, it may also use GDDR5x instead of GDDR5....
> 
> I'm pretty sure that they will definitely use polaris for this project to save on power consumption and console's worst enemy... HEAT
> 
> Here is Mircrosoft's second chance to make a worthy Xbox One.


Here's a link to that discussion. Interesting read.

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/8880#post_25088761

this might be it here

http://www.redgamingtech.com/amd-polaris-10-11-gpu-specs-revealed-shader-power-draw/


----------



## dagget3450

Any updates on new fiji bios and overclocking yet?


----------



## baii

Can I flash the uefi fury x bios on a full unlocked fury?

I have some issue with 4 k display getting intermediate black screen, or not waking up properly, some time driver crash like forced restart. Although I should blame my long DP cable for the first problem.


----------



## Maximization

]
Quote:


> Originally Posted by *dagget3450*
> 
> Any updates on new fiji bios and overclocking yet?


I did get my highest score with same overclock.

http://www.3dmark.com/fs/8238446

set aside some time, it broke my crossfire during bios upgrade and win 10 had to do a repair routine. I am lucky I have dip switches to kill my pci e slots, had too switch cable around and do each card by itself. AMD loaded most recent driver. using msi afterburner. haven't pushed further yet.


----------



## n64ADL

do you guys think i should buy a r9 nano now and wait for polaris?? i have a r9 390 and it doesn't cut it for most games at 3440x1440. i have a 750 watt power supply and getting two r9 390's plus the core i7 5820k limited bandwidth makes it impractical. microcenter if you get their warranties offer a two month return policy for their products. should i just jump now, two weeks away seems so far.....?????


----------



## toncij

Well, I doubt any Polaris chip released now will match Fury line, including Nano (which is actually a Fury X with thermal limits). Watercool a Nano and you're set until Vega next year.


----------



## Elmy

Quote:


> Originally Posted by *toncij*
> 
> LOL, that is a neat touch.
> Are you under NDA or you can tell us what do you think of the card? How it runs? Compared to available R9 Nano CF, etc... Some 3D Mark maybe?
> Where did you buy it so early anyway?


Didn't buy it. Its hard to explain how I got it. But I am sponsored by AMD and have a relationship with them. I do have a 2nd one coming too.

If you are super curious about what I do. Go check out my Youtube page. https://www.youtube.com/channel/UCYBxPCnM13zEWgNbixq84jA ... I have a whole bunch of stuff there. Need to make some new videos soon though. If you really like my stuff ... Subscribe.

You can also follow me on twitter https://twitter.com/elmnator I post a lot more there.

I have been on the cover of CPU magazine twice with my builds. Oct 2011 and March 2015 if you want to go look them up too.

I never signed an NDA but I did ask If I could release any performance numbers and I was told not until the NDA is up. I can't say when the NDA is up either. Thats part of an NDA.

Originally Posted by Elmy View Post

Nope :-/

I can tell you its 10 and 1/2" s long.

For some reason I don't if its me or what but it seems to run smoother than 2 Fury X's. ( I didn't say faster... Just smoother)

It looks like it has the same fan and radiator as the Fury X.

It takes 3 8 pin to power it. But everyone knows that already.

It has the same connections as the Fury X. 3 Full size displayport and one HDMI.

It comes in a big box and has a what I assume is a dead Fiji chip as a souvenir. I am not selling this JFYI

I am not under an NDA. They never asked me to not post anything. But I know there is one and I can't release any performance numbers until I see them released by someone else first.

I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.

This will be hands down the best VR card for the time being. ( my opinion )

AMD has had the fastest Graphics card for the past 2 years with the 295X2. This thing will most likely take its place as the new fastest graphics card in the world. Its kind of funny when I am at LAN parties people come up to me and ask what GPU's I have because I have waterblocks on them and can't really tell unless you are a graphics card aficionado. I tell them that its a pair of the fastest GPU's in the world and then I ask them what that is and they usually respond with the Titan X. I then have to educate them.

Anyways I will have more information in the future.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> He didn't buy it, lol
> 
> Pretty sure he is keeping things under wraps until someone else posts about the card first....
> 
> He has already given a nice report on temps and how "smooth" the card runs, though he said something to the equivalent of "not saying it's faster than Fury X CF, just seems smoother"
> 
> I am looking forward to a review, but I gotta be honest..... At $1500 there's no way this card stands a chance in the market except for people wanting to build smaller form factor systems, OR people wanting 4x CF in a smaller mid-tower setup (like my s340)
> 
> On the other hand, if this thing were $1000-1200, then I could easily see it having a place in any high-end rig. Just a little too rich for my blood.


This....

NDA is lifting sometime in 2016 guaranteed.... LoL .... I will post back here with information then.

There will be plenty of people to buy this card at whatever retail price just like the 295X2. Sold plenty of them too.

Quote:


> Originally Posted by *Alastair*
> 
> I will buy that chip and pay for shipping. The decoration chip! Please! Senpai! I beg you!


Not for sale at this time... Sorry :-(

It has been amazing playing on this card. Not a single hickup at all. TBH I would rather play on this card then my Quad Fury X setup. I will hopefully have a Youtube video up comparing Fury X in crossfire to this card on release day.


----------



## Butthurt Beluga

hey guys, I'm looking to get an R9 Fury X to replace my R9 290X.
I know that the new Polaris GPUs are supposedly coming out in June, but I'm not looking to upgrade until the 'big' Polaris GPU (or Vega) until later this year/early next year.
Just had a few questions..

So should I do a complete driver wipe and reinstall even though I have a 290X?

Any preferred manufacturer or ones to avoid?

I was thinking about getting a Fury X second hand but I'm afraid that I'll get a low overclocking chip or perhaps one of the earlier models that had issues with coil whine, at what price point does it make sense to get a new one over an old one?
Right now (at least from what I can see) there's a new Fury X on newegg for as low as $599, and a couple second hand ones that might be at most $80 USD cheaper than a new one.

Any feedback is appreciated, thanks guys


----------



## toncij

Quote:


> Originally Posted by *Elmy*
> 
> It has been amazing playing on this card. Not a single hickup at all. TBH I would rather play on this card then my Quad Fury X setup. I will hopefully have a Youtube video up comparing Fury X in crossfire to this card on release day.


Oh I see. Like your beautiful "White Lightning "









Well: I love it to hear it's smooth. It's a great deal actually, especially since it takes a single slot, unlike two Furies. Regarding the method, I can't wait for AMD to put two GPUs on one interposer and give them access to the same amount of frame buffer. That will be awesome.

It is the fastest card, but in DX11 it is not the fastest GPU. In DX12, there is a chance a to be since it seems they kick some serious background.

Anyway, nice to hear this. AMD is doing well lately. Hopefully that will continue since I somehow love that company.


----------



## Visa Declined

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Any preferred manufacturer or ones to avoid?


All of the Fury X cards are manufactured by AMD, so when choosing which one to buy, look at the warranties to see which one suits your needs. The revision 1 cards had a noisy pump, so keep that in mind if you are buying one used. As far as the coil whine goes, I personally wouldn't worry about it, these cards aren't anything like the Nano's.


----------



## Flamingo

Quote:


> Originally Posted by *toncij*
> 
> Well, I doubt any Polaris chip released now will match Fury line, including Nano (which is actually a Fury X with thermal limits). Watercool a Nano and you're set until Vega next year.


http://videocardz.com/59206/amds-official-gpu-roadmap-for-2016-2018

Well then there is this graph released recently...

I'm more and more inclined to sell off my Nano now.

P10 (490 or 490x) will be lower watt, lower power consumption, more RAM? Only size would be a matter for my SG13 case (prefer Nano sized cards)


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *Flamingo*
> 
> http://videocardz.com/59206/amds-official-gpu-roadmap-for-2016-2018
> 
> Well then there is this graph released recently...


I'm still not sold on Polaris 10 beating Fiji except in performance / watt. That is, unless it turns out to clock much better than the "leaks" thus far. If we get 1250-1300 MHz factory OC cards with more headroom to spare on the AMD side it's a whole different forecast.


----------



## dagget3450

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> I'm still not sold on Polaris 10 beating Fiji except in performance / watt. That is, unless it turns out to clock much better than the "leaks" thus far. If we get 1250-1300 MHz factory OC cards with more headroom to spare on the AMD side it's a whole different forecast.


I agree, i suspect if anything performance wise over fiji won't be released until late this year or even next year. Given the late launch of pro duo and all i can't see them dumping Fiji that fast. However i also try to read into what Nvidia is doing. They might be poised to replaced 980ti with something better.That also has me thinking AMD is aiming for mid to low end replacements first then higher end later. I wonder if AMD is banking on DX12 gains to help them along on performance as well during this phase as a way to compete with "1080 gtx". I don't see this working good for them really so i am hoping i am wrong on my speculation


----------



## MrKoala

Quote:


> Originally Posted by *Elmy*
> 
> I played Black Ops 3 last night for about 2 hours straight and Rainbow Six Siege for about an hour and temps never got above 55c and never had one hickup at all.


How is the VRM temperature this time? On 295X2 core temperature was fine but the VRMs were cooked hard.


----------



## dagget3450

Quote:


> Originally Posted by *MrKoala*
> 
> How is the VRM temperature this time? On 295X2 core temperature was fine but the VRMs were cooked hard.


That only mattered due to vrms throttling the card. Fiji has a whole new batch of issues and vrm temps don't seem to be one.


----------



## toncij

This year, unfortunately, I don't have access to engineering samples of either brand, but according to my calculations highest Pascal (GP104) this year will match 980Ti/TitanX. I don't think any Pascal will match Fury/FuryX so I don't hold my hopes up.


----------



## gupsterg

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> I'm still not sold on Polaris 10 beating Fiji except in performance / watt.


I'm thinking the same, but if Polaris 10 did I'd be







.

I've been monitoring how many SKUs of Fiji cards etailers are stocking in the UK (plus stock levels where shown) and they are disappearing. As to if Fiji will remain available, I don't know, but some etailers are showing discontinued for an out of stock SKU.

Currently not seen any huge price drops, best prices been about £350 for Nano, £360 for Fury and £410 for Fury X. These prices are just promos over the course of last 6 months, HUKD search.

As I didn't lose any £ selling my Hawaii cards and upgrading to Fiji for me was shockingly cheap I'm in good position IMO, so perhaps not too fussed what occurs with Polaris, now Vega is a different story







.


----------



## SuperZan

I'm in no rush to move from Fiji to Polaris myself. Vega will have the real upgrades that I'm keen on seeing (HBM2, refined architecture, etc.).

I will pick up Polaris 10 for my LANbox in a heartbeat if its best SKU is 390x or better performance in a more energy/thermal-friendly package.


----------



## gupsterg

I've become so hooked on Fiji now (mainly due to bios mod







) , I went and ordered my 4th Fiji card yesterday (Fury X)







.


----------



## SuperZan

Niice!


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> I've become so hooked on Fiji now (mainly due to bios mod
> 
> 
> 
> 
> 
> 
> 
> ) , I went and ordered my 4th Fiji card yesterday (Fury X)
> 
> 
> 
> 
> 
> 
> 
> .


Isn't Fiji a bit shy on overclocking headroom? What are the realistic results for some average model, any info and links? (3D Mark possibly)? (not including hardware mods for voltage, only BIOS ones) (thx.


----------



## gupsterg

I do no hard mods







, I don't have the capability







, all cards stock coolers/tim, etc.

I bought Fury Tri-X a) to experience unlock b) to experience air cooler on it, I gained 3840SP and liked the air cooler a lot, 3DM FS compare. The Fury 3840SP vs Hawaii (genuine 290X 1100/[email protected] VID) 3DM FS compare, so pretty happy with results.

I also had a Fury X at that time, it was good to compare against SP unlock on Fury, 3DM FS compare. This had stock VID of 1.250V, both Fury Tri-X & Fury X would not OC more than 1090/525 even with upto +50mV added to stock VID I'd get artifacts







.

Next I got another Fury X, this had VID of 1.212V, with VID of 1.243V in ROM I get 1135/535 *but* the most optimal OC IMO is 1115/535 using stock VID 1.212V, 3DM compare. Between 1115/535 vs 1135/535 is only a 1.79% GPU clock difference, you can see translates to similar level of scaling depending on test, 3DM compare.

Technically the 1115/535 is superior as it doesn't require +31mV (~2.6%) VID increase plus so small a performance difference vs 1135/535. Note all cases I speak of VID not VDDC as seen in monitoring apps. Depending on app loading card VDDC will differ, even when GPU is in same DPM/clocks so I always state VID as set for DPM 7. The 1115/535 OC beats my highly modded Hawaii OC ROM for 24/7 use across the board (exc. physics), 3DM compare.

I owned a Tri-X 290 STD max tested at the time 1100/1475 with +25mV, next an Asus 290X DCUII STD struggled to get past 1070/1340 with even upto +75mV. At the time of those cards a) bios mod not possible due to drivers checking ROM for "signing" b) didn't know about it. Next Vapor-X 290X 1100/1525 @ 1.3V (approx. +56mV above EVV VID in stock ROM), then Tri-X 290 OC 1140/1495 @1.256V (1.243V EVV VID in stock ROM). The Tri-X 290 OC I never did much more testing as dedicated time to Fiji, why I didn't pitch this against Fiji was the results in 3DM FS were lower than VX290X.

Now for me only the 290s seemed like good clockers, the 290Xs not so. Comparing my Fiji cards to the Hawaii I don't think they are clocking badly. I'd say my samples for both Hawaii and Fiji are average, the voltage increases are very sane and usable daily in both gens IMO. So I'd think a balanced comparative?

Where the Fiji cards have really shone above the Hawaii cards I owned is the coolers, much quieter than Hawaii aftermarket, easily able to sustain the OC I've done with very little perceptible noise increase. The Vapor-X 290X I paid £225 (new Q2 15), if I wanted similar quietness for the OC vs Fury/Fury X I reckon I'd need to spend at least another £50 (2nd hand) to £100 for say a Corsair HG10 or Kraken G10 with AIO. No idea if they'd be as quiet, I also don't believe on such a setup I'd be able to sustain an OC which would match closely Fury/Fury X performance to not warrant buying/keeping.


----------



## n64ADL

does the r9 nano have that much coil whine??? i've heard a lot of complaints about it on newegg? i'll most likely get the r9 nano and just liquid cool it.


----------



## Pintek

My first defective nano was like listening to crickets and locusts on a loud speaker. After rma the new card is about as loud as hummingbirds from 75ft away.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> I do no hard mods
> 
> 
> 
> 
> 
> 
> 
> , I don't have the capability
> 
> 
> 
> 
> 
> 
> 
> , all cards stock coolers/tim, etc.
> 
> I bought Fury Tri-X a) to experience unlock b) to experience air cooler on it, I gained 3840SP and liked the air cooler a lot, 3DM FS compare. The Fury 3840SP vs Hawaii (genuine 290X 1100/[email protected] VID) 3DM FS compare, so pretty happy with results.
> 
> I also had a Fury X at that time, it was good to compare against SP unlock on Fury, 3DM FS compare. This had stock VID of 1.250V, both Fury Tri-X & Fury X would not OC more than 1090/525 even with upto +50mV added to stock VID I'd get artifacts
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Next I got another Fury X, this had VID of 1.212V, with VID of 1.243V in ROM I get 1135/535 *but* the most optimal OC IMO is 1115/535 using stock VID 1.212V, 3DM compare. Between 1115/535 vs 1135/535 is only a 1.79% GPU clock difference, you can see translates to similar level of scaling depending on test, 3DM compare.
> 
> Technically the 1115/535 is superior as it doesn't require +31mV (~2.6%) VID increase plus so small a performance difference vs 1135/535. Note all cases I speak of VID not VDDC as seen in monitoring apps. Depending on app loading card VDDC will differ, even when GPU is in same DPM/clocks so I always state VID as set for DPM 7. The 1115/535 OC beats my highly modded Hawaii OC ROM for 24/7 use across the board (exc. physics), 3DM compare.
> 
> I owned a Tri-X 290 STD max tested at the time 1100/1475 with +25mV, next an Asus 290X DCUII STD struggled to get past 1070/1340 with even upto +75mV. At the time of those cards a) bios mod not possible due to drivers checking ROM for "signing" b) didn't know about it. Next Vapor-X 290X 1100/1525 @ 1.3V (approx. +56mV above EVV VID in stock ROM), then Tri-X 290 OC 1140/1495 @1.256V (1.243V EVV VID in stock ROM). The Tri-X 290 OC I never did much more testing as dedicated time to Fiji, why I didn't pitch this against Fiji was the results in 3DM FS were lower than VX290X.
> 
> Now for me only the 290s seemed like good clockers, the 290Xs not so. Comparing my Fiji cards to the Hawaii I don't think they are clocking badly. I'd say my samples for both Hawaii and Fiji are average, the voltage increases are very sane and usable daily in both gens IMO. So I'd think a balanced comparative?
> 
> Where the Fiji cards have really shone above the Hawaii cards I owned is the coolers, much quieter than Hawaii aftermarket, easily able to sustain the OC I've done with very little perceptible noise increase. The Vapor-X 290X I paid £225 (new Q2 15), if I wanted similar quietness for the OC vs Fury/Fury X I reckon I'd need to spend at least another £50 (2nd hand) to £100 for say a Corsair HG10 or Kraken G10 with AIO. No idea if they'd be as quiet, I also don't believe on such a setup I'd be able to sustain an OC which would match closely Fury/Fury X performance to not warrant buying/keeping.


This 82/70 G1 and G2 tests on FuryX are level of TitanX overclocked by 15%. Very, very nice. Impressive. Is that only software-moded FuryX achievement? 1135MHz?


----------



## Alastair

I have not done much overclocking yet, but I have reached 1150/545 fairly easily. I still need to go the route of a custom BIOS. But I just haven't gotten to it yet.


----------



## gupsterg

@toncij

Cheers







.

Those results are without tess.tweak, with tess.tweak near 88/77. I can do 1135/565 via MSI AB or ROM, why I choose to do by ROM is when I use SW OC lower DPM GPU Clocks/VIDs are dynamically increased (I see this in registers dump). I don't believe this is MSI AB, etc doing this, but driver in conjunction with how ROM default PowerPlay is EVV (Electronic Variable Voltage). When I manually set VID per DPM in ROM then lower DPM GPU Clocks/VIDs are unaffected by SW OC "on the fly" to say test a new OC, only DPM 7 is OC'd.

The other aspect of choosing ROM OC vs SW OC is due to SMC on Fiji, apps have become better programmed to "message" it via driver for adjustment/telemetry but I don't take the chance of an issue to occur on an OC due to this.

My 4th card has been dispatched







, as I'm a bit tight I paid for cheapest delivery so upto 3days wait til in my hands. While then I may see if I can OC my current best Fury X more. I have done some testing at 1175MHz with minimal stability testing, my stability requirement is high. For example my VX290X saw numerous [email protected] runs upto 70hrs continuous. I've been lighter on the Fury/Fury X, numerous [email protected] runs upto 18hrs, the OCs must pass without a single GPU bad state. I also do up to 3hrs looped 3DM FS, then multiple gaming tests.

In the 3D Fanboy competition @Fyzzz's 290 vs @Vellinious's 290X vs my Fury X , 3DM compare. Note the clocks of their GPUs, firstly very good clockers, secondly there is no way they get those clocks/performance at stock VID (which I'm using). Their voltages are gonna be very high, I know Fyzzz runs a very sane ~1100MHz for daily use, so the compare of my 290X vs Fury X would be similar to his daily clocks, in which the Fury X by a large margin beats the 290X, bare in mind my 290X ROM had a lot of mods.



Spoiler: My VX290X ROM mods



- Stock GPU/RAM 1030/1325 boosted to 1100/1525.
- GPU Freq. DPMs 2 to 7 boosted per % as in OP heading Making OC bios like factory pre OC'd card/ROM.
- VRM Controller reprogrammed to remove +31.25mV VDDC offset and - 6.25mV VDDC/VDDCI offset.
- 3 States VDDCI ([email protected] [email protected] [email protected]).
- GPU Clock 2 matched to DPM 2 GPU Freq.
- Mem Freq. DPM 1 & 2 @ 1250MHz, DPM 3 to 7 1525MHz.
- All DPMs manually voltage fixed and tested.
- Efficiency @ idle table matched to DPM0.
- 390/X Memory Controller Timings.
- Timings improved for straps 1250MHz & 1375MHz to Stilt's AFR.
- Timings improved for straps 1500MHz & 1625MHz to Stock 1250MHz AFR.
- Standard fan mode Profile improved for clocks/voltage being run.
- PowerLimit mod (stock: 214W/206A/214W mod: 238W/229A/238W



In comparison I'd say for Fiji I'm doing very little ROM modding and does deliver very good performance.

@Alastair

As you qualified for the "service" with sub to 3d fanboy compo you only need to reply to the PM where I requested stock VID per DPM







and then we move along.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Those results are without tess.tweak, with tess.tweak near 88/77. I can do 1135/565 via MSI AB or ROM, why I choose to do by ROM is when I use SW OC lower DPM GPU Clocks/VIDs are dynamically increased (I see this in registers dump). I don't believe this is MSI AB, etc doing this, but driver in conjunction with how ROM default PowerPlay is EVV (Electronic Variable Voltage). When I manually set VID per DPM in ROM then lower DPM GPU Clocks/VIDs are unaffected by SW OC "on the fly" to say test a new OC, only DPM 7 is OC'd.
> 
> The other aspect of choosing ROM OC vs SW OC is due to SMC on Fiji, apps have become better programmed to "message" it via driver for adjustment/telemetry but I don't take the chance of an issue to occur on an OC due to this.
> 
> My 4th card has been dispatched
> 
> 
> 
> 
> 
> 
> 
> , as I'm a bit tight I paid for cheapest delivery so upto 3days wait til in my hands. While then I may see if I can OC my current best Fury X more. I have done some testing at 1175MHz with minimal stability testing, my stability requirement is high. For example my VX290X saw numerous [email protected] runs upto 70hrs continuous. I've been lighter on the Fury/Fury X, numerous [email protected] runs upto 18hrs, the OCs must pass without a single GPU bad state. I also do up to 3hrs looped 3DM FS, then multiple gaming tests.
> 
> In the 3D Fanboy competition @Fyzzz's 290 vs @Vellinious's 290X vs my Fury X , 3DM compare. Note the clocks of their GPUs, firstly very good clockers, secondly there is no way they get those clocks/performance at stock VID (which I'm using). Their voltages are gonna be very high, I know Fyzzz runs a very sane ~1100MHz for daily use, so the compare of my 290X vs Fury X would be similar to his daily clocks, in which the Fury X by a large margin beats the 290X, bare in mind my 290X ROM had a lot of mods.
> 
> 
> 
> Spoiler: My VX290X ROM mods
> 
> 
> 
> - Stock GPU/RAM 1030/1325 boosted to 1100/1525.
> - GPU Freq. DPMs 2 to 7 boosted per % as in OP heading Making OC bios like factory pre OC'd card/ROM.
> - VRM Controller reprogrammed to remove +31.25mV VDDC offset and - 6.25mV VDDC/VDDCI offset.
> - 3 States VDDCI ([email protected] [email protected] [email protected]).
> - GPU Clock 2 matched to DPM 2 GPU Freq.
> - Mem Freq. DPM 1 & 2 @ 1250MHz, DPM 3 to 7 1525MHz.
> - All DPMs manually voltage fixed and tested.
> - Efficiency @ idle table matched to DPM0.
> - 390/X Memory Controller Timings.
> - Timings improved for straps 1250MHz & 1375MHz to Stilt's AFR.
> - Timings improved for straps 1500MHz & 1625MHz to Stock 1250MHz AFR.
> - Standard fan mode Profile improved for clocks/voltage being run.
> - PowerLimit mod (stock: 214W/206A/214W mod: 238W/229A/238W
> 
> 
> 
> In comparison I'd say for Fiji I'm doing very little ROM modding and does deliver very good performance.
> 
> @Alastair
> 
> As you qualified for the "service" with sub to 3d fanboy compo you only need to reply to the PM where I requested stock VID per DPM
> 
> 
> 
> 
> 
> 
> 
> and then we move along.


Yeah I know. I just haven't gotten to it yet. I might do it this weekend.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .


This is you running FireStrike 1080p test default settings, right? I'd like to test this against my current TitanX machine. Would be very interesting to see the results.


----------



## gupsterg

@toncij

Yep, 3DM FS 1080P defaults







.

Yep I'd be interested in your results as well







.

@Alastair

Cool







and no worries







.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> Yep, 3DM FS 1080P defaults
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yep I'd be interested in your results as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Alastair
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> and no worries
> 
> 
> 
> 
> 
> 
> 
> .


Ok so joshy is obviously LN2 too







)) Look at his HBM clock









http://www.3dmark.com/compare/fs/8259868/fs/6009556/fs/8008822#
http://www.3dmark.com/compare/fs/8259868/fs/8008822#

Im the left one...

I'm actually pleasantly surprised how FuryX does so well. That's only 4-11% faster gfx tests for a TitanX running at boost of 1467.

Will have to get one and see for myself how it goes. What brands have unlocked firmware for this? (My understanding is that, although all are made equal, some come at 1050, some at 1000, some unlocked, some not...?)


----------



## Alastair

Quote:


> Originally Posted by *toncij*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gupsterg*
> 
> @toncij
> 
> Yep, 3DM FS 1080P defaults
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yep I'd be interested in your results as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Alastair
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> and no worries
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Ok so joshy is obviously LN2 too
> 
> 
> 
> 
> 
> 
> 
> )) Look at his HBM clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/compare/fs/8259868/fs/6009556/fs/8008822#
> http://www.3dmark.com/compare/fs/8259868/fs/8008822#
> 
> Im the left one...
> 
> I'm actually pleasantly surprised how FuryX does so well. That's only 4-11% faster gfx tests for a TitanX running at boost of 1467.
> 
> Will have to get one and see for myself how it goes. What brands have unlocked firmware for this? (My understanding is that, although all are made equal, some come at 1050, some at 1000, some unlocked, some not...?)
Click to expand...

Sapphire R9-Fury Tri-X 3584 cores 224 TMU's 64 ROP's 1000MHz core clock. Same with Gigabyte G1 Fury, XFX Fury and Powercolor Fury. Sapphire R9-Fury Tri-X OC and Nitro Edition are both OC'ed to 1040 I think. Some of these disabled Fury's can be unlocked to 3840 shaders and 240 TMU's to close the performance delta to a Fury-X to less than 5 %. Some even stand a chance to unlock to a full Fury-X but I wouldn't bag on it.

Fury-X is 4096 shaders 256TMU's 64 ROP's at 1050Mhz.

All cards are dual BIOS Except for Asus Strix Fury, avoid this one IMO if you want to experiment.


----------



## toncij

I'm looking to go for a Nano and water block it or a FuryX directly. I guess I don't need any hardware modding and can use Afterburner to overclock any?


----------



## djsatane

So my ASUS Fury X working great and initially it was very quiet I didnt hear anything no whine or buzzing noise at all. However, about 3 days now of use I hear small buzzing noise or whine it sounds its coming from the card so the pump most likely its not loud but it did not occur initially, now I can hear it just a bit when idle and when under load its just a tad louder its not very noticeable and system is still way quiter than with my 290x before but I am wondering what are some other Fury X owners experiences. I have read that some had pump whine/noise initially and then it went away after usage etc...

I have checked a site: http://wccftech.com/amd-fury-x-pump-silent-solves-noise-issue/ based on that my card seems to have the revised pump. (  )


----------



## Alastair

Quote:


> Originally Posted by *djsatane*
> 
> So my ASUS Fury X working great and initially it was very quiet I didnt hear anything no whine or buzzing noise at all. However, about 3 days now of use I hear small buzzing noise or whine it sounds its coming from the card so the pump most likely its not loud but it did not occur initially, now I can hear it just a bit when idle and when under load its just a tad louder its not very noticeable and system is still way quiter than with my 290x before but I am wondering what are some other Fury X owners experiences. I have read that some had pump whine/noise initially and then it went away after usage etc...
> 
> I have checked a site: http://wccftech.com/amd-fury-x-pump-silent-solves-noise-issue/ based on that my card seems to have the revised pump. (  )


Just shake your card around. Maybe a air bubble got trapped in there,


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> Just shake your card around. Maybe a air bubble got trapped in there,


I was thinking something similar, maybe try moving radiator above the gpu while running it?


----------



## gupsterg

@toncij

Yep that is LN2 result







, I concur the Fiji GPUs are superb "bang for $" compared with high end nVidia.

Fury X all are 1050MHz, all are made by AMD and provided to AIBs to brand with stickers/boxes as they wish. Only thing that would make be pick one over another brand is mainly price and secondly warranty; all are dual bios and same AIO,etc.

The Nano has the full Fiji GPU but VRM is cut back plus 1x 8 pin PCI-E power plug. AFAIK all are made by AMD and AIBs just brand as own, all are ref AMD Nano PCB as well.



Spoiler: Ref PCB Fury X









Spoiler: Ref PCB Nano







WC'ing the Nano is great from what I've read but I'd think VRM/PCI-E plug may hold it back a little compared with Fury X, but then also depends on the roll of "Silicon lottery" dice. Considering what I've paid for all they Fury Xs I've bought I wouldn't consider a Nano at all, this is accounting £40-£50 difference between the 2 which gets me the plus of PCB & AIO.



Spoiler: Some HWiNFO stats on 3DM FS GT1









Spoiler: 3DM FS GT1 3hr run









Spoiler: [email protected] ~18hr run







I reckon 1150/535 is within my grasp/comfort zone for VID, 3DM compare.


----------



## Radox-0

Quote:


> Originally Posted by *toncij*
> 
> I'm looking to go for a Nano and water block it or a FuryX directly. I guess I don't need any hardware modding and can use Afterburner to overclock any?


Nano here under water for its size and indeed fire up afterburner and OC away. As mentioned prior, water does the nano wonders, General my two samples and my current one manage 1100/545 MHz fine for daily gaming clocks no issue while running very cool and that's all stable.cit can go to 1125 but few titles crash. The card itself can actually hit 1150 MHz but you need extra mv. As mentioned elsewhere in the thread at times the overall performance reduces with high mv which happens on my two cards.

Having said that it does appear fury x's generally can reach a higher stable overclock so if space is not an issue that may be an option.


----------



## djsatane

Quote:


> Originally Posted by *dagget3450*
> 
> I was thinking something similar, maybe try moving radiator above the gpu while running it?


I have tried both moving card and shaking it a bit (system off obviously) and moving radiator but it made no difference, i think this is very faint pump operation noise I hear, small regular buzz, I just hope it wont get any worse over time.


----------



## buildzoid

Well I've hit my limit for 3 way crossfire in Firestrike. 33970 points with HWbot rules

Needed 2 PSUs because the EVGA G2 1600W couldn't keep up. Put my worst card onto my CoolerMaster V1000. Pretty sure I hit a CPU bottleneck because I was seeing basically no gain from raising GPU clocks.

Crossfire supports running cards without matched clocks however if your HBM clocks are different between it sends performance to hell. Different core clocks work just fine.

Also I have a suspicion that I'm really inefficient but there aren't enough scores with 3 way Fury Xs for me to know.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Well I've hit my limit for 3 way crossfire in Firestrike. 33970 points with HWbot rules
> 
> Needed 2 PSUs because the EVGA G2 1600W couldn't keep up. Put my worst card onto my CoolerMaster V1000. Pretty sure I hit a CPU bottleneck because I was seeing basically no gain from raising GPU clocks.
> 
> Crossfire supports running cards without matched clocks however if your HBM clocks are different between it sends performance to hell. Different core clocks work just fine.
> 
> Also I have a suspicion that I'm really inefficient but there aren't enough scores with 3 way Fury Xs for me to know.


If it helps i have a score you can compare, It's 3 way fury and if i recall stock voltage on gpu's. cpu was on air, looks close i think on platform and settings used during the run?

http://www.3dmark.com/3dm/11493585

i think this is the compare list
http://www.3dmark.com/compare/fs/8250938/fs/8084609


----------



## Powerslave85

Sapphire Radeon Fury X 4G HBM, Rev. 2 Pump (engraved, no Sticker) love her. Like an old Porsche 911, loud as hell and tries to kill you. Thinking about 2-Way X-Fire when price drops below 300$.

Windows 10 Pro N
Asus Maximus VIII Hero
Intel i5 6600K @ 4x 4.6GHz OC
Noctua NH-D15 Cooler
4x G.Skill Ripjaws IV @ 8GB and 2400 MHz DDR4
Sapphire Fury X 4G HBM with latest UEFI Video BIOS and OC VMod BIOS as alternative BIOS
Blue GPU Tach
750W CoolerMaster PSU
Some CoolerMaster Centurion Case from Office Junkyard

Runs like hell and keeps me warm. Built it for HTC Vive Gaming


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> If it helps i have a score you can compare, It's 3 way fury and if i recall stock voltage on gpu's. cpu was on air, looks close i think on platform and settings used during the run?
> 
> http://www.3dmark.com/3dm/11493585
> 
> i think this is the compare list
> http://www.3dmark.com/compare/fs/8250938/fs/8084609


I just don't get it. Why is my GPU score so atrociously low. Do I need to go and "bin" drivers or something? When I was testing the cards separately every single one managed a GPU score greater than 20K and that was with a 4790K at stock clocks. So I'm inclined to think that the drivers are fine.


----------



## gupsterg

I've sticking to crimison drivers, so far v16.3.2 WHQL have been best for me, v16.4.1 bench slightly less consistently. My OS is configured for daily use and no tweaks to aid benching. As I'm after something that just performs well for daily use in gaming/[email protected]


----------



## buildzoid

Quote:


> Originally Posted by *gupsterg*
> 
> I've sticking to crimison drivers, so far v16.3.2 WHQL have been best for me, v16.4.1 bench slightly less consistently. My OS is configured for daily use and no tweaks to aid benching. As I'm after something that just performs well for daily use in gaming/[email protected]


Well I need to use 16.3.2 for Futuremark to aknowledge my results so that's not really and issue. I guess I'm going to be running some test to see if it might not be a clock/voltage problem though it shouldn't since I basically copy pasted my best single GPU clocks across all three cards.


----------



## gupsterg

You have links to the single card 3DM benches? Just wanna compare with 1135/535. I'm hoping I get 1150/535 stable with about 1.262V VID on current Fury X, but suprisingly my MSI Fury X arrived today so I may start testing that







.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> I just don't get it. Why is my GPU score so atrociously low. Do I need to go and "bin" drivers or something? When I was testing the cards separately every single one managed a GPU score greater than 20K and that was with a 4790K at stock clocks. So I'm inclined to think that the drivers are fine.


Win7 did way better for me than win10 on CF results. Then the second part for me was i noticed that overclocking vram and voltages had very odd effects on gpu tests in FS. Example was gpu test 2 would score less when i used voltage or too high of vram clocks. If i recall correctly. I do know that my results were only consistent when i did not use voltage on gpus.
I also think this changed when using FSE or FSU as well.


----------



## gupsterg

So what do you guys make of this?


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> So what do you guys make of this?


Polaris chips will replace Series 3xx, Fury stays where it is
Pascal will replace everything, 1080 being at 980Ti level

Both companies even, except for some slight lead at DX11 for Nvidia and at DX12 for AMD.


----------



## Cool Mike

I am surprised. No Pro Duo reviews yet?


----------



## toncij

Quote:


> Originally Posted by *Cool Mike*
> 
> I am surprised. No Pro Duo reviews yet?


26th of April is the release date. Even those that got media bundles can't publish yet.


----------



## gupsterg

Quote:


> Originally Posted by *toncij*
> 
> Fury stays where it is


Now this when I view stocked SKUs I'm not seeing.

ebuyer now have only 1 sku of Fury (ie Fiji Pro) and 1 sku of Nano, out of 8 possible Fury X skus they have 4.

OCuk 1 sku of Fury (ie Fiji Pro) instock, the XFX has been pre-order status coming up 2 weeks IIRC, Nano 1 sku in stock, 5 Fury X skus.

What I received today = discontinued, they have 1 sku Fury, 2 skus Nano and 3 Fury X.

Anyone reckon Fiji gonna get a rebrand? is anyone watching skus stocked in US?


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> Now this when I view stocked SKUs I'm not seeing.
> 
> ebuyer now have only 1 sku of Fury (ie Fiji Pro) and 1 sku of Nano, out of 8 possible Fury X skus they have 4.
> 
> OCuk 1 sku of Fury (ie Fiji Pro) instock, the XFX has been pre-order status coming up 2 weeks IIRC, Nano 1 sku in stock, 5 Fury X skus.
> 
> What I received today = discontinued, they have 1 sku Fury, 2 skus Nano and 3 Fury X.
> 
> Anyone reckon Fiji gonna get a rebrand? is anyone watching skus stocked in US?


Hmm, very strange. Why would AMD give up on a HBM model? Also, there are only 3 relevant models: Fury, Nano and FuryX. So... I mean, there are no alternatives to Nano when it comes to small cards. Without HBM they simply can't make a small card and HBM will not be here until HBM2 on Vega.

Also, Polaris won't be available before late summer, "back to school" period, so why would they cut the supply now? Fijis have a great yield too (according to my source) so that is also not a problem.


----------



## gupsterg

Yes it is strange what I'm noting, I will be defo keeping an eye on skus.

As to why skus dwindling? no idea, perhaps their making a loss on cards?

I agree with your post.

The Fury Tri-X has gone from the etailers I linked, even Amazon.co.uk don't have it, ebuyer state discontinued. Another etailler Nano, not one at their location. Fury only 1 at location, no Fury X at location.


----------



## fat4l

Duo Pro EK waterblock inc ...


----------



## p4inkill3r

Quote:


> Originally Posted by *fat4l*
> 
> Duo Pro EK waterblock inc ...


----------



## Elmy

Quote:


> Originally Posted by *fat4l*
> 
> Duo Pro EK waterblock inc ...


Mine! .....

Getting 2 of these....


----------



## hyp36rmax

Quote:


> Originally Posted by *fat4l*
> 
> Duo Pro EK waterblock inc ...


Nice!


----------



## Spock121

Quote:


> Originally Posted by *fat4l*
> 
> Duo Pro EK waterblock inc ...




I don't get it.


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> Win7 did way better for me than win10 on CF results. Then the second part for me was i noticed that overclocking vram and voltages had very odd effects on gpu tests in FS. Example was gpu test 2 would score less when i used voltage or too high of vram clocks. If i recall correctly. I do know that my results were only consistent when i did not use voltage on gpus.
> I also think this changed when using FSE or FSU as well.


Well my test 2 score is fine at 236FPS. It's test 1 where I'm doing badly. Oh well I just gotta do more testing.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Well my test 2 score is fine at 236FPS. It's test 1 where I'm doing badly. Oh well I just gotta do more testing.


Its possible i have it backwards as i didn't document it however i know results were negative on one of these tests when i used voltage. My biggest gains were not using windows 10 followed by not adjusting voltages

Edit: YMMV


----------



## Givenchy

My second sapphire fury x is making massive noise on boot up now for some reason... didnt they fix this issue long way back? i RMA'd mine like months ago aswell..
Its pretty annoying to hear the buzzing sound. or when you play a game for some times and then you have this annoying sound... even with the headphones on..









Ideas?

Im honestly at the point of selling this card duo the massive sound it makes..


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *Givenchy*
> 
> My second sapphire fury x is making massive noise on boot up now for some reason... didnt they fix this issue long way back? i RMA'd mine like months ago aswell..
> Its pretty annoying to hear the buzzing sound. or when you play a game for some times and then you have this annoying sound... even with the headphones on..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ideas?
> 
> Im honestly at the point of selling this card duo the massive sound it makes..


If you are willing to break your warranty you can usually fix the coil whine by surround the chokes on the PCB in some soft material like hot glue gun that will prevent or reduce the vibrations. Some people play the RMA lottery until they get back a card with no whine at all. They can't really "fix the issue" per se. I suppose they could desolder the components and put new ones on the same card.. I haven't heard of any companies doing that though.


----------



## buildzoid

Ok so I did another run of 3 way but in FSE. Used windows 7 with 16.3.2. BIOS set to 1.3V core 1.36V HBM. 600W TDP and 400A current limit.

Cards did 1180/600mhz. Got a score of 22422. I really need to work on my GPU effciency. I'm thinking of trying a new set of BIOSs with less Vcore to see if that might not help. I will also try 1.3V on card 1 and 1.2V on the other 2 cards. I might also end up binning drivers.

Windows 7 does get me much higher Physics scores but the GPU scores are roughly in line with win 10.


----------



## toncij

Quote:


> Originally Posted by *buildzoid*
> 
> Ok so I did another run of 3 way but in FSE. Used windows 7 with 16.3.2. BIOS set to 1.3V core 1.36V HBM. 600W TDP and 400A current limit.
> 
> Cards did 1180/600mhz. Got a score of 22422. I really need to work on my GPU effciency. I'm thinking of trying a new set of BIOSs with less Vcore to see if that might not help. I will also try 1.3V on card 1 and 1.2V on the other 2 cards. I might also end up binning drivers.
> 
> Windows 7 does get me much higher Physics scores but the GPU scores are roughly in line with win 10.


"binning drivers"? You plan on testing multiple versions to find ideal? Isn't it that latest should perform the best in general unless a bug occurs?


----------



## buildzoid

Quote:


> Originally Posted by *toncij*
> 
> "binning drivers"? You plan on testing multiple versions to find ideal? Isn't it that latest should perform the best in general unless a bug occurs?


Pretty much. The thing is the guy is beating me is on the very first Crimson driver whereas I can't get close on 16.3.2.


----------



## toncij

Quote:


> Originally Posted by *buildzoid*
> 
> Pretty much. The thing is the guy is beating me is on the very first Crimson driver whereas I can't get close on 16.3.2.


It might be thermal throttling unless you fixed the clocks.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> Pretty much. The thing is the guy is beating me is on the very first Crimson driver whereas I can't get close on 16.3.2.


It just occurred to me i remember something else about win7 vs win10 in my testing. I am using an old Win7pro disc, its Win7 pro SP1. I did not do any updates as i recall when i did before it caused my gpu scores to tank. I can't be sure but i think an update somewhere was causing my win7 performance to be poor like my win10. Just a thought, but right now i dont have fury's installed im testing /blocking 390x's and selling off my fury x's.
(need more vram 4k and beyond)


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> It just occurred to me i remember something else about win7 vs win10 in my testing. I am using an old Win7pro disc, its Win7 pro SP1. I did not do any updates as i recall when i did before it caused my gpu scores to tank. I can't be sure but i think an update somewhere was causing my win7 performance to be poor like my win10. Just a thought, but right now i dont have fury's installed im testing /blocking 390x's and selling off my fury x's.
> (need more vram 4k and beyond)


That can't be it since I'm on SP1 with no updates. Do you know what your best 3 way was? I only get 27260 on the GPU tests. Whereas the guy I'm aiming to beat gets 28337.


----------



## rubenlol2

I noticed that you get a lower score with the newer drivers, I think there was a change in power management with the new drivers.
Old drivers didn't throttle with furmark, I could pull almost 400w with a fury, but the newer drivers throttle pretty heavily.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> That can't be it since I'm on SP1 with no updates. Do you know what your best 3 way was? I only get 27260 on the GPU tests. Whereas the guy I'm aiming to beat gets 28337.


I assume you mean with Tess on? I don't think i did any runs with Tess on.

Have you tried rerunning your benchs at least a few times on same clocks and see if you get a variation on scores?

Edit: I still don't understand how people are getting identical scores with win10/7 and i wasn't. Win10 was really bad for me as in 10-20fps avg lower in many benchmarks.... I tried a lot of tweaks on win10 and power management. I wish i could have pin pointed it, but i did try 2 win10 installs and both suffered badly.


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> I assume you mean with Tess on? I don't think i did any runs with Tess on.
> 
> Have you tried rerunning your benchs at least a few times on same clocks and see if you get a variation on scores?
> 
> Edit: I still don't understand how people are getting identical scores with win10/7 and i wasn't. Win10 was really bad for me as in 10-20fps avg lower in many benchmarks.... I tried a lot of tweaks on win10 and power management. I wish i could have pin pointed it, but i did try 2 win10 installs and both suffered badly.


No without tess. As for win10 vs 7. FPS on the GPU tests was the same for me but win 7 scored better on physics


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> No without tess. As for win10 vs 7. FPS on the GPU tests was the same for me but win 7 scored better on physics


Ahh i missed the part were you said FSE. I don't know if this is my best run but i have it bookmarked. GPU score was 27856
http://www.3dmark.com/3dm/11479863

I did a os wipe not long ago and i think i lost all my screens and links.


----------



## buildzoid

Quote:


> Originally Posted by *dagget3450*
> 
> Ahh i missed the part were you said FSE. I don't know if this is my best run but i have it bookmarked. GPU score was 27856
> http://www.3dmark.com/3dm/11479863
> 
> I did a os wipe not long ago and i think i lost all my screens and links.


That's on stock volts?


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> That's on stock volts?


It should have been on stock volts, after a while i gave up on using voltage adjustments sadly i don't have any screenshots of voltage but i do have the screenshot of the desktop stuff.


If i did use any voltage it would have been really low, +16mv-+30mv at most.


----------



## djsatane

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> If you are willing to break your warranty you can usually fix the coil whine by surround the chokes on the PCB in some soft material like hot glue gun that will prevent or reduce the vibrations. Some people play the RMA lottery until they get back a card with no whine at all. They can't really "fix the issue" per se. I suppose they could desolder the components and put new ones on the same card.. I haven't heard of any companies doing that though.


My card got steady low buzzing what seems to be coil whine now as well, it does kinda make sense that its kinda like a vibration, the card performance and temps are great but this god damn low buzzing coil while oh god not sure if I can take it for long time, when it gets hot weather and fans/ac works in my house the noise maybe blocked off by that but when its all quiet this is killing my brain.

I read few people about surrounding chokes on PCB with glue etc but I wish there was more info on this or at least examples. I am afraid of doing this though. I have my pc already about 6 feet away from monitor, if I had longer cables for dp/usb/sound I would try to move the pc even further away, but thats not option atm, but what I may do is get plexi glass and build a half box diviing pc from me with some sound cushion lol.

btw those pro duo boards, i wouldnt be suprised if they will have coil whine/buzzing of the board vibration noise as well, and maybe even more!


----------



## bluezone

This is the second part the video I Posted earlier. It's interesting what he has to say about (I'm Canadian so "aboot") AMD and the new console refreshes possibly using dual GPU's.


----------



## Willius

Quote:


> Originally Posted by *bluezone*
> 
> This is the second part the video I Posted earlier. It's interesting what he has to say about (I'm Canadian so "aboot") AMD and the new console refreshes possibly using dual GPU's.


The guy makes nice videos, it makes a lot of sense. He could possibly be onto something


----------



## toncij

Quote:


> Originally Posted by *Willius*
> 
> The guy makes nice videos, it makes a lot of sense. He could possibly be onto something


Well, we know that's a logical next step.
The point is, tho, nothing prevents Nvidia from doing the same thing. The video is somehow stuck with the idea that Nvidia can not or will not move to the very same way of manufacturing GPUs. It will. Nothing prevents Nvidia from manufacturing smaller chips on the interposer.
AMD is still in a bad position since they don't have as much money as Nvidia to push their platform.
The war we'll be facing now is the software one. Gameworks vs OpenGPU, G-Sync vs FreeSync in the hardware part and first-to-market like with HBM and HBM2.

What brings money now and will for some time is DX11. Games author mentioned, like Quantum Break, are not broken since the last Nvidia driver update (980Ti and FuryX having the same performance).


----------



## Cool Mike

I am locked and loaded.

Ready to purchase the Pro Duo. I hope Newegg has availability after midnight tonight.


----------



## n64ADL

do you think the new AMD gpu they'll announce will most likely be a more mainstream r9 fury or r9 nano?? gosh why do they got to wait till late summer to release the good gpu's. wouldn't they get an advantage by announcing much sooner than E3.


----------



## bluezone

Quote:


> Originally Posted by *toncij*
> 
> Well, we know that's a logical next step.
> The point is, tho, nothing prevents Nvidia from doing the same thing. The video is somehow stuck with the idea that Nvidia can not or will not move to the very same way of manufacturing GPUs. It will. Nothing prevents Nvidia from manufacturing smaller chips on the interposer.
> AMD is still in a bad position since they don't have as much money as Nvidia to push their platform.
> The war we'll be facing now is the software one. Gameworks vs OpenGPU, G-Sync vs FreeSync in the hardware part and first-to-market like with HBM and HBM2.
> 
> What brings money now and will for some time is DX11. Games author mentioned, like Quantum Break, are not broken since the last Nvidia driver update (980Ti and FuryX having the same performance).


He does do videos for Nvidia and Intel as well. Looking at each as a separate entity.

IMO When it comes down to it. When do you pivot to DX 12. Right now at the cusp of 14 -16 nanometre (or er) what is the strong suit of each company. With Nvidia it's footing is firmly in both DX 11 and 12 (DX 12 supporting DX 11 features). From what I've read Nvidia may be making a slower incremental change of direction. For RTG (Radeon Technologies Group) the best move is forward into DX 12. RTG hoped DX 11 would be more of what DX 12 is turning out to be. So they had already (maybe accidently) starting shifting with the HD 7900 series.That leaves consumers to chose between what meets their needs and what may meet their needs. But shiny stuff sells.
Quote:


> Originally Posted by *n64ADL*
> 
> do you think the new AMD gpu they'll announce will most likely be a more mainstream r9 fury or r9 nano?? gosh why do they got to wait till late summer to release the good gpu's. wouldn't they get an advantage by announcing much sooner than E3.


Very likely the Fury line replacement (upgrade) will be "VEGA" and it looks like 2017.


----------



## toncij

I think Nvidia, at the moment, has higher level compatibility to DX12. DX12_1 instead of _0 like Fiji. Might change with Polaris.

My comment was more in the line with: Nvidia can compensate and redesign in worst case scenario, much easier than AMD could ever - simply a question of cash to burn, which Nvidia has a lot of. Async shaders are not something Nvidia can't make. They might even be able to do it atm since CUDA, unlike DX12, supports the thing.


----------



## bluezone

Quote:


> Originally Posted by *toncij*
> 
> I think Nvidia, at the moment, has higher level compatibility to DX12. DX12_1 instead of _0 like Fiji. Might change with Polaris.
> 
> My comment was more in the line with: Nvidia can compensate and redesign in worst case scenario, much easier than AMD could ever - simply a question of cash to burn, which Nvidia has a lot of. Async shaders are not something Nvidia can't make. They might even be able to do it atm since CUDA, unlike DX12, supports the thing.


Yes when you get down to it. We are talking about compute (CUDA is compute) right now and how and if you feed the beast. Compute is extra work to be done if you have "idle" assets on the GPU. Nvidia architecture is very efficient right now and has very high utilization of fewer shader components. RTG R9 Fury series dies are huge with huge and different resources. (floating point performance on Fury clobbers 980 TI) Huge things tend to be less efficient. So ACE's make a lot of sense right now for RTG. Not so much Nvidia.

Both companies bring different things to the table. Both are taking notes from the other.

You have very good points.









Side note: Extremtech Quote-

"The issue has been further confused by claims that Maxwell is the only GPU on the market to support "full" DirectX 12. While it's true that Maxwell is the only GPU that supports DirectX 12_1, AMD is the only company offering full Tier 3 resource binding and asynchronous shaders for simultaneous graphics and compute. That doesn't mean AMD or Nvidia is lying - it means that certain features and capabilities of various cards are imperfectly captured by feature levels and that calling one GPU or another "full" DX12 misses this distinction. Intel, for example, offers ROV at the 11_1 feature level - something neither AMD nor Nvidia can match."

REP +1


----------



## djsatane

Pro Duo is like 6990 card in a sense thats its internal crossfire right? I am just curious, cause 1 chip fury x limitation of hbm was 4gb in this case, so if one have pro duo and uses crossfire disabled where only 1 gpu is used, does that mean only 4gb HBM is available then? This would be different than with 6990 as far as onboard vram usage.


----------



## bluezone

Crimson 16.4.2 announcement right here.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.2.aspx


----------



## wesbluemarine

Quote:


> Originally Posted by *bluezone*
> 
> Crimson 16.4.2 announcement right here.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.4.2.aspx


I think FRTC is not working with Crimson drivers and windows store games, because my gpu is warmer than when i use Catalyst 15.11.


----------



## pdasterly

where can i buy radeon pro duo


----------



## Butthurt Beluga

I just got my PowerColor Radeon R9 Fury X today.
So stoked, even if I am late to the party by quite some time.





I'll try not to blow up the thread with my potato cam quality pictures, but definitely can't help the fact that I feel like a kid in a candy store


----------



## SuperZan

Cheers! You'll enjoy I'm sure, it's a great card.









Nice get with the PowerColor by the by, I've been well pleased with their build quality of late.


----------



## gupsterg

Quote:


> Originally Posted by *djsatane*
> 
> Pro Duo is like 6990 card in a sense thats its internal crossfire right? I am just curious, cause 1 chip fury x limitation of hbm was 4gb in this case, so if one have pro duo and uses crossfire disabled where only 1 gpu is used, does that mean only 4gb HBM is available then? This would be different than with 6990 as far as onboard vram usage.


Same PLX chip as 295X2.
Quote:


> To facilitate the XDMA PCI Express communication for the R9 295X2 a PLX bridge chip sits next to the primary GPU. This allows each GPU to get a full x16 allotment of bandwidth for chip to chip communication but only require a single x16 connection from the host PC. Again, this is common practice for dual-GPU cards in recent memory.


Link:- AMD Community - XDMA

Each GPU is just 4GB.

So in essence Fury X CF on single PCB = Radeon Pro Duo.


----------



## toncij

Quote:


> Originally Posted by *SuperZan*
> 
> Nice get with the PowerColor by the by, I've been well pleased with their build quality of late.


I'm afraid all FuryX are made equal.


----------



## SuperZan

Quote:


> Originally Posted by *toncij*
> 
> I'm afraid all FuryX are made equal.


By spec, yeah. I was speaking more to the range of PowerColor cards being of high quality lately. This gives me faith in their assembling a quality product. I've seen some very poorly-built reference cards.


----------



## toncij

Quote:


> Originally Posted by *SuperZan*
> 
> By spec, yeah. I was speaking more to the range of PowerColor cards being of high quality lately. This gives me faith in their assembling a quality product. I've seen some very poorly-built reference cards.


But all FuryX are manufactured by AMD. 3rd party companies just package and sell them. FuryX is special in that regard.


----------



## gupsterg

Plus Nano







.

I'm currently deeming I'm gonna keep they MSI Fury X I bought recently. Mainly down to 3yr warranty vs Sapphire 2yrs, they also have a UK office I can deal with if reseller muck me around. IIRC 1st year with etailer and 2nd & 3rd with MSI.

This has been the lowest VID per DPM card I've had so far, so I would say high leakage ASIC or if GPU-Z supported ASIC quality read it = High ASIC quality. This is based on information the Stilt stated on Hawaii, for further info anyone wishing to read ref hawaii bios mod thread OP heading "What is ASIC Quality?". Post 66 & 69 on TPU have some of my data regarding 4th card plus others.


----------



## xTesla1856

AMD finally fixed The Division flickering with crossfire enabled. Yesss!


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> Plus Nano
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I'm currently deeming I'm gonna keep they MSI Fury X I bought recently. Mainly down to 3yr warranty vs Sapphire 2yrs, they also have a UK office I can deal with if reseller muck me around. IIRC 1st year with etailer and 2nd & 3rd with MSI.
> 
> This has been the lowest VID per DPM card I've had so far, so I would say high leakage ASIC or if GPU-Z supported ASIC quality read it = High ASIC quality. This is based on information the Stilt stated on Hawaii, for further info anyone wishing to read ref hawaii bios mod thread OP heading "What is ASIC Quality?". Post 66 & 69 on TPU have some of my data regarding 4th card plus others.


Does "high ASIC" in GPU-Z mark high leakage or low leakage? Higher leakage should enable higher voltages and higher speed with a high cost in power and temperature, but that should not concern anyone with water cooling. From what I see, ASIC quality is increased with lower leakage, not higher? If you want to overclock, you'd want lower ASIC quality in that regard.


----------



## Cool Mike

As of now, I see no pro duo's on Newegg. Thought I would this morning.









Release today, right?


----------



## gupsterg

Quote:


> Originally Posted by *toncij*
> 
> Does "high ASIC" in GPU-Z mark high leakage or low leakage?


In GPU-Z:-

High ASIC = High leakage
Low ASIC = Low leakage

This information I know as correct as the Stilt has posted it numerous times in the context of AMD cards on multiple forums







and my own testing with 4 hawaii cards / 4 fiji cards follows what he has stated







.
Quote:


> Originally Posted by *toncij*
> 
> Higher leakage should enable higher voltages and higher speed with a high cost in power and temperature, but that should not concern anyone with water cooling.


Yep, below is info from Hawaii bios mod OP which I complied from @The Stilt's posts.

*What is "ASIC Quality"?*


Spoiler: Warning: Spoiler!






Spoiler: ASIC Quality is LeakageID of GPU.



Quote:


> The Lkg value is the fused LeakageID of the GPU. Convert it to decimal and divide by 1023, you´ll have the GPU-Z ASIC "quality". Higher LeakageID (or ASIC "quality") means higher leakage, which is bad unless you´re running on custom water cooling or LN2.


*Quote from link.*





Spoiler: More info on LeakageID.



Quote:


> If you use over 1.3V for the GPU (which certainly isn´t recommended), do yourself a favor and DO NOT run Furmark.
> 
> The VRM can provide around 360A of current without burning, and a GPU running at >1.3V might just exceed it in Furmark (depending on leakage). The higher your "ASIC Quality" (GPU-Z) is the higher your leakage level is and vice versa. Higher leakage means the GPU will require less voltage to operate, however it´s maximum safe voltage level is lower at the same time. Lower leakage parts require higher voltages, however their break down voltage is slightly greater too.
> 
> Note that the VRM current capability is completely temperature dependant, so don´t expect it to survive at high temperatures.It can provide 360A at 25°C, but it can still burn with <150A load at 110°C.
> 
> High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
> 
> Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient
> 
> Unless you are using LN2 you definitely want the leakage to be as low as possible. Even under LN2 the high leakage characteristics are only desired because the difference in voltage scaling. All ASICs despite the leakage have some sort of design specific absolute voltage limit. The low leakage ASIC might run into this limit prior reaching the maximum clocks.
> 
> You can use this software the check the default, leakage dependent voltage of your CPU specimen: http://1drv.ms/1Hln01F
> 
> The software must be run at default settings (as it came from the factory; clocks, voltages), otherwise the results will be invalid.
> 
> Unless your default voltage is greater than 1.250V you should never exceed 1.300V.
> 
> Also never trust the VDDC voltage reading displayed GPU-Z or Afterburner. Even at stock the voltage reads around 56mV too low and the discrepancy only increases with the increased current draw (higher clocks and voltage).


*Quote from link.*





Spoiler: Effect of LeakageID on LL and current draw



Quote:


> Eventhou you can tell if the GPU has high or low leakage properties based on the "ASIC Quality", there is no way to tell how much you can reduce the VDDC from stock. It needs to be tested on each and every card as even the GPUs with identical LeakageID ("ASIC Quality") are never fully identical in terms of other properties. Once there is enough of ACCURATE test data it is possible to roughly estimate the usable VDDC levels (i.e. is the minimum VDDC closer to 1.2V or 1.05V for example).
> 
> Also it must be noted that GPUs with low and high leakage properties have different voltage under load eventhou the voltage would be set to the same level. A low leakage GPU might draw 150A of current in load while a high leakage part can draw 170A or even higher. Since the load-line resistance (RLL) is fixed instead of being dynamic based on the leakage current, the high leakage GPU will have a greater load-line effect (voltage drop).
> 
> E.G.
> 
> Low Leakage GPU: VDDC = 1.20000V, IDDMax = 150.0A, RLL 0.64mOhm
> 
> Idle (~ 5A IDD) = 1.20000 - (((1/10000)*6.4) * 5) == 1.1968V
> Load (~ 150A IDD) = 1.20000 - (((1/10000)*6.4) * 150) == 1.104V
> 
> High Leakage GPU: VDDC = 1.20000V, IDDMax = 170.0A, RLL 0.64mOhm
> 
> Idle (~ 5A IDD) = 1.20000 - (((1/10000)*6.4) * 5) == 1.1968V
> Load (~ 170A IDD) = 1.20000 - (((1/10000)*6.4) * 170) == 1.0912V


*Quote from link.*






As GPU-Z does not read LeakageID on Fiji the other caveat of using stock VID of DPM 7 is the default voltage doesn't indicate the leakage level correctly. As there are bad and good quality ASIC in every leakage level. If you have two otherwise identical high leakage ASICs, one deemed good and one bad, the worse one will have higher default voltage despite the leakage being identical. As you can tell a fickle subject.


----------



## fat4l

I would say it's more likely to be 2x fury nano not 2x fury x


----------



## djsatane

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I just got my PowerColor Radeon R9 Fury X today.
> So stoked, even if I am late to the party by quite some time.
> 
> 
> 
> 
> 
> I'll try not to blow up the thread with my potato cam quality pictures, but definitely can't help the fact that I feel like a kid in a candy store


Nice, do you have any noise from the card itself such as buzzing or any small vibration like noise? I am just curious.


----------



## Cool Mike

Pro Duo is in stock at NewEgg.

$1499.99

Sapphire out of Stock
XFX in Stock

They are identical with same warrenty.

I purchased the XFX. Keep in mind they are shipping out of California. I'm on the east coast so I did overnight. They are also requiring a signature at delivery.


----------



## Butthurt Beluga

Quote:


> Originally Posted by *djsatane*
> 
> Nice, do you have any noise from the card itself such as buzzing or any small vibration like noise? I am just curious.


Nope! Not as of yet, at least. Quiet as a mouse.


----------



## bluezone

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I just got my PowerColor Radeon R9 Fury X today.
> So stoked, even if I am late to the party by quite some time.
> 
> 
> 
> 
> 
> I'll try not to blow up the thread with my potato cam quality pictures, but definitely can't help the fact that I feel like a kid in a candy store


----------



## pdasterly

picking up radeon pro duo, which should i get xfx or sapphire?
Had good luck with previous sapphire rma.


----------



## SuperZan

I haven't had any issues with either AIB. "Flip a coin. When it's in the air, you'll know what side you're hoping for."


----------



## pdasterly

going to need this
http://techfrag.com/2016/04/25/ekwb-unveils-custom-water-block-amd-radeon-pro-duo/


----------



## xer0h0ur

Damn I would love to ditch the 295X2 for a Pro Duo but I already made my mind up to wait for the HBM2 generation GPUs next year.


----------



## SuperZan

The temptation is real but you're making the smart play waiting for Vega. I love Crossfire Fiji but that HBM2 14nm goodness on the horizon just can't be ignored.


----------



## xkm1948

Installed my HTC Vive today. Letting everyone know that Fury X is a beast in terms of VR rendering. Simply mind ******* awesome.


----------



## djsatane

The small buzzing/vibration noise is definitely not related to air bubbles trapped in pump now, I have tried all kind of tests moving both gpu and radiator even when on, it makes no difference. At this point the only way to fix it would be for me to add some kind of padding where the pump is like between pump and the cover or add super glue to main join locations on the card to reduce vibration I guess. Its not as noticeable on "noisy" days when I have to open windows cause its very warm vs cool days when I close all windows and its all quiet in my apt. lol. I already got arcylic shield window that seperates me from the pc which is under another desk, while it cuts down some of it, not all. I have open ear headphones but now I also considering getting closed ear for work on the pc. I did not think of RMA as its not as loud or annoying as some videos I have seen around the web but its definitely something I am going to possibly try to remedy more in the future.


----------



## bluezone

I've been up to something and I'll post pictures after I'm done testing.









It, seems to work very well.


----------



## buildzoid

I think I've found the absolute limit of what a Fury X will benchmark on the stock cooler.

*HBM*
600mhz HBM is easy to do with a modded BIOS and extra caps on the HBM VRM. 666mhz HBM is impossible even with 1.42V on the HBM I wasn't able to reach 666mhz. I will be trying even more caps on the HBM but I doubt it will help.

*Core*
Raising core voltages above stock in pretty much any way shape or form decimates 3Dmark scores. I got all my best score running at stock volts around 1150-1170 core clock. The voltage needed for speed greater than that killed of all the performance gain.

I've tried chaning core voltage with volt mods, BIOS mods, afterburner and TRiXX. All them drop FPS when going above stock.

The only thing I haven't tried is messing with the Fury-X's power sensing circuitry. The reason why haven't done this is that there is a large chance that the VRM will burn out if I do. I'm also strongly considering the viability of an E-power mod because that would make it impossible for the GPU to communicate with the VRM bypassing any power limits.

My best Firestrike score in 3 way Crossfire with tessellation disabled is 34309 points. With 56.6K graphics. Without some way to raise volts without losing FPS more is simply impossible because I can't run higher clocks without more voltage


----------



## xkm1948

So higher volt equals to higher OC, but worse performance?


----------



## buildzoid

Quote:


> Originally Posted by *xkm1948*
> 
> So higher volt equals to higher OC, but worse performance?


I could benchmark with cards at 1200+ core clock and 1.35-1.4V core but the scores at those clocks were terrible. Not even close to the 56.6K I managed at 1150mhz.


----------



## xkm1948

Quote:


> Originally Posted by *buildzoid*
> 
> I could benchmark with cards at 1200+ core clock and 1.35-1.4V core but the scores at those clocks were terrible. Not even close to the 56.6K I managed at 1150mhz.


So you will recommend just use Crimson to OC the core and use BIOS editor to OC the HBM?

I don't know about you but my 1st batch Fury X wont OC pass1100 core.


----------



## bluezone

So it's much warmer out side and cracking a window open isn't helping much. Either with cooling or keep the noise of the fan down.







Everything cooler and much more quite now.


----------



## dagget3450

Quote:


> Originally Posted by *buildzoid*
> 
> I think I've found the absolute limit of what a Fury X will benchmark on the stock cooler.
> 
> *HBM*
> 600mhz HBM is easy to do with a modded BIOS and extra caps on the HBM VRM. 666mhz HBM is impossible even with 1.42V on the HBM I wasn't able to reach 666mhz. I will be trying even more caps on the HBM but I doubt it will help.
> 
> *Core*
> Raising core voltages above stock in pretty much any way shape or form decimates 3Dmark scores. I got all my best score running at stock volts around 1150-1170 core clock. The voltage needed for speed greater than that killed of all the performance gain.
> 
> I've tried chaning core voltage with volt mods, BIOS mods, afterburner and TRiXX. All them drop FPS when going above stock.
> 
> The only thing I haven't tried is messing with the Fury-X's power sensing circuitry. The reason why haven't done this is that there is a large chance that the VRM will burn out if I do. I'm also strongly considering the viability of an E-power mod because that would make it impossible for the GPU to communicate with the VRM bypassing any power limits.
> 
> My best Firestrike score in 3 way Crossfire with tessellation disabled is 34309 points. With 56.6K graphics. Without some way to raise volts without losing FPS more is simply impossible because I can't run higher clocks without more voltage


I feel your pain, i went through this only with AB/Trixx oc and the 3d fanboy comp thread. So what i was wondering is if you only have these losses on CF. Can you replicate same issue on a single gpu?
For me it seems like the more GPU's i add to CF on fiji the more sensitive it was to voltage and fps changes....

If you do epower mods or whatever i definitely want to see your results. Something else i also wonder for you, is if you can compare your % differences on each resolution group if possible. 1080/1440/4k and see if there is any bigger or lesser gaps with voltage. I didn't even bother with water blocks because of the way fiji feels so locked out on controlling performance and voltages.


----------



## djsatane

I have put on piece of arcylic covered in padding on the backplate cover of fury x so that when I screw it in its tight against the pump in hopes of reducing the pump vibration/buzzing, it helped a tad but not completely. At this point I contacted the place where I got the fury x from to see if I can RMA it. I am not sure if I want to because chances are that I may even get one thats more noisy so its a risk.


----------



## dagget3450

Quote:


> Originally Posted by *djsatane*
> 
> I have put on piece of arcylic covered in padding on the backplate cover of fury x so that when I screw it in its tight against the pump in hopes of reducing the pump vibration/buzzing, it helped a tad but not completely. At this point I contacted the place where I got the fury x from to see if I can RMA it. I am not sure if I want to because chances are that I may even get one thats more noisy so its a risk.


Can you make a video of it for others to hear? All of mine are quiet to me, except i have one with a noisy fan over the other 3 at low rpms. It kinda has a whine to it at low rpms its not very noticeable while in use.


----------



## djsatane

Quote:


> Originally Posted by *dagget3450*
> 
> Can you make a video of it for others to hear? All of mine are quiet to me, except i have one with a noisy fan over the other 3 at low rpms. It kinda has a whine to it at low rpms its not very noticeable while in use.


I will do it sometime, but the worst part of it was it was pulsating buzzing noise/whine like up and down, that was very very annoying, after my small attempt at fixing it now there is still buzzing but its steady and somehow this way I don't notice it as much, the worst part is when it was pulsating.


----------



## gupsterg

Fury X no 1 & no 2 I had were both rev.3 of pump, ie like yours @djsatane, both had no issues with pump noise or coil noise. After asking Sapphire tech support I was told production date Oct/Nov 15, I bought Mar 16. Some what the fan on each seemed a bit whiny to me but not overly.

Fury X no 3 I haven't yet checked to see what rev of pump it has, I'd think it is rev.3. The MSI box has a xx/11/15 marking even though only bought a week ago, this seems the whinest fan so far. I currently don't believe it's the pump. I've listened via a paper tube placed from my ear to card and can't make anything out but coil whine, the coil whine in general use I can't hear without paper tube. I'd also think this card is the quietest for coil whine out of the Fury / X s I've had. I've just ordered a 4 pin mini fan connector so I can take the PWM signal from Fury X to a pair of fans of my choosing to try push/pull config.

*** edit ***

Fury no 3 has rev.3 pump, even with cover removed I noted very little noise and vibration from it. Using the paper tube I noted coil whine noise more prominently than pump, Heaven was running to load card up.

*** edit 2 ***

Fixed whine on Fury no 3







, 1st test was remove all screws and just rest fan on rad, no whine







. Next cut some triangles out of spongy material (A/C insulation tape), placed on 4 corners of fan which mate with rad, only tightened fan screws lightly = no whine







.


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> Fury X no 1 & no 2 I had were both rev.3 of pump, ie like yours @djsatane, both had no issues with pump noise or coil noise. After asking Sapphire tech support I was told production date Oct/Nov 15, I bought Mar 16. Some what the fan on each seemed a bit whiny to me but not overly.
> 
> Fury X no 3 I haven't yet checked to see what rev of pump it has, I'd think it is rev.3. The MSI box has a xx/11/15 marking even though only bought a week ago, this seems the whinest fan so far. I currently don't believe it's the pump. I've listened via a paper tube placed from my ear to card and can't make anything out but coil whine, the coil whine in general use I can't hear without paper tube. I'd also think this card is the quietest for coil whine out of the Fury / X s I've had. I've just ordered a 4 pin mini fan connector so I can take the PWM signal from Fury X to a pair of fans of my choosing to try push/pull config.
> 
> *** edit ***
> 
> Fury no 3 has rev.3 pump, even with cover removed I noted very little noise and vibration from it. Using the paper tube I noted coil whine noise more prominently than pump, Heaven was running to load card up.
> 
> *** edit 2 ***
> 
> Fixed whine on Fury no 3
> 
> 
> 
> 
> 
> 
> 
> , 1st test was remove all screws and just rest fan on rad, no whine
> 
> 
> 
> 
> 
> 
> 
> . Next cut some triangles out of spongy material (A/C insulation tape), placed on 4 corners of fan which mate with rad, only tightened fan screws lightly = no whine
> 
> 
> 
> 
> 
> 
> 
> .


Hmm ill have to look at this myself for one of mine, TY for that info
Quote:


> Originally Posted by *Cool Mike*
> 
> Pro Duo is in stock at NewEgg.
> 
> $1499.99
> 
> Sapphire out of Stock
> XFX in Stock
> 
> They are identical with same warrenty.
> 
> I purchased the XFX. Keep in mind they are shipping out of California. I'm on the east coast so I did overnight. They are also requiring a signature at delivery.


Any word yet? I was super curious about your power consumption


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> *** edit 2 ***
> 
> Fixed whine on Fury no 3
> 
> 
> 
> 
> 
> 
> 
> , 1st test was remove all screws and just rest fan on rad, no whine
> 
> 
> 
> 
> 
> 
> 
> . Next cut some triangles out of spongy material (A/C insulation tape), placed on 4 corners of fan which mate with rad, only tightened fan screws lightly = no whine
> 
> 
> 
> 
> 
> 
> 
> .


Well the noise is coming out of the inside of the card itself in the area where pump is but it can also be the vibration of the board itself aka coil whine, so if I were to reset the fan on the rad I doubt that would make any differnece in my case. I added more padding so the cover thats under the card(image 
(when card is installed in regular board) facing down with the pump cover removable on that cover i added padding so when I screwed in it went against the pump cover tight, this reduced some of the whine/buzz as in its not pulsating anymore almost at all but its still there a tad as far as noise. Now under load its a tad louder and I havent tested if any of the louder noise at load maybe coming out of rad fan or not though.

I havent had response from the person of a place I bought the card from(its a friend actually) so there is still chance I may replace it in future.





 thats a very short clip, there is some fan wind at first in sound but then u can hear a bit what I am talking about, its not the best I will try to get better recording later. When I feel the card or the tubes I feel some of the vibration so I think it maybe either pump causing it or at least adjacent to pump board.

I havent unscrew the screws from the other side as there is a warranty sticker on there to see if anything is loose there or if I can possibly make something tighter there or test things but I wonder how do things look on that side and sound there.


----------



## gupsterg

@dagget3450

No worries







, I'm guessing my situation was possibly mix of things. I know the mount for the rad is secure, it's ~10mm acrylic panel cut to accommodate the rad in 5.25 bay. I couldn't feel it move/resonate but the fan/rad did, ever so slightly.

@djsatane

When comparing the cards I've had, say as out of the box, they differ in whine. So I'd say we each may have differing fix.

I have an inverted ATX case so pump is face up, at one point 1 card was in normal ATX case, no other noise from card but same small whine from fan, rad always vertical with pipes at bottom end. Viewing your video that seems pretty noisy, when home I'll do one of my setup.

Sapphire cards had a full cover sticker top & bottom, removal = warranty void (confirmed with tech support). I peeled back carefully and replaced after viewing inside. Thankfully MSI Fury X have none







, IIRC only a small sticker over 1 screw on GPU backplate (ie holding pump to GPU, not the cover plate), Sapphire had this also, so I'd think AMD place it there.


----------



## DiceAir

Hi there

Have a question. Will i5-6600k be enough for a fury on 1080p in games like Black ops3, Guild wars, Battlefield 4, Rainbopw six siege?


----------



## p4inkill3r

Quote:


> Originally Posted by *DiceAir*
> 
> Hi there
> 
> Have a question. Will i5-6600k be enough for a fury on 1080p in games like Black ops3, Guild wars, Battlefield 4, Rainbopw six siege?


Plenty of power, yes.


----------



## DiceAir

Quote:


> Originally Posted by *p4inkill3r*
> 
> Plenty of power, yes.


Thanks for the info. Will it be enough if he doesn't overclock for now? I know it's overclockable cpu. It's for a friend and he doesn't want to overclock just yet although he's got the cooler and so on for now but I'm not going to argue with him about overclocking for now. so just wanted to know if it will be fine on stock?


----------



## battleaxe

Quote:


> Originally Posted by *DiceAir*
> 
> Thanks for the info. Will it be enough if he doesn't overclock for now? I know it's overclockable cpu. It's for a friend and he doesn't want to overclock just yet although he's got the cooler and so on for now but I'm not going to argue with him about overclocking for now. so just wanted to know if it will be fine on stock?


For one card yes. Easily. But for Xfire then its questionable. i7 would be better in that case.


----------



## DiceAir

Quote:


> Originally Posted by *battleaxe*
> 
> For one card yes. Easily. But for Xfire then its questionable. i7 would be better in that case.


Yes he will be running 1 card. I remember my i5-2500k was bottlenecking my r9 280x crossfire rig in bf4 but since they patched the game a bit it was no longer the case and was only in certain areas I upgraded to i7-4790k and I could see the game using all threads and stutter was gone. If I was him I would rather spend the extra cash and get the i7 anyway just due to the already higher stock speed and higher chance of not hindering performance but I'm also sure the i5-6600k should be enough for gaming. I'm sure with games starting to use dx12 it shouldn't be an issue as amd Gpu's cale much better with dx12 so I'm also sure he should be fine then.


----------



## Elmy

Here is my Pro Duo that I have had for the past 3 weeks or so. The NDA is lifted of course and I can now tell you guys anything you want to know.

Is there any benchmarks you guys would like to see?

Here are a couple I did in 3DMark Ultra and Extreme. Scored 6548 in Ultra and 11,980 in Extreme. 5960X was @ 4.5 GHZ.

http://www.3dmark.com/fs/8310293

http://www.3dmark.com/fs/8310327


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Here is my Pro Duo that I have had for the past 3 weeks or so. The NDA is lifted of course and I can now tell you guys anything you want to know.
> 
> Is there any benchmarks you guys would like to see?
> 
> Here are a couple I did in 3DMark Ultra and Extreme. Scored 6548 in Ultra and 11,980 in Extreme. 5960X was @ 4.5 GHZ.
> 
> http://www.3dmark.com/fs/8310293
> 
> http://www.3dmark.com/fs/8310327


I was hoping for more power consumption numbers even if only a killowatt meter reading at load


----------



## Alastair

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Here is my Pro Duo that I have had for the past 3 weeks or so. The NDA is lifted of course and I can now tell you guys anything you want to know.
> 
> Is there any benchmarks you guys would like to see?
> 
> Here are a couple I did in 3DMark Ultra and Extreme. Scored 6548 in Ultra and 11,980 in Extreme. 5960X was @ 4.5 GHZ.
> 
> http://www.3dmark.com/fs/8310293
> 
> http://www.3dmark.com/fs/8310327


I will still buy the spare decoration chip if you are interested!


----------



## Elmy

Quote:


> Originally Posted by *dagget3450*
> 
> I was hoping for more power consumption numbers even if only a killowatt meter reading at load


I will get something to you by the end of the weekend.

Quote:


> Originally Posted by *Alastair*
> 
> I will still buy the spare decoration chip if you are interested!


Once I get my 2nd Pro Duo I will send you one of them for FREE  ... mIght be a month or so... I don't know when AMD is sending the 2nd one yet. I'll keep in touch.

Here is the Highest score I think I can get with the Pro Duo in Ultra ( 7,665 ) and my 5960X screaming along at 4.8 GHZ . Anything higher on the clocks it crashes. Might play some more this weekend and see if I can break 8,000.00.

http://www.3dmark.com/3dm/11830259?


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> I will get something to you by the end of the weekend.
> Once I get my 2nd Pro Duo I will send you one of them for FREE  ... mIght be a month or so... I don't know when AMD is sending the 2nd one yet. I'll keep in touch.
> 
> Here is the Highest score I think I can get with the Pro Duo in Ultra ( 7,665 ) and my 5960X screaming along at 4.8 GHZ . Anything higher on the clocks it crashes. Might play some more this weekend and see if I can break 8,000.00.
> 
> http://www.3dmark.com/3dm/11830259?


wow, so you managed to get it clocked from 760 to 1145 on stock AIO? Whats the temps look like?


----------



## Elmy

Quote:


> Originally Posted by *dagget3450*
> 
> wow, so you managed to get it clocked from 760 to 1145 on stock AIO? Whats the temps look like?


Temps running 3DMark and the Pro Duo Clocked at 1125 in an open air bench was 39c on one GPU and 41c on the other. I watched it the whole time and that was the highest it got to.


----------



## spyshagg

amazing

no throttling at all on gpu-z graph?


----------



## p4inkill3r

Quote:


> Originally Posted by *Elmy*
> 
> Temps running 3DMark and the Pro Duo Clocked at 1125 in an open air bench was 39c on one GPU and 41c on the other. I watched it the whole time and that was the highest it got to.


That's pretty sweet.


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> wow, so you managed to get it clocked from 760 to 1145 on stock AIO? Whats the temps look like?


I thought Radeon Pro is 1000MHz stock? and so the reference to Nano in CF by @fat4l where I said earlier Fury X in CF = Radeon Pro Duo. So technically similar OC headroom as other Fiji cards, besides OC nice result on AIO for temps







.

Several posts back discussion was about FPS/performance drop with voltage applied. I didn't see it on Fury X no 2, but I'm not applying whole lot of VID increase. I'm also not seeing it on Fury X no 3, my best results are with VID added on both cards.

I'd never compared on Hawaii if the scaling of performance with OC was in line with % clock increase, so I checked today. This 3DM FS compare is stock ROM but clock/vddc change only and no timings mod. So rom mod on hawaii compare is similar to what I'm doing on Fiji cards.

So far Fury X no 3 has reached 1130/545, 3DM compare (no scaling issue IMO). As the offset gives no context of VID let me say this is at 1.225V for DPM 7 VID in ROM







(1hr Heaven/Valley, 3hr run 3DM FS, ~6hrs [email protected] tested). It reached 1105/555 stable with stock VID of 1.187V, to show there is a performance boost in 1130/545/+38mV vs 1105/545/+0mV view this compare.

So:-

Fury X no 2 1115/535 @ stock DPM 7 (1.212V), 1135/535 @ 1.243V DPM 7 (ie +31mV)
Fury X no 3 1105/555 @ stock DPM 7 (1.187V), 1130/545 @ 1.225V DPM 7 (ie +38mV)

I have not yet finished seeing if I can gain more out of Fury X no 3 and will retest Fury X no 2 also.


----------



## Medusa666

Would buying a AMD Radeon Pro Duo be futureproof for the coming 2-3 years?

Do the dead Fiji chip come with every single Pro Duo or is it only with selected?

How is Crossfire / Dual GPU support looking for the coming 2-3 years with DX12?


----------



## spyshagg

Buy two nanos


----------



## n64ADL

Quote:


> Originally Posted by *spyshagg*
> 
> Buy two nanos


i would but what is the *point* with polaris on the way soon??


----------



## djsatane

Quote:


> Originally Posted by *n64ADL*
> 
> i would but what is the *point* with polaris on the way soon??


life is short


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> @djsatane
> 
> When comparing the cards I've had, say as out of the box, they differ in whine. So I'd say we each may have differing fix.
> 
> I have an inverted ATX case so pump is face up, at one point 1 card was in normal ATX case, no other noise from card but same small whine from fan, rad always vertical with pipes at bottom end. Viewing your video that seems pretty noisy, when home I'll do one of my setup.
> 
> Sapphire cards had a full cover sticker top & bottom, removal = warranty void (confirmed with tech support). I peeled back carefully and replaced after viewing inside. Thankfully MSI Fury X have none
> 
> 
> 
> 
> 
> 
> 
> , IIRC only a small sticker over 1 screw on GPU backplate (ie holding pump to GPU, not the cover plate), Sapphire had this also, so I'd think AMD place it there.


I bought some ac insulation material(thin rubber like), vinyl insulation pads etc and foam, (hardware store going out of business 70% off) I may open the card up and try to fill/cover as much of it as i can inside and even outside, but I am wondering the vinyl would be ok in that compartment above pump? I mean it wouldnt melt in there would it?


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> I thought Radeon Pro is 1000MHz stock? and so the reference to Nano in CF by @fat4l where I said earlier Fury X in CF = Radeon Pro Duo. So technically similar OC headroom as other Fiji cards, besides OC nice result on AIO for temps
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Several posts back discussion was about FPS/performance drop with voltage applied. I didn't see it on Fury X no 2, but I'm not applying whole lot of VID increase. I'm also not seeing it on Fury X no 3, my best results are with VID added on both cards.
> 
> I'd never compared on Hawaii if the scaling of performance with OC was in line with % clock increase, so I checked today. This 3DM FS compare is stock ROM but clock/vddc change only and no timings mod. So rom mod on hawaii compare is similar to what I'm doing on Fiji cards.
> 
> So far Fury X no 3 has reached 1130/545, 3DM compare (no scaling issue IMO). As the offset gives no context of VID let me say this is at 1.225V for DPM 7 VID in ROM
> 
> 
> 
> 
> 
> 
> 
> (1hr Heaven/Valley, 3hr run 3DM FS, ~6hrs [email protected] tested). It reached 1105/555 stable with stock VID of 1.187V, to show there is a performance boost in 1130/545/+38mV vs 1105/545/+0mV view this compare.
> 
> So:-
> 
> Fury X no 2 1115/535 @ stock DPM 7 (1.212V), 1135/535 @ 1.243V DPM 7 (ie +31mV)
> Fury X no 3 1105/555 @ stock DPM 7 (1.187V), 1130/545 @ 1.225V DPM 7 (ie +38mV)
> 
> I have not yet finished seeing if I can gain more out of Fury X no 3 and will retest Fury X no 2 also.


I believe the Radeon Pro looks to be 5 vs 4 phases per GPU compared to the Nano.


----------



## Agent Smith1984

My issue with the Nano is that it's everything that little guy can do to maintain 1000MHz core clock..... In contrast, it seems that the Pro Duo clocks just as good as an X


----------



## seabiscuit68

Since no one answered your question -

I would avoid Crossfire / SLI like the plague. I've been using crossfire for quite a while (had crossfire 6870s prior to the crossfire 7970s) and I can tell you that it just seems to get worse and worse. I keep trying to tell myself that it was a good idea...it is not.

Actually, as of right now, Crimson has pretty well killed crossfire. 85% of my games that used to at least function when crossfire was enabled now crash on start. The Division / Rainbow Six Siege are either negative scaled or non-functional. Most new games don't work worth crap.

Do yourself a favor - buy the top tier card, run it for 2 years, sell it on ebay and rinse and repeat. The depreciation of the top tier cards is very low so you will only be spending $75 or so a year to always play with the best card...


----------



## rdr09

Quote:


> Originally Posted by *seabiscuit68*
> 
> Since no one answered your question -
> 
> I would avoid Crossfire / SLI like the plague. I've been using crossfire for quite a while (had crossfire 6870s prior to the crossfire 7970s) and I can tell you that it just seems to get worse and worse. I keep trying to tell myself that it was a good idea...it is not.
> 
> Actually, as of right now, Crimson has pretty well killed crossfire. 85% of my games that used to at least function when crossfire was enabled now crash on start. The Division / Rainbow Six Siege are either negative scaled or non-functional. Most new games don't work worth crap.
> 
> Do yourself a favor - buy the top tier card, run it for 2 years, sell it on ebay and rinse and repeat. The depreciation of the top tier cards is very low so you will only be spending $75 or so a year to always play with the best card...


I maxed out the cores on my i7 4.5 HT off (basically an i5) with crossfire 7900 series cards. It was fine with HT on. Bottleneck was the first issue. Single 290 Ht off is fine. With 2, same deal. Have to turn HT on.

For some of us there is no other way but to use two or more until prolly the next highend cards.


----------



## Alastair

I have no issues running crossfire.


----------



## SuperZan

RS:S is the only game I'm playing at the moment that has any issue with Crossfire. I don't get any of the fabled stuttering in WItcher 3, save for some of the load-in graphic narration screens that I don't care about. Fallout 4 has been golden. No issues at all with any of the multi-player games I frequent. I may just be lucky though; the games that people complain about most often are games I have zero interest in, such as The Division or Quantum Break.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> I have no issues running crossfire.


Neither do many of us. But you know... Xfire is so terrible...









Mine works great. No stutter, and smooth as butter. Like I said before, most of it is setup problems of some not knowing how to setup Xfire properly. Its not always just plug and play. Sometimes, you have to use the brain we were given to figure out what is causing said "stutering" issues.


----------



## djsatane

@gupsterg

I have done some things to the card but resisted opening that cover facing the board where the warranty sticker is.




It may look at bit stupid but since this is water cooled there is no need for air flow right? Anyways, I am happy to say this did lower the coil/whine humming a lot. Its still there if you are in the room with total silence but the pulsating and how loud it used to be is solved! Of course I am going to see how if any of the sound changes over time. Further, this material is vinyl 1 side stick insulation typically used for ac/windows/car truck covers, I assume its safe for what I used it? Its not touching any board parts.


----------



## Elmy

Quote:


> Originally Posted by *spyshagg*
> 
> amazing
> 
> no throttling at all on gpu-z graph?


The clocks are all over the place running 3DMark .... I don't think its throttling though... could be wrong....

Quote:


> Originally Posted by *Medusa666*
> 
> Would buying a AMD Radeon Pro Duo be futureproof for the coming 2-3 years?
> 
> Do the dead Fiji chip come with every single Pro Duo or is it only with selected?
> 
> How is Crossfire / Dual GPU support looking for the coming 2-3 years with DX12?


Well the 295X2 was released April 2014 , So its been almost exactly 2 years. If you purchased the 295X2 at release its been one hell of a card for the past 2 years. It was still the fastest card up until the Pro Duo was released.

My understanding is every Pro Duo will come with the dead chip. I could be wrong though. The ones ive seen so far all had them.

Crossfire support is hit and miss. Most games have no problems and from what I have discussed with AMD most of the time its on the developer end if it doesn't work.

Quote:


> Originally Posted by *p4inkill3r*
> 
> That's pretty sweet.


I forgot to mention the fan was at 100% in that test..... I'll get some stuff tested this weekend and report back.

Any benchmarks you would like to see let me know and I will try to get them done this weekend.


----------



## toncij

Quote:


> Originally Posted by *seabiscuit68*
> 
> Since no one answered your question -
> 
> I would avoid Crossfire / SLI like the plague. I've been using crossfire for quite a while (had crossfire 6870s prior to the crossfire 7970s) and I can tell you that it just seems to get worse and worse. I keep trying to tell myself that it was a good idea...it is not.
> 
> Actually, as of right now, Crimson has pretty well killed crossfire. 85% of my games that used to at least function when crossfire was enabled now crash on start. The Division / Rainbow Six Siege are either negative scaled or non-functional. Most new games don't work worth crap.
> 
> Do yourself a favor - buy the top tier card, run it for 2 years, sell it on ebay and rinse and repeat. The depreciation of the top tier cards is very low so you will only be spending $75 or so a year to always play with the best card...


Hmm. How is that so low you think? You now loose a lot. Check 980Ti and FuryX price when Polaris (and contrary to my beliefs it may happen) gets out with the Fury level of performance for $350...

And yes, CrossFire several years ago worked fine. Don't know how it works now, haven't tried.

Quote:


> Originally Posted by *Elmy*
> 
> The clocks are all over the place running 3DMark .... I don't think its throttling though... could be wrong....
> Well the 295X2 was released April 2014 , So its been almost exactly 2 years. If you purchased the 295X2 at release its been one hell of a card for the past 2 years. It was still the fastest card up until the Pro Duo was released.
> 
> My understanding is every Pro Duo will come with the dead chip. I could be wrong though. The ones ive seen so far all had them.
> 
> Crossfire support is hit and miss. Most games have no problems and from what I have discussed with AMD most of the time its on the developer end if it doesn't work.
> I forgot to mention the fan was at 100% in that test..... I'll get some stuff tested this weekend and report back.
> 
> Any benchmarks you would like to see let me know and I will try to get them done this weekend.


Clocks should be stable under water, right?? How is that not throttling?

No, RDP comes with a dead chip only with "media kits" you got. Retail will not. Not even press kits exist, only you lucky youtubers.









One favor I'd like to ask: can you please test a game on 5K (5120x2880)? I'm really interested how a RDP copes with 5K games due to VRAM limits. Battlefield 4 and Shadow of Mordon support resolution scaling, The Division does too. If you could be so kind...


----------



## gupsterg

@bluezone

I agree VRM on Radeon Pro Duo is better VRM than Nano







. I think fat4l's post is in regards to that Nano is 4096SP @ 1000MHz and so is Radeon Pro Duo. Therefore Nano CF = Radeon Pro Duo, where as when I said it's like Fury X in CF it's wrong as Fury X is 1050MHz 4096SP.

@djsatane

Wow that's a lot of insulating going on, never done anything like that to a card, dunno if appropriate or not.

I'll be honest if I was in your shoes I'd RMA card, how I see it is at one point I'd have to sell card and selling a card insulated like that may not appeal to a buyer. If removed to sell card then obviously buyer not gonna be happy as they've bought a noisy card.

@Elmy

So Radeon Pro Duo throttle like Nano, this is with even PowerLimit increase? can you try with Power Efficiency Off in driver?

I've always used 3DM FS as a guide to know if PL is OK for an OC, Heaven and Valley I found will always drop some clocks but not when "Power Efficiency" is set to Off in driver.



Attached is the HML file of that screenie.

Fury_X_No3_1130_545_1225mV_VID.zip 11k .zip file


I have upped PL in ROM but still "PowerTune" seems aggressive (depending on app loading card), only switching "Power Efficiency" Off really solves it but then you have clock bounce at low loads. That small peak to 680MHz @ 12:38:23 is when I switch off PE @ desktop, note also the small blips after I exit Heaven.

Any chance of a bios dump of Radeon Pro Duo? I'd think there would be 2 roms like 295X2.


----------



## spinejam

Two weeks ago, I purchased a Gigabyte R9 Fury X from Jet.com that made my computer sound like a fish tank. Had to RMA it - no doubt! Decided on an Asus R9 Fury X from Newegg and what a difference!!! Whisper quiet now!









Don't settle for excessive pump noise on these cards - they cost way too much $$$ for such nonsense!


----------



## PentiumK

What should the default minimum fanspeed at idle be for the Nano?
For some reason my card does about 1615 rpm at 19%, doesn't go below that when manually adjusted from AB or SpeedFan.


----------



## djsatane

Quote:


> Originally Posted by *gupsterg*
> 
> @djsatane
> 
> Wow that's a lot of insulating going on, never done anything like that to a card, dunno if appropriate or not.
> 
> I'll be honest if I was in your shoes I'd RMA card, how I see it is at one point I'd have to sell card and selling a card insulated like that may not appeal to a buyer. If removed to sell card then obviously buyer not gonna be happy as they've bought a noisy card.


So far temps are same as without this even under load, no issues encountered and while it may look like a lot of insulation but its actually not that much, however in noise level what a difference! Its perfect now. I could of RMA but I could have replaced it with the seller for another one as its a person I know who runs small pc building place in canada (extreme-pc) but he did mention that chances are that there will be some humming and coil whine usually from his experience so there is also chance I may get one that may have more coil whine than the one I had. Again it wasnt very loud but the pulsating part was what I did not like, but the insulation really filtered it out(I think the one I put on the inside of that cover made the biggest difference.


----------



## gupsterg

As long as your happy it's all good mate







.

Yes defo coil whine is something which card to card I've noted differs, pump noise I haven't noticed as different or an issue and not saying your not experiencing it







.


----------



## n64ADL

if i bought a r9 pro duo would i be able to pair it with a r9 390 for dx12 titles?? just curious has anyone tried this with titles like hitman yet, if it would be worth while??


----------



## SuperZan

Quote:


> Originally Posted by *n64ADL*
> 
> if i bought a r9 pro duo would i be able to pair it with a r9 390 for dx12 titles?? just curious has anyone tried this with titles like hitman yet, if it would be worth while??


AFAIK for trifire you'd need another Fiji card: Nano, Fury, Fury X, or a Pro Duo.

Edit: oh I see what you're asking, TBH I'm not seeing enough support for that yet to warrant adding a card that can't be used in a Xfire config.


----------



## Elmy

I got of ton of data for you guys.

My system is running a 5960X @ 4.5 for these benchmarks. All gaming resolutions are at 3440X1440.

3DMark @ Stock settings on GPU = 6538 and the whole system draws 560 Watts ( this was max I saw on my Killawatt in last 3Dmark bench and temps @ 32c.

3DMark @ 1100 , Fan @ stock = 6682 , system draw at 645 watts. ( I think this was throttling)

3DMark @ 1100 , Fan @ 100% voltage and Power limit both at +50 = 7299 and system was drawing 770 watts max and temps were 39c and 42c max.

Running Tomb Raider at stock GPU and Ultimate settings in game gave me 52.2 average , 80 max , 31 min ,

Running Tomb Raider at 1100 GPU " " gave me 57.3 average , 78 max , 34 min.

Here is the cool part the Radeon Pro Duo and the Fury X DO work in crossfire. Here is the 3DMark validation. http://www.3dmark.com/3dm/11846943?

Adding a Fury X increased my score by 3,000 on Ultra settings and GPU set @ 1100. The MAX wattage used was 1050 Watts I saw on the Killawatt.

I keep getting a random error pop up on my screen though. Just played about an hour of Black Ops 3 and everything was running just fine in Tri-fire.


----------



## bluezone

looks like the Radeon Pro Duo has evolved from the introduction last year.

Lisa Sue showing off the original with 4 VRM per GPU and 2 8-pin power connectors.



Now the released Radeon Pro Duo.


----------



## xkm1948

Would be awesome to see four of these Pro Duo XFiring together.


----------



## pdasterly

Quote:


> Originally Posted by *xkm1948*
> 
> Would be awesome to see four of these Pro Duo XFiring together.


im sure 2 is max, technically 2 pro duo's would be 4way xfire


----------



## dagget3450

Quote:


> Originally Posted by *pdasterly*
> 
> im sure 2 is max, technically 2 pro duo's would be 4way xfire


Yes, but this had me wondering about dx12 multigpu. I think its unlimited?


----------



## pdasterly

Quote:


> Originally Posted by *dagget3450*
> 
> Yes, but this had me wondering about dx12 multigpu. I think its unlimited?


loan me $6500 and ill test it out for you









edit $6900 plus extra psu


----------



## fat4l

Quote:


> Originally Posted by *Elmy*
> 
> 
> 
> Here is my Pro Duo that I have had for the past 3 weeks or so. The NDA is lifted of course and I can now tell you guys anything you want to know.
> 
> Is there any benchmarks you guys would like to see?
> 
> Here are a couple I did in 3DMark Ultra and Extreme. Scored 6548 in Ultra and 11,980 in Extreme. 5960X was @ 4.5 GHZ.
> 
> http://www.3dmark.com/fs/8310293
> 
> http://www.3dmark.com/fs/8310327


Lol. Nice to see Ares 3 is not far from Duo Pro. I know Ares is oced but still....duo pro can never OC this high anyway...















FS U = 2.7% difference (Duo Pro wins) http://www.3dmark.com/compare/fs/8310293/fs/7157947
FS X = 10.9% difference(Ares 3 wins) http://www.3dmark.com/compare/fs/8310327/fs/8048093

Quote:


> Originally Posted by *gupsterg*
> 
> @bluezone
> 
> I agree VRM on Radeon Pro Duo is better VRM than Nano
> 
> 
> 
> 
> 
> 
> 
> . I think fat4l's post is in regards to that Nano is 4096SP @ 1000MHz and so is Radeon Pro Duo. Therefore Nano CF = Radeon Pro Duo, where as when I said it's like Fury X in CF it's wrong as Fury X is 1050MHz 4096SP.


I was pointing out the "cores" radeon pro due uses. Low power binned chips, that are being used in Fury Nano


----------



## gupsterg

Quote:


> Originally Posted by *fat4l*
> 
> Lol. Nice to see Ares 3 is not far from Duo Pro. I know Ares is oced but still....duo pro can never OC this high anyway...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FS U = 2.7% difference (Duo Pro wins) http://www.3dmark.com/compare/fs/8310293/fs/7157947
> FS X = 10.9% difference(Ares 3 wins) http://www.3dmark.com/compare/fs/8310327/fs/8048093


I agree it can't OC as far.

Looking at a Fury X CF OC FS Ultra result with same GPU/HBM clock (same CPU but not clock) and differing driver GT1 is 10% higher on Fury X CF.

Taking Elmy's RPD OC FS Ultra result (only GPU OC'd) vs Ares 3 OC'd, ~20% gain over Ares 3; I would assume there would be a gain on Extreme as well. IMO the % gain is misleading, the FPS gain is too low; GT1: 3.61 GT2: 6.94 Combined: 2.61.

When I looked at RPD pricing @ ~£1300 in the UK, I'm thinking "No way ....", I could get say Nano CF and custom loop to cool my whole system with plenty of change left out of £1300. AIO on Fury X is pretty good IMO, which has not made be wanna go custom loop, so Fury X CF @ ~£400-£450 per card is cheaper again than RPD. If I wanted 2 GPUs on 1 PCB 2nd hand 295X2 is best bang for $ IMO.
Quote:


> Originally Posted by *fat4l*
> 
> I was pointing out the "cores" radeon pro due uses. Low power binned chips, that are being used in Fury Nano


----------



## toncij

The price makes sense exclusively if you're limited to a mini-ITX case and need two cards in a single slot. I can't find any other reason to pay 500 EUR more than 2xNano just to have it in a single card.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> I agree it can't OC as far.
> 
> Looking at a Fury X CF OC FS Ultra result with same GPU/HBM clock (same CPU but not clock) and differing driver GT1 is 10% higher on Fury X CF.
> 
> Taking Elmy's RPD OC FS Ultra result (only GPU OC'd) vs Ares 3 OC'd, ~20% gain over Ares 3; I would assume there would be a gain on Extreme as well. IMO the % gain is misleading, the FPS gain is too low; GT1: 3.61 GT2: 6.94 Combined: 2.61.
> 
> When I looked at RPD pricing @ ~£1300 in the UK, I'm thinking "No way ....", I could get say Nano CF and custom loop to cool my whole system with plenty of change left out of £1300. AIO on Fury X is pretty good IMO, which has not made be wanna go custom loop, so Fury X CF @ ~£400-£450 per card is cheaper again than RPD. If I wanted 2 GPUs on 1 PCB 2nd hand 295X2 is best bang for $ IMO.


You have a point there








Lets see the X results.

@Elmy, could you please run FS X, with default driver settings + OCed cpu/duo pro so we can compare with Ares 3 OC








I play in 1440p anyway so X makes more sense to me


----------



## gupsterg

@toncij

This is gonna be such a small number of people IMO







. RPD is just too late a release and way over priced IMO







. I was chatting to an ebayer only yesterday who's relisted a Fury X 3 times as winning bidders don't pay at auction end







(he attains a pretty good price ~£400 each time). I'm guessing the winners get cold feet from thinking what will polaris / pascal bring to table / pricing. So buying RPD again seems not right until those have been released, not saying this on performance stance but on basis if pricing for current cards shifts.

@fat4l

A very little point only







.

The 295X2 or Ares III is defo holding it's presence against RPD IMO







.


----------



## toncij

From my own sources, there is a high probability that Polaris will match Fury line performance. But, it will not be a HBM card so all who actually like Fury size will probably make no mistake opting for Nano.

There are many rumors about Polaris and Pascal beating current flagship line. I still think even if that happens, it won't be by a large margin. I'm sitting on a TitanX @ 1600MHz and a FuryX at 1113MHz so I'm fine unless Polaris or Pascal really kick 20% or more higher on stock. Given large overclock headroom of Polaris and Pascal, it may as well be 50% under water, even AIO.


----------



## gupsterg

Indeed, interesting times in GPU arena







.

Not too sure if Polaris will beat Fiji, match maybe, be just under possible. All based on my opinion and got no credible source or info to support my opinion







. Not holding my breath OC ability either, again just my opinion.

I was very lucky to find a current gen CPU (bin'd 2 CPUs only) that gave me an OC buzz like the Q6600







(did not upgrade between the 2 stated). Fiji for me has just been a buzz as sold my hawaii cards at no loss and very surprisingly made a little profit. All samples of Fiji I've had again no loss on disposal, the one I've kept is a ridiculous price which I'm not gonna post







(so defo very ridiculous); so I'm all set not to go for Polaris IMO, Vega I may possibly go for.

I haven't had nVidia GPU in several years, not that I haven't owned them before. One of my faves was the 8800GTX, on the AMD front now Fiji but overall all time fave was Radeon 9800 Pro (was so bowled over with it on Far Cry). 3DFX was another very notable video card IMO, from many years back







.


----------



## toncij

I'm running my second 5960X at 4,4GHz cold under an AIO.







Very satisfied. It can benchmark and render at 4,6 and 1.315V, which I also find nice. So far I'm afraid I won't change a CPU very soon.
GPU on the other hand, it depends. If Polaris shows significant advantage over my current cards I'll buy it and put a bracket and AIO on it too. If not, I'll wait for Vega. I use FuryX and TitanX interchangeably, for computing and gaming respectively.


----------



## fat4l

Quote:


> Originally Posted by *toncij*
> 
> From my own sources, there is a high probability that Polaris will match Fury line performance. But, it will not be a HBM card so all who actually like Fury size will probably make no mistake opting for Nano.
> 
> There are many rumors about Polaris and Pascal beating current flagship line. I still think even if that happens, it won't be by a large margin. I'm sitting on a TitanX @ 1600MHz and a FuryX at 1113MHz so I'm fine unless Polaris or Pascal really kick 20% or more higher on stock. Given large overclock headroom of Polaris and Pascal, it may as well be 50% under water, even AIO.


These are low end and mainstream cards, mostly to replace old hawaii 3XX cards. "Old" cuz they use old chips/cores.

Polaris 11 = low end
Polaris 10 = mainstream , maybe with performance of ~Fury.

AMD is aiming for performance/watt not for performance.

1080 will be faster than 980Ti so ....it wil lbe faster than polaris chips.


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> These are low end and mainstream cards, mostly to replace old hawaii 3XX cards. "Old" cuz they use old chips/cores.
> 
> Polaris 11 = low end
> Polaris 10 = mainstream , maybe with performance of ~Fury.
> 
> AMD is aiming for performance/watt not for performance.
> 
> 1080 will be faster than 980Ti so ....it wil lbe faster than polaris chips.


I know what these are. Regarding 1080, not sure. Judging by GP100 TFLOPS and core count, 1080 will have 2560 cores, at the same core freq. to be similar to 980Ti. Albeit, it could have a higher clock than GP100 and be even a bit faster.

Don't expect more than 10% faster than 980Ti stock for stock.

Polaris on paper looks worse than Fiji and I expect it but some rumors are there that it will match Fiji.


----------



## gupsterg

@toncij

I have enjoyed thoroughly your participation in thread







, I like the balanced posts you have discussing nVida/AMD







, very refreshing







.

Having seen your profile I did think you'd have more requirement for CPU power than me, so comes at no surprise you have a 5960X







. My whole i5 rig is built "bang for $", the cooler I bought 2nd hand £13.50







. Mobo new but crazy price to me via some buying tactics/selling a freebie that came with it, PSU again crazy price. I bought a Fractal Design Core 2300 to house the Q6600 and moved the i5 to SilverStone case.

Putting aside say performance, power consumption is where AMD have been really lacking. Now the node change/tech is making this happen for them but at the back of my mind I can't help but think PowerTune maybe too aggressive on Polaris and hamper say OC'ing, like we see on Fiji; what are your thoughts on this?







.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> I have enjoyed thoroughly your participation in thread
> 
> 
> 
> 
> 
> 
> 
> , I like the balanced posts you have discussing nVida/AMD
> 
> 
> 
> 
> 
> 
> 
> , very refreshing
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Having seen your profile I did think you'd have more requirement for CPU power than me, so comes at no surprise you have a 5960X
> 
> 
> 
> 
> 
> 
> 
> . My whole i5 rig is built "bang for $", the cooler I bought 2nd hand £13.50
> 
> 
> 
> 
> 
> 
> 
> . Mobo new but crazy price to me via some buying tactics/selling a freebie that came with it, PSU again crazy price. I bought a Fractal Design Core 2300 to house the Q6600 and moved the i5 to SilverStone case.
> 
> Putting aside say performance, power consumption is where AMD have been really lacking. Now the node change/tech is making this happen for them but at the back of my mind I can't help but think PowerTune maybe too aggressive on Polaris and hamper say OC'ing, like we see on Fiji; what are your thoughts on this?
> 
> 
> 
> 
> 
> 
> 
> .


Well, my machine is built for computing dev, albeit no Teslas, no Quadros, no Xeons and no ECC RAM or high queue depth SSDs, but cheap consumer parts from the top shelf. I like and do some graphics dev and wish for AMD to succeed. Not only their tactics is more fair to consumers, but their tech finally got its place under the sun which brings us better hardware, tools and lower prices.









Fiji is a marvel on many levels. I enjoy using it for many reasons. Vastly different architecture than Nvidia's makes it so fun to explore.


----------



## pcrevolution

Quick question guys:

Given that the R9 Fury X and Nano have the same number of stream processors, ignoring thermal throttling, will they perform the same clock-for-clock?


----------



## fat4l

Quote:


> Originally Posted by *pcrevolution*
> 
> Quick question guys:
> 
> Given that the R9 Fury X and Nano have the same number of stream processors, ignoring thermal throttling, will they perform the same clock-for-clock?


yes but Fury X may clock better/higher.


----------



## SuperZan

Quote:


> Originally Posted by *Elmy*
> 
> I got of ton of data for you guys.
> 
> My system is running a 5960X @ 4.5 for these benchmarks. All gaming resolutions are at 3440X1440.
> 
> 3DMark @ Stock settings on GPU = 6538 and the whole system draws 560 Watts ( this was max I saw on my Killawatt in last 3Dmark bench and temps @ 32c.
> 
> 3DMark @ 1100 , Fan @ stock = 6682 , system draw at 645 watts. ( I think this was throttling)
> 
> 3DMark @ 1100 , Fan @ 100% voltage and Power limit both at +50 = 7299 and system was drawing 770 watts max and temps were 39c and 42c max.
> 
> Running Tomb Raider at stock GPU and Ultimate settings in game gave me 52.2 average , 80 max , 31 min ,
> 
> Running Tomb Raider at 1100 GPU " " gave me 57.3 average , 78 max , 34 min.
> 
> Here is the cool part the Radeon Pro Duo and the Fury X DO work in crossfire. Here is the 3DMark validation. http://www.3dmark.com/3dm/11846943?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Adding a Fury X increased my score by 3,000 on Ultra settings and GPU set @ 1100. The MAX wattage used was 1050 Watts I saw on the Killawatt.
> 
> I keep getting a random error pop up on my screen though. Just played about an hour of Black Ops 3 and everything was running just fine in Tri-fire.


Beautiful graphics score. That scaling is nice, it's right where it should be when comparing it to my old Fury X / Fury crossfire scores.


----------



## TheLAWNOOB

What is the max rpm for the fan on Pro Duo?

Also is it possible to get a huge drop in price after GTX 1080, since thats what happened to R9 295X2? I know it has Fire Pro drivers and aimed at VR, but still pretty much a gaming card.


----------



## fat4l

@Elmy u around mate?









Could you please run 3D extreme with default driver setting and overclocked duo pro and cpu ?


----------



## Elmy

Quote:


> Originally Posted by *fat4l*
> 
> @Elmy u around mate?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Could you please run 3D extreme with default driver setting and overclocked duo pro and cpu ?


What overclocks do you want me to set?


----------



## Elmy

Quote:


> Originally Posted by *fat4l*
> 
> You have a point there
> 
> 
> 
> 
> 
> 
> 
> 
> Lets see the X results.
> 
> @Elmy, could you please run FS X, with default driver settings + OCed cpu/duo pro so we can compare with Ares 3 OC
> 
> 
> 
> 
> 
> 
> 
> 
> I play in 1440p anyway so X makes more sense to me


Like this ? http://www.3dmark.com/fs/8327630


----------



## fat4l

Quote:


> Originally Posted by *Elmy*
> 
> Like this ? http://www.3dmark.com/fs/8327630


Yeah nice









Comparison, Ares 3(290X CF) vs Duo Pro (Fury Nano CF)
http://www.3dmark.com/compare/fs/8327630/fs/8048093
Duo pro wins by 2%








Its interesting to see the difference between GT1 and GT2.


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> Yeah nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Comparison, Ares 3(290X CF) vs Duo Pro (Fury Nano CF)
> http://www.3dmark.com/compare/fs/8327630/fs/8048093
> Duo pro wins by 2%
> 
> 
> 
> 
> 
> 
> 
> 
> Its interesting to see the difference between GT1 and GT2.


There must be an error of some kind there. GT1 is a LOL, but GT2 seems more realistic, albeit a bit disappointing.


----------



## gupsterg

RPD maybe suffering from driver, I know on Fury X I can lose performance based on which driver even for like clocks. Hawaii/Grenada have mature drivers. As the OC 3DM FS Ultra score showed boost/win over Ares 3 I would have really thought at Extreme RPD would surpass Ares 3.

Elmy may also be getting FPS/performance drop on OC like we see on other Fiji cards. I currently believe this is due to VID. On my high leakage asic I can add more VID before I see performance loss on an OC, as this has lower VID by default. On a lower leakage asic I a small range of VID I can raise before it impact OC performance.

@Elmy

1. Are you willing to run AiDA64 and supply registers dump for RPD so roughly can determine leakage?

2. Can you run MSI AB logging card data whilst doing 3DM run and upload HML in a zip so can see if throttling (stock & OC).

3. I know I asked before and maybe you missed it but any chance of rom dump of RPD? Should be 2 of them, master/slave, dump via GPU-Z, zip and attach to post if possible?

Cheers







.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> RPD maybe suffering from driver, I know on Fury X I can lose performance based on which driver even for like clocks. Hawaii/Grenada have mature drivers. As the OC 3DM FS Ultra score showed boost/win over Ares 3 I would have really thought at Extreme RPD would surpass Ares 3.
> 
> Elmy may also be getting FPS/performance drop on OC like we see on other Fiji cards. I currently believe this is due to VID. On my high leakage asic I can add more VID before I see performance loss on an OC, as this has lower VID by default. On a lower leakage asic I a small range of VID I can raise before it impact OC performance.
> 
> @elmy
> 
> 1. Are you willing to run AiDA64 and supply registers dump for RPD so roughly can determine leakage?
> 
> 2. Can you run MSI AB logging card data whilst doing 3DM run and upload HML in a zip so can see if throttling (stock & OC).
> 
> 3. I know I asked before and maybe you missed it but any chance of rom dump of RPD? Should be 2 of them, master/slave, dump via GPU-Z, zip and attach to post if possible?
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .


What can he gain by AIDA64 register dump in regard to leakage?


----------



## gupsterg

Default VID per DPM.

For example:

Sapphire Fury Tri-X DPM 7 1.243V
Sapphire Fury X DPM 7 1.250V
Sapphire Fury X DPM 7 1.212V
MSI Fury X DPM 7 1.187V

Above is cards I've owned, I also have ~14 Fiji cards dumps from others (some of these are in fiji bios mod thread). I can probably wrangle more data from a member I've been helping on bios mod which has ~100 Fiji cards







. I also have ~30 Hawaii/Grenada dumps







.

I stated in post 8158 on relevance of VID per DPM to LeakageID.

Also if people know what VID is per DPM when they apply an offset in MSI AB,etc they know what VID card is at. SW monitoring shows VDDC only, which is a drooped/variable VDDC value, so they really have no idea what VID the GPU is at.


----------



## bluezone

An interesting rig.


----------



## Cool Mike

Thought I would repost my sweet ITX/Pro Duo build from another thread, since this is the official thread. Maxing out COD Black OPS III @ 4K and getting 60+ FPS. Have COD set to sync at 60Hz with monitor. Haven't seen FPS drop below 60 FPS.

I have had the XFX Pro Duo in my hands for 6 days. The Phanteks Enthoo Evolv ITX case is a top choice for a ITX build if you need room for a for couple radiators. This case allowed me to install a CPU radiator and the Pro Duo radiator.

I didn't think the Pro Duo would be UEFI compatible, but sure enough it is. My sweet spot as far as overclocking is 1050 MHz and 525 MHz on both cores. -32mV GPU core voltage decrease to keep throttling at a minimum. Temps are in the low 60's 100% loaded. The case is small so the air flow is not the best.

Has to be one of the most powerful ITX builds right now.









A few pic's and benches:


----------



## Elmy

Quote:


> Originally Posted by *Cool Mike*
> 
> Thought I would repost my sweet ITX/Pro Duo build from another thread, since this is the official thread. Maxing out COD Black OPS III @ 4K and getting 60+ FPS. Have COD set to sync at 60Hz with monitor. Haven't seen FPS drop below 60 FPS.
> 
> I have had the XFX Pro Duo in my hands for 6 days. The Phanteks Enthoo Evolv ITX case is a top choice for a ITX build if you need room for a for couple radiators. This case allowed me to install a CPU radiator and the Pro Duo radiator.
> 
> I didn't think the Pro Duo would be UEFI compatible, but sure enough it is. My sweet spot as far as overclocking is 1050 MHz and 525 MHz on both cores. -32mV GPU core voltage decrease to keep throttling at a minimum. Temps are in the low 60's 100% loaded. The case is small so the air flow is not the best.
> 
> Has to be one of the most powerful ITX builds right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A few pic's and benches:
> ]


Very Nice!

Quote:


> Originally Posted by *gupsterg*
> 
> RPD maybe suffering from driver, I know on Fury X I can lose performance based on which driver even for like clocks. Hawaii/Grenada have mature drivers. As the OC 3DM FS Ultra score showed boost/win over Ares 3 I would have really thought at Extreme RPD would surpass Ares 3.
> 
> Elmy may also be getting FPS/performance drop on OC like we see on other Fiji cards. I currently believe this is due to VID. On my high leakage asic I can add more VID before I see performance loss on an OC, as this has lower VID by default. On a lower leakage asic I a small range of VID I can raise before it impact OC performance.
> 
> @Elmy
> 
> 1. Are you willing to run AiDA64 and supply registers dump for RPD so roughly can determine leakage?
> 
> 2. Can you run MSI AB logging card data whilst doing 3DM run and upload HML in a zip so can see if throttling (stock & OC).
> 
> 3. I know I asked before and maybe you missed it but any chance of rom dump of RPD? Should be 2 of them, master/slave, dump via GPU-Z, zip and attach to post if possible?
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .


I will see what I can do next weekend.


----------



## Metalhead79

Are there any aftermarket coolers available for the Fury that would outperform Gigabyte's Windforce cooler?


----------



## Alastair

Quote:


> Originally Posted by *Metalhead79*
> 
> Are there any aftermarket coolers available for the Fury that would outperform Gigabyte's Windforce cooler?


Doubt it. The Fury coolers are some of the best air coolers on the market. Also I doubt any of the current coolers would be compatible for HBM stacks.


----------



## bluezone

Small Crimson update.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.5.1.aspx


----------



## gupsterg

Just saw this on Videocardz, my Fury X with tess.tweak result(OC bench stable tested). Will try it with my best stable 24/7 OC 1135/535 as well and 3DM FS Extreme as soon as finish [email protected] run.


----------



## Blotto80

So I've currently got a Fury Tri-X that can't unlock any cores. I've got it clocked at 1125/520. I'm pretty pleased with it overall. I've got a guy who picked up a Fury X and doesn't like the pump noise, who offered to trade me straight up. Would you guys take the trade?


----------



## dagget3450

Quote:


> Originally Posted by *Blotto80*
> 
> So I've currently got a Fury Tri-X that can't unlock any cores. I've got it clocked at 1125/520. I'm pretty pleased with it overall. I've got a guy who picked up a Fury X and doesn't like the pump noise, who offered to trade me straight up. Would you guys take the trade?


If you dont mind the noise its a no brainer imo.


----------



## fat4l

New results here ...


----------



## Agent Smith1984

Quote:


> Originally Posted by *fat4l*
> 
> New results here ...


Looks pretty strong but why would anybody buy NVIDIA's initial launches anymore? I mean, everybody knows the ti is coming a few months later, it happens every time like clockwork.....

It'd be one thing if the 1080 was $550, and the ti was $650, but they tend to launch the *80 at the high price, and then drop it when the ti drops.


----------



## Alastair

What is a Fury/furyx scoring in comparison to that as a graphics score?


----------



## toncij

Well, if this result is true, it will be very fun because this is 25% faster than like TitanX at 1300ish MHz. http://www.3dmark.com/fs/4619541
http://www.3dmark.com/fs/5779221


----------



## toncij

Quote:


> Originally Posted by *Alastair*
> 
> What is a Fury/furyx scoring in comparison to that as a graphics score?


Overclocked it is faster http://www.3dmark.com/fs/6009460


----------



## gupsterg

@Alastair

Some of my benches 3DM FS Extreme.

For ~1300 GS increase check scaling this, applying to the GTX 1080 @ 10733 GS I'd say:-

GT1:- ~56 FPS
GT2:- ~39 FPS


----------



## Agent Smith1984

The 1080 is scoring 27k+ graphics score on standard firestrike.... 1.88ghz core clock, 8gb of 10ghz GDDR5X, however a 980ti at the same clock speed (KP'S RUN @ 1886), is much faster, so even though 1080 will be faster at launch, the clock for clock performance is inferior to the current series.... Kinda weird...


----------



## gupsterg

Only been reading some stuff on green team past few days, so not up on it, is 1080 same amount of CUDA cores as 980 Ti?


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> Only been reading some stuff on green team past few days, so not up on it, is 1080 same amount of CUDA cores as 980 Ti?


Not sure yet, I'm guessing slightly less based on 980 ti being faster than 1080 at same clocks


----------



## toncij

I seriously doubt the "leaked test" is not fake. 1860MHz? That is hardly a real stock clock.
Realistic: 1300MHz or similar stock clock, 2560 cores (less than 2880 from 980Ti and 3072 from TitanX).


----------



## Alastair

Yeah. Also 1800MHz on a brand new process node? Could this just be a really overclocked 980ti or 980 or something with a spoofed ID?


----------



## SuperZan

Quote:


> Originally Posted by *Alastair*
> 
> Yeah. Also 1800MHz on a brand new process node? Could this just be a really overclocked 980ti or 980 or something with a spoofed ID?


My assumption until we see real verifiable proof. With new tech launches "guilty until proven innocent" is my motto.


----------



## dagget3450

Yeah, i don't see a stock clock of 1800mhz or whatever. Like someone else said 1300ish would sounds more like it.


----------



## buildzoid

The old Fermi cards had 1600mhz shaders. If Nvidia built Pascal specifically to run stupid high clocks this isn't that surprising.


----------



## fat4l

My source is saying 30% increase 1080 vs 980Ti stock to stock.


----------



## 12Cores

Quote:


> Originally Posted by *dagget3450*
> 
> Yeah, i don't see a stock clock of 1800mhz or whatever. Like someone else said 1300ish would sounds more like it.


I think the Pascal cards will hit some epic clocks, but I have been running amd cards for a long time now and I know that if Polaris card can hit 1400/1500 with the updated gcn architecture even with this tiny die we will see something really special from AMD. Usually a die shrink results in higher clocks, lets just hope AMD can deliver or that 1070 for around $300 could spell their doom.


----------



## SuperZan

Quote:


> Originally Posted by *12Cores*
> 
> I think the Pascal cards will hit some epic clocks, but I have been running amd cards for a long time now and I know that if Polaris card can hit 1400/1500 with the updated gcn architecture even with this tiny die we will see something really special from AMD. Usually a die shrink results in higher clocks, lets just hope AMD can deliver or that 1070 for around $300 could spell their doom.


I'd wager the 1070 will be $400+ USD. I think Polaris is meant to really sock in the low/low-mid and Vega will shoot for the high-end performance crown. IMO AMD is probably going to leave the mid-high to Nvidia initially. AMD needs market share ( < $400 ) and mind share ( > $650 ) and they have to compete for both with fewer resources than their competitor. Total speculation, but that's what it seems like after Famous Original Roy's tweets and the few charts/timelines AMD has published.


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> My source is saying 30% increase 1080 vs 980Ti stock to stock.


Your source?

Anyway. GP100 runs at 1300 as I've said. Why wouldn't they run it at 1860 if the Pascal could do that?


----------



## Radox-0

The Tesla GP100 specs listed on Nvidia's devblog show the card to have a boost clock of 1480Mhz (https://devblogs.nvidia.com/parallelforall/inside-pascal/). Considering boost clock is more often then not easily passed in real world in some cases by a fair bit, combined with this not being a workstation card could it not be feasible that's a real figure?

Obviously the memory is different but I imagine the core would be of similar design in terms of architecture.


----------



## Radox-0

Quote:


> Originally Posted by *toncij*
> 
> Your source?
> 
> Anyway. GP100 runs at 1300 as I've said. Why wouldn't they run it at 1860 if the Pascal could do that?


the Quoted boost clock speeds is always given as a conservative figure of what it can actually do. Depending on the quality of the card, the actual boost clock speeds reached can be significantly higher. The baseclock for the GP100 is given as 1328Mhz and boost 1480 Mhz.

Also given that its a workstation card with a bigger chip a listed speed of 1480 Mhz boost (I imagine in real life it would pass 1500 Mhz no issue) seems fairly reasonable. More so considering that stability rather then absolute speed is critical.

Proof is in the pudding of course, but the fact a workstation cards likely boosts to over 1500 Mhz, seems to me the smaller consumer cards could hit 1800 Mhz+. We shall see I guess


----------



## toncij

1328/1480 is still expected 11%. 1860 would be 40% overclock like 1400 is to 1000 of the original TitanX. Currently the fastest 980Ti is 1266MHz with 1367 boost clock which is still ~10%.
Pascal is smaller process but I seriously doubt they'd miss on 1860 boost with 1675MHz base clock if it could do it. Keep in mind GP100 is "all in" product and if it could be possible, it would be out there. 1675 with a 1860 boost would be 26% more, which is today's maximum from stock to fastest GM200.

Manual overclock under water? Yes, I presume 1860 might be possible. Stock boost? I don't think so.

We'll see.


----------



## Radox-0

Not entirely unexpected. I have gone through 8 Titan X's (not at the same time, I only wish







) and on stock they boosted to 1275-1370 Mhz depending on the quality. In fact I just tested one of mine before writing this and it boosts to 1331 Mhz on stock profile with 0 changes. Point being I guess, what the actual cards reach with boost can be much higher then what's listed on paper.

Take into account we are comparing a Tesla card's paper specs in which case A) paper specs are conservative for boost and B) while Tesla being a large chip is not "all out" in the sense they typically uses lower / conservative clocks for stability as Tesla cards have a different use case rather then extracting the maximum performance all the time.

Anyways, apologise all, took this off track somewhat







I guess we will see later this evening / weekend where things stand no doubt









Now I can't wait till we see some details on AMD's cards


----------



## rubenlol2

Tesla arch (G80 G92 GT200) and whatnot had a shader clock and core clock that you could adjust by themselves.
I think most if not all modern cards run the shaders at twice the core clock if I'm not wrong.


----------



## fat4l

Why are you guys talking about tesla ?
Pascal is coming now


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> Why are you guys talking about tesla ?
> Pascal is coming now


You do realize Pascal is the architecture, Tesla is a product of Pascal... etc.?


----------



## fat4l

Quote:


> Originally Posted by *toncij*
> 
> You do realize Pascal is the architecture, Tesla is a product of Pascal... etc.?


Aha this is what it is coming from


----------



## buildzoid

Quote:


> Originally Posted by *rubenlol2*
> 
> Tesla arch (G80 G92 GT200) and whatnot had a shader clock and core clock that you could adjust by themselves.
> I think most if not all modern cards run the shaders at twice the core clock if I'm not wrong.


I'm pretty sure that with kepler Nvidia went from 1:2 engine to shader clocks to 1:1. AFAIK Fermi was 1:2 Kepler is 1:1 and Maxwell is also 1:1.


----------



## dagget3450

In other Fury related news, i managed to beat my best FSU graphics score on my 5960x with an AMD FX9590 lol. Sure the cpu portion was destroyed by intel but why a better GPU score lol. Were talking lightyears difference in platforms like ddr4 3200 vs ddr3 1600 and pcie 2.0 8x vs pcie 3.0 8x and lesser gpu clocks to boot as well.
Quote:


> So just a quick run on FSU and compare of 5960x vs 9590fx. Why my gpu score is higher on FX i'll never understand. Lower gpu clocks pcie 2.0 and all that and i end up with a higher gpu score than x99/5960x. All while getting murdered in CPU portion of the benchmark......
> 
> I guess this helps gauge how much they count cpu side of it in FSU as well, it looks a lot like FS/FSE now to me in terms of point distribution.
> 
> http://www.3dmark.com/compare/fs/7946403/fs/8383738
> 
> The cpus were clocked almost identical if you look at the results page youll see. Also ddr4 3200 vs ddr3 1600 - i havent done much with the AMD platform yet as i really wanted to see without much overclocking what it will do before going that route. I don't know if i'll invest that time either as i got a lot of pc projects waiting on me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know it should get worse on FSE and FS but still i do not understand
> 
> 
> 
> 
> 
> 
> 
> On the other hand i've only tested a few benches for quad fiji on FX and that was valley/heaven and those have bad results so far. However im finding things like crossfire profiles are not universal either across both platforms for valley/heaven. I.E. 1x1 CF works way better on FX cpu and had very small or not much effect on 5960x....
> I am going to test out GTA5 i guess. I really don't want to download a million games to test but i do want to explore this mystery a little more before i finish building my other box


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> In other Fury related news, i managed to beat my best FSU graphics score on my 5960x with an AMD FX9590 lol. Sure the cpu portion was destroyed by intel but why a better GPU score lol. Were talking lightyears difference in platforms like ddr4 3200 vs ddr3 1600 and pcie 2.0 8x vs pcie 3.0 8x and lesser gpu clocks to boot as well.


Believe it or not, it's been shown in a few articles that for whatever reason the AMD CPU's do better at 4k resolution (the testing I saw was an 8350 vs a 4930k I believe)

Not a clue why.... but that is strange.









here's that article:
http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html


----------



## dagget3450

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Believe it or not, it's been shown in a few articles that for whatever reason the AMD CPU's do better at 4k resolution (the testing I saw was an 8350 vs a 4930k I believe)
> 
> Not a clue why.... but that is strange.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here's that article:
> http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html


Yeah but when you take into consideration cpu overhead for AMD vs Nvidia one would think AMD GPU would do worse not matter what. Were talking Quad FuryX (the link you show is 2 way SLI)as well, i don't have quad Nvidia to try but when watching the test it is also way smoother on the AMD 9590. I didn't even do anything really with the 990fx platform either its all stock. I wonder if it's more to do with timing and CF profiles/optimizations.


----------



## toncij

Quote:


> Originally Posted by *fat4l*
> 
> Aha this is what it is coming from


Ahh you're talking about the old Tesla arch? Sorry, my bad. I thought you were calling GP100 a Tesla since, we'll, it is one.









Never mind. So, can anyone point out if there is any real difference between Nano and Fury X except for the cooling. I'm thinking of getting several Nanos instead of FuryXes.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> Yeah but when you take into consideration cpu overhead for AMD vs Nvidia one would think AMD GPU would do worse not matter what. Were talking Quad FuryX as well, i don't have quad Nvidia to try but when watching the test it is also way smoother on the AMD 9590. I didn't even do anything really with the 990fx platform either its all stock. I wonder if it's more to do with timing and CF profiles/optimizations.


No clue, lol

But it does just reiterate the point for gamers building systems....

Keep the CPU/mobo cheap (of course without be so cheap it sacrifices performance no matter what), and dump every penny you have into graphics card(s)....


----------



## Radox-0

Quote:


> Originally Posted by *toncij*
> 
> Ahh you're talking about the old Tesla arch? Sorry, my bad. I thought you were calling GP100 a Tesla since, we'll, it is one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never mind. So, can anyone point out if there is any real difference between Nano and Fury X except for the cooling. I'm thinking of getting several Nanos instead of FuryXes.


One is shorter









Well of course on stock profiles the Nano will typically throttle, you can up the power limit and increase fan speed on stock cooler and it gets past this no issue. Under water it is a diffrent beast and can be pushed fairly nicely. My current Nano gets to about 1125 Mhz and while it can reach 1150 mhz, it seems like other Fiji cards I need higher volts to stabilise at that figure which results in a real world loss of performance, even if core clock is higher. Did not have my Fury X for long, but from my results and looking around, the Fury X can stabilise at higher over clocks, I imagine in part thanks to the beefier power delivery system.

If you plan to keep stock coolers on then go Fury X I say. If under water, the Nano being considerably cheaper while offering most of the performance when both are OC'd would be a nice choice if your not after absolute max performance. Being perfectly honest though, I say wait for these new cards


----------



## Alastair

So I updated my Fury's with the updated BIOS from AMD's website. However I got an SSID mismatch. But I forced the update anyway and it seems to have worked. Anything I should be worried about. FuryX Bios can work on a reference Fury right?


----------



## toncij

Quote:


> Originally Posted by *Radox-0*
> 
> One is shorter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well of course on stock profiles the Nano will typically throttle, you can up the power limit and increase fan speed on stock cooler and it gets past this no issue. Under water it is a diffrent beast and can be pushed fairly nicely. My current Nano gets to about 1125 Mhz and while it can reach 1150 mhz, it seems like other Fiji cards I need higher volts to stabilise at that figure which results in a real world loss of performance, even if core clock is higher. Did not have my Fury X for long, but from my results and looking around, the Fury X can stabilise at higher over clocks, I imagine in part thanks to the beefier power delivery system.
> 
> If you plan to keep stock coolers on then go Fury X I say. If under water, the Nano being considerably cheaper while offering most of the performance when both are OC'd would be a nice choice if your not after absolute max performance. Being perfectly honest though, I say wait for these new cards


Well, I know technical differences, but wasn't really sure about Nano BIOS and the power thingy in a real situation. Planing on EKWB solution.


----------



## djsatane

Quote:


> Originally Posted by *Alastair*
> 
> So I updated my Fury's with the updated BIOS from AMD's website. However I got an SSID mismatch. But I forced the update anyway and it seems to have worked. Anything I should be worried about. FuryX Bios can work on a reference Fury right?


I have Asus fury x and I had same issue but its just subsystem id mismatch, it doesnt seem to cause any issues for operation, but did you say you used furyx bios on a fury(non x)?


----------



## Flamingo

Is now a good time to sell the R9 Nano and grab a 1070/1080 later? Or wait for the reviews? Nvidia's move to 8GB .. they will surely push devs to use as much VRAM now to keep AMD's Furys at a disadvantage (for now until AMDs offerings release)


----------



## SuperZan

Quote:


> Originally Posted by *Flamingo*
> 
> Is now a good time to sell the R9 Nano and grab a 1070/1080 later? Or wait for the reviews? Nvidia's move to 8GB .. they will surely push devs to use as much VRAM now to keep AMD's Furys at a disadvantage (for now until AMDs offerings release)


I'm riding out Fiji until Vega / big Pascal. IF you're determined to sell, do it sooner rather than later, but all we have is Nvidia's info just yet so you can be sure that the gains they showed are their best-case picks. 1070/1080 are still strong cards but Vega / big Pascal is going to be the first big-big jump of 14/16nm.


----------



## pdasterly

damn gtx 1080, should i still get the radeon pro duo?


----------



## Alastair

Quote:


> Originally Posted by *djsatane*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> So I updated my Fury's with the updated BIOS from AMD's website. However I got an SSID mismatch. But I forced the update anyway and it seems to have worked. Anything I should be worried about. FuryX Bios can work on a reference Fury right?
> 
> 
> 
> I have Asus fury x and I had same issue but its just subsystem id mismatch, it doesnt seem to cause any issues for operation, but did you say you used furyx bios on a fury(non x)?
Click to expand...

yes I am using AMD's own updated UEFI support Fury X BIOS on my Sapphire R9 Fury Tri-X's. Seems that Fiji Pro did not get any BIOS updating love.


----------



## pdasterly

there is utility to show if your fury has permanently locked cores. my fury is unable to unlock additional cores


----------



## Alastair

Quote:


> Originally Posted by *pdasterly*
> 
> there is utility to show if your fury has permanently locked cores. my fury is unable to unlock additional cores


I'm not talking about unlocking cores.

In other news however. When attempting to make a .4 low and a .4high BIOS to unlock my Fury's, ATOMTOOL keeps telling me that the BIOS is invalid or unsupported or something?


----------



## gupsterg

@toncij

If you don't need SFF I'd go Fury X, besides thermals throttling Nano it's PowerLimit/PowerTune due to cut back VRM plus 1x 8 pin PCI-E connector on Nano vs 2 on Fury X.



Spoiler: Nano









Spoiler: Fury X







Grab a Nano/Fury X ROM from TPU Vbios DB, open it in Fiji bios editor and you will see in Limits section differing PowerTune aspects.

If I compare best promo prices on HUKD for Nano at ~£350 vs Fury X ~£400 it's no brainer to go with Fury X and use the AIO IMO plus better VRM .

For the 1135/535 @ 1.243V (stock 1.212V) I use 350W / 325A / 350W PowerLimit in PowerTune section of PowerPlay in ROM, view A / W for GPU in below ~18hr [email protected] run.



Spoiler: Fury X OC [email protected] ~18hr







IIRC on the high leakage ASIC 1130/545 @ 1.225V (stock 1.187V) was ~8% higher on A / W than above sample.

@Alastair

Flashing a Fury X ROM to Fury you maybe overvolting GPU without even knowing it







, in the unlock thread view post 1055.



Spoiler: Example 1



Let's say stock Fury ROM calculates DPM 7 1.187V with 1000MHz GPU clock, you flash to Fury X DPM 7 is set as 1050MHz the ROM may decide to set VID at 1.250V (+63mV).





Spoiler: Example 2



Let's say stock Fury ROM calculates DPM 7 1.187V with 1000MHz GPU clock, you mod that ROM so DPM 7 is set as 1050MHz the ROM may decide to set VID at 1.250V (+63mV).



The only time your VID will stay the same as stock, when GPU clock is increased in ROM, is when fixed manually. VID per DPM is based on default GPU clock per DPM + LeakageID + other GPU properties.

Regarding issue with AtomTool, is filename for ROM 8.3 format? running command prompt as administrator?


----------



## 0x00000000

Which would net me more performance underwater and overclocked? The powercolor r9 nano or the asus gtx980 strix?
I have 3 of each with waterblocks but I need to choose which to put into my main rig.


----------



## SuperZan

Nano's underwater.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> If you don't need SFF I'd go Fury X, besides thermals throttling Nano it's PowerLimit/PowerTune due to cut back VRM plus 1x 8 pin PCI-E connector on Nano vs 2 on Fury X.
> 
> 
> 
> Spoiler: Nano
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Fury X
> 
> 
> 
> 
> 
> 
> 
> Grab a Nano/Fury X ROM from TPU Vbios DB, open it in Fiji bios editor and you will see in Limits section differing PowerTune aspects.
> 
> If I compare best promo prices on HUKD for Nano at ~£350 vs Fury X ~£400 it's no brainer to go with Fury X and use the AIO IMO plus better VRM .
> 
> For the 1135/535 @ 1.243V (stock 1.212V) I use 350W / 325A / 350W PowerLimit in PowerTune section of PowerPlay in ROM, view A / W for GPU in below ~18hr [email protected] run.
> 
> 
> 
> Spoiler: Fury X OC [email protected] ~18hr
> 
> 
> 
> 
> 
> 
> 
> IIRC on the high leakage ASIC 1130/545 @ 1.225V (stock 1.187V) was ~8% higher on A / W than above sample.


Got any 3DMarks for that?


----------



## gupsterg

- 3DM FS GT1 looped (used to use demo but new UI not allowing loop of that, demo=higher power usage).
- Driver at defaults no tweaks at all (same with results link).
- OC via ROM.

Fury X (No 2) 1135/535 1.243/1.300 (stock 1.212/1.300)



Fury X (No 3) 1130/545 1.225/1.300 (stock 1.187/1.300)



http://www.3dmark.com/compare/fs/8247534/fs/8326238

Difference IMO on 3DM score is within run to run variance, so I'd say 1135/535 is pretty much same performance as 1130/545. Seeing ~5% A / ~3.5% W more on higher leakage ASIC (No 3). If Fury X (No 3) was at 1.243V like Fury X (No 2) I reckon it would surpass it on A / W by a fair bit more.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @toncij
> 
> If you don't need SFF I'd go Fury X, besides thermals throttling Nano it's PowerLimit/PowerTune due to cut back VRM plus 1x 8 pin PCI-E connector on Nano vs 2 on Fury X.
> 
> 
> 
> Spoiler: Nano
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Fury X
> 
> 
> 
> 
> 
> 
> 
> Grab a Nano/Fury X ROM from TPU Vbios DB, open it in Fiji bios editor and you will see in Limits section differing PowerTune aspects.
> 
> If I compare best promo prices on HUKD for Nano at ~£350 vs Fury X ~£400 it's no brainer to go with Fury X and use the AIO IMO plus better VRM .
> 
> For the 1135/535 @ 1.243V (stock 1.212V) I use 350W / 325A / 350W PowerLimit in PowerTune section of PowerPlay in ROM, view A / W for GPU in below ~18hr [email protected] run.
> 
> 
> 
> Spoiler: Fury X OC [email protected] ~18hr
> 
> 
> 
> 
> 
> 
> 
> IIRC on the high leakage ASIC 1130/545 @ 1.225V (stock 1.187V) was ~8% higher on A / W than above sample.
> 
> @Alastair
> 
> Flashing a Fury X ROM to Fury you maybe overvolting GPU without even knowing it
> 
> 
> 
> 
> 
> 
> 
> , in the unlock thread view post 1055.
> 
> 
> 
> Spoiler: Example 1
> 
> 
> 
> Let's say stock Fury ROM calculates DPM 7 1.187V with 1000MHz GPU clock, you flash to Fury X DPM 7 is set as 1050MHz the ROM may decide to set VID at 1.250V (+63mV).
> 
> 
> 
> 
> 
> Spoiler: Example 2
> 
> 
> 
> Let's say stock Fury ROM calculates DPM 7 1.187V with 1000MHz GPU clock, you mod that ROM so DPM 7 is set as 1050MHz the ROM may decide to set VID at 1.250V (+63mV).
> 
> 
> 
> The only time your VID will stay the same as stock, when GPU clock is increased in ROM, is when fixed manually. VID per DPM is based on default GPU clock per DPM + LeakageID + other GPU properties.
> 
> Regarding issue with AtomTool, is filename for ROM 8.3 format? running command prompt as administrator?


Running command prompt as administrator yes. As for ROM 8.3 format. I have no clue.


----------



## gupsterg

If you google terms *8.3 filename* you will find info on it







.

Basically the ROM filename needs to be 8 characters (max) before the " . " (period) and 3 characters after the " . " , if it does not conform to the convention AtomTool throws a wobbly.

In the recent PM I've also asked are you trying to get AtomTool to unlock Fury X ROM?


----------



## Medusa666

Guys, how future proof would a Radeon Pro Duo be now that Polaris is coming?

Nvidia is not an option.

Thanks for any replies!


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> If you google terms *8.3 filename* you will find info on it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Basically the ROM filename needs to be 8 characters (max) before the " . " (period) and 3 characters after the " . " , if it does not conform to the convention AtomTool throws a wobbly.
> 
> In the recent PM I've also asked are you trying to get AtomTool to unlock Fury X ROM?


Still says incompatible BIOS image.

EDIT: And you may be right on my cards are now slightly overvolted with this new BIOS. I had VID's of 1.196 and 1.2 for my cards, now I get 1.228V on both my cards according to GPU-Z. But I can do 1100MHz without adding voltage to my cards so I guess that is a plus!


----------



## Gdourado

I sold my 390x for a good price to take advantag of a price drop and get a XFX Fury triple dissipation.
Now with the 1080 announcement and 1070, I don't know if I should pull the trigger on the fury or wait for the Nvidia cards.
How do you think the fury will stack up against the Nvidia new models?
Is it a good buy? Or should I wait?
Anyway, I am out of a GPU since my 390x already has a new home...


----------



## toncij

Do people sell hw here or locally?


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> Still says incompatible BIOS image.


I have asked 2x, once in post and once in PM, are you trying to unlock the Fury X ROM? if *YES* this will not work for you, unlock tool only work on Fury ROM *not* Fury X.



Quote:


> Originally Posted by *Alastair*
> 
> EDIT: And you may be right on my cards are now slightly overvolted with this new BIOS. I had VID's of 1.196 and 1.2 for my cards, now I get 1.228V on both my cards according to GPU-Z. But I can do 1100MHz without adding voltage to my cards so I guess that is a plus!


GPU-Z shows VDDC not VID.

VID is what the GPU is set to.

VDDC is what GPU is at realtime, this variates due to:-

i) Load line calibration / effect

ii) Depending on which app is used to load GPU for like clocks/settings/DPM state you will see different VDDC, this is PowerTune/Variable voltage at work.

*The* only way to know VID currently is registers dump via AiDA64.

You may not be adding voltage but it's already added







, you can already see VDDC is raised therefore VID is also raised







.

DO an AiDA64 registers dump on stock ROM and then on Fury X ROM and you should see VID is raised = more VDDC







.


----------



## Flamingo

Quote:


> Originally Posted by *Gdourado*
> 
> I sold my 390x for a good price to take advantag of a price drop and get a XFX Fury triple dissipation.
> Now with the 1080 announcement and 1070, I don't know if I should pull the trigger on the fury or wait for the Nvidia cards.
> How do you think the fury will stack up against the Nvidia new models?
> Is it a good buy? Or should I wait?
> Anyway, I am out of a GPU since my 390x already has a new home...


Well since you made the move already, wait for reviews and amd announcements. Price drops for fury or 1070.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Still says incompatible BIOS image.
> 
> 
> 
> I have asked 2x, once in post and once in PM, are you trying to unlock the Fury X ROM? if *YES* this will not work for you, unlock tool only work on Fury ROM *not* Fury X.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> EDIT: And you may be right on my cards are now slightly overvolted with this new BIOS. I had VID's of 1.196 and 1.2 for my cards, now I get 1.228V on both my cards according to GPU-Z. But I can do 1100MHz without adding voltage to my cards so I guess that is a plus!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> GPU-Z shows VDDC not VID.
> 
> VID is what the GPU is set to.
> 
> VDDC is what GPU is at realtime, this variates due to:-
> 
> i) Load line calibration / effect
> 
> ii) Depending on which app is used to load GPU for like clocks/settings/DPM state you will see different VDDC, this is PowerTune/Variable voltage at work.
> 
> *The* only way to know VID currently is registers dump via AiDA64.
> 
> You may not be adding voltage but it's already added
> 
> 
> 
> 
> 
> 
> 
> , you can already see VDDC is raised therefore VID is also raised
> 
> 
> 
> 
> 
> 
> 
> .
> 
> DO an AiDA64 registers dump on stock ROM and then on Fury X ROM and you should see VID is raised = more VDDC
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

oh pooh. I didn't realise. Ok yes I have been trying to use AMD's updated FURYX rom to try make unlock roms for my Fury.


----------



## baii

So the new furyx ROM will put the unlockable core on fury to locked? Instead of using them?


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> oh pooh. I didn't realise. Ok yes I have been trying to use AMD's updated FURYX rom to try make unlock roms for my Fury.


OK.

I think I can make AMD Fury X 107 ROM as Fury, which should possibly then be configurable to xxxx SP manual mod or AtomTool. @Sgt Bilko was interested in it, due to lack of time on my part I have yet to make it and pass it to him, am I correct in thinking you would be interested in it as well?


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> oh pooh. I didn't realise. Ok yes I have been trying to use AMD's updated FURYX rom to try make unlock roms for my Fury.
> 
> 
> 
> OK.
> 
> I think I can make AMD Fury X 107 ROM as Fury, which should possibly then be configurable to xxxx SP manual mod or AtomTool. @Sgt Bilko was interested in it, due to lack of time on my part I have yet to make it and pass it to him, am I correct in thinking you would be interested in it as well?
Click to expand...

yes I would be. I'll also give dumps of my old stock BIOS and this BIOS for the voltage mod as well









This is a lot of work on your part. +1


----------



## gupsterg

No worries







.

Yes, I would need AiDA64 dump, when you have stock factory ROM @ stock settings. I don't need the dump for when you have Fury X ROM flashed but if you get the chance be interested to see how much VID was raised on your card.


----------



## Agent Smith1984

You guys happen to catch the specs on that new 1080? WOW......

I'm not interested in that card @ $600 though.... but what I am interested in seeing is the performance of the 1070 with 8GB (cough** 7.5GB) of VRAM, lol

If it's 2048-2300+ cuda cores and clocking in the 1800MHz+ range, then it should be around 980ti performance......

If it's just faster than Fury but with all that VRAM, it will be exactly what I am looking for at $379..... oh I need AMD to counter quickly, cause the 390X is sold and I'm not wanting the main rig to be down for more than a month or so.









Hopefully Polaris 10 brings some Fury X + performance at $349 with 8GB


----------



## Flamingo

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hopefully Polaris 10 brings some Fury X + performance at $349 with 8GB


The 490 probably will be the next 390. I want an 8GB card too with Fury level performance and 300ish price. But then it wont be the same lenght as a Nano because GDDR5x









Need small form factor for my system.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Flamingo*
> 
> The 490 probably will be the next 390. I want an 8GB card too with Fury level performance and 300ish price. But then it wont be the same lenght as a Nano because GDDR5x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Need small form factor for my system.


Yeah, seems like Nano was a one shot deal, and we may not get anything else like, but who knows?


----------



## gupsterg

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hopefully Polaris 10 brings some Fury X + performance


Doesn't look like it if this is legit + if it's 10 XT, link. In each table if you click each driver name it takes you to the online bench.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> Doesn't look like it if this is legit + if it's 10 XT, link. In each table if you click each driver name it takes you to the online bench.


So I think I am seeing the 1080, the 1070 and the 1060 on there???

Polaris looks very unimpressive if that their top model.

I am guessing we will see 390 performance at around $230?? meh...

Either way, the Fury still looks to hold it's ground in 4k....

NVIDIA and their damn gimped memory bandwidth, I swear








If they'd just give people a 512bit bus, they could keep using GDDR5 for 3 more years and rely on their gimpwerks and dev payoffs to do the rest with CUDA cores...

Mind you, this is a title that tends to favor AMD because of the DX12, and this is showing us that again NVIDIA failed to bring their "A" sync game









I'd be lying if I said I was not VERY intrigued by the 1070 though, at $379 it could be best value for the next year or two.


----------



## spyshagg

All these arguments were healthy a generation ago, but today having to chose based on the monitor you have is the biggest catastrophe in pc gaming history.

I would have to be very hard pressed to choose nvidia considering their role on the issue, despite all the merits I could attribute to their gpu offerings.


----------



## Agent Smith1984

Quote:


> Originally Posted by *spyshagg*
> 
> All these arguments were healthy a generation ago, but today having to chose based on the monitor you have is the biggest catastrophe in pc gaming history.
> 
> I would have to be very hard pressed to choose nvidia considering their role on the issue, despite all the merits I could attribute to their gpu offerings.


Are you referring to Freesync and Gsync?

I think if you use either of those, then you have already signed up to be stuck with one brand or the other anyways......

I'm almost positive I am getting a 1070 at this point. Titan X performance for $380, and the damn thing will overclock to over 2GHz.... that sounds like fun to me









I'll take a look back at AMD once Vega drops. I show no loyalty when it comes to GPU's though I prefer to support AMD when it makes sense.


----------



## gupsterg

Again if legit, I'm not sorta surprised on Polaris 11 / 10 performance, did expect a bit more though.

AMD did state it's aiming for differing segment with these cards compared with nVidia offering. It would also then fit the bill of how Fiji is still going till Vega plus releasing RPD.

I'll be honest nVidia seem to be going for the $$$ on the FE cards and those that think it will be better binned here is answer, link. On hexus.net it has also been confirmed in a GTX 1080 thread by one of their reviewers/site team members that FE = no better binning/components.

IF I even take GTX 1080 non FE @ $599 or GTX 1070 non FE @ $379, which in UK probably gonna be £599 / £379 I reckon I'm set with having got the Fury X for cheaper







.



I think I'm gonna stick to Fury X @ 1080P and when Vega come I'll jump to 1440P.


----------



## SuperZan

Quote:


> Originally Posted by *gupsterg*
> 
> Again if legit, I'm not sorta surprised on Polaris 11 / 10 performance, did expect a bit more though.
> 
> AMD did state it's aiming for differing segment with these cards compared with nVidia offering. It would also then fit the bill of how Fiji is still going till Vega plus releasing RPD.
> 
> I'll be honest nVidia seem to be going for the $$$ on the FE cards and those that think it will be better binned here is answer, link. On hexus.net it has also been confirmed in a GTX 1080 thread by one of their reviewers/site team members that FE = no better binning/components.
> 
> IF I even take GTX 1080 non FE @ $599 or GTX 1070 non FE @ $379, which in UK probably gonna be £599 / £379 I reckon I'm set with having got the Fury X for cheaper
> 
> 
> 
> 
> 
> 
> 
> .


Got my Furies for a little over £600, I can't complain . FE just isn't compelling; NV is making it easy to wait on Vega / Big Pascal if you're already running well at your resolution.


----------



## Agent Smith1984

Yeah, but how can AMD let that ~$350 market get that far away from them??

I understand that the majority of folks will buy $200 cards, and I will certainly never buy $600+ cards, but I am often in that $350-400 range and NVIDIA is basically going to dominate that market unless Fury's come way down in price.


----------



## Gdourado

Quote:


> Originally Posted by *gupsterg*
> 
> Again if legit, I'm not sorta surprised on Polaris 11 / 10 performance, did expect a bit more though.
> 
> AMD did state it's aiming for differing segment with these cards compared with nVidia offering. It would also then fit the bill of how Fiji is still going till Vega plus releasing RPD.
> 
> I'll be honest nVidia seem to be going for the $$$ on the FE cards and those that think it will be better binned here is answer, link. On hexus.net it has also been confirmed in a GTX 1080 thread by one of their reviewers/site team members that FE = no better binning/components.
> 
> IF I even take GTX 1080 non FE @ $599 or GTX 1070 non FE @ $379, which in UK probably gonna be £599 / £379 I reckon I'm set with having got the Fury X for cheaper
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> I think I'm gonna stick to Fury X @ 1080P and when Vega come I'll jump to 1440P.


Right.
I am European. So 1070 might be 450 euros.
450 is 100 euros more than a fury air-cooled currently.
not to mention having to sell my Freesync monitor at a loss and spend more on a GSync one.

my thought.
get a fury now for 350, and use it until late 2016 or q1 2017 and then upgrade either for big pascal or big Polaris.

is this a good idea? Or is buying a fury now just stupid?


----------



## Blotto80

I bought a Fury a few months ago for $680 cdn and just traded it for a Fury X last week. I'm set until Vega. I'm on the boat where after dropping the cash on a 1440p freesync display, I won't consider nVida until they change their tune and support the open standard.


----------



## gupsterg

@subscribers.

Some sites already have GTX review cards, hardwareluxx has, ref translated link (their English version of site I didn't see this article); 1st paragraph last sentence, also last section of article has some info on Ashes benches leak.

@Gdourado

Don't base everything regarding AMD or nVidia on these current leaks.

There maybe slight price shift on Fury, so I'd say hold off yet on Fury and see what occurs. I went Fiji in March 16 as was in a position where I could sell my Hawaii cards for no loss and buy Fiji for cheap, so win win situation.

I can't ever see myself going nVida now or future TBH and paying for G-Sync?







.


----------



## Agent Smith1984

I'm on a 4k TV, so my monitor has no impact on my GPU selection..... I gotta admit though, I am really surprised to see the amount of people who have gone with these proprietary monitors. I figured plain ol' 1440p 120+hz would be the standard moving forward. Sinking monitors with GPU's just seems like a terrible idea to me....


----------



## Blotto80

Personally, I don't view the freesync monitors as proprietary. I've purchased the open standard and will no longer purchase from GPU vendors that insist on exclusively supporting their closed, more expensive method of obtaining the same results. The difference in smoothness that adaptive sync makes is more staggering that you might think. I'm getting anywhere between 37-60fps in The Witcher 3 with everything maxed, freesync on and it's smooth as butter, off and the slowdowns are very noticeable.


----------



## gupsterg

@Agent Smith1984

I did for a while use a 1080P plasma for gaming, some games made my eyes go funny







, so went back to monitor







.

I played the QA lottery with an Eizo FG2421, 3rd time I got a decent screen







. I wouldn't mind going to 1440P now, but I can't see anything on the market floating my boat as the Eizo did (ie VA / 120Hz / Turbo 240).


----------



## illies100

Hi ,

I'm r9 fury tri-x possessor , i post my result in Ashes of Singularity on DX 12 mode / Crazy preset , sorry for my bad english :

Capture.PNG 583k .PNG file


----------



## gupsterg

Any chance of system spec? clocks?


----------



## Thoth420

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You guys happen to catch the specs on that new 1080? WOW......
> 
> I'm not interested in that card @ $600 though.... but what I am interested in seeing is the performance of the 1070 with 8GB (cough** 7.5GB) of VRAM, lol
> 
> If it's 2048-2300+ cuda cores and clocking in the 1800MHz+ range, then it should be around 980ti performance......
> 
> If it's just faster than Fury but with all that VRAM, it will be exactly what I am looking for at $379..... oh I need AMD to counter quickly, cause the 390X is sold and I'm not wanting the main rig to be down for more than a month or so.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully Polaris 10 brings some Fury X + performance at $349 with 8GB


Is the Polaris 10 Air cooled? Sorry I haven't been following GPU lately since I have been having serious motherboard issues(7 month oddysey that began suspecting ASUS board from the gate and after tons of their finger pointing at basically everyone from EK, Windows 10, Realtek,RAM on the QVL set to 2133 manually at stock....,AMD(replaced a probably totally fine Fury X....), and of course me. I refuse to go back to Nvidia but with new cards on the horizon and planning on going back to a much more conservative air cooled build with maybe just an EK Predator for the CPU cooler and my current Fury X being fully blocked I need to start thinking of a Single GPU sidegrade(or perhaps upgrade) at the least that is quiet and air cooled and I don't want to give Nvidia my money.

I am not against a Fury (non X) as long as it isn't one of those factory OC'd cards like my crazy unstable Lightning 7970 or even a 390x....as long as she is quiet, great cooler and stable(even just at stock clocks).


----------



## Agent Smith1984

Quote:


> Originally Posted by *Thoth420*
> 
> Is the Polaris 10 Air cooled? Sorry I haven't been following GPU lately since I have been having serious motherboard issues(7 month oddysey that began suspecting ASUS board from the gate and after tons of their finger pointing at basically everyone from EK, Windows 10, Realtek,RAM on the QVL set to 2133 manually at stock....,AMD(replaced a probably totally fine Fury X....), and of course me. I refuse to go back to Nvidia but with new cards on the horizon and planning on going back to a much more conservative air cooled build with maybe just an EK Predator for the CPU cooler and my current Fury X being fully blocked I need to start thinking of a Single GPU sidegrade(or perhaps upgrade) at the least that is quiet and air cooled and I don't want to give Nvidia my money.
> 
> I am not against a Fury (non X) as long as it isn't one of those factory OC'd cards like my crazy unstable Lightning 7970 or even a 390x....as long as she is quiet, great cooler and stable(even just at stock clocks).


Polaris will likely be air cooled.... it's going to be such a low TDP that water won't even help it much I don't think. I just don't expect it to be a very powerful card. Benchmark leaks have confirmed this...

I would expect GTX 1070 ($379) to go head to head with Fury X, at least in 1080 and 1440.... not sure on 4k yet, since it will be using GDDR5 on a 256 bit bus at 8GHz









I'll know when I get one though!! lol


----------



## Medusa666

Given the supposed performance of Polaris, for how long will Radeon Pro Duo be a good card and deliver cutting edge performance?


----------



## pdasterly

Quote:


> Originally Posted by *Medusa666*
> 
> Given the supposed performance of Polaris, for how long will Radeon Pro Duo be a good card and deliver cutting edge performance?


295x2 is 2nd fastest card in world


----------



## shadowxaero

So 1080 Ashes benchmark at 1440p Crazy settings scores a 4900 at 49.6FPS

and...

My Fury (non x) scores a 4700 at 48.1FPS same settings >.>.......not impressed.


----------



## toncij

Quote:


> Originally Posted by *shadowxaero*
> 
> So 1080 Ashes benchmark at 1440p Crazy settings scores a 4900 at 49.6FPS
> 
> and...
> 
> My Fury (non x) scores a 4700 at 48.1FPS same settings >.>.......not impressed.


Funny results. Well, async shaders skew the picture a bit. We need 3DMark scores.


----------



## Kana-Maru

Quote:


> Originally Posted by *shadowxaero*
> 
> So 1080 Ashes benchmark at 1440p Crazy settings scores a 4900 at 49.6FPS
> 
> and...
> 
> My Fury (non x) scores a 4700 at 48.1FPS same settings >.>.......not impressed.
> 
> 
> Spoiler: Warning: Spoiler!


Even with the CPU difference [Quad Core vs Hexa Core] it appears the GTX 1080 is only 2.8% faster than your Fury [non X] and even that number is a stretch. Was your Fury stock? I also see that you were running "*Ultra*-Shadow Quality" and the GTX 1080 only running "*High*-Shadow Quality". So that 2.8% difference could be even lower. The other issue is that both results are using different versions of the game.

It appears Nvidia fix for lacking async is to release their reference with massive core clocks. Well I guess those drivers could still be on the way. It's been what 8 or 9 months now?


----------



## toncij

Your assumption is correct. Nvidia is battling AMD advantage in async by using raw power. 1080 is barely any faster than FuryX in Ashes. So, if you're planing on playing DX12 games, FuryX is still pretty much the king.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> Your assumption is correct. Nvidia is battling AMD advantage in async by using raw power. 1080 is barely any faster than FuryX in Ashes. So, if you're planing on playing DX12 games, FuryX is still pretty much the king.


Well I come to that conclusion because I find it funny that none of the Nvidia PR people who are uploading these results ahead of launch are never running or showing DX11 results. People are only posting DX12 results and saying things like _"YEAH! SEE Nvidia #1"_ and looking I'm like "really"







? The GTX 1080 has a 60% reference clock over the stock Fury X and still can't mange to beat it by more than 7% in DX12 based on the results you are showing me. Some people are in denial at this point, but the Nvidia hype is not fazing me at all. So yeah it's looking like Nvidia answer is basically ramping up the reference clocks and allowing the AIBs to overclock them even more.


----------



## shadowxaero

Quote:


> Originally Posted by *Kana-Maru*
> 
> [/SPOILER]
> 
> Even with the CPU difference [Quad Core vs Hexa Core] it appears the GTX 1080 is only 2.8% faster than your Fury [non X] and even that number is a stretch. Was your Fury stock? I also see that you were running "*Ultra*-Shadow Quality" and the GTX 1080 only running "*High*-Shadow Quality". So that 2.8% difference could be even lower. The other issue is that both results are using different versions of the game.
> 
> It appears Nvidia fix for lacking async is to release their reference with massive core clocks. Well I guess those drivers could still be on the way. It's been what 8 or 9 months now?


My Fury was clocked at 1150 during the benchmark. Though I didn't even know the quality settings where different lol. And the version on steam hasn't updated to the 1.10 build yet for some reason, I am reinstalling it now.


----------



## Kana-Maru

Quote:


> Originally Posted by *shadowxaero*
> 
> My Fury was clocked at 1150 during the benchmark. Though I didn't even know the quality settings where different lol. And the version on steam hasn't updated to the 1.10 build yet for some reason, I am reinstalling it now.


Well the GTX 1080 was running the more updated version so I expected better results, but that wasn't the case. Based on your core the GTX 1080 is a whopping 51% clocked higher than your Fury. If Vulkan and DX12 really kicks into gear as expected, AMD has probably made the smartest choice by allowing the Fury and Fury X to remain the high end cards until next year. There's plenty of DX12 games releasing and in development. EA and Crysis have already implemented DX12. It's going to widely adopted and don't think ramping up the core clocks is the best approach, but fanboys might think differently from me.


----------



## toncij

These are my results with a TitanX SC and stock (last two are stock) http://www.ashesofthesingularity.com/metaverse#/personas/a907c72d-76c7-4311-8a2f-2fe75ce6e107?ladderId=x
This is Extreme 1440 preset http://www.ashesofthesingularity.com/metaverse#/personas/a907c72d-76c7-4311-8a2f-2fe75ce6e107/match-details/9a13aa65-6036-44f9-b03e-06f9bbdb7bb9
This is Crazy 4K http://www.ashesofthesingularity.com/metaverse#/personas/a907c72d-76c7-4311-8a2f-2fe75ce6e107/match-details/23c4b6c6-4f3c-46d4-b903-72311f4509f6
This is Crazy 5K http://www.ashesofthesingularity.com/metaverse#/personas/a907c72d-76c7-4311-8a2f-2fe75ce6e107/match-details/1b3808f8-ed13-4776-a9f7-066f7bab4567
This is Crazy 1440 http://www.ashesofthesingularity.com/metaverse#/personas/a907c72d-76c7-4311-8a2f-2fe75ce6e107/match-details/3794155d-559a-48a2-996c-f01a1b5f2732

I'd love to see yours at this setup (menu set to Crazy and Extreme).


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> I'd love to see yours at this setup (menu set to Crazy and Extreme).


I don't have the game yet. I finally got around to finishing Tomb Raider and now I'm still playing Hitman Episode 1. I have purchased Episode 2, but I'm not done with unlocking and doing everything in the first mission.

I see that AotS is on sale and I thought about picking up, but I don't want spend money unless I know I'm going to play it. Otherwise I'll be buying it just to run benchmarks.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> I don't have the game yet. I finally got around to finishing Tomb Raider and now I'm still playing Hitman Episode 1. I have purchased Episode 2, but I'm not done with unlocking and doing everything in the first mission.
> 
> I see that AotS is on sale and I thought about picking up, but I don't want spend money unless I know I'm going to play it. Otherwise I'll be buying it just to run benchmarks.


Well, AotS is bought for benchmarks. Play it or not, doesn't matter.








Quote:


> Originally Posted by *shadowxaero*


can post his.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> Well, AotS is bought for benchmarks. Play it or not, doesn't matter.


No just no. The same thing was said about every Crysis game to date. Although they can stress the heck out of any GPU, I enjoyed playing those games. AotS just happens to be the only proper DX12 game on the market which might make some people think that it's only purpose is benchmarking. The game actually looks fun and only like specific RTS games. I do plan on picking it up, I'm just hoping the websites give the Fury & Fury X a fair share when it comes to benchmarking it against the hype train GTX 1080.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> No just no. The same thing was said about every Crysis game to date. Although they can stress the heck out of any GPU, I enjoyed playing those games. AotS just happens to be the only proper DX12 game on the market which might make some people think that it's only purpose is benchmarking. The game actually looks fun and only like specific RTS games. I do plan on picking it up, I'm just hoping the websites give the Fury & Fury X a fair share when it comes to benchmarking it against the hype train GTX 1080.


You got that wrong; I like the game, I play it, but I wanted to say: even if you don't like it, you can get it for benchmarking - the game has a great benchmarking suite and it can be sold as one easily.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> You got that wrong; I like the game, I play it, but I wanted to say: even if you don't like it, you can get it for benchmarking - the game has a great benchmarking suite and it can be sold as one easily.


Oh well that's another way to think of it. I'd be looking to play it though lol. With Futuremark dragging their feet with their DX12 implementation I see where you are coming from. I don't mean that crappy overhead test either. Futuremark hasn't said a word about Vulkan support either. AotS, as I stated earlier, is the only proper DX12 benchmark on the market for now. They will probably charge everyone again for the DX12 version.


----------



## gupsterg

Well got my hands on a Radeon Pro Duo Master/Slave ROMs plus registers & i2cdump







.



Spoiler: RPD VID per DPM



Code:



Code:


------[ ATI GPU #1 @ mem EFB00000 ]------

------[ ADL GPU Info ]------

Part Number  = 113-C88801MS-102
BIOS Version = 015.049.000.011
BIOS Date    = 2016/01/06 20:57
Memory Type  = HBM
GPU Clock    = 1000 MHz
Memory Clock = 400 MHz
VDDC         = 0 mV
DPM State    = 0
GPU Usage    = 100 %

------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  508 MHz, VID = 0.95000 V
DPM2: GPUClock =  717 MHz, VID = 0.95600 V
DPM3: GPUClock =  874 MHz, VID = 1.06800 V
DPM4: GPUClock =  911 MHz, VID = 1.10600 V
DPM5: GPUClock =  944 MHz, VID = 1.15000 V
DPM6: GPUClock =  974 MHz, VID = 1.18700 V
DPM7: GPUClock = 1000 MHz, VID = 1.22500 V

------[ ATI GPU #2 @ mem EFA00000 ]------

------[ ADL GPU Info ]------

Part Number  = 113-C88801SL-102
BIOS Version = 015.049.000.011
BIOS Date    = 2016/01/06 21:38
Memory Type  = HBM
GPU Clock    = 1000 MHz
Memory Clock = 400 MHz
VDDC         = 0 mV
DPM State    = 0
GPU Usage    = 100 %

------[ ADL PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  508 MHz, VID = 0.94300 V
DPM2: GPUClock =  717 MHz, VID = 0.96200 V
DPM3: GPUClock =  874 MHz, VID = 1.09300 V
DPM4: GPUClock =  911 MHz, VID = 1.13100 V
DPM5: GPUClock =  944 MHz, VID = 1.17500 V
DPM6: GPUClock =  974 MHz, VID = 1.21200 V
DPM7: GPUClock = 1000 MHz, VID = 1.25000 V





GPU 1 seems average leakage to me and GPU 2 is low leakage. To me it seems if they are not binning GPUs as low leakage for RPD as tightly as I expected, as both would be 1.250V DPM 7 I reckon.

As stated before I've had a Fury @ 1.243V and Fury X @ 1.250V plus several members dumps on Fury/X/Nano IMO been lower leakage than GPU 1 on this RPD sample.



Spoiler: PowerLimit compare Fury / Fury X / Nano / Radeon Pro Duo







@Elmy I don't need ROM anymore from your card, but would be interested in AiDA64 registers dump, cheers







.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Well got my hands on a Radeon Pro Duo Master/Slave ROMs plus registers & i2cdump
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Spoiler: RPD VID per DPM
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ ATI GPU #1 @ mem EFB00000 ]------
> 
> ------[ ADL GPU Info ]------
> 
> Part Number  = 113-C88801MS-102
> BIOS Version = 015.049.000.011
> BIOS Date    = 2016/01/06 20:57
> Memory Type  = HBM
> GPU Clock    = 1000 MHz
> Memory Clock = 400 MHz
> VDDC         = 0 mV
> DPM State    = 0
> GPU Usage    = 100 %
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  508 MHz, VID = 0.95000 V
> DPM2: GPUClock =  717 MHz, VID = 0.95600 V
> DPM3: GPUClock =  874 MHz, VID = 1.06800 V
> DPM4: GPUClock =  911 MHz, VID = 1.10600 V
> DPM5: GPUClock =  944 MHz, VID = 1.15000 V
> DPM6: GPUClock =  974 MHz, VID = 1.18700 V
> DPM7: GPUClock = 1000 MHz, VID = 1.22500 V
> 
> ------[ ATI GPU #2 @ mem EFA00000 ]------
> 
> ------[ ADL GPU Info ]------
> 
> Part Number  = 113-C88801SL-102
> BIOS Version = 015.049.000.011
> BIOS Date    = 2016/01/06 21:38
> Memory Type  = HBM
> GPU Clock    = 1000 MHz
> Memory Clock = 400 MHz
> VDDC         = 0 mV
> DPM State    = 0
> GPU Usage    = 100 %
> 
> ------[ ADL PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  508 MHz, VID = 0.94300 V
> DPM2: GPUClock =  717 MHz, VID = 0.96200 V
> DPM3: GPUClock =  874 MHz, VID = 1.09300 V
> DPM4: GPUClock =  911 MHz, VID = 1.13100 V
> DPM5: GPUClock =  944 MHz, VID = 1.17500 V
> DPM6: GPUClock =  974 MHz, VID = 1.21200 V
> DPM7: GPUClock = 1000 MHz, VID = 1.25000 V
> 
> 
> 
> 
> 
> GPU 1 seems average leakage to me and GPU 2 is low leakage. To me it seems if they are not binning GPUs as low leakage for RPD as tightly as I expected, as both would be 1.250V DPM 7 I reckon.
> 
> As stated before I've had a Fury @ 1.243V and Fury X @ 1.250V plus several members dumps on Fury/X/Nano IMO been lower leakage than GPU 1 on this RPD sample.
> 
> @Elmy I don't need ROM anymore from your card, but would be interested in AiDA64 registers dump, cheers
> 
> 
> 
> 
> 
> 
> 
> .


If the GPU's are not matching then they will have to be volted and overclocked individually. I'm not sure if that is possible on a Duel GPU card.
My old HD 7950's were not matching and had to be volted at different levels. Due to ASIC quality. They did not play well together when synchronised.


----------



## gupsterg

I've never had dual GPU card and yes IIRC the 295X2 rom modders do go about setting per GPU. I've seen this same variance on ASIC quality / LeakageID on 295X2 per GPU.

Just added PL compare of ROMs to post 8355, I can see why @Elmy card was throttling with OC when seeing A / W usage on my Fury X with OC plus PL set so = no throttle.


----------



## shadowxaero

Quote:


> Originally Posted by *toncij*
> 
> Well, AotS is bought for benchmarks. Play it or not, doesn't matter.
> 
> 
> 
> 
> 
> 
> 
> 
> can post his.


Lol I bought AotS for benchmarks lol and sure I will run those settings and post scores.

Right this minute I am testing my card for stability as I have it at 1180/570. This thing can eat some power though. Depending on the accuracy of HwInfo my card is consuming a lot of wattage right now


Edit:
AotS Extreme 1440p


Spoiler: Warning: Spoiler!




http://www.ashesofthesingularity.com/metaverse#/personas/8ce6b48c-1393-4315-b6b9-ef303709d80d/match-details/0ea49dea-d6f0-4f38-b74f-9b66cc0b04f6



AotS Crazy 1440p


Spoiler: Warning: Spoiler!




http://www.ashesofthesingularity.com/metaverse#/personas/8ce6b48c-1393-4315-b6b9-ef303709d80d/match-details/0c5acf44-5a9e-44ee-9809-56732f87e4ca



AotS Crazy 2160p


Spoiler: Warning: Spoiler!




http://www.ashesofthesingularity.com/metaverse#/personas/8ce6b48c-1393-4315-b6b9-ef303709d80d/match-details/ee6bc5ad-70ce-45cc-bff2-f562a6990314


----------



## Gdourado

I have a couple of questions.
How much faster is the fury X against a XFX Fury triple dissipation?
How overclockable is the fury X and the aircooled fury in general?

Thanks!


----------



## toncij

Thanks for your benchmarks! Do you have stored results for stock FuryX of yours? Found your profile live. I presume earlier benchmarks are closer to stock or stock.


----------



## Blotto80

I just traded up from a Fury Tri-X that could hit 1125/525 to a Fury X that can do 1160/550. It's faster, not big time maybe 5-7% tops, but it runs considerably cooler and is quieter once you factor in the custom fan curve I ran on the Fury. Here's some Firestrike Extreme scores:

Fury Stock = 6833
Fury X Stock = 7364
Fury Max OC = 7467 (1125/520)
Fury X Mac Oc = 7934 (1160/550)


----------



## Alastair

Quote:


> Originally Posted by *Gdourado*
> 
> I have a couple of questions.
> How much faster is the fury X against a XFX Fury triple dissipation?
> How overclockable is the fury X and the aircooled fury in general?
> 
> Thanks!


Fury vs. Fury X is around 5%-7% once clocks are the same.
Fury vs. Fury X overclocking seems about the same.


----------



## shadowxaero

Quote:


> Originally Posted by *toncij*
> 
> Thanks for your benchmarks! Do you have stored results for stock FuryX of yours? Found your profile live. I presume earlier benchmarks are closer to stock or stock.


Lol it is a Fury TriX but I can run some stock benchmarks, I don't think I have benched it stock.
Quote:


> Originally Posted by *Alastair*
> 
> Fury vs. Fury X is around 5%-7% once clocks are the same.
> Fury vs. Fury X overclocking seems about the same.


From what I have seen, even when clocks are the some the extra shaders the Fury X has, usually put it over the Fury resulting in like a 2 to 4 FPS difference.

For example Blotto80 scored a 7934 with 1160/550 in FS Extreme.

For me to score an 8100 it took 1225 on the core for me.


----------



## Gdourado

Anyone painted a Fury X to fit a white themed build?


----------



## SuperZan

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone painted a Fury X to fit a white themed build?


I -think- @Thoth420 has one, IIRC.


----------



## Thoth420

Quote:


> Originally Posted by *Gdourado*
> 
> Anyone painted a Fury X to fit a white themed build?


Just the EK backplate and sadly the whole thing may have to go for return....









Nothing to do with the backplate though so paint away! The shop that did my cooling used automotive paint which is glossy. I would have preferred a matte.


----------



## Thoth420

*Double Post*

Mark for delete


----------



## Blotto80

Here's my Ashes score on Fury X at 1130/530. I'm going to ramp the clocks back up for some testing.


http://www.ashesofthesingularity.com/metaverse#/personas/b90be4b6-9278-44bb-a81f-35ac5889c667/match-details/01421fe1-ca7e-494a-9ecc-0732d62f2921

New run at 1160/550, That GTX1080 score isn't looking so hot.


http://www.ashesofthesingularity.com/metaverse#/personas/b90be4b6-9278-44bb-a81f-35ac5889c667/match-details/8b49cda6-d479-4a6a-87ea-ecccad301c38


----------



## Kana-Maru

Quote:


> Originally Posted by *Blotto80*
> 
> Here's my Ashes score on Fury X at 1130/530. I'm going to ramp the clocks back up for some testing.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.ashesofthesingularity.com/metaverse#/personas/b90be4b6-9278-44bb-a81f-35ac5889c667/match-details/01421fe1-ca7e-494a-9ecc-0732d62f2921
> 
> New run at 1160/550, That GTX1080 score isn't looking so hot.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.ashesofthesingularity.com/metaverse#/personas/b90be4b6-9278-44bb-a81f-35ac5889c667/match-details/8b49cda6-d479-4a6a-87ea-ecccad301c38


Thanks for the results. This just further proves my claims that Nvidia has no hardware architecture answer for DX12\Vulkan. Their answer is to clock the heck out of their reference cards, but that defeats the purpose. Nvidia PR employees are still uploading GTX 1080 benchmarks and DX12 isn't looking so hot. I'm still waiting on benchmarks and we are getting "some" benchmarks leaks. I guess Nvidia async drivers are still on the way as well.


----------



## rv8000

Finally got around to benching on the unlocked bios, a bit sad that I lost about 20mhz on the core but destroyed my old GPU score by over 700 points, breaking 17k was easier than expected. 1140/550 +66mv (any higher voltage would cause negative scaling).

http://www.3dmark.com/3dm/11981251


----------



## Gdourado

Found this on the web:



More pictures:


http://imgur.com/qFLkL


Seems good.
Might be a good approach to Plasti Dip the card.


----------



## Gdourado

There is also the Asus Nano that matches the X99 Deluxe.


----------



## gupsterg

@Gdourado

Fury X looks sweet







, IIRC Asus sell a white edition Nano.

@shadowxaero

Comparing overall score takes into account where CPU, graphics score is the one to use plus GT1/GT2 FPS. You'll note when we compared our benches in the bios mod thread the Physics score is higher on your system vs mine, due to i7 4790K vs i5 4690K. Even though the combined test FPS is using CPU but you'll note we scored closely







.

Have a view of this thread:-

https://rog.asus.com/forum/showthread.php?34363-GUIDE-3DMark-Score-Calculation-how-to-calculated-your-3DMark-Score

@rv8000

Nice result







, the Fury's that unlock to 3840SP are so good.

Any chance of registers dump to see what VID is per DPM for your card? stock ROM / settings use AiDA64.

My current voltage increased OC is 1135/535 @ 1.243V VID or as others would say +31mV. I'm now planning on using my +0mV OC for daily use as it benches so close.


----------



## shadowxaero

Quote:


> Originally Posted by *gupsterg*
> 
> @Gdourado
> 
> Fury X looks sweet
> 
> 
> 
> 
> 
> 
> 
> , IIRC Asus sell a white edition Nano.
> 
> @shadowxaero
> 
> Comparing overall score takes into account where CPU, graphics score is the one to use plus GT1/GT2 FPS. You'll note when we compared our benches in the bios mod thread the Physics score is higher on your system vs mine, due to i7 4790K vs i5 4690K. Even though the combined test FPS is using CPU but you'll note we scored closely
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Have a view of this thread:-
> 
> https://rog.asus.com/forum/showthread.php?34363-GUIDE-3DMark-Score-Calculation-how-to-calculated-your-3DMark-Score
> 
> @rv8000
> 
> Nice result
> 
> 
> 
> 
> 
> 
> 
> , the Fury's that unlock to 3840SP are so good.
> 
> Any chance of registers dump to see what VID is per DPM for your card? stock ROM / settings use AiDA64.
> 
> My current voltage increased OC is 1135/535 @ 1.243V VID or as others would say +31mV. I'm now planning on using my +0mV OC for daily use as it benches so close.


Oh your right, I started lowering my voltahe and subsequently my core clock and my GPU score started going up by 200 points or so.

Today I am trying to fight that spot at which I start getting negative scaling.


----------



## Agent Smith1984

Thought you all might like to know that open box Fury cards are only $399 on newegg right now.... They have the Strix and the Tri-X


----------



## gupsterg

Quote:


> Originally Posted by *shadowxaero*
> 
> Oh your right, I started lowering my voltahe and subsequently my core clock and my GPU score started going up by 200 points or so.
> 
> Today I am trying to fight that spot at which I start getting negative scaling.


Cool







, replied in Fiji bios mod as your data was there







.


----------



## xTesla1856

My second card still isn't back from RMA. It's been a month and half until now, and with every day I wanna just get rid of both cards. I know it's not their fault and I adore the Fiji chip, but this is really annoying. I contacted the retailer twice and they said they'll send me a new one once Sapphire sends them one. Meanwhile they have over 40 in stock....


----------



## Thoth420

That painted Fury X looks awesome! ......but does it pump whine? One of the main reasons I blocked the Fury X was to avoid that....took me over a year to solve coil whine(it was mostly my house power causing it and an AVR helped get rid of it on my 780, 780Ti and 7970). I couldn't stand listening to the nasty pump sound that my first card had....that said my 2nd Fury X even blocked still coil whines(very lightly) but only under insanely high FPS low GPU usage(menu's screens with uncapped FPS) however FreeSync solves this!


----------



## SuperZan

Quote:


> Originally Posted by *xTesla1856*
> 
> My second card still isn't back from RMA. It's been a month and half until now, and with every day I wanna just get rid of both cards. I know it's not their fault and I adore the Fiji chip, but this is really annoying. I contacted the retailer twice and they said *they'll send me a new one once Sapphire sends them one. Meanwhile they have over 40 in stock....
> 
> 
> 
> 
> 
> 
> 
> *


That is a bit absurd, I would feel the same way. Hopefully they get a new card to you sooner rather than later.


----------



## gupsterg

Quote:


> Originally Posted by *xTesla1856*
> 
> My second card still isn't back from RMA. It's been a month and half until now, and with every day I wanna just get rid of both cards. I know it's not their fault and I adore the Fiji chip, but this is really annoying. I contacted the retailer twice and they said they'll send me a new one once Sapphire sends them one. Meanwhile they have over 40 in stock....


Try posting on the Sapphire forum, this section. A few times I've seen the mod contact Sapphire on members behalf and gain a resolution direct from Sapphire if etailer is dragging feet. Also perhaps contact Sapphire hardware rep on OCN, @VaporX.


----------



## bluezone

New driver folks.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.5.2.aspx

Hopefully this isn't true.

http://www.pcper.com/news/General-Tech/Say-it-aint-so-AMD#comments


----------



## Alwrath

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thought you all might like to know that open box Fury cards are only $399 on newegg right now.... They have the Strix and the Tri-X


Just picked up an open box sapphire fury x for $495. Should get it by next week im so stoked comin from a 290. Will be perfect for Doom @2560x1600


----------



## Kana-Maru

Quote:


> Originally Posted by *bluezone*
> 
> New driver folks.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.5.2.aspx
> 
> Hopefully this isn't true.
> 
> http://www.pcper.com/news/General-Tech/Say-it-aint-so-AMD#comments


It's not apparently:
http://www.eteknix.com/amd-debunks-recent-negative-rumours/


----------



## 12Cores

Has anyone upgraded from 7970 crossfire to a single Fury X card if so what has been your experience?


----------



## bluezone

Quote:


> Originally Posted by *Kana-Maru*
> 
> It's not apparently:
> http://www.eteknix.com/amd-debunks-recent-negative-rumours/


I thought that smelled a little fishy.

REP+1
Quote:


> Originally Posted by *12Cores*
> 
> Has anyone upgraded from 7970 crossfire to a single Fury X card if so what has been your experience?


I had 7950 crossfire and I am using a single R9 Nano. Each setup has its strengths. That said its the 7950's were better at 1080p and below, R9 Nano @ resolutions above that. Single card is smoother than 2. Benchmarks were better on 7950's (1080p and below), but gaming feels better on the Nano. I sold off the 7950's so I cannot make any other comparisons other than these.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bluezone*
> 
> I thought that smelled a little fishy.
> 
> REP+1
> I had 7950 crossfire and I am using a single R9 Nano. Each setup has its strengths. That said its the 7950's were better at 1080p and below, R9 Nano @ resolutions above that. Single card is smoother than 2. Benchmarks were better on 7950's (1080p and below), but gaming feels better on the Nano. I sold off the 7950's so I cannot make any other comparisons other than these.


I've ran 7970's in crossfire and yes, they were excellent at 1080p (I've also tested 290's and 390's in crossfire), and to be honest 1440p was a dead heat with my fury, but at 4k, the single fury did better....

However, i would never take the single fury over 2) 290's... With fury @ 500-650(for x) vs used 290's (around $200) and 390's ($300 new), you may want to consider that.

My 390's were 8gb too, so no vram limitations at 4k, where they really REALLY shined!

Mind you, i say this as someone who sold everything off, and am waiting to try out a gtx 1070...


----------



## Gdourado

Has anyone tested a single Fury Aircooled against a pair of 380X's in crossfire both at 1080p and 1440p?
How do they compare?

Cheers!


----------



## Alastair

Quote:


> Originally Posted by *Gdourado*
> 
> Has anyone tested a single Fury Aircooled against a pair of 380X's in crossfire both at 1080p and 1440p?
> How do they compare?
> 
> Cheers!


well in hardware spec alone a Tonga XT chip is exactly half of a Fiji XT. So on paper a pair of Tonga XT's is equal to a Fiji XT. But since scaling is not 100% you will probably find a pair of Tonga XT's is a bit slower than a single Fury.


----------



## xzamples

hey guys, i'm selling a brand new factory sealed sapphire r9 nano

http://www.overclock.net/t/1594178/sapphire-r9-nano-pny-gtx-960-xlr8

maybe somebody here would be interested - cheers!


----------



## Agent Smith1984

What are you guys who are overclocked getting for an FS Ultra score on your Fury cards???

Check this out...
http://www.overclock3d.net/articles/gpu_displays/gtx_1080_3dmark_performance_leaked/1


----------



## SuperZan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are you guys who are overclocked getting for an FS Ultra score on your Fury cards???
> 
> Check this out...
> http://www.overclock3d.net/articles/gpu_displays/gtx_1080_3dmark_performance_leaked/1


Best I ever managed on a Fury X was something like 4182. I'd have to dig up the score. I know a couple of forum members are in the Top 10 on HWBOT with the highest being 4935 and Sgt. Bilko has one at 4506. If those 1080 numbers prove to be accurate it's a good thing for 4K's development future to have a mid-range die capable of that kind of performance.


----------



## flopper

its understandable that amd choose to put the duo card for VR with 2.4ghz watercooled 1080 offering the same ballpark numbers with a single card.


----------



## Agent Smith1984

Looks like Vega is coming sooner than planned also!!

http://videocardz.com/59808/amd-vega-gpu-allegedly-pushed-forward-to-october


----------



## Agent Smith1984

Quote:


> Originally Posted by *flopper*
> 
> its understandable that amd choose to put the duo card for VR with 2.4ghz watercooled 1080 offering the same ballpark numbers with a single card.


Yeah, but Pro Duo is going to have to be in the $1000 range now to really compete.....


----------



## DedEmbryonicCe1

http://www.3dmark.com/fs/8107377
This is my best FS Ultra run. (4177)


----------



## gupsterg

I reckon your getting negative performance scaling, view my OC vs yours.

http://www.3dmark.com/compare/fs/8107377/fs/8415667


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> I reckon your getting negative performance scaling, view my OC vs yours.
> 
> http://www.3dmark.com/compare/fs/8107377/fs/8415667


Definitely looks that way. I noticed this a lot with my Fury.... if it ain't happy, it will not score well. I could run up near 1075/590 on stock voltage with my Fury, but it scored best when ran at around 1060/560.... I mean, had I been able to control voltage when I had mine, I could have gone further, but I see that even with additional voltage they still have a "happy spot" just a little higher up the clock spectrum.


----------



## shadowxaero

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are you guys who are overclocked getting for an FS Ultra score on your Fury cards???
> 
> Check this out...
> http://www.overclock3d.net/articles/gpu_displays/gtx_1080_3dmark_performance_leaked/1


My highest score is 4457 (Graphics score of 4480) on my Fury TriX with 3840 shaders unlocked. But I think I may be getting negative performance scaling.
http://www.3dmark.com/fs/7614719

This is a result from yesterday.
http://www.3dmark.com/fs/7614719

I will run a new one today as I seem to have found where my cards starts scaling negatively with voltage.


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are you guys who are overclocked getting for an FS Ultra score on your Fury cards???
> 
> Check this out...
> http://www.overclock3d.net/articles/gpu_displays/gtx_1080_3dmark_performance_leaked/1


I have some old synthetic benchmarks I ran last year. I don't really care for synthetic benchmarks, but I will once DX12 synthetic benchmarks drops.

*Updated*
*GTX 1080 @ 1886MHz vs my Fury X 1125Mhz\1170Mhz*
Graphics Scores:

*3DMark11 Performance smh 720p*
GTX 1080 @ 1886MHz = 27683
Fury X @ 1125Mhz = 23912
-67.6% Core clock difference = 10.63% difference

*3DMark11 Extreme smh 1080*
GTX 1080 @ 1886MHz = 9338
Fury X @ 1125Mhz = 7997
-67.6% Core clock difference = 16.8% difference

*FireStirke Performance smh 1080p*
GTX 1080 @ 1886MHz = 21828
Fury X @ 1170Mhz = 18860
-61.2% Core clock difference = 15.7% difference

*FireStirke Extreme*
GTX 1080 @ 1886MHz = 10367
Fury X @ 1170Mhz = 9173
-61.2% Core clock difference = 13% difference

*FireStirke Ultra*
GTX 1080 @ 1886MHz = 4998
Fury X @ 1170Mhz = 4581
-61.2% Core clock difference = 8.3% difference

Even for DX11 performance I wouldn't go out and spend another $600-700 +tax.

Where are the DX12 leaks? Oh wait here is one:

*GTX 1080 Ashes of the Singularity benchmarks*
http://www.overclock3d.net/articles/gpu_displays/gtx_1080_ashes_of_the_singularity_benchmarks/1

The article claims 10-12% fps average. What the article left out is the fact that CPU difference and difference version of the game are being compared. Even then the GTX 1080 results aren't "exciting" or enticing me to go out and purchase it.

Once you take all the data into accordingly you'll see that the GTX 1080 @ 1733MHz [? could be clocked higher ?] is only 5.66% faster than than the Fury [X\Nano ?]. That's with the CPU difference. A user here showed that the highly clocked GTX 1080 was less than 3%, but even then that's a stretch and the Fury Nano might come out ahead.

Nvidia is living it up with DX11, but let's see how DX12 works out for them. I'm sure no matter the outcome people will be purchasing the heck out of the GTX 1080 regardless of stats and facts.

*Edit:*
Nvidia has no answer for DX12 I believe. Raw power seems to be their way out. Will the "hype" train prevail?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> I have some old synthetic benchmarks I ran last year. I don't really care for synthetic benchmarks, but I will once DX12 synthetic benchmarks drops.
> 
> *GTX 1080 @ 2114MHz vs my Fury X 1125Mhz\1170Mhz*
> Graphics Scores:
> 
> *3DMark11 Performance smh 720p*
> GTX 1080 @ 2114MHz = 26456
> Fury X @ 1125Mhz = 23912
> -88% Core clock difference = 10.63% difference
> 
> *3DMark11 Extreme smh 1080*
> GTX 1080 @ 2114MHz = 9338
> Fury X @ 1125Mhz = 7997
> -88% Core clock difference = 16.8% difference
> 
> *FireStirke Performance smh 1080p*
> GTX 1080 @ 2114MHz = 21828
> Fury X @ 1170Mhz = 18860
> -80.6% Core clock difference = 15.7% difference
> 
> *FireStirke Extreme*
> GTX 1080 @ 2114MHz = 10367
> Fury X @ 1170Mhz = 9173
> -80.6% Core clock difference = 13% difference
> 
> *FireStirke Ultra*
> GTX 1080 @ 2114MHz = 4998
> Fury X @ 1170Mhz = 4581
> -80.6% Core clock difference = 8.3% difference
> 
> Even for DX11 performance I wouldn't go out and spend another $600-700 +tax.
> 
> Where are the DX12 leaks? Oh wait here is one:
> 
> *GTX 1080 Ashes of the Singularity benchmarks*
> http://www.overclock3d.net/articles/gpu_displays/gtx_1080_ashes_of_the_singularity_benchmarks/1
> 
> The article claims 10-12% fps average. What the article left out is the fact that CPU difference and difference version of the game are being compared. Even then the GTX 1080 results aren't "exciting" or enticing me to go out and purchase it.
> 
> Once you take all the data into accordingly you'll see that the GTX 1080 @ 1733MHz [? could be higher ?] is only 5.66% faster than than the Fury [X\Nano ?]. That's with the CPU difference. A user here showed that the highly clocked GTX 1080 was less than 3%, but even then that's a stretch and the Fury Nano might come out ahead.
> 
> Nvidia is living it up with DX11, but let's see how DX12 works out for them. I'm sure no matter the outcome people will be purchasing the heck out of the GTX 1080 regardless of stats and facts.
> 
> *Edit:*
> Nvidia has no answer for DX12 I believe. Raw power seems to be their way out. Will the "hype" train prevail?


Where are you getting 2144MHz from though? That article states the boost clocks were between 1860 and 1886MHz for those benches.... Not to disregard your point though, I am also very curious to see real world DX12 comparisons also


----------



## Kana-Maru

I'm updating everything now.

Ok I updated the chart with the 1886MHz Core clock and the percentage difference for the core clock. Another chart was just released for GTX [email protected] and that's what I'm finishing up.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> I'm updating everything now.
> 
> Ok I updated the chart with the 1886MHz Core clock and the percentage difference for the core clock. Another chart was just released for GTX [email protected] and that's what I'm finishing up.


Just found it, wow, that thing is a monster at 2114 dude..... seriously!

And this is before voltage control and water.... my god man, we could see some 2300MHz+ cards floating around, and that's before KP gets his cold hands on one. I'm pro AMD, don't get me wrong, but you gotta admit that NVIDIA has really blown everyone away with this thing if this information is accurate, and it looks pretty legit....


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just found it, wow, that thing is a monster at 2114 dude..... seriously!


Yeah that OC is super high! However, it's still the same old synthetic benchmarks in DX11. We all know Nvidia has drivers have low overhead for DX11. What happens when you add parallel queues and concurrency into the mix. I think the future [Vulkan\DX12] will decide that. All of this hype is great for Nvidia fans. 2.1Ghz on the core is nothing short of amazing, heck 1.8Ghz is already a ton, but when the different is less than 10% in some cases that takes away from the wow factor.

This is main reason I'm not to excited about the high overclocks without efficient hardware to back it up outside of DX11. Nvidia answer appears to be brute force and they know people will go crazy over the high clocks.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah that OC is super high! However, it's still the same old synthetic benchmarks in DX11. We all know Nvidia has drivers have low overhead for DX11. What happens when you add parallel queues and concurrency into the mix. I think the future [Vulkan\DX12] will decide that. All of this hype is great for Nvidia fans. 2.1Ghz on the core is nothing short of amazing, heck 1.8Ghz is already a ton, but when the different is less than 10% in some cases that takes away from the wow factor.
> 
> This is main reason I'm not to excited about the high overclocks without efficient hardware to back it up outside of DX11. Nvidia answer appears to be brute force and they know people will go crazy over the high clocks.


Yes, and I too am a sucker for Mhz, though I fully understand the need for efficiency and IPC performance, etc.....

AMD disappointing me with Fiji's clock ceiling so hopefully Polaris and Vega give us something we can really crank up the clocks on....


----------



## shadowxaero

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just found it, wow, that thing is a monster at 2114 dude..... seriously!
> 
> And this is before voltage control and water.... my god man, we could see some 2300MHz+ cards floating around, and that's before KP gets his cold hands on one. I'm pro AMD, don't get me wrong, but you gotta admit that NVIDIA has really blown everyone away with this thing if this information is accurate, and it looks pretty legit....


I am still not to sure, I expect the 1080 to best faster in Dx11 but are most of out new AAA releases moving to Dx12? Again if we look at the AotS benchmarks the 1080 (presumably as stock clocks) is averaging 50FPS versus my Fury TriX that gets 48.1, and I can only imagine a full Fiji chip would actually be faster.

So we have the 1080 at 2.1Ghz beating out @Kana-Maru Fury X OCed buy 8% at 4k. That could very well turn into 1 or 2 percent or it may end up losing outright to an OC Fury X in Dx12.


----------



## Kana-Maru

Quote:


> Originally Posted by *shadowxaero*
> 
> I am still not to sure, I expect the 1080 to best faster in Dx11 but are most of out new AAA releases moving to Dx12? Again if we look at the AotS benchmarks the 1080 (presumably as stock clocks) is averaging 50FPS versus my Fury TriX that gets 48.1, and I can only imagine a full Fiji chip would actually be faster.
> 
> So we have the 1080 at 2.1Ghz beating out @Kana-Maru Fury X OCed buy 8% at 4k. That could very well turn into 1 or 2 percent or it may end up losing outright to an OC Fury X in Dx12.


Actually it was 1860Mhz\1866Mhz on the GTX 1080, still crazy high core clock nonetheless. You could be right because in DX12 and with Hitman my 4K score increased. Actually just about everything increased a little anyways, but the 4K increase was really nice.

*Apples to Apples - Fury X @ Stock Settings*
Day 1 - DX12 - 4.6Ghz - 3840x2160 [4K] = 36.02fps Average
Patch 1.1.0 [Fix] - DX12 - 4.6Ghz 3840x2160 [4K] = 43.82fps Average *+22%*

All Hitman benchmarks [so far] are here:
http://www.overclock-and-game.com/news/pc-gaming/42-hitman-directx-12-fury-x-benchmarks

If those types of DX12 updates + AMD driver updates keeps releasing then DX12\Vulkan will definitely keep AMD GPUs near Nvidia's most expensive GPUs.


----------



## Thoth420

Quote:


> Originally Posted by *Kana-Maru*
> 
> Actually it was 1860Mhz\1866Mhz on the GTX 1080, still crazy high core clock nonetheless. You could be right because in DX12 and with Hitman my 4K score increased. Actually just about everything increased a little anyways, but the 4K increase was really nice.
> 
> *Apples to Apples - Fury X @ Stock Settings*
> Day 1 - DX12 - 4.6Ghz - 3840x2160 [4K] = 36.02fps Average
> Patch 1.1.0 [Fix] - DX12 - 4.6Ghz 3840x2160 [4K] = 43.82fps Average *+22%*
> 
> All Hitman benchmarks [so far] are here:
> http://www.overclock-and-game.com/news/pc-gaming/42-hitman-directx-12-fury-x-benchmarks
> 
> If those types of DX12 updates + AMD driver updates keeps releasing then DX12\Vulkan will definitely keep AMD GPUs near Nvidia's most expensive GPUs.


That is some nice performance for Hitman at 4K. How is the experience? Smooth or stuttery? Any strange graphics bugs? I have been playing it on the xbox one for now as my rig is being a problematic pest and frankly the game feels like a beta(AI doesn't react to running or creep running or much of anything outside the training maps indicating something is amiss). I figured why take a half working game and try and play it on a half working PC but I have been curious as to the performance a single Fury X would net.
Also the graphics on the console range from really impressive to terrible...wish they went for a better balance. Either way it looks like concentrated butt with a mish mash of high and low res textures.


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> *Updated*
> *GTX 1080 @ 1886MHz vs my Fury X 1125Mhz\1170Mhz*
> Graphics Scores:
> 
> *3DMark11 Performance smh 720p*
> GTX 1080 @ 1886MHz = 27683
> Fury X @ 1125Mhz = 23912
> -67.6% Core clock difference = 10.63% difference
> 
> *3DMark11 Extreme smh 1080*
> GTX 1080 @ 1886MHz = 9338
> Fury X @ 1125Mhz = 7997
> -67.6% Core clock difference = 16.8% difference
> 
> *FireStirke Performance smh 1080p*
> GTX 1080 @ 1886MHz = 21828
> Fury X @ 1170Mhz = 18860
> -61.2% Core clock difference = 15.7% difference
> 
> *FireStirke Extreme*
> GTX 1080 @ 1886MHz = 10367
> Fury X @ 1170Mhz = 9173
> -61.2% Core clock difference = 13% difference
> 
> *FireStirke Ultra*
> GTX 1080 @ 1886MHz = 4998
> Fury X @ 1170Mhz = 4581
> -61.2% Core clock difference = 8.3% difference


Those results = Tess.Tweak?
Quote:


> Originally Posted by *Kana-Maru*
> 
> *Edit:*
> Nvidia has no answer for DX12 I believe. Raw power seems to be their way out. Will the "hype" train prevail?


From things I've been reading I'd agree.


----------



## Kana-Maru

Quote:


> Originally Posted by *Thoth420*
> 
> That is some nice performance for Hitman at 4K. How is the experience? Smooth or stuttery? Any strange graphics bugs? I have been playing it on the xbox one for now as my rig is being a problematic pest and frankly the game feels like a beta(AI doesn't react to running or creep running or much of anything outside the training maps indicating something is amiss). I figured why take a half working game and try and play it on a half working PC but I have been curious as to the performance a single Fury X would net.
> Also the graphics on the console range from really impressive to terrible...wish they went for a better balance. Either way it looks like concentrated butt with a mish mash of high and low res textures.


Ah man it's so smooth. It's never "stuttery" or there's no micro-stutter. At least it has been since the last time I've played it. I don't have a capture card for 4K and I'll lose around 5-7fps if I try recording myself [haven't tried VCE yet]. The best I can do is record off screen with my cell phone or something. That's a stretch though. The only problem is when the game is saving or when you reach a checkpoint. Basically I believe they are making sure you are a legit purchaser and not a pirate because the FPS dip can be serious for a split second when the game is saving. Plus the game is uploading your data the the server as well. It can really skew the FPS min results. This behavior isn't new to games and varies from title to title. Checkpoints are known to cause FPS drops with random results and this is why I've came up with my own solution in my FPS benchmarks [FPS Min Caliber™].

I haven't actually had a lot of issues people are having on Steam. I had some small issues like the music not playing after finishing a mission which is minor. DX12 didn't work for about a week until they patched it. I caught the graphical settings not being saved after the first patch, but it was fixed in a week. I haven't played the game at all since the last small update for Elusive Targets. I have Episode 1 and Episode 2, but I'm still having fun with Episode 1 by trying to complete all of the challenges and now taking out the targets on the hit list. The developers see, to have a lot going on with this game, but I'm sure things would be a lot easier if they focused on one API [DX12 instead of DX11] and depend so much on those servers.

Quote:


> Originally Posted by *gupsterg*
> 
> Those results = Tess.Tweak?


Tessellation = AMD Optimized settings + a highly overclock 12 logical core CPU. I did have some issues where I had to ultimately run DDU and that could be because I still had some old Nvidia files loading on my PC. There has been a lot of driver updates since my last synthetic run and I probably need to run them again.

Quote:


> From things I've been reading I'd agree.


Let's see if Nvidia fans hold them accountable. Based on the things I've read some NV fans say about DX12 & Vulkan, I doubt it.


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> Tessellation = AMD Optimized settings + a highly overclock 12 logical core CPU.


Any chance of links to benches? it's sweet setup you have







.
Quote:


> Originally Posted by *Kana-Maru*
> 
> Let's see if Nvidia fans hold them accountable. Based on the things I've read some NV fans say about DX12 & Vulkan, I doubt it.


I'd agree







. I'm just amazed what price the FE versions cost.


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> Any chance of links to benches? it's sweet setup you have
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I'd agree
> 
> 
> 
> 
> 
> 
> 
> . I'm just amazed what price the FE versions cost.


No links because I run the benches offline when overclocked and save the results. I can run it while connected to the net and let it upload. I purchased 3DMark on Steam sometime ago. My main focus is usually 1440p and 4K in-game benchmarks over synthetic. I guess I never really cared about the online e-peen, even when I was running dual GTX 670s. Maybe I'll care more once Futuremark stop playing around and release their DX12 benchmark [Time Spy].


----------



## gupsterg

I've mainly use 3DM11/13 for:-

a) check scaling with OC, which sometimes isn't seeable in a real world bench result to the same degree.
b) artifact testing / getting aspects of power usage results.
c) comparative between systems.
d) I find other benches can have at times too much of a variance on results between runs for same settings, where as 3DM seems tighter run to run variance.

When doing Hawaii OC'ing / bios mod I did do a lot of FRAPS tests of games I have, Fiji I haven't yet got around to due to time limitation.

As you state your benches in 3DM are AMD Optimized (ie driver default = Tess.On) those are some fantastic results







.

For example:-
Quote:


> *3DMark11 Performance smh 720p*
> GTX 1080 @ 1886MHz = 27683
> Fury X @ 1125Mhz = 23912


I get 20938 with PE = Off only (PE = On is same TBH) *but* tessellation is AMD Optimized in driver (ie default setting). Then with tessellation off = 26527.

Looking at the 3000 point loss in 3DM11 with AMD Optimized setting I'm thinking is my i5 4690K holding back the Fury X?

I may have some 1170MHz benches to compare with your 3DM, will have to check.


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> I've mainly use 3DM11/13 for:-
> 
> a) check scaling with OC, which sometimes isn't seeable in a real world bench result to the same degree.
> b) artifact testing / getting aspects of power usage results.
> c) comparative between systems.
> d) I find other benches can have at times too much of a variance on results between runs for same settings, where as 3DM seems tighter run to run variance.
> 
> When doing Hawaii OC'ing / bios mod I did do a lot of FRAPS tests of games I have, Fiji I haven't yet got around to due to time limitation.
> 
> As you state your benches in 3DM are AMD Optimized (ie driver default = Tess.On) those are some fantastic results
> 
> 
> 
> 
> 
> 
> 
> .
> 
> For example:-
> I get 20938 with PE = Off only (PE = On is same TBH) *but* tessellation is AMD Optimized in driver (ie default setting). Then with tessellation off = 26527.
> 
> Looking at the 3000 point loss in 3DM11 with AMD Optimized setting I'm thinking is my i5 4690K holding back the Fury X?
> 
> I may have some 1170MHz benches to compare with your 3DM, will have to check.


Well it is a i5, but I wouldn't know. I personally can't go lower than the i7 and now it's pretty much 6 cores or nothing.

I stopped using FRAPS many years ago. It's so unreliable and crashes. I started programming my own benchmarking tools and so far so good. FRAPS can still be used as a "quick" comparison, but Fraps pretty much causes all kinds of random crashes when I'm trying to benchmark a game. I'll never go back to fraps for anything other than a quick test for comparisons to make the frame times and frame rate is on point. Even then, I can get better results for comparisons through in game benchmarks and synthetic benchmarks now anyways. I thought about using Nvidia FCAT, but that was lol-worthy years ago when I saw reviewers changing their results in favor of Nvidia instead of AMD based on Nvidia's FCAT results.

Your 26K graphic score is awesome







. I wonder what I could get with no Tessellation, but it might not matter because I'm running a older X58 platform. You have the best of both world, newer CPU architecture and a newer platform with all of bells and whistles. I only ran the synthetic benchmarks for completeness in my Fury X review last year. I really don't care for 3DMark 11 or FireStrike, however, it's still a big thing in the GPU world and I use it for comparisons since the results are in nearly every review. I know Nvidia GPUs love the serial-like DX11 workflow so of course they will leak info. DX12 not so much. I focus mostly on the games and graphical settings.

There's no need to compare scores unless you absolutely want to. I'm more focused on the GTX 1080 reviews gameplay benchmarks to see just how much performance has increased over a series of titles. I'm sure we are going to get more than enough Nvidia Gameworks titles for sure. Now that I'm thinking about itPCper ran the AMD Radeon Pro Duo and in the game comparison category they ran all Nvidia TWIMTBP titles. You know they didn't turn off any of the NV tech, or at least they didn't say so. Every game was Nvidia sponsored+Nvidia tech and the reviewer complained about the frame times. Obviously NV tech is highly optimized for NV GPUs smh. This is the main reason I run my own benchmarks and make the best decision when it's time to go high end.


----------



## gupsterg

My main uses are gaming / recreational benching / light office, etc. As an i7 wasn't beating an i5 for gaming I went for it. I was tempted to go i7 4790K with deals about at crimbo 15, then I deemed that the HT wasn't worth it. If I went higher end I really wasn't getting "bang for $".

With savvy buying the CPU/MOBO/RAM/HSF/PSU at the time ~£360 (Q1 15). I went Hawaii on GPU front, then I got Fiji to do bios mod. Plan was to dispose of Fiji after I did the bios mod testing but then I couldn't go back







.

I have lost count on how many times I've read where fanboyniss hasn't come into discussion that people state AMD GPU age better and nVidia don't. It's not to say I haven't had nVidia GPUs in the past. To me it seems like Hawaii has been a great GPU, Fiji may not be as overclockable but has some great aspects to it and also deserves a place in the "Hall of fame" IMO. I have a side mesh panel and due to the length of Fury X I actually now see a LED flash on the mobo for storage activity which I never knew existed as had lengthy cards which obscured it







. The size of the card is just crazy with HBM / interposer.


----------



## Thoth420

Quote:


> Originally Posted by *Kana-Maru*
> 
> Ah man it's so smooth. It's never "stuttery" or there's no micro-stutter. At least it has been since the last time I've played it. I don't have a capture card for 4K and I'll lose around 5-7fps if I try recording myself [haven't tried VCE yet]. The best I can do is record off screen with my cell phone or something. That's a stretch though. The only problem is when the game is saving or when you reach a checkpoint. Basically I believe they are making sure you are a legit purchaser and not a pirate because the FPS dip can be serious for a split second when the game is saving. Plus the game is uploading your data the the server as well. It can really skew the FPS min results. This behavior isn't new to games and varies from title to title. Checkpoints are known to cause FPS drops with random results and this is why I've came up with my own solution in my FPS benchmarks [FPS Min Caliber™].
> 
> I haven't actually had a lot of issues people are having on Steam. I had some small issues like the music not playing after finishing a mission which is minor. DX12 didn't work for about a week until they patched it. I caught the graphical settings not being saved after the first patch, but it was fixed in a week. I haven't played the game at all since the last small update for Elusive Targets. I have Episode 1 and Episode 2, but I'm still having fun with Episode 1 by trying to complete all of the challenges and now taking out the targets on the hit list. The developers see, to have a lot going on with this game, but I'm sure things would be a lot easier if they focused on one API [DX12 instead of DX11] and depend so much on those servers.
> Tessellation = AMD Optimized settings + a highly overclock 12 logical core CPU. I did have some issues where I had to ultimately run DDU and that could be because I still had some old Nvidia files loading on my PC. There has been a lot of driver updates since my last synthetic run and I probably need to run them again.
> Let's see if Nvidia fans hold them accountable. Based on the things I've read some NV fans say about DX12 & Vulkan, I doubt it.


Cheers! I got that first elusive target in 19 minutes with a 5 star rating.


----------



## ibeat117

That sounds really awkward, i have an Fury X which consumes 1,80V at 1050 MHz and can do 1100 MHz easy without adding Voltage.


----------



## Givenchy

Flickering, Black Screen,No Signal still occuring after 9 months with no fix? ( Fury X )

1000 comments and no nothing from AMD?
Like honestly, what is this?

https://community.amd.com/thread/188642?start=990&tstart=0

Im actually glad this thing is going to be sold soon.


----------



## dagget3450

Well its funny after having my furyx's installed on 3 different platforms i noticed a couple things. First i only ever get black screens when i am overclocked and unstable. Also i remember people talking about a red screen as well. I just was able to replicated it on my evga sr2 box when i was over clocking cpu too far where as my hawaiis seemed fine. So i think for many people they just have unstable systems or overclocks and think since the previous gpu was fine it must be the furys fault.

Even if your not overclocked cpu wise its possible your system isn't as stable as you think. I also see many threads of people having issues and they have faulty hardware before the gpu was installed. I am not doubting there are faulty video cards. These furys push your system harder and after seeing this first hand in my own testing it just makes me wonder how many people are blaming gpus because its he only thing they changed but don't account for other possibilities.


----------



## gupsterg

@dagget3450

The display issue highlighted in that thread is something down to Fiji card / driver / firmware and possibly combo of OS / hardware. Read the whole thread and no one has an answer, AMD stated they had the RMA'd cards back and couldn't replicate issue.

I had it occur on 1 Fiji card out of 4 I've had so far, the card was at stock settings / ROM. I was using a later'ish Crimson driver (v16 something).

It occurred once on my i5 rig just when doing light usage (ie office/web), then when the same card was in my Q6600 rig it did the same thing. Q6600 is mainly used for [email protected] / displaying media on plasma, connection HDMI. The i5 was on my Eizo via DP. Both my rigs have been stress tested for hundreds of hours, defo not an OC instability scenario in my case IMO.

I have never experienced this kind of screen corruption at stock GPU settings (light usage) ever on a card before.


----------



## Kana-Maru

Quote:


> Originally Posted by *Thoth420*
> 
> Cheers! I got that first elusive target in 19 minutes with a 5 star rating.


That's pretty good. I haven't even gotten around to it yet. I finally found the magician clothing and I still need to finish those challenges.

Quote:


> Originally Posted by *Givenchy*
> 
> Flickering, Black Screen,No Signal still occuring after 9 months with no fix? ( Fury X )
> 
> 1000 comments and no nothing from AMD?
> Like honestly, what is this?
> 
> https://community.amd.com/thread/188642?start=990&tstart=0
> 
> Im actually glad this thing is going to be sold soon.


There is a BIOS update for the Fury X that you can download or have installed automatically that might fix that problem. The BIOS installed automatically with the Asus GPU TweakII software for me. I would return that Fury X for a new Fury X. I didn't read the entire topic, just the 1st page that described the issue. So AMD couldn't duplicate the problem. It could definitely be something up with his OS or registry.

The only time I had a black screen was when a overclock failed, but it recovered much quicker than my old Nvidia GTX 670 drivers did. I had the "No Signal" issue when Crysis 3 crashed on me. A simple reboot fixed that issue for me. I've never had flickering. It could be a faulty GPU or faulty hardware. I use dual monitors.


----------



## Givenchy

Quote:


> Originally Posted by *dagget3450*
> 
> Well its funny after having my furyx's installed on 3 different platforms i noticed a couple things. First i only ever get black screens when i am overclocked and unstable. Also i remember people talking about a red screen as well. I just was able to replicated it on my evga sr2 box when i was over clocking cpu too far where as my hawaiis seemed fine. So i think for many people they just have unstable systems or overclocks and think since the previous gpu was fine it must be the furys fault.
> 
> Even if your not overclocked cpu wise its possible your system isn't as stable as you think. I also see many threads of people having issues and they have faulty hardware before the gpu was installed. I am not doubting there are faulty video cards. These furys push your system harder and after seeing this first hand in my own testing it just makes me wonder how many people are blaming gpus because its he only thing they changed but don't account for other possibilities.


Its been a year almost.. plus other poeple have the same issue with FreeSync.

https://community.amd.com/thread/194556

I've changed my motherboard, Another fury x from RMA which resulted in the same ( No suprise ), and a few other things. ( Ram tests )
Everything is stock.

This occurs randomly when alt tabbing. mostly after the second or third alt-tab. but happends within the first alt-tab aswell.


https://vid.me/P1W8
https://vid.me/oOqu

The only thing i can replace now is the PSU which is a M12 SeaSonic evo 850w Bronze edition.

What a joke.

Edit : im on the latest VBios from AMD.

People are also telling me that clockblocker should fix the issue. but it does not for me. espacially in source games. ( 144hz/fps )

Edit 2 : if i have to return the sapphire fury x for the third time, im refunding this All together with the AOC FreeSync screen.

I wanted to note, my issue is not happening outside games, like the OP mentioned, mine is happening in games.

Throw every benchmark at it, no flickering,no crash.

Throw Csgo at it, flickers massively.
Throw TF2 at it flickers,

The thing i dont understand is, why is it not flickering on benchmarks? Heaven,3DMARK no issues..

But then when the fps goes up all the way to 144frames.( Capped ) im having issue's?

I've been thinking about this for quite a while.. but then again when my fps is 150 in BF4 for example, no flickering. in source games, it happends. most other games dont have this issue.

And AOC who does not reply back.







been 3 weeks.

"R9 Fury X Artifacting at above 60FPS"

https://community.amd.com/message/2705716

His video is what i am having with most source games. and some others.




Loads of video's thread about this issue.


----------



## Thoth420

Quote:


> Originally Posted by *Kana-Maru*
> 
> That's pretty good. I haven't even gotten around to it yet. I finally found the magician clothing and I still need to finish those challenges.


For vamp challenges do most in the escalations and start in the attic to make life easier. You can knock most down that way and the chandelier(if I recall correctly) one can be done in the Story mode but I won't spoil that. I'm around 85/100 Paris and around 74/100 Sapienza and hope Morocco releases soon.


----------



## toncij

Quote:


> Originally Posted by *gupsterg*
> 
> I've mainly use 3DM11/13 for:-
> 
> a) check scaling with OC, which sometimes isn't seeable in a real world bench result to the same degree.
> b) artifact testing / getting aspects of power usage results.
> c) comparative between systems.
> d) I find other benches can have at times too much of a variance on results between runs for same settings, where as 3DM seems tighter run to run variance.
> 
> When doing Hawaii OC'ing / bios mod I did do a lot of FRAPS tests of games I have, Fiji I haven't yet got around to due to time limitation.
> 
> As you state your benches in 3DM are AMD Optimized (ie driver default = Tess.On) those are some fantastic results
> 
> 
> 
> 
> 
> 
> 
> .
> 
> For example:-
> I get 20938 with PE = Off only (PE = On is same TBH) *but* tessellation is AMD Optimized in driver (ie default setting). Then with tessellation off = 26527.
> 
> Looking at the 3000 point loss in 3DM11 with AMD Optimized setting I'm thinking is my i5 4690K holding back the Fury X?
> 
> I may have some 1170MHz benches to compare with your 3DM, will have to check.


1135MHz and you get so close to 1080?


----------



## flopper

Quote:


> Originally Posted by *toncij*
> 
> 1135MHz and you get so close to 1080?


the 1080 is a replacement for the 980.


----------



## toncij

Quote:


> Originally Posted by *flopper*
> 
> the 1080 is a replacement for the 980.


Yes, but it is a DX11 test and also 1080 is expected to be significantly faster than TitanX, let alone 980. Alleged 25% above TitanX is significant amount. FuryX should in theory bit the dust there due to DX11 not being its game.


----------



## gupsterg

Quote:


> Originally Posted by *toncij*
> 
> 1135MHz and you get so close to 1080?


Yep seems that way







.

I use Win 7 Pro SP1 UEFI, I have no OS tweaks, it's basically my everyday use setup







. I'm using no RAMDisk or have Samsung Turbo-write / Rapid-Mode on







, which I've seen can impact RealBench, dunno about 3DM (wouldn't expect so though). Anyhow like I said before I don't give GPUs a whole lotta VID increase. My initial testing had shown 1115/535 +0mV (1.212V VID) was good for 1hr each of Heaven / Valley / 3DM, but failed [email protected] (ie I got 1-2 bad states on GPU). So I setup 1110/530 +6mV (1.218V VID), this tested for 16hrs [email protected] then I stopped run.



Spoiler: 1110/530 16hrs run ~450000 PPD (with 1x CPU/GPU slots)







3DM11 Performance: driver defaults = 20712 Tess.Tweak+PE=Off = 26208
3DM13 FS: Driver defaults = 17228 Tess.Tweak+PE=Off = 19210
3DM13 FS E: Driver defaults = 8077 Tess.Tweak+PE=Off = 9233
3DM13 FS U: Driver defaults = 4114 Tess.Tweak+PE=Off = 4615

I reckon the 1110/530 @ 1.218V is more optimal on all fronts vs 1135/535 @ 1.243V. I recently did try lowering VID at 1050/500 to see if I got a performance boost like Nano members from undervolting. I saw none.


----------



## Eliovp

Quote:


> Originally Posted by *gupsterg*
> 
> I recently did try lowering VID at 1050/500 to see if I got a performance boost like Nano members from undervolting. I saw none.


I also lowered memory clock to 300 (400 works as well) for stability and to fix throttling.


----------



## RatPatrol01

So my Nano waterblock shows up tomorrow, anyone have any tips for installation? I know I'm supposed to be real careful with the orange area at the base of the die


----------



## bluezone

Quote:


> Originally Posted by *RatPatrol01*
> 
> So my Nano waterblock shows up tomorrow, anyone have any tips for installation? I know I'm supposed to be real careful with the orange area at the base of the die


yes carful of the interposer and use the best quality TIM you can get. In the long run cheep stuff doesn't cut it with GPUS.

I would suggest reading this Tom's hardware post or at least follow the recommendations. No liquid Metal TIM your working directly on the die.

http://www.tomshardware.com/reviews/thermal-paste-performance-benchmark,3616.html.


----------



## Medusa666

Would the Radeon Pro Duo be good for future proofing the system?


----------



## dagget3450

Quote:


> Originally Posted by *Medusa666*
> 
> Would the Radeon Pro Duo be good for future proofing the system?


Given things like DOOMs 4gb vram cap for nightmare settings. The main issues with a pro duo would be CF support and 4gbVRAM. Future proof is tough considering things like Nvidia performance tanking on older gen gpus. Just my opinion. I don't think you can really future proof right now sadly


----------



## 0x00000000

Quote:


> Originally Posted by *bluezone*
> 
> yes carful of the interposer and use the best quality TIM you can get. In the long run cheep stuff doesn't cut it with GPUS.
> 
> I would suggest reading this Tom's hardware post or at least follow the recommendations. No liquid Metal TIM your working directly on the die.
> 
> http://www.tomshardware.com/reviews/thermal-paste-performance-benchmark,3616.html.


Why does one have to be careful with the interposer, are we supposed or not supposed to cover it with thermal paste?
I have applied mx4 on waterblocks for my gtx 980, is it any different now that there is the interposer?
This time however, I will be using gelid gc extreme, will that work?


----------



## bluezone

Quote:


> Originally Posted by *0x00000000*
> 
> Why does one have to be careful with the interposer, are we supposed or not supposed to cover it with thermal paste?
> I have applied mx4 on waterblocks for my gtx 980, is it any different now that there is the interposer?
> This time however, I will be using gelid gc extreme, will that work?


You have to be very carful cleaning the interposer. Its very delicate. With most silicon hardware the electrical components and traces are facing down. you are seeing the back side of the silicon with out any of the printed circuitry exposed. The interposer circuitry is on both sides AFAIK. This is so that it can make contact with the bottom of the HBM and GPU die connections.

So do not use any hard or sharp cleaning tools to clear away the old TIM on the interposer. Harsh chemicals are probably not a good idea either, high percentage alcohol is a good solvent to soften TIM. I would suggest applying alcohol liberally to the TIM and afterward to your self.







Thick applications of TIM will act as a heat insulator. So you should probably avoid applying TIM to the interposer.

GC Geild is my personal favorite TIM, I've had good luck on my NANO with it. When NVidia switches to HMB2 these same suggestions should apply.


----------



## bluezone

New drivers.

http://support.amd.com/en-us/kb-articles/pages/amd-radeon-software-crimson-edition-16.5.2.1-release-notes.aspx

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## bluezone

seems like a good driver.

3D11 Tess Graphics Score 20593 : http://www.3dmark.com/3dm11/11253382

3D11 no Tess Graphics Score 25766 :http://www.3dmark.com/3dm11/11253377

As good as 16.3.2 3D11 no Tess Graphics Score25617 : http://www.3dmark.com/3dm11/11116122


----------



## RatPatrol01

Waterblock installation went nice and smooth, haven't been able to push it past 42C so far, and it's a bit of a monster with a quick OC applied.

stock valley benchmark


stock firestrike extreme


valley with fury x clocks


full overclock valley


full overclock firestrike extreme


temps


----------



## bluezone

Very nice.








How does it compare to before the water block and what sort of clocks are you using for your rough over clock?


----------



## RatPatrol01

before the waterblock it would always run right up to 75C under load, or up to about 82C if i upped the power limit so it wouldn't power throttle.

current overclock is:

1100mhz core
525mhz memory
stock voltage
+50% power limit

Tried 1120mhz on the core but that crashes the video driver as soon as I start a run in valley. I could probably go higher on the memory clock, but I don't wanna mess with the HBM too much, and who knows how high I could get if I started fiddling with the voltage, might be something to play with over a weekend when I have time for long term stability testing


----------



## bluezone

Do me a favor and try,
1075 MHz core
550 memory
-24mv voltage offset
@ 50% power limit

then give a run on the valley bench


----------



## RatPatrol01

Quote:


> Originally Posted by *bluezone*
> 
> Do me a favor and try,
> 1075 MHz core
> 550 memory
> -24mv voltage offset
> @ 50% power limit
> 
> then give a run on the valley bench


Slightly less overall score, but higher max fps, temps were about the same


----------



## bluezone

Okay my best guess is every thing the same but 1120mhz and +24-36mv. I think that will get you stable. try =24mv 1st. By the way you beat my best score by about 250 points in valley at a lower clock. Good job


----------



## RatPatrol01

yeah I'll have to give that a go when I have time to stability test, should be a good time







it's already clocking in with a higher graphics score in firestrike extreme than a stock 980ti


----------



## Alastair

I'm managing 1100MHz at stock volts. By stock I mean the stock voltage that the new Fury X BIOS from AMD applies. I know I can run 545 on the HBM no problem. I haven't tried going higher yet cause I simply don't have the need to right now.

Something I seem to have figured out is AMD have not made an updated "power" BIOS. Only an updated standard BIOS for Furyx.

What I mean by that is the updated BIOS with the better UEFI support seems to be the 300w power limit 75C temp limit BIOS.

I can't seem to find an updated 350w power limit and 85C temp limit "power" BIOS.


----------



## Alastair

@gupsterg
I'm downloading AIDA 64 now. With MY BIOS dumps. My stock BIOS, does it matter if it's the normal BIOS or the extended power BIOS?


----------



## wesbluemarine

Quote:


> Originally Posted by *RatPatrol01*
> 
> yeah I'll have to give that a go when I have time to stability test, should be a good time
> 
> 
> 
> 
> 
> 
> 
> it's already clocking in with a higher graphics score in firestrike extreme than a stock 980ti


What waterblock do you have?EK or Aquacomputer?
Do you have any advice for cleaning the chip?
Did you have coil whine on your nano before? And now?

I'm asking because i have a nano too, and i've bought the stuff for making my first custom loop.

Thanks!


----------



## JunkaDK

Hey guys..

Just wanted to share a picture of my Asus R9 Fury STRIX painted white











Wanna see more pics ? go here https://pcpartpicker.com/b/ntPscf









Thanks.


----------



## Flamingo

Im really thinking of jumping the gun from R9 Nano to the GTX 1080. Would be my first ever nvidia card lol

First review out: http://www.pcpop.com/view/2/2763/2763166_all.shtml?r=17180620#p2


----------



## ManofGod1000

Quote:


> Originally Posted by *Flamingo*
> 
> Im really thinking of jumping the gun from R9 Nano to the GTX 1080. Would be my first ever nvidia card lol
> 
> First review out: http://www.pcpop.com/view/2/2763/2763166_all.shtml?r=17180620#p2


Let's wait for the real reviews. I looked at those Fire Strike scores and they both were significantly lower than what I have with my EVGA 980 Ti FTW and I do not overclock the card. Something tells me these cards are all being compared to bone stock 1GHz 980 Ti cards when my card runs at 1405 Mhz out of the box. Personally, I would just stick with the R9 Nano and if you are not getting the performance you want, buy a second one for cross fire.


----------



## RatPatrol01

Quote:


> Originally Posted by *wesbluemarine*
> 
> What waterblock do you have?EK or Aquacomputer?
> Do you have any advice for cleaning the chip?
> Did you have coil whine on your nano before? And now?
> 
> I'm asking because i have a nano too, and i've bought the stuff for making my first custom loop.
> 
> Thanks!


I have the EK Acetal + Nickel block. To clean the chip I just used 99% isopropyl alcohol, qtips, and lint-free wipes, and went really slowly and gently. Couldn't get all the old TIM off the interposer, but I got rid of the bulk of it and got the chips themselves nice and clean.

My Nano has a little bit of whine but it seems to only happen while i'm exiting a graphically intense program. The whine is pretty much the same under water as it was under air.


----------



## Agent Smith1984

So 1080 has Async compute...... so much for AMD staying competitive in DX12... what an impressive card.

Can't wait to see what Vega brings....

Stock clocks of 1760 boost.... the thing is nuts.


Even in Mordor @ 5k it shows no letting up. I hope this is the same type of stuff we see from the next AMD card cause this is really impressive!


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So 1080 has Async compute...... so much for AMD staying competitive in DX12... what an impressive card.
> 
> Can't wait to see what Vega brings....
> 
> Stock clocks of 1760 boost.... the thing is nuts.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Even in Mordor @ 5k it shows no letting up. I hope this is the same type of stuff we see from the next AMD card cause this is really impressive!
> 
> 
> Spoiler: Warning: Spoiler!


Yeah right. It all depends on which reviews you look. The numbers are all over the place depending on the website and Guru numbers rarely ever matched my Fury X @ stock.

http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1080-review
Ashes of the Singularity, Extreme, 0x MSAA,

*GTX 1080:*
1080p DX11: - 76.5fps
1080p DX12: - 77.8fps +1.69%

1440p DX11: - 68.6fps
1440p DX12: - 68.9fps +0.43%

4K DX11: - 53.7fps
4K DX12: - 53.3fps -0.74%

*Fury X:*
1080p DX11: - 58.3fps
1080p DX12: - 69.9fps +20%

1440p DX11: - 54.0fps
1440p DX12: - 63.4fps +11.22%

4K DX11: - 43.6fps
4K DX12: - 49.0fps +12.38%

So Nvidia still can't do DX12 correctly as I expected a few pages back and When you compare the percentage increase over the Fury X:

GTX 1080 DX12 percentage increase over the Fury X according to Eurogamer:
1080p +11.30%
1440p + 8.67%
4K +8.77%

Obviously they didn't post the results for both test so there's no way to actually calculate the actual percentage based on the CPU average and the FPS average. We just have to go by their blanket statement. There's plenty of GTX 1080 results in the AotS database that you can compare and many of them are less than 5% once you factor in the CPU difference.

I actually suggested that the GTX 1080 could be 7%-11% faster than the Fury\Fury Nano\FuryX, but that's only if you don't take the other factors into account [which I do].

These results are hilarious because some websites are comparing GTX 1080 AotS DX11 to AMD GPUs running DX12, which isn't fair. Still I'm taking a lot of these website results with a grain of salt when it comes to the Fury X. I wish I knew where they were running their test or if they just cranked up the built in benchmark and got the results from that.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah right. It all depends on which reviews you look. The numbers are all over the place depending on the website and Guru numbers rarely ever matched my Fury X @ stock.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1080-review
> Ashes of the Singularity, Extreme, 0x MSAA,
> 
> *GTX 1080:*
> 1080p DX11: - 76.5fps
> 1080p DX12: - 77.8fps +1.69%
> 
> 1440p DX11: - 68.6fps
> 1440p DX12: - 68.9fps +0.43%
> 
> 4K DX11: - 53.7fps
> 4K DX11: - 53.3fps -0.74.
> 
> *Fury X:*
> 1080p DX11: - 58.3fps
> 1080p DX12: - 69.9fps +20%
> 
> 1440p DX11: - 54.0fps
> 1440p DX12: - 63.4fps +11.22%
> 
> 4K DX11: - 43.6fps
> 4K DX11: - 49.0fps +12.38%
> 
> So Nvidia still can't do DX12 correctly as I expected a few pages back and When you compare the percentage increase over the Fury X:
> 
> GTX 1080 DX12 percentage increase over the Fury X according to Eurogamer:
> 1080p +11.30%
> 1440p + 8.67%
> 4K +8.77%
> 
> Obviously they didn't post the results for both test so there's no way to actually calculate the actual percentage based on the CPU average and the FPS average. We just have to go by their blanket statement. There's plenty of GTX 1080 results in the AotS database that you can compare and many of them are less than 5% once you factor in the CPU difference.
> 
> I actually suggested that the GTX 1080 could be 7%-11% faster than the Fury\Fury Nano\FuryX, but that's only if you don't take the other factors into account [which I do].
> 
> These results are hilarious because some websites are comparing GTX 1080 AotS DX11 to AMD GPUs running DX12, which isn't fair. Still I'm taking a lot of these website results with a grain of salt when it comes to the Fury X. I wish I knew where they were running their test or if they just cranked up the built in benchmark and got the results from that.


I'm assuming the reviewers, to save time, use built in benches when they can, but I have found most built in benches offer little insight to how the game really runs.

All things being fair, the card is pretty impressive either way. My hope is that it's giving us a glimpse of what we will see from AMD soon, especially now that NVIDIA has the hardware out before AMD, so they know what they are up against.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah right. It all depends on which reviews you look. The numbers are all over the place depending on the website and Guru numbers rarely ever matched my Fury X @ stock.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1080-review
> Ashes of the Singularity, Extreme, 0x MSAA,
> 
> *GTX 1080:*
> 1080p DX11: - 76.5fps
> 1080p DX12: - 77.8fps +1.69%
> 
> 1440p DX11: - 68.6fps
> 1440p DX12: - 68.9fps +0.43%
> 
> 4K DX11: - 53.7fps
> 4K DX11: - 53.3fps -0.74.
> 
> *Fury X:*
> 1080p DX11: - 58.3fps
> 1080p DX12: - 69.9fps +20%
> 
> 1440p DX11: - 54.0fps
> 1440p DX12: - 63.4fps +11.22%
> 
> 4K DX11: - 43.6fps
> 4K DX11: - 49.0fps +12.38%
> 
> So Nvidia still can't do DX12 correctly as I expected a few pages back and When you compare the percentage increase over the Fury X:
> 
> GTX 1080 DX12 percentage increase over the Fury X according to Eurogamer:
> 1080p +11.30%
> 1440p + 8.67%
> 4K +8.77%
> 
> Obviously they didn't post the results for both test so there's no way to actually calculate the actual percentage based on the CPU average and the FPS average. We just have to go by their blanket statement. There's plenty of GTX 1080 results in the AotS database that you can compare and many of them are less than 5% once you factor in the CPU difference.
> 
> I actually suggested that the GTX 1080 could be 7%-11% faster than the Fury\Fury Nano\FuryX, but that's only if you don't take the other factors into account [which I do].
> 
> These results are hilarious because some websites are comparing GTX 1080 AotS DX11 to AMD GPUs running DX12, which isn't fair. Still I'm taking a lot of these website results with a grain of salt when it comes to the Fury X. I wish I knew where they were running their test or if they just cranked up the built in benchmark and got the results from that.


This is a pathetic result from 1080....

But, I can't find 1080 results in AotS database?


----------



## Kana-Maru

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm assuming the reviewers, to save time, use built in benches when they can, but I have found most built in benches offer little insight to how the game really runs.
> 
> All things being fair, the card is pretty impressive either way. My hope is that it's giving us a glimpse of what we will see from AMD soon, especially now that NVIDIA has the hardware out before AMD, so they know what they are up against.


That's exactly what I've came to realize which is why I run in-game benchmarks as well as the built-in benchmarks. I've done this for years now. The checked out Guru3D and a few other sites and it appears Guru and some sites are using older results from older drivers. They haven't seem to have tested some games after patches and drivers updates.

The GTX 1080 high stock clock are impressive, but DX12 still paints a different picture in my eyes. Guru3D Hitman results @ 1440p compared to my results using the built-in benchmark only puts the GTX 1080 FE 12.5% ahead of the Fury X. Once you hit 4K that GTX 1080 FE increase drops to 6.81%.

Then again the numbers change from website to website and some cases dramatically. Looks like 1920x1080 will continue to be a "thing" from now on.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> This is a pathetic result from 1080....
> 
> But, I can't find 1080 results in AotS database?


Well there were a TON of them before the reviews came out. We still have some screenshots so maybe they don't want people looking and comparing results while the press markets their GTX 1080. I'll try to take a look and see if I can find anything.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> That's exactly what I've came to realize which is why I run in-game benchmarks as well as the built-in benchmarks. I've done this for years now. The checked out Guru3D and a few other sites and it appears Guru and some sites are using older results from older drivers. They haven't seem to have tested some games after patches and drivers updates.
> 
> The GTX 1080 high stock clock are impressive, but DX12 still paints a different picture in my eyes. Guru3D Hitman results @ 1440p compared to my results using the built-in benchmark only puts the GTX 1080 FE 12.5% ahead of the Fury X. Once you hit 4K that GTX 1080 FE increase drops to 6.81%.
> 
> Then again the numbers change from website to website and some cases dramatically. Looks like 1920x1080 will continue to be a "thing" from now on.


I will be testing it tomorrow against RadeonProDuo. Same benchmark version, same driver. Interesting, but I think yet another lie from Nvidia about DX12 is out.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> I will be testing it tomorrow against RadeonProDuo. Same benchmark version, same driver. Interesting, *but I think yet another lie from Nvidia about DX12 is out*.


Fanboys won't hold them accountable and will throw money at them so I don't think Nvidia really cares at point lol.


----------



## gupsterg

@Alastair

For purpose of the registers dump the difference between stock ROM with & without PowerLimit increase makes no difference







.

@Agent Smith1984

After ready a few reviews I'd say nVidia have the performance crown yet again







, hoping now Vega beats them into the ground







.

Now on Async I'm not yet sold they nailed it, I think time will tell.

For example this article regarding Maxwell has always stuck in my mind, besides the lack of Async what got me was what nVidia are asking devs to do.

Now this article on Pascal explains some of what went on with Maxwell, it doesn't explain if Pascal is at SM/CU level like AMD, nor does any other article I've read. In the TPU review I read:-
Quote:


> The "Pascal" architecture supports Asynchronous Compute as standardized by Microsoft. It adds to that with its own variation of the concept as "Dynamic Load Balancing.


As I've only recently started to read about "Async" I'm unaware if AMD's implementation exceeds MS standard.

I'm still happy with Fury X as I paid *less than half the price* of what a 980 Ti was costing. Considering what the FE costs in $ and MSRP of non FE, which generally means in the UK we pay the same in £ (ie FE $699 = £699) I'm still happy with Fury X.

In some of the benches where the 295X2 has great CF support it was nice to see it matching/beating the GTX 1080 @ 1440P above. When I made the decision to buy Fury X I had also noted as the res increased the gap closed between it and a Ti, IIRC some games it beat it.


----------



## 00riddler

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah right. It all depends on which reviews you look. The numbers are all over the place depending on the website and Guru numbers rarely ever matched my Fury X @ stock.
> 
> http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1080-review
> Ashes of the Singularity, Extreme, 0x MSAA,
> 
> *GTX 1080:*
> 1080p DX11: - 76.5fps
> 1080p DX12: - 77.8fps +1.69%
> 
> 1440p DX11: - 68.6fps
> 1440p DX12: - 68.9fps +0.43%
> 
> 4K DX11: - 53.7fps
> 4K DX12: - 53.3fps -0.74%
> 
> *Fury X:*
> 1080p DX11: - 58.3fps
> 1080p DX12: - 69.9fps +20%
> 
> 1440p DX11: - 54.0fps
> 1440p DX12: - 63.4fps +11.22%
> 
> 4K DX11: - 43.6fps
> 4K DX12: - 49.0fps +12.38%
> 
> So Nvidia still can't do DX12 correctly as I expected a few pages back and When you compare the percentage increase over the Fury X:
> 
> GTX 1080 DX12 percentage increase over the Fury X according to Eurogamer:
> 1080p +11.30%
> 1440p + 8.67%
> 4K +8.77%
> 
> Obviously they didn't post the results for both test so there's no way to actually calculate the actual percentage based on the CPU average and the FPS average. We just have to go by their blanket statement. There's plenty of GTX 1080 results in the AotS database that you can compare and many of them are less than 5% once you factor in the CPU difference.
> 
> I actually suggested that the GTX 1080 could be 7%-11% faster than the Fury\Fury Nano\FuryX, but that's only if you don't take the other factors into account [which I do].
> 
> These results are hilarious because some websites are comparing GTX 1080 AotS DX11 to AMD GPUs running DX12, which isn't fair. Still I'm taking a lot of these website results with a grain of salt when it comes to the Fury X. I wish I knew where they were running their test or if they just cranked up the built in benchmark and got the results from that.


It's in German but i think it is a good camparison for async compute in DX12 between 1080, 980TI and Fury X:

http://www.computerbase.de/2016-05/geforce-gtx-1080-test/11/#abschnitt_pascal_profitiert_tatsaechlich_von_async_compute


----------



## Kana-Maru

Quote:


> Originally Posted by *00riddler*
> 
> It's in German but i think it is a good camparison for async compute in DX12 between 1080, 980TI and Fury X:
> 
> http://www.computerbase.de/2016-05/geforce-gtx-1080-test/11/#abschnitt_pascal_profitiert_tatsaechlich_von_async_compute


That is the same site that compared Nvidia GTX 1080 DX11 vs AMD DX12 AotS results in their GTX 1080 benchmark. I know bias when I see it. I'd rather use data from actual Fury users in the Stardock database for comparisons. It still appears Nvidia has issues with Async & DX12.

Quote:


> Originally Posted by *toncij*
> 
> This is a pathetic result from 1080....
> 
> But, I can't find 1080 results in AotS database?


I looked and you are correct. It appears Nvidia has ripped all of their GTX 1080 benchmarks from the database including the GPU name itself. Wow Nvidia, forcing people to get results from the review sites. Well there's still a ton of GTX 1080 screenshots out there for comparison in the meantime.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> I looked and you are correct. It appears Nvidia has ripped all of their GTX 1080 benchmarks from the database including the GPU name itself. Wow Nvidia, forcing people to get results from the review sites.


I wonder what may be the reason for that...







Fishy.

And no, Nvidia so far does not support AC.







We'll see if that has changed with Pascal, but Ashes benchmarks missing drop some shadow of doubt there.


----------



## Agent Smith1984

I have looked at about 6 different reviews covering about 12-15 games in total, and just pecking at the calculator a bit, it looks like the 1080 is about 25% faster than Fury X on average. That is including DX12 Hitman and AoS, and Doom, which none of those would benefit from driver differences, patches, etc since they are all so new......

I'm excited about this 1080, regardless of my AMD fandom because when NVIDIA goes and does something like put out a new gen GPU that is about 25-30% faster than their last flagship, they are setting a standard for all of us....

You gotta think that AMD is going to swing back pretty hard later this year with something that will both set a new standard in performance (not with Polaris, but with Vega), and also create price drops large enough to get some of this current high end stuff down into the hands of the average mainstream user buying $200-300 GPU's..... I'm grown now and spend my own money as I please (when I can anyways) but there is a whole generation of young people who are on a tight budget, or on mom and dad's gift budget and will never see anything more than a $300 GPU until they are out on their own and/or working and buying their own hardware. This is the kind of drastic jump that puts very capable current or last gen hardware in those folks hands at a great price, and keep them away from the consoles


----------



## Agent Smith1984

Well, AMD will stay king in one area for sure:

http://www.pcworld.com/article/3071332/hardware/its-true-nvidias-geforce-gtx-1080-officially-supports-only-2-way-sli-setups.html

Guess they got tired of seeing red teams almost 100% scaling despite the number of adapters


----------



## gupsterg

Reading info under heading *Wait Just A Minute... So No More 3 & 4-Way SLI?* on this link.

Seems like too much aggravation, not that I have ever had multi-gpu setup or see myself having multi-gpu.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> See heading *Wait Just A Minute... So No More 3 & 4-Way SLI?* on this link.
> 
> Seems like too much aggravation, not that I have ever had multi-gpu setup or see myself having multi-gpu.


Apply for an enthusiast key??? Are you kidding me???

They are like, "we want to log the people spending thousands on multiple 1080's by name and address" LMAO


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> Reading info under heading *Wait Just A Minute... So No More 3 & 4-Way SLI?* on this link.
> 
> Seems like too much aggravation, not that I have ever had multi-gpu setup or see myself having multi-gpu.


Yup it's dead., well at least officially from Nvidia. You can still use a 3rd card solely as a PhysX GPU, but who in their right mind is going to do that. The GTX GPUs can still be used with AMD Radeon GPUs if the developers implement the tech [DX12\Vulkan] to allow multi-display adapter. This was the case for Radeon and GTX working together in Ashes of the Singularity.

However, if you want have the money you can still run 3 and 4 GTX 1080 GPUs??? Then there's that "Enthusiast Key", what the heck. So if you buy one of those SLI bridges you can potentially run 3 and 4 way SLI after getting the Enthusiast Key. Plus there's 100% no official support for 3 or 4 SLI, but they are willing to sell you something that will allow it.

It seems like Nvidia just can't support more than two GPUs anymore or doesn't want to support more than two GPUs. I think it's more to it though. There's no guarantee for 3 or 4 way SLI.


----------



## Alastair

Enthusiast Key! In B4 39.99 dollars subscription for 3 or 4 way SLI benefits.


----------



## Agent Smith1984

What's funny is..... NVIDIA owners are thoroughly unimpressed with the 1080.... Lots of folks saying that because their overclocked 980ti (around 1500mhz or so) are coming within 5% of it at stock, that it's a failure. Overclocking the 1080 pushes it back it back up to around 15%+ faster though.

Some of them are mad they sold their cards for $400 a week ago, LMAO


----------



## alcal

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Apply for an enthusiast key??? Are you kidding me???
> 
> They are like, "we want to log the people spending thousands on multiple 1080's by name and address" LMAO


I hadn't though about it this way, but what I presume was you being sarcastic may actually be on point. "If you want to spend $2800 on GPUs that won't properly work together, we gotta get your number so we can hit you up when we have more pointless things to sell you"


----------



## Agent Smith1984

Quote:


> Originally Posted by *alcal*
> 
> I hadn't though about it this way, but what I presume was you being sarcastic may actually be on point. "If you want to spend $2800 on GPUs that won't properly work together, we gotta get your number so we can hit you up when we have more pointless things to sell you"


Exactly!


----------



## SuperZan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's funny is..... NVIDIA owners are thoroughly unimpressed with the 1080.... Lots of folks saying that because their overclocked 980ti (around 1500mhz or so) are coming within 5% of it at stock, that it's a failure. Overclocking the 1080 pushes it back it back up to around 15%+ faster though.
> 
> Some of them are mad they sold their cards for $400 a week ago, LMAO


I think it's just about perspective. Looking at recent history with the 780, 980, and now the 1080 vs. their respective Ti counterparts, unless you were in SEVERE need of an upgrade (like gaming on an X800 XT still or something) the *80 has just been a bad buy. 980 to 980 Ti in particular was in the same pricing ballpark but you got 2Gb VRAM and a nice performance increase over the 980. Now with the presumed 1080 Ti you're looking at HBM2 perhaps, and even more performance gains - I just can't see the 1080 for a 390x or better owner.


----------



## Agent Smith1984

Quote:


> Originally Posted by *SuperZan*
> 
> I think it's just about perspective. Looking at recent history with the 780, 980, and now the 1080 vs. their respective Ti counterparts, unless you were in SEVERE need of an upgrade (like gaming on an X800 XT still or something) the *80 has just been a bad buy. 980 to 980 Ti in particular was in the same pricing ballpark but you got 2Gb VRAM and a nice performance increase over the 980. Now with the presumed 1080 Ti you're looking at HBM2 perhaps, and even more performance gains - I just can't see the 1080 for a 390x or better owner.


I dunno, I just sold off my 390x, and honestly, when the games I play most like GTA V, show the 1080 being twice as fast in 4k, it makes sense to me.

I would of either spend another $300+ on a 390X for CF and use 600w+ power while GPU's battle 90c temps, or sell the 390x for $300, add the $300 I was already planning to spend, and see the same performance...... but again, it's like you said, perspective is key. If people can wait 8-12 months for the 1080ti then they stand to get an amazing card for around $650 and the 1080 should go down to around $500 or so. I am most curious to see how the 1070 does, because it may very well be my next card if it offers overclocked 980ti performance at $379..... I won't really have time to wait for Vega (but may check it out when the time comes), and Polaris has been openly slated by AMD to be a mainstream only offering...


----------



## SuperZan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I dunno, I just sold off my 390x, and honestly, when the games I play most like GTA V, show the 1080 being twice as fast in 4k, it makes sense to me.
> 
> I would of either spend another $300+ on a 390X for CF and use 600w+ power while GPU's battle 90c temps, or sell the 390x for $300, add the $300 I was already planning to spend, and see the same performance...... but again, it's like you said, perspective is key. If people can wait 8-12 months for the 1080ti then they stand to get an amazing card for around $650 and the 1080 should go down to around $500 or so. I am most curious to see how the 1070 does, because it may very well be my next card if it offers overclocked 980ti performance at $379..... I won't really have time to wait for Vega (but may check it out when the time comes), and Polaris has been openly slated by AMD to be a mainstream only offering...


True, and IIRC you play at 4K as well. I'm happy there with two Furies but I could absolutely see wanting to upgrade a single-card solution at this resolution. That's probably the best argument anybody could make for a 1080, I agree.


----------



## bluezone

A little while back AotS was re-reviewed; post release; on one of the tech sites. In the discussion (argument) board that followed. One ardent NVidia fan poised that the GTX 980Ti could easily match the R9 FuryX frame rate via raising GPU clocks to match FP performance of the R9 FuryX, without Async Compute and he proved this by over clocking his card and posting his results.

From what I've seen so far with bench marking of the GTX 1080, the performance seems to track with the improved clock speed over the GTX 980Ti. Meaning FP performance is quicker related to AotS. Where is Async showing any help?


----------



## toncij

Quote:


> Originally Posted by *bluezone*
> 
> A little while back AotS was re-reviewed; post release; on one of the tech sites. In the discussion (argument) board that followed. One ardent NVidia fan poised that the GTX 980Ti could easily match the R9 FuryX frame rate via raising GPU clocks to match FP performance of the R9 FuryX, without Async Compute and he proved this by over clocking his card and posting his results.
> 
> From what I've seen so far with bench marking of the GTX 1080, the performance seems to track with the improved clock speed over the GTX 980Ti. Meaning FP performance is quicker related to AotS. Where is Async showing any help?


Not sure what he posted or how, but here is a single fact: if you take that 980Ti and FuryX have identical performance in DX11 and then move it to a DX12 benchmark, FuryX will be faster and 980Ti theoretically can not match it. To match it, 980Ti needs to be overclocked to have matching DX12 performance, which would give it more DX11 performance in return.

For the very same hardware performing in DX11, DX12 favors piece supporting async shaders. It's rather simple. Now, in case when Ashes is benched showing FuryX being slower in DX11 than 980Ti and then matchin and exceeding performance of 980Ti in DX12, we have a clear architecture win.

That's a big if though, because in practice, two cards are never the same. It's worth nothing to have better hardware because of AS support if the competing product is as fast or faster without it. For all intents and purpose, of course.


----------



## bluezone

Quote:


> Originally Posted by *toncij*
> 
> Not sure what he posted or how, but here is a single fact: if you take that 980Ti and FuryX have identical performance in DX11 and then move it to a DX12 benchmark, FuryX will be faster and 980Ti theoretically can not match it. To match it, 980Ti needs to be overclocked to have matching DX12 performance, which would give it more DX11 performance in return.
> 
> For the very same hardware performing in DX11, DX12 favors piece supporting async shaders. It's rather simple. Now, in case when Ashes is benched showing FuryX being slower in DX11 than 980Ti and then matchin and exceeding performance of 980Ti in DX12, we have a clear architecture win.
> 
> That's a big if though, because in practice, two cards are never the same. It's worth nothing to have better hardware because of AS support if the competing product is as fast or faster without it. For all intents and purpose, of course.


Good points and yes were talking DX12 performance. The GTX 1080 supports AS from what I've read and hear so far, but relative performance improvements from AS don't seem to show. From what I see, its a hair lower than I had expected with out AS. This of course with immature drivers.


----------



## toncij

Quote:


> Originally Posted by *bluezone*
> 
> Good points and yes were talking DX12 performance. The GTX 1080 supports AS from what I've read and hear so far, but relative performance improvements from AS don't seem to show. From what I see, its a hair lower than I had expected with out AS. This of course with immature drivers.


From multiple results posted here: 1080 moving from DX11 to DX12 has no improvement. That clearly proves there is no a.s. support. It's got nothing with drivers. One little fact you are not supposed to know is that one little specialty of DirectX12, Mantle, Vulkan and Metal is that driver is minimal and there are no more heavy optimizations in driver. In DX11 Nvidia had heavy optimizations for just about everything, but DX12 moves majority of work to game developers.


----------



## danjal

Then what about the benchmarks showing the gtx1080 running the new Doom with vulcan api at over 200fps? To be honest, I think both companies are blowing smoke, whos blowing more has yet to be determined, but it will be determined...

My Sapphire Fury does not impress me one bit. It's an overpriced, underperforming gpu as far as I'm concerned.


----------



## flopper

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I dunno, I just sold off my 390x, and honestly, when the games I play most like GTA V, show the 1080 being twice as fast in 4k, it makes sense to me.
> 
> I would of either spend another $300+ on a 390X for CF and use 600w+ power while GPU's battle 90c temps, or sell the 390x for $300, add the $300 I was already planning to spend, and see the same performance...... but again, it's like you said, perspective is key. If people can wait 8-12 months for the 1080ti then they stand to get an amazing card for around $650 and the 1080 should go down to around $500 or so. I am most curious to see how the 1070 does, because it may very well be my next card if it offers overclocked 980ti performance at $379..... I won't really have time to wait for Vega (but may check it out when the time comes), and Polaris has been openly slated by AMD to be a mainstream only offering...


cant argue with that.
single card is the better option.


----------



## danjal

gtx1080 benchmarks with dx12 included. as I suspected, it runs dx12 pretty good. I think nvidia just embarrassed amd actually.

1440 



4k


----------



## Flamingo

Heh, guess Im sticking to my Nano for now then with prices like this in Australya



Also I seriously doubt the GTX 1070 would be a convincing upgrade over the Nano. When is the review for the 1070 due?

Its kinda annoying tbh honest, the whole vendor hijacking game titles and benchmark software to make sure their hardware runs better or rather the other vendors hardware runs slower.


----------



## Kana-Maru

I've written an article about the GTX 1080 and issues surrounding the reveal and now the NDA lift. Some things were written before the reviews were released to the public. Nvidia has done a great job with the card overall, but there's still some issues that have to be addressed. I know that the hype train is completely off the rails now, but I had to write an article on it.

*GTX 1080 - What's not being discussed*

http://www.overclock-and-game.com/news/pc-gaming/46-gtx-1080-what-s-not-being-discussed

I haven't posted this on the Nvidia side yet. I think it'll be better to let them calm down a little and get some sleep. When I post this tomorrow, I'm sure they will be pounding me with insults, but I won't care. This wasn't necessarily a pro Nvidia article and I know how some people can get.


----------



## bluezone

Quote:


> Originally Posted by *Kana-Maru*
> 
> I've written an article about the GTX 1080 and issues surrounding the reveal and now the NDA lift. Some things were written before the reviews were released to the public. Nvidia has done a great job with the card overall, but there's still some issues that have to be addressed. I know that the hype train is completely off the rails now, but I had to write an article on it.
> 
> *GTX 1080 - What's not being discussed*
> 
> http://www.overclock-and-game.com/news/pc-gaming/46-gtx-1080-what-s-not-being-discussed
> 
> I haven't posted this on the Nvidia side yet. I think it'll be better to let them calm down a little and get some sleep. When I post this tomorrow, I'm sure they will be pounding me with insults, but I won't care. This wasn't necessarily a pro Nvidia article and I know how some people can get.


Very good article. Nice incites.

Yes the 1080 been a bit fishy on the review sites.

REP+1


----------



## Kana-Maru

Quote:


> Originally Posted by *bluezone*
> 
> Very good article. Nice incites.
> 
> Yes the 1080 been a bit fishy on the review sites.
> 
> REP+1


Thanks, I just felt like speaking my mind. AMD isn't perfect by any means, but they rarely get a fair shake. I don't expect any journalist to call Nvidia out either. They are afraid to shake the big green monster who own a large portion of the GPU market.

I've posted this around the net and some Nvidia fans [they were running GTX GPUs], or supposedly Nvidia fans, have been trying to chew me out, but they aren't reading the entire article before trying to slander me lol.


----------



## bluezone

Quote:


> Originally Posted by *Kana-Maru*
> 
> Thanks, I just felt like speaking my mind. AMD isn't perfect by any means, but they rarely get a fair shake. I don't expect any journalist to call Nvidia out either. They are afraid to shake the big green monster who own a large portion of the GPU market.


I'm busy right now reading through the argument section at Hard[OCP] (sorry discussion section) and the NVidia defence league is in full force there at the slightest neg suggestion toward 1080 or testing methods. LOL

so I know what you mean.

EDIT: they do bring up heat soak problems and power delivery throttling similar to Nano


----------



## SuperZan

Quote:


> Originally Posted by *bluezone*
> 
> I'm busy right now reading through the argument section at Hard[OCP] (sorry discussion section) and the NVidia defence league is in full force there at the slightest neg suggestion toward 1080 or testing methods. LOL
> 
> so I know what you mean.
> 
> EDIT: they do bring up heat soak problems and power delivery throttling similar to Nano


I would expect nothing less in Kyle's stronghold of doom. Seriously though, the hampered OC performance on the FE is interesting on a card that relies so heavily on clockspeed for performance. I wonder what the AIB non-reference designs will be like to counter that issue.


----------



## danjal

Quote:


> Originally Posted by *SuperZan*
> 
> I would expect nothing less in Kyle's stronghold of doom. Seriously though, the hampered OC performance on the FE is interesting on a card that relies so heavily on clockspeed for performance. I wonder what the AIB non-reference designs will be like to counter that issue.


another 8 pin connector better cooling...











I dont know where your getting hampered overclocking performance, its going over 2ghz on the fe/reference card.

It sure is making my Sapphire Nitro Fury look like a overpriced pos in every aspect.


----------



## Agent Smith1984

I wouldn't say embarrassed them, I would say they showed them how strong they need to counter with Vega. That's kinda how this works.....


----------



## dagget3450

Everyone has their own preferences on new hardware. I realize many people are amazed at 1080s power consumption. While it is nice, i could give a rats arse about power consumption or heat. I would rather have raw performance over all else. While the 1080 is faster than current offerings, its not by much. I will wait to see the next big gpus but the whole low power thing is bothering me when it takes away from performance. AMD is pushing less power and efficiency as well.

I want raw performance and heat power can be tamed with water cooling and a decent psu. That's just me i suppose.


----------



## JackCY

Not everyone has 100s of $ in WC and high power consumption goes on your electric bill constantly. Another disadvantage of the 10xx is that it can only do lower power multimonitor up to 2 displays, with 3 the power goes nuts as always at least that's what it seems from reviews that managed to test the 3 monitor setup power consumption and not just 2 monitors.
Overall the 10xx seems very cut down to reduce cost, 1070 is cut down to sell as many bad chips as possible even omitting the G5X memory to cut down cost further. This is production cost, of course they sell both overpriced as always, $50-$150 more than previous generation 970/980.

Can't wait for 14nm AMD chips with better prices.


----------



## Agent Smith1984

Quote:


> Originally Posted by *dagget3450*
> 
> Everyone has their own preferences on new hardware. I realize many people are amazed at 1080s power consumption. While it is nice, i could give a rats arse about power consumption or heat. I would rather have raw performance over all else. While the 1080 is faster than current offerings, its not by much. I will wait to see the next big gpus but the whole low power thing is bothering me when it takes away from performance. AMD is pushing less power and efficiency as well.
> 
> I want raw performance and heat power can be tamed with water cooling and a decent psu. That's just me i suppose.


I agree, unless you are considering the people who will not be water cooling and who need a card that runs fairly cool with air cooling.... I fit into that group for now.

What I will say though, is that if I were to go with a 1070 or 1080, I would be waiting for the partner boards, especially something that may have 2) power connectors..... every overclock test I've seen shows the overclocking stopping in the early 2000-2100 range and bouncing around due to TDP limitations. The thing runs on 1.1v...... if you could get one to take around 1.2v like maxwell, it may clock in the 2300MHz range and really kick out some sick numbers....


----------



## SuperZan

Quote:


> Originally Posted by *danjal*
> 
> another 8 pin connector better cooling...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont know where your getting hampered overclocking performance, its going over 2ghz on the fe/reference card.
> 
> It sure is making my Sapphire Nitro Fury look like a overpriced pos in every aspect.


It's not higher clock in a vacuum. It has higher base+boost clocks as a necessary condition for its performance increase over Maxwell. Just saying it's hitting a certain clock without context is meaningless. In actual demonstrated OC headroom the FE is actually quite similar to Fiji.

You should separate your dislike of your Fury from your analysis of the 1080.


----------



## Kana-Maru

Quote:


> Originally Posted by *dagget3450*
> 
> Everyone has their own preferences on new hardware. I realize many people are amazed at 1080s power consumption. While it is nice, i could give a rats arse about power consumption or heat. I would rather have raw performance over all else. While the 1080 is faster than current offerings, its not by much. I will wait to see the next big gpus but the whole low power thing is bothering me when it takes away from performance. AMD is pushing less power and efficiency as well.
> 
> I want raw performance and heat power can be tamed with water cooling and a decent psu. That's just me i suppose.


I agree. Raw performance needs to come back without people complaining about power and heat.

Quote:


> Originally Posted by *JackCY*
> 
> Not everyone has 100s of $ in WC and high power consumption goes on your electric bill constantly. Another disadvantage of the 10xx is that it can only do lower power multimonitor up to 2 displays, with 3 the power goes nuts as always at least that's what it seems from reviews that managed to test the 3 monitor setup power consumption and not just 2 monitors.
> Overall the 10xx seems very cut down to reduce cost, 1070 is cut down to sell as many bad chips as possible even omitting the G5X memory to cut down cost further. This is production cost, of course they sell both overpriced as always, $50-$150 more than previous generation 970/980.
> 
> Can't wait for 14nm AMD chips with better prices.


Sounds like some people shouldn't be buying power hungry cards then, That shouldn't stop companies from releasing raw performance. Those who can't afford high end cards and the cost to control heat or those who worry about their electricity or heat should purchase something else. To me it's like a sports car. Don't complain about the sports car gas mileage, go buy a Prius instead. Some people want speed instead of efficiency. Some people only want efficiency. Some people want both power & efficiency and when you want both that's when you start getting 4 cylinders in freaking muscle cars [I'm looking at you Camaro and Mustang with your funny sounding engines] and all of those features should be options for GPUs as well.

Can't wait for the GPU battle to heat up as well.


----------



## ITAngel

How are the FURY X cards doing with the latest drivers? I been thinking of maybe trading my EVGA GTX 980 Ti Classified card for a FURY X card but not sure as I feel the technology is still fairly new.


----------



## Medusa666

I just ordered a Radeon Pro Duo to futureproof the PC for the coming 2-4 years, will play at 1440P, is this a good idea?

The CPU is a 5960X.

Let me know what you think.


----------



## toncij

Quote:


> Originally Posted by *Medusa666*
> 
> I just ordered a Radeon Pro Duo to futureproof the PC for the coming 2-4 years, will play at 1440P, is this a good idea?
> 
> The CPU is a 5960X.
> 
> Let me know what you think.


RPD is a dual Nano card. Nothing more, nothing less, except for 50% more cash. Unless you need a single slot crossfire solution, I see no point.


----------



## Medusa666

Quote:


> Originally Posted by *toncij*
> 
> RPD is a dual Nano card. Nothing more, nothing less, except for 50% more cash. Unless you need a single slot crossfire solution, I see no point.


Oh, I thought it was equal to Fury X?


----------



## bluezone

Quote:


> Originally Posted by *ITAngel*
> 
> How are the FURY X cards doing with the latest drivers? I been thinking of maybe trading my EVGA GTX 980 Ti Classified card for a FURY X card but not sure as I feel the technology is still fairly new.


The drivers have been arriving quite frequently and as a result slowly improving. They are a work in progress still but have come quite a way from launch.
If you stuck between DX11 and DX12 your likely more than good with what you have now. The GTX 980 Ti is a very nice card.


----------



## ITAngel

Quote:


> Originally Posted by *bluezone*
> 
> The drivers have been arriving quite frequently and as a result slowly improving. They are a work in progress still but have come quite a way from launch.
> If you stuck between DX11 and DX12 your likely more than good with what you have now. The GTX 980 Ti is a very nice card.


Hey Thanks for the info much appreciated. Yea I like it, it has run everything just curious on the FURY X since I was impressed with the 290X back then.


----------



## Alwrath

Just got my $495 Fury X open box from Newegg, installed it into my system, and powered on and... nothing. Fans started spinning for about 1 second then computer wouldent turn on. Uh oh. Did some digging and researching, and found out my Corsair hx1000 I bought back in 2009 is actually 2 500w power supply's on 2 seperate 12v rails rated at 40 amps each. CPU was using 12v 1 and gpu was on 12v 2 and it wont power on. Ordered a 1200w single 12v rail Rosewill Quark. Cant wait to get it and test this Fury X out


----------



## RatPatrol01

Quote:


> Originally Posted by *Medusa666*
> 
> Oh, I thought it was equal to Fury X?


Nano and Fury X are the same chip, just different boards


----------



## danjal

Then nvidia adds another power connector and hbm and really cranks up the gp104 in the 1080ti and x-titan. and dont forget the dual gpu card..

it should be interesting nonetheless, I still dont think amd's architecture is all its cracked up to be.. If so the Fury/Fury-x would have been more powerful.


----------



## Alastair

Quote:


> Originally Posted by *toncij*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Medusa666*
> 
> I just ordered a Radeon Pro Duo to futureproof the PC for the coming 2-4 years, will play at 1440P, is this a good idea?
> 
> The CPU is a 5960X.
> 
> Let me know what you think.
> 
> 
> 
> RPD is a dual Nano card. Nothing more, nothing less, except for 50% more cash. Unless you need a single slot crossfire solution, I see no point.
Click to expand...

No you are wrong. The PD has better power delivery per GPU than a Nano has. 5+1 phase on the PD vs. a 4+1 on Nano.


----------



## Alastair

Quote:


> Originally Posted by *Medusa666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *toncij*
> 
> RPD is a dual Nano card. Nothing more, nothing less, except for 50% more cash. Unless you need a single slot crossfire solution, I see no point.
> 
> 
> 
> Oh, I thought it was equal to Fury X?
Click to expand...

He is wrong. Ignore the comment. PD has a better power delivery to a nano.


----------



## Alastair

Quote:


> Originally Posted by *danjal*
> 
> Then nvidia adds another power connector and hbm and really cranks up the gp104 in the 1080ti and x-titan. and dont forget the dual gpu card..
> 
> it should be interesting nonetheless, I still dont think amd's architecture is all its cracked up to be.. If so the Fury/Fury-x would have been more powerful.


I don't think you know, what you think you know.

1. GP104 is 1070 and 1080. Ti and Titan is GP100. And historically speaking the large die parts are not clocked much higher if at all compared to the smaller dies.
2. Oh where, oh where, is dual chip Maxwell?
3. GP100 is available in the here and now. If you can afford a Tesla with some the SMX's disabled. The fact is. If NV can't get FULLY working dies into the super-computer scene it will still be a good 6 months + until we can see consumer GP100 parts.
4. It is already evident how powerful GCN is as an architecture. We have cards dating back from 2012 still taking the fight to Nvidia and still doing well at it too. Add to the fact that a 1080 needs to have around 65% higher clockspeeds to start pulling ahead of a 1.1GHz FuryX and all of a sudden you can see where the advantage lies. With AMD. It is actually so blindingly obvious. 1.1GHz on Fury X = 1.6+GHz on 1080. It is quite clear from the get go that Nvidia needs to brute force its way through the MHz barriers if they want to compete effectively with LAST gen GCN. Amd has been refining GCN for years now. we won't just be seeing clock speeds improvements with GCN4 that is for sure.
5. How could the Fury X be more powerful. It already competes quite handily against its intended competitor (980ti), beats it at 4K, and wrecks it when DX12 is thrown into the mix. Why would should it be more powerful?


----------



## Mega Man




----------



## bluezone

Quote:


> Originally Posted by *ITAngel*
> 
> Hey Thanks for the info much appreciated. Yea I like it, it has run everything just curious on the FURY X since I was impressed with the 290X back then.


290x/390x are still a great cards. R9 395x2 is trading blows with GTX 1080 @ 4K although power bill is a little harsh.


----------



## ITAngel

Quote:


> Originally Posted by *bluezone*
> 
> 290x/390x are still a great cards. R9 395x2 is trading blows with GTX 1080 @ 4K although power bill is a little harsh.


Oh i see, the R9 395x2 looks to be a pretty cool card. =) Thanks for the info.


----------



## toncij

Quote:


> Originally Posted by *Alastair*
> 
> He is wrong. Ignore the comment. PD has a better power delivery to a nano.


Having better power delivery really mandates 60% higher price?


----------



## RatPatrol01

Quote:


> Originally Posted by *toncij*
> 
> Having better power delivery really mandates 60% higher price?


I don't think the point was to say whether or not it's worth the price, just that the RPD isn't really two nano's, it's closer to two fury x's


----------



## toncij

Quote:


> Originally Posted by *RatPatrol01*
> 
> I don't think the point was to say whether or not it's worth the price, just that the RPD isn't really two nano's, it's closer to two fury x's


And two Furries are pretty much the same thing, just RPD is stock clocked as Nanos.


----------



## ITAngel

From what I have seen so far from both cards is their technology is fairly new. I think the drivers for the Fury are not yet optimized for it but the same can be said about GTX 1080. They both have their pro and cons but so far it seems the 1080 base on performance is winning this race but AMD has been very quiet about their new GPU line. Normally they advertise and so on but this time they are very quiet. I wonder what they are doing right now....









In the other hand I find the Fury card very interesting and unlike NVIDIA when AMD fixes their drivers they actually perform way better than I have seen on NVIDIA side. I own a GTX 980 Ti Classified and I did on the 290X plus Devil 13 290x2 card before. I am just sitting here waiting and watching what is happening on the GPU market.

If got a Fury/Fury-X for a good price I won't mind picking one up since i play on a 1080p @ 60hz led lcd monitor.









However; benchmarks like those versus real world performance meaning actual gaming and application use sage may end up giving a whole different results in the end. May look good now but may not look good later on if you know what i mean.


----------



## Flamingo

Hmm GTX 1080 with its price in AU is not worth upgrading over the R9 Nano., GTX 1070 might, but I dont think the performance improvement would be decent enough go for an upgrade

Gotta wait for 1070 benches and release dates.


----------



## danjal

So they drop support for like titles for the future. So amd solution is to keep running old outdated drivers?


----------



## Gdourado

How does a Nano behave with the stock air cooler and 50% more power limit?
Can it keep 1000mhz constant while gaming? Or does it throttle?


----------



## Alastair

Quote:


> Originally Posted by *Gdourado*
> 
> How does a Nano behave with the stock air cooler and 50% more power limit?
> Can it keep 1000mhz constant while gaming? Or does it throttle?


yes it can maintain 1000. Maybe also just set a custom fan curve in Afterburner to assist it.


----------



## Gdourado

Quote:


> Originally Posted by *Alastair*
> 
> yes it can maintain 1000. Maybe also just set a custom fan curve in Afterburner to assist it.


So what is the better buy for gaming?
An air cooled fury withthe cut down Fiji chip but two 8 pin power connectors and a beefy cooler?
Or a Nano with all the stream processors but less power available?
Can a fury overclock much beyond 1000 to outperform the Nano with the 4096 processors at 1000?


----------



## JunkaDK

Quote:


> Originally Posted by *Gdourado*
> 
> So what is the better buy for gaming?
> An air cooled fury withthe cut down Fiji chip but two 8 pin power connectors and a beefy cooler?
> Or a Nano with all the stream processors but less power available?
> Can a fury overclock much beyond 1000 to outperform the Nano with the 4096 processors at 1000?


I am pretty sure it can yes! An overclocked Fury can perform like a Fury X and 980ti in lots of newer titles.


----------



## RatPatrol01

Quote:


> Originally Posted by *Gdourado*
> 
> So what is the better buy for gaming?
> An air cooled fury withthe cut down Fiji chip but two 8 pin power connectors and a beefy cooler?
> Or a Nano with all the stream processors but less power available?
> Can a fury overclock much beyond 1000 to outperform the Nano with the 4096 processors at 1000?


I love my nano, and it can overclock like a champ with +50% power limit, and hold it there, but I was not a fan of the stock cooler. IMO if you plan to overclock on air, the fury is probably the more flexible option.


----------



## ITAngel

Quote:


> Originally Posted by *RatPatrol01*
> 
> I love my nano, and it can overclock like a champ with +50% power limit, and hold it there, but I was not a fan of the stock cooler. IMO if you plan to overclock on air, the fury is probably the more flexible option.


That is good to know, I have never own a Fury so I was wondering about that, how flexible it is. If I ever got one I need to put it on my EK loop on my system.


----------



## Gdourado

They cost about the same.
XFX fury triple dissipation or an Asus R9 Nano.
Pretty much 4096 stream processors or better cooling and power delivery.
But if a Nano, even on stock cooling doesn't throttle with increased power limit, might be the way to go.
From what I read, fuji is not a great clicker.
Can get 1150 at most.
Might as well go for a stable 1050 with the Nano at full fuji chip.

Am I thinking right?


----------



## Alastair

Quote:


> Originally Posted by *Gdourado*
> 
> They cost about the same.
> XFX fury triple dissipation or an Asus R9 Nano.
> Pretty much 4096 stream processors or better cooling and power delivery.
> But if a Nano, even on stock cooling doesn't throttle with increased power limit, might be the way to go.
> From what I read, fuji is not a great clicker.
> Can get 1150 at most.
> Might as well go for a stable 1050 with the Nano at full fuji chip.
> 
> Am I thinking right?


You know. I would try get your hands on a Tri-X if you can find an older one and try unlocking it to 3840SP's. It is a roll of the dice and you had a gooid chance at it until AMD sniffed out what we were up to. So if you can find a Tri-X kinda within 4-5months of launch date you can get both the power delivery AND the performance.

As for overclocking. I am at 1100MHz on stock volts. I am about to dump my BIOS registers for Gupsterg so we can get a BIOS mod for me going and then I am shooting for 1200. (My one card could maintain 1160MHz without additional voltage, but it does only run at 36C cuase of the blocks.)


----------



## Medusa666

I'm starting to regret my Radeon Pro Duo buy, considering Polaris and Vega will come this year.

I can still send it back, should I?


----------



## RatPatrol01

Quote:


> Originally Posted by *ITAngel*
> 
> That is good to know, I have never own a Fury so I was wondering about that, how flexible it is. If I ever got one I need to put it on my EK loop on my system.


If you are going with watercooling, I have been really impressed with the Nano once you can cool it properly and quietly.
Quote:


> Originally Posted by *Gdourado*
> 
> They cost about the same.
> XFX fury triple dissipation or an Asus R9 Nano.
> Pretty much 4096 stream processors or better cooling and power delivery.
> But if a Nano, even on stock cooling doesn't throttle with increased power limit, might be the way to go.
> From what I read, fuji is not a great clicker.
> Can get 1150 at most.
> Might as well go for a stable 1050 with the Nano at full fuji chip.
> 
> Am I thinking right?


Best I was able to do on air with my Nano was 1020 without the fan getting unusably loud and then eventually throttling itself a bit. With water it's holding strong at 1100 with stock voltage


----------



## flopper

Quote:


> Originally Posted by *Medusa666*
> 
> I'm starting to regret my Radeon Pro Duo buy, considering Polaris and Vega will come this year.
> 
> I can still send it back, should I?


I would rather buy Vega.
but thats 6 months to go+


----------



## xchrisposix

Just picked up a 390x a few days ago. Its been great and haven't had any issues so far.


----------



## SpeedyVT

Quote:


> Originally Posted by *Medusa666*
> 
> I'm starting to regret my Radeon Pro Duo buy, considering Polaris and Vega will come this year.
> 
> I can still send it back, should I?


I wouldn't have bought unless you're doing 3D modeling, that's the real benefit. Gaming will always stagnate due non-productive practices by all parties. The Pro Duo is faster than the 1080 stock for stock.


----------



## Medusa666

Quote:


> Originally Posted by *flopper*
> 
> I would rather buy Vega.
> but thats 6 months to go+


Yeah I know, the Pro Duo is an expensive card, I could get a double 1080 setup for it.

What do we know about Vega?


----------



## danjal

Quote:


> Originally Posted by *Medusa666*
> 
> I'm starting to regret my Radeon Pro Duo buy, considering Polaris and Vega will come this year.
> 
> I can still send it back, should I?


I'd send it back and get a 1080 for half the price then put that extra money back for a vega.


----------



## SuperZan

Quote:


> Originally Posted by *Medusa666*
> 
> Yeah I know, the Pro Duo is an expensive card, I could get a double 1080 setup for it.
> 
> What do we know about Vega?


We know it's aiming for the high-end, specwise the only definitive thing we know is that it'll have HBM2.


----------



## SpeedyVT

Quote:


> Originally Posted by *danjal*
> 
> I'd send it back and get a 1080 for half the price then put that extra money back for a vega.


I'd just buy Polaris for now, it's low wattage and relative performance would make it an ultimate htpc/living room gaming machine.


----------



## RatPatrol01

If the 295x2 is any indicator, the RPD should sit on top of performance rankings for a decent chunk of time, of course the issue of proper support on the software side is the real potential problem


----------



## Alastair

Nice and stable 1100/550 on both my Fury's at their stock voltage. In terms of applied voltage that is giving me 1.181V for one card and 1.218V for the other. Applied voltages under load are in the 1.2V range. If only we could get decent voltage control without the negative scaling issues yeah that would be great.


----------



## toncij

RPD is a beast. It will sit on the throne for quite some time. But, 1080 is a single card close to it by performance when heavily overclocked... Vega might blast RPD too (will probably if done properly). But I don't think Vega or Volta (1080Ti?) are coming any time soon...


----------



## SpeedyVT

Quote:


> Originally Posted by *toncij*
> 
> RPD is a beast. It will sit on the throne for quite some time. But, 1080 is a single card close to it by performance when heavily overclocked... Vega might blast RPD too (will probably if done properly). But I don't think Vega or Volta (1080Ti?) are coming any time soon...


Radeon Pro Duo is for developers it has features you can't buy in a normal GPU for rendering. Buying it for gaming is a terrible idea considering a lot of games are requiring larger VRAM.


----------



## RatPatrol01

That kid's posts in this thread continue to baffle me


----------



## toncij

Quote:


> Originally Posted by *SpeedyVT*
> 
> Radeon Pro Duo is for developers it has features you can't buy in a normal GPU for rendering. Buying it for gaming is a terrible idea considering a lot of games are requiring larger VRAM.


Yes, it's not the best choice cost-wise., but I don't find any special features I'd not find in other Fury cards. We're using those as a convenient way to have access to dual Fiji in a small case, purely for explicit multi-adapter development. For a gamer, I'd rather go with dual Nanos and watercool them with EKWB.


----------



## Thoth420

Anyone playing Doom on a single Fury X? Wondering what the perf is like with everything cranked to the max(aside AA, FXAA would be fine) at 2560 x 1440.


----------



## JunkaDK

Quote:


> Originally Posted by *Thoth420*
> 
> Anyone playing Doom on a single Fury X? Wondering what the perf is like with everything cranked to the max(aside AA, FXAA would be fine) at 2560 x 1440.


I'm playing on a single Fury getting around 80 fps, so im sure you will get that from the Fury X at least







see this Doom benchmark video:


----------



## kokobash

I manage to get myself two nano cards. Is it worth investing for a waterblock for these cards. Or running it with stock is ok?(no plans on voltage overclocking)


----------



## toncij

Quote:


> Originally Posted by *kokobash*
> 
> I manage to get myself two nano cards. Is it worth investing for a waterblock for these cards. Or running it with stock is ok?(no plans on voltage overclocking)


By all means. Watercooled these do not throttle, are super-silent and can be overclocked at least to 1050/525 (I had 3 under EKWB).


----------



## bluezone

On the subject of the interposer. I stumbled across this.

http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/


----------



## JunkaDK

Quote:


> Originally Posted by *bluezone*
> 
> On the subject of the interposer. I stumbled across this.
> 
> http://semiaccurate.com/2015/06/22/amd-talks-fiji-fiji-x-odd-bits-tech/


Whoops


----------



## Thoth420

Quote:


> Originally Posted by *JunkaDK*
> 
> I'm playing on a single Fury getting around 80 fps, so im sure you will get that from the Fury X at least
> 
> 
> 
> 
> 
> 
> 
> see this Doom benchmark video:


Thanks


----------



## Gdourado

Quote:


> Originally Posted by *toncij*
> 
> By all means. Watercooled these do not throttle, are super-silent and can be overclocked at least to 1050/525 (I had 3 under EKWB).


From what I gather, with 150% power limit and good case airflow, they don't throttle either.








But can be a bit louder.


----------



## toncij

Quote:


> Originally Posted by *Gdourado*
> 
> From what I gather, with 150% power limit and good case airflow, they don't throttle either.
> 
> 
> 
> 
> 
> 
> 
> 
> But can be a bit louder.


Yes, but the vent is really crappy


----------



## Performer81

I got a new XFX FURY triple fan after my 970 died and that card is awesome. I think the cooler could even handle the Pro DUO. Temp gets barely over 60 degrees at 900!!!! U/min and i dont have any vsync or frame limiter. With a custom fan curve at ~ 2000 temp goes not much over 50. That cooler has 7 heatpipes it is way better that the ones from Sapphire or Gigabyte. I also can undervolt it like crazy, -72mv and still oc to 1050 from 1000. I am very happy. Sadly the coil ehine is much louder than the fans.


----------



## Gdourado

Quote:


> Originally Posted by *Performer81*
> 
> I got a new XFX FURY triple fan after my 970 died and that card is awesome. I think the cooler could even handle the Pro DUO. Temp gets barely over 60 degrees at 900!!!! U/min and i dont have any vsync or frame limiter. With a custom fan curve at ~ 2000 temp goes not much over 50. That cooler has 7 heatpipes it is way better that the ones from Sapphire or Gigabyte. I also can undervolt it like crazy, -72mv and still oc to 1050 from 1000. I am very happy. Sadly the coil ehine is much louder than the fans.


I am also eyeing one of those.
What is the max OC you can get on both the core and the memory?
And have you tried unlocking extra stream processors?

Cheers!


----------



## Performer81

Didnt test max. overclock yet. Only have the card for 2 days, i played some rounds of Battlefield 4 and project cars with 1100MHZ and -18mv and it was stable, -25mv werent stable for very long. Checked the shaders but my card cant be unlocked. I also didnt touch the hbm yet. Asic is 56,7 btw.


----------



## Kana-Maru

Quote:


> Originally Posted by *Thoth420*
> 
> Anyone playing Doom on a single Fury X? Wondering what the perf is like with everything cranked to the max(aside AA, FXAA would be fine) at 2560 x 1440.


I just uploaded my results:

*Doom - Fury X Benchmarks*
http://www.overclock-and-game.com/news/pc-gaming/47-doom-fury-x-benchmarks

Waiting on Vulkan. Gorgeous game. I'm really enjoying it so far.


----------



## spyshagg

Quote:


> Originally Posted by *JunkaDK*
> 
> I'm playing on a single Fury getting around 80 fps, so im sure you will get that from the Fury X at least
> 
> 
> 
> 
> 
> 
> 
> see this Doom benchmark video:


Nice

Im getting that with a modded 290x


----------



## toncij

On a single FuryX around 25 FPS on 5K...


----------



## JunkaDK

Quote:


> Originally Posted by *spyshagg*
> 
> Nice
> 
> Im getting that with a modded 290x


At 1440p or 1080p?


----------



## dagget3450

Quote:


> Originally Posted by *toncij*
> 
> On a single FuryX around 25 FPS on 5K...


in what?


----------



## Medusa666

Thanks for all the opinions and replies regarding the Radeon Pro Duo, I have the weekend to reconsider so I'l give it some thought : )


----------



## Alastair

Guys. How long does an RMA on Amazon take? For some reason card 2 seems to be buzzing a bit more than usual and its irritating me a bit.


----------



## toncij

Quote:


> Originally Posted by *dagget3450*
> 
> in what?


We're talking about Doom.


----------



## Performer81

Quote:


> Originally Posted by *Performer81*
> 
> Didnt test max. overclock yet. Only have the card for 2 days, i played some rounds of Battlefield 4 and project cars with 1100MHZ and -18mv and it was stable, -25mv werent stable for very long. Checked the shaders but my card cant be unlocked. I also didnt touch the hbm yet. Asic is 56,7 btw.


I tested some higher overclock and obviously 1150MHZ need +50mv which results in ~1,25V under load. Is this still fine? Where is the limit? I have no temp problems at all with that cooler but i dont wanna degrade that gpu too much.


----------



## RatPatrol01

Quote:


> Originally Posted by *Alastair*
> 
> Guys. How long does an RMA on Amazon take? For some reason card 2 seems to be buzzing a bit more than usual and its irritating me a bit.


Never RMA'd a GPU through them but when I got a DOA mobo from them once I just called up, told em it was busted, and they put a new one in the mail immediately, then issued a return label for the DOA board


----------



## Alastair

Quote:


> Originally Posted by *RatPatrol01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys. How long does an RMA on Amazon take? For some reason card 2 seems to be buzzing a bit more than usual and its irritating me a bit.
> 
> 
> 
> Never RMA'd a GPU through them but when I got a DOA mobo from them once I just called up, told em it was busted, and they put a new one in the mail immediately, then issued a return label for the DOA board
Click to expand...

seems my return window for my cards on Amazon has expired. So what do I need to do? Contact sapphire?


----------



## RatPatrol01

Quote:


> Originally Posted by *Alastair*
> 
> seems my return window for my cards on Amazon has expired. So what do I need to do? Contact sapphire?


Probably, but how bad is the buzzing? Any idea what causes the noise? My Nano tends to chitter when opening or closing a graphically intense program, and occasionally during idle, but rarely in a way that bugs me. I am curious how your card is behaving


----------



## gupsterg

Quote:


> Originally Posted by *RatPatrol01*
> 
> My Nano tends to chitter when opening or closing a graphically intense program.


Mainly this is when my Fury X (other 4 Fiji cards I've had exhibit coil whine). This tends to be due to FPS going wildly high in menu / exit screen, I've read enabling FRTC will cap the FPS, so you shouldn't get excessive FPS on menu = less or no coil whine.


----------



## Alastair

Quote:


> Originally Posted by *RatPatrol01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> seems my return window for my cards on Amazon has expired. So what do I need to do? Contact sapphire?
> 
> 
> 
> Probably, but how bad is the buzzing? Any idea what causes the noise? My Nano tends to chitter when opening or closing a graphically intense program, and occasionally during idle, but rarely in a way that bugs me. I am curious how your card is behaving
Click to expand...

I took my EK block off my one card to get to the BIOS switch. And maybe I didn't put it back on as tight. Cause it started buzzing after that. I figure the block might of been pressing against the coils slightly helping to reduce the buzzing and it isn't doing that now?


----------



## RatPatrol01

Quote:


> Originally Posted by *Alastair*
> 
> I took my EK block off my one card to get to the BIOS switch. And maybe I didn't put it back on as tight. Cause it started buzzing after that. I figure the block might of been pressing against the coils slightly helping to reduce the buzzing and it isn't doing that now?


So it's buzzing constantly? That seems like something is up, but I'd still start with a block remount, just to eliminate factors


----------



## Alastair

Ah yes I think you are right. I will just redo the block. In other news, if you could trade your Fury's for a pair of Fury X's and only pay the price difference, would you?


----------



## Flamingo

Im watching all these 1080 cards with custom power delivery setups and extra connectors.... why didnt AMD let its partners customize the Fiji boards...


----------



## Alastair

Quote:


> Originally Posted by *Flamingo*
> 
> Im watching all these 1080 cards with custom power delivery setups and extra connectors.... why didnt AMD let its partners customize the Fiji boards...


Gigabyte G1 Gaming Fury, Sapphire R9-Fury Nitro edition, Asus Strixx Fury. These are customised cards.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> I took my EK block off my one card to get to the BIOS switch. And maybe I didn't put it back on as tight. Cause it started buzzing after that. I figure the block might of been pressing against the coils slightly helping to reduce the buzzing and it isn't doing that now?


Put some cheap thermal tape or electrical tape on the caps then reinstall. That's helped on my 290x before.


----------



## Flamingo

Quote:


> Originally Posted by *Alastair*
> 
> Gigabyte G1 Gaming Fury, Sapphire R9-Fury Nitro edition, Asus Strixx Fury. These are customised cards.


Really? Didnt know.... still no Nano or Fury X though, esp Fury X since thats their flagship.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Really? Didnt know.... still no Nano or Fury X though, esp Fury X since thats their flagship.


TBH Fury X has a decent enough PCB / VRM. AIO performs very well IMO, VRM temps do not seem an issue like say on Hawaii ref PCB/cooler, when I opened the box of 4th Fury X I was greeted by ...



Spoiler: Warning: Spoiler!








I inspected the tubing and it wasn't cracked even though it had been folded over on itself







. That cards been [email protected] for now 70hrs+ in a rig which I though if it does spill coolant I'd not be fussed. The card is being sent for return this coming week.

Technically you could say Fury X is Nano with AIO, slight OC, better VRM and extra PCI-E power connector







.

After having experienced the air cooler on the Fury Tri-X I would have liked an option for the Fury X to be on air. Found the card with AIO real pain to handle on initial install. It would be nice to be able to buy say Fury X or Nano with say solid chokes = no coil whine, so I get where you're coming from. Hawaii had so many custom cards where as Fji just does not







.


----------



## xkm1948

Wish we have VEGA with 8192SP and greatly improved efficiency per sp. Man 14nm and 1.5GHz with HBM2. That would be insane


----------



## Gdourado

Quote:


> Originally Posted by *gupsterg*
> 
> TBH Fury X has a decent enough PCB / VRM. AIO performs very well IMO, VRM temps do not seem an issue like say on Hawaii ref PCB/cooler, when I opened the box of 4th Fury X I was greeted by ...
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I inspected the tubing and it wasn't cracked even though it had been folded over on itself
> 
> 
> 
> 
> 
> 
> 
> . That cards been [email protected] for now 70hrs+ in a rig which I though if it does spill coolant I'd not be fussed. The card is being sent for return this coming week.
> 
> Technically you could say Fury X is Nano with AIO, slight OC, better VRM and extra PCI-E power connector
> 
> 
> 
> 
> 
> 
> 
> .
> 
> After having experienced the air cooler on the Fury Tri-X I would have liked an option for the Fury X to be on air. Found the card with AIO real pain to handle on initial install. It would be nice to be able to buy say Fury X or Nano with say solid chokes = no coil whine, so I get where you're coming from. Hawaii had so many custom cards where as Fji just does not
> 
> 
> 
> 
> 
> 
> 
> .


Why do you say that the AIO of the fury is a hassle t install?


----------



## gupsterg

@xkm1948

I reckon the most Vega will be is 4096SP, but we should see good performance with new architecture / HBM 2 / node , etc.

@Gdourado

My comment was in the sense of handling card, I was handing 2 parts where as air cooled is 1. Now my comment was what I think generally an owner would think.

In my particular case it was a pain and as my situation is *not the norm* I did not state it, but I will now







.

For example, my case is a very old SilverStone Temjin 06, I like the inverted ATX setup plus I've done a few mods to it, so I won't dump it







. The rear exhaust 120mm mount is just 120mm thus won't accommodate the rad. I didn't want rad where my front intake 140mm fans are. So to be able to get the rad in the 5.25" bays I had to make the support bar removable (1st image). Then I made a mounting for the rad plus changed blanking plates for mesh.



Spoiler: Images of my setup









Compared with the Fury Tri-X was a real pain (initially) keeping Fury X, the air cooler also has better GPU VRM temp from what I experienced. I also didn't find it noisier than AIO, TBH I found it quieter / better.


----------



## Gdourado

What is the verdict on long term reliability of the AIO in the fury x?
I read over the Web that AIO coolers can loose fluid over time due to very slow evaporation and can cause issues.
Can this be the case with the fury?
Will a 2 year old card with good usage still perform like new?


----------



## Kana-Maru

Quote:


> Originally Posted by *Gdourado*
> 
> What is the verdict on long term reliability of the AIO in the fury x?
> I read over the Web that AIO coolers can loose fluid over time due to very slow evaporation and can cause issues.
> Can this be the case with the fury?
> Will a 2 year old card with good usage still perform like new?


Well I'm nearly a year into my Fury X with no problems. I haven't heard about any issues with people running the R9 295X2. I've been running the same AIO CPU cooler in my rig for nearly 5 years now I believe with no issues. So I'm sure the Fury X will be fine. It also has a reservoir and it sounds like it was plenty of liquid in there.


----------



## Medusa666

Anyone here who has enough Know-How to determine the quality of the PCB / components on the Radeon Pro Duo card?

I'm only a layman and I would very much like to know this.

Any informative reply is appreciated!


----------



## xkm1948

Question. Is it possible to mount the Fury X radiator in the front of the case as intake? I have a Noctua D15 and these are fighting for space there.


----------



## gupsterg

@Gdourado

I've fitted AIO coolers for friends / family and yet to have one come back to me it's lost performance or gone kaput. This is my first AIO unit in my rig, I've never had AIO in my rig, always been air cooled.

Like Kana-Maru said I checked out 295X2 threads and didn't seem like a talked about issue.

After seeing and using the card which had folded coolant pipes I'm confident they're up to the job







(I had reservation about them previously).

@Medusa666

The PCB has IR 6894 & 6811 mosfets from markings in photo, you can find a) spec data on web b) @buildzoid has a post in this thread and Fiji bios mod about them. He also states what the chokes are rated for on Fury / X, I would assume the Radeon Pro Duo would use the same or better TBH. He also ran some ripple tests on ref PCB Fury X (IIRC Fiji bios mod has post) and stated was low = good. He did do a cap mod on his Fury X, it did stabilise further OC clocks but not huge increase.

@xkm1948

No way as an intake man, the heat at full load coming off the rad I would not want in my case. For example if I compare screenies of [email protected] run for ~15hrs on Vapor-X 290X (1100/1525) vs Fury X (1135/535), the mobo / CPU is 6C cooler, as Fury X is exhausting it's heat out of case.


----------



## Alastair

Quote:


> Originally Posted by *Flamingo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Gigabyte G1 Gaming Fury, Sapphire R9-Fury Nitro edition, Asus Strixx Fury. These are customised cards.
> 
> 
> 
> Really? Didnt know.... still no Nano or Fury X though, esp Fury X since thats their flagship.
Click to expand...

NVidia Titan X is the current gaming flagship and there are no custom cards,,


----------



## Flamingo

Quote:


> Originally Posted by *Alastair*
> 
> NVidia Titan X is the current gaming flagship and there are no custom cards,,


I'd consider the 295x2 or Radeon Duo Pro _that_ sorta flagship considering the price


----------



## Mega Man

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone here who has enough Know-How to determine the quality of the PCB / components on the Radeon Pro Duo card?
> 
> I'm only a layman and I would very much like to know this.
> 
> Any informative reply is appreciated!


AMD always very much OVER engineers the ref board, since the 7970 ( and well before ) unlike nvidia amds ref pcbs are the best, maybe the exception being for l2n


----------



## diggiddi

Quote:


> Originally Posted by *xkm1948*
> 
> Question. Is it possible to mount the Fury X radiator in the front of the case as intake? I have a Noctua D15 and these are fighting for space there.


Flip the fan so it exhausts


----------



## Medusa666

Quote:


> Originally Posted by *gupsterg*
> 
> [
> After seeing and using the card which had folded coolant pipes I'm confident they're up to the job
> 
> 
> 
> 
> 
> 
> 
> (I had reservation about them previously).
> 
> @Medusa666
> 
> The PCB has IR 6894 & 6811 mosfets from markings in photo, you can find a) spec data on web b) @buildzoid has a post in this thread and Fiji bios mod about them. He also states what the chokes are rated for on Fury / X, I would assume the Radeon Pro Duo would use the same or better TBH. He also ran some ripple tests on ref PCB Fury X (IIRC Fiji bios mod has post) and stated was low = good. He did do a cap mod on his Fury X, it did stabilise further OC clocks but not huge increase.


Quote:


> Originally Posted by *Mega Man*
> 
> AMD always very much OVER engineers the ref board, since the 7970 ( and well before ) unlike nvidia amds ref pcbs are the best, maybe the exception being for l2n


Thanks for the replies guys!

I will be getting this card tomorrow or tuesday : )


----------



## toncij

Quote:


> Originally Posted by *Medusa666*
> 
> Thanks for the replies guys!
> 
> I will be getting this card tomorrow or tuesday : )


Which one?







RPD? It's a nice card. http://www.3dmark.com/fs/8532735
But I wish it could clock higher.

Also, what annoys me is the noise. The card itself, pumps in its inside, really produce an annoying buzz all the time, even at 35 deg idle or 45 load... just an annoying buzz near actually a silent vent on the radiator. I'm keeping my case open atm, but the noise is there.


----------



## gupsterg

Quote:


> Originally Posted by *diggiddi*
> 
> Flip the fan so it exhausts


From factory the rad fan is set to exhaust







, so if he mounts rad on intake fan position, the heat will exhaust out of the case.

My original reply was in the context when he stated "Is it possible to mount the Fury X radiator in the front of the case as intake?" I assumed he was gonna flip the fan making it pull cold air from outside over the rad and dump warm air in the case.


----------



## Alastair

Quote:


> Originally Posted by *Alastair*
> 
> Since this is a BIOS editing thread. This question is kinda valid. If you flash a 4low or 4high unlocking BIOS onto a Fully operational Fury X, will it shut down of the 4 CU's?


anyone. I posted this in the BIOS editing thread but I am curious. Anyone tried flutist or is willing to try?


----------



## Medusa666

Quote:


> Originally Posted by *toncij*
> 
> Which one?
> 
> 
> 
> 
> 
> 
> 
> RPD? It's a nice card. http://www.3dmark.com/fs/8532735
> But I wish it could clock higher.
> 
> Also, what annoys me is the noise. The card itself, pumps in its inside, really produce an annoying buzz all the time, even at 35 deg idle or 45 load... just an annoying buzz near actually a silent vent on the radiator. I'm keeping my case open atm, but the noise is there.


Yeah, the RPD.

I know that Polaris is coming and that Pascal is out, but I got a Freesync monitor and I like AMD so I ordered this card.

I hope it will be good for 1440P the coming 2-3 years.


----------



## GruntXIII

Anyone had problems with too low VRAM so far (in higher resolutions like 3440x1440)?

Bought a R9 Nano (+watercooler), but somehow regret it seeing that 1080 is performing so well.


----------



## toncij

Quote:


> Originally Posted by *GruntXIII*
> 
> Anyone had problems with too low VRAM so far (in higher resolutions like 3440x1440)?
> 
> Bought a R9 Nano (+watercooler), but somehow regret it seeing that 1080 is performing so well.


I've experienced VRAM issues only at 5K (5120x2880), not before. At 4K none so far. Even Shadow of Mordor worked fine.


----------



## danjal

I seen a 4k benchmark someone on youtube was doing that hit the vram limit on a 4gb fiji gpu.. I cant remember which card it was, thinking it was fury-x...


----------



## toncij

Quote:


> Originally Posted by *danjal*
> 
> I seen a 4k benchmark someone on youtube was doing that hit the vram limit on a 4gb fiji gpu.. I cant remember which card it was, thinking it was fury-x...


Well, yes, you can force it, but... also, turn off AA - you don't need it at 4K mostly at all. It saves some VRAM.


----------



## Gdourado

Does anyone have the Gigabyte Fury with the Windforce cooler?
How is the card? Noise, temps and OC?
Can't find many info online about this card.

Cheers!


----------



## Alastair

Quote:


> Originally Posted by *Gdourado*
> 
> Does anyone have the Gigabyte Fury with the Windforce cooler?
> How is the card? Noise, temps and OC?
> Can't find many info online about this card.
> 
> Cheers!


I would just stick with Sapphire really. However since pretty much most Fiji cards perform the same you could always be a Guinea pig for us and tell us how it is.


----------



## Radox-0

Quote:


> Originally Posted by *GruntXIII*
> 
> Anyone had problems with too low VRAM so far (in higher resolutions like 3440x1440)?
> 
> Bought a R9 Nano (+watercooler), but somehow regret it seeing that 1080 is performing so well.


It does okay. Using my Nano on my 3440 x 1440 panel while waiting on 1080's. There are some titles that utilise the full amount of VRAM but its not translating to issues. I expect you will be running out of grunt depending on what your playing before VRAM becomes an issue.


----------



## Tgrove

Quote:


> Originally Posted by *Radox-0*
> 
> It does okay. Using my Nano on my 3440 x 1440 panel while waiting on 1080's. There are some titles that utilise the full amount of VRAM but its not translating to issues. I expect you will be running out of grunt depending on what your playing before VRAM becomes an issue.


What games are those? I have yet to see the full 4gb of my fury xs used @ 4k


----------



## GruntXIII

Running out of Grunt? As long as I'm using it it won't









You are going to buy more than 1 1080? Nice =D

There are already games out which utilize more than 4GB (without heavy AA or DS). For instance Doom or Gears of War (edit: Or at least I read they were using more than that...maybe it's only at UHD resolution and not at UWQHD). Guess I have to live with it for now.

Maybe I'll be in for the 1080ti or Fury XX (or whatever they call it).


----------



## toncij

Quote:


> Originally Posted by *GruntXIII*
> 
> Running out of Grunt? As long as I'm using it it won't
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are going to buy more than 1 1080? Nice =D
> 
> There are already games out which utilize more than 4GB (without heavy AA or DS). For instance Doom or Gears of War (edit: Or at least I read they were using more than that...maybe it's only at UHD resolution and not at UWQHD). Guess I have to live with it for now.
> 
> Maybe I'll be in for the 1080ti or Fury XX (or whatever they call it).


I'm running Doom at 5K. It runs fine on RadeonProDuo (4GB) and TitanX - both run fine.


----------



## GruntXIII

With shadows set to Nightmare?


----------



## Radox-0

Quote:


> Originally Posted by *Tgrove*
> 
> What games are those? I have yet to see the full 4gb of my fury xs used @ 4k


Shadow of Mordor, Far Cry 4 and Far cry Primal. Not really tested much else as until recently Nano is just used in a HTPC for living room. Will give some other titles a shot later this evening if I get a chance.
Quote:


> Originally Posted by *GruntXIII*
> 
> Running out of Grunt? As long as I'm using it it won't
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are going to buy more than 1 1080? Nice =D
> 
> There are already games out which utilize more than 4GB (without heavy AA or DS). For instance Doom or Gears of War (edit: Or at least I read they were using more than that...maybe it's only at UHD resolution and not at UWQHD). Guess I have to live with it for now.
> 
> Maybe I'll be in for the 1080ti or Fury XX (or whatever they call it).


Yeah as in I usually turn some settings down in some tiltes to try and make a smooth 60fps. In turn the VRAM goes down. Never got to a stage where the VRAM even when being used mostly causes problems for me at 3440 x 1440. It will be great either way









To be honest I kind of wish I waited for the Pascal Titan now. Main rig was Tri-Titan X but got a good price for them (Titan X's) so thought may as well get the new and shiny (like a magpie in that regards







) But two 1080's should prove to be a decent replacement. Run 3440 x 1440 @ 100hz so usually allows decent eye candy for now.

Alas I have gone off topic, but yeah I imagine you will not be getting any issues. If anything my nano surprised me with the results I was getting as I did not give it a proper workout until recently.


----------



## spyshagg

Quote:


> Originally Posted by *JunkaDK*
> 
> At 1440p or 1080p?


1440P









On the hell levels, I do need to drop megatexturing from ultra to high
ultra =35fps (***)
high= 70fps


----------



## toncij

Quote:


> Originally Posted by *GruntXIII*
> 
> With shadows set to Nightmare?


Everything maxed out except for AA levels all OFF.


----------



## GruntXIII

That's interesting...I guess you can set it to nightmare due to the game seeing 8 GB VRAM. As far as I know you shouldn't be able to set it to nightmare if you're under 6 GB VRAM.


----------



## danjal

my first Fury was a gigabyte fury and it had the inherent black screen at login... I wouldnt get one if I were you.. I rma'd it and got a Sapphire Nitro Fury..

I couldnt recommend getting any Fury right now, their drivers arent doing to good, if I were spending $500+ on a gpu, I'd have to get the 1080 or wait for computex and see what amd is doing, or see what board partners of nvidia are doing.

Remember the gtx1080 drivers aren't very mature, it will get better as the drivers mature..

From what I'm witnessing on amd's forums is they have unresolved issues since they've launched and they cant seem to do anything about the screen flickering or black screen at login issues on certain cards and its been almost a year. now.


----------



## danjal

The game was Skyrim I believe, might have been Witcher3..


----------



## shadowxaero

Quote:


> Originally Posted by *danjal*
> 
> my first Fury was a gigabyte fury and it had the inherent black screen at login... I wouldnt get one if I were you.. I rma'd it and got a Sapphire Nitro Fury..
> 
> I couldnt recommend getting any Fury right now, their drivers arent doing to good, if I were spending $500+ on a gpu, I'd have to get the 1080 or wait for computex and see what amd is doing, or see what board partners of nvidia are doing.
> 
> Remember the gtx1080 drivers aren't very mature, it will get better as the drivers mature..
> 
> From what I'm witnessing on amd's forums is they have unresolved issues since they've launched and they cant seem to do anything about the screen flickering or black screen at login issues on certain cards and its been almost a year. now.


What is wrong with the drivers?
My Fury is doing great lol


----------



## flopper

Quote:


> Originally Posted by *shadowxaero*
> 
> What is wrong with the drivers?
> My Fury is doing great lol


my 290 did great as well as my 390 and my Polaris will do great to


----------



## danjal

Quote:


> Originally Posted by *shadowxaero*
> 
> What is wrong with the drivers?
> My Fury is doing great lol


go here and read the 10 pages of complaints. https://community.amd.com/thread/197992 disregard the *fixed* in the title. The admins on amds forum also deleted a 60+ page thread on the same topic to cover it up..

Not all of the current gpus have this issue, but all models can have this problem, and amd hasnt been able to fix it after almost a year.. I feel sorry for those that have spent their hard earned money to be ignored by amd.


----------



## shadowxaero

Quote:


> Originally Posted by *danjal*
> 
> go here and read the 10 pages of complaints. https://community.amd.com/thread/197992 disregard the *fixed* in the title. The admins on amds forum also deleted a 60+ page thread on the same topic to cover it up..
> 
> Not all of the current gpus have this issue, but all models can have this problem, and amd hasnt been able to fix it after almost a year.. I feel sorry for those that have spent their hard earned money to be ignored by amd.


Any piece of hardware can have issues. But that doesn't negate the fact that I along with my other users in this thread have Fiji chips and no problems with drivers. I honestly think a lot of issues come down to user error to be honest. And I am not saying AMD has perfect drivers I mean before crimson I didn't have close to the same consistent performance that I do now. And I don't believe AMD has been ignoring users. People complained about not having day one drivers and now AMD releases one to two drivers a month and people still find a way to complain.

Some people will just never be happy lol.


----------



## xTesla1856

I've had bad luck with Sapphire, my Nitro Fury died on me after two weeks of use. That was over 2 months ago and I still only have one card left. There really is no rule to which manufacturers are better. I find there are however quite a few dead Fijis.


----------



## spinejam

Quote:


> Originally Posted by *danjal*
> 
> go here and read the 10 pages of complaints. https://community.amd.com/thread/197992 disregard the *fixed* in the title. The admins on amds forum also deleted a 60+ page thread on the same topic to cover it up..
> 
> Not all of the current gpus have this issue, but all models can have this problem, and amd hasnt been able to fix it after almost a year.. I feel sorry for those that have spent their hard earned money to be ignored by amd.


Can't comment on any issues b/c my Asus Fury X has been problem-free. Quiet, cool, and runs everything I need it to w/o issue.


----------



## danjal

Quote:


> Originally Posted by *xTesla1856*
> 
> I've had bad luck with Sapphire, my Nitro Fury died on me after two weeks of use. That was over 2 months ago and I still only have one card left. There really is no rule to which manufacturers are better. I find there are however quite a few dead Fijis.


Yep, I've noticed there seem to be a bunch of them that have the black screen at login issue and screen flicker.. My first gigabyte Fury was doa with the black screen at login, and I dont feel my current Sapphire Nitro performs as it should for $500+...

My sapphire fury still has a flicker issue with the youtube flash player of all things, I can watch a video and I will eventually get random flickering within the player window, been like this for a while now.. Amd also knows of it because I've issued a bug report and pointed it out to an admin on their forum several times...

nonetheless, I'm not a fan of amd or nvidia,I want performance for my money, but I can tell you I'll more than likely be switching to an nvidia gpu after my experience with this new amd stuff.. AMD just doesnt seem to be the company it once was..


----------



## xTesla1856

Mine took a dump out of the blue while playing TW3 in Crossfire. Bluescreened my machine, wild blue artefacting and my Windows refused to load. RMA is taking over 2 months now, this was the last time I ordered anything through that retailer.


----------



## danjal

I'd like to see the return rate on the 390/390x, nano, fury/fury-x


----------



## danjal

It was Rise of the Tomb Raider.

saw it here here 




The thing is in future games (later this year), 4gig hbm is not going to be enough vram when dx12 becomes the norm..


----------



## Medusa666

What exactly does the Power Limit in AMD Overdrive do?

Default is 0%, and the card is throttling ( Radeon Pro Duo ), is it possible to prevent this, given that the card is running really cool, 50-60c.

Thanks!


----------



## danjal

Quote:


> Originally Posted by *Medusa666*
> 
> What exactly does the Power Limit in AMD Overdrive do?
> 
> Default is 0%, and the card is throttling ( Radeon Pro Duo ), is it possible to prevent this, given that the card is running really cool, 50-60c.
> 
> Thanks!


turn it all the way up.. it allows increased voltage to the gpu it wont hurt anything with it maxed out, it wont stop the throttling.


----------



## Alwrath

Black screens are usually user error, usually caused by people overclocking / unstable overclocks. I hit some black screens with my 290, but it was my own fault. Not enough voltage for my oc. Video card recovered and I restarted computer. Havent had an issue since.


----------



## danjal

Quote:


> Originally Posted by *Alwrath*
> 
> Black screens are usually user error, usually caused by people overclocking / unstable overclocks. I hit some black screens with my 290, but it was my own fault. Not enough voltage for my oc. Video card recovered and I restarted computer. Havent had an issue since.


The problem I'm talking about is not due to overclocking. Stock cards are doing it at the initial install like my Gigabyte Fury did on a clean install of win10... I dont think the gpu's are dead, I think its something with the drivers voltage control or the drivers not loading correctly from boot up and the gpu crashes causing the black screen at log on..


----------



## Alwrath

Quote:


> Originally Posted by *danjal*
> 
> The problem I'm talking about is not due to overclocking. Stock cards are doing it at the initial install like my Gigabyte Fury did on a clean install of win10... I dont think the gpu's are dead, I think its something with the drivers voltage control or the drivers not loading correctly from boot up and the gpu crashes causing the black screen at log on..


I had black screens on log in if I left the card on oc settings. I always switch it to stock if I restart or power down my pc. Not sure if its related.


----------



## danjal

Quote:


> Originally Posted by *Alwrath*
> 
> I had black screens on log in if I left the card on oc settings. I always switch it to stock if I restart or power down my pc. Not sure if its related.


its NOT from an overclock..

How would a gpu have an overclock on it if it were brand new and has never been installed in a system before.

yes you can get a gpu crash from too higjh of an overclock, but this is not what is causing the issue I'm talking about.


----------



## bluezone

New driver.

support page:
http://support.amd.com/en-us/kb-articles/pages/amd-radeon-software-crimson-edition-16.5.3-release-notes.aspx

Down load Win7
http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## SuperZan

Nice, new games support and a few fixes.


----------



## Tgrove

Quote:


> Originally Posted by *Radox-0*
> 
> Shadow of Mordor, Far Cry 4 and Far cry Primal. Not really tested much else as until recently Nano is just used in a HTPC for living room. Will give some other titles a shot later this evening if I get a chance.
> Yeah as in I usually turn some settings down in some tiltes to try and make a smooth 60fps. In turn the VRAM goes down. Never got to a stage where the VRAM even when being used mostly causes problems for me at 3440 x 1440. It will be great either way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To be honest I kind of wish I waited for the Pascal Titan now. Main rig was Tri-Titan X but got a good price for them (Titan X's) so thought may as well get the new and shiny (like a magpie in that regards
> 
> 
> 
> 
> 
> 
> 
> ) But two 1080's should prove to be a decent replacement. Run 3440 x 1440 @ 100hz so usually allows decent eye candy for now.
> 
> Alas I have gone off topic, but yeah I imagine you will not be getting any issues. If anything my nano surprised me with the results I was getting as I did not give it a proper workout until recently.


Ive played those games and it comes no where near the 4gb limit. And i play at 4k, gheres no way 3440x1440 uses all 4gb hbm


----------



## Tgrove

Quote:


> Originally Posted by *danjal*
> 
> It was Rise of the Tomb Raider.
> 
> saw it here here
> 
> 
> 
> 
> The thing is in future games (later this year), 4gig hbm is not going to be enough vram when dx12 becomes the norm..


Only saw some guy running his mouth, no actual gameplay numbers


----------



## Flamingo

http://videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks

Is this even real?

I ran my R9 Nano at +50% and I got a score of 15898.

So if the benchmarks are right the 480 and 480x will be faster than the Nano


----------



## SuperZan

Quote:


> Originally Posted by *Flamingo*
> 
> 
> 
> http://videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks
> 
> Is this even real?
> 
> I ran my R9 Nano at +50% and I got a score of 15898.
> 
> So if the benchmarks are right the 480 and 480x will be faster than the Nano


Was your score the Overall total or just the Graphics Score? That chart is for Graphics Score and a Nano should slot in with the other Fiji's there.


----------



## Flamingo

Quote:


> Originally Posted by *SuperZan*
> 
> Was your score the Overall total or just the Graphics Score? That chart is for Graphics Score and a Nano should slot in with the other Fiji's there.


The performance score.

I also ran the Extreme score which was 5996 which was more in line for a +50% nano (compared to the Guru3D runs = http://www.guru3d.com/articles-pages/amd-radeon-r9-nano-review,25.html)


----------



## Medusa666

Guys, can I have some quick feedback on this video of my Radeon Pro Duo radiator fan in idle.

I think that the sound is extremely annoying, however is it is normal I won't RMA the card.

Is this noise acceptable / normal for this fan?


----------



## illies100

I don't think it's normal , but you can simply change the fan for 15$ , noctua fan are very silent


----------



## toncij

Quote:


> Originally Posted by *spinejam*
> 
> Can't comment on any issues b/c my Asus Fury X has been problem-free. Quiet, cool, and runs everything I need it to w/o issue.


TBH: black-screens, extremely long resolution switching, flickering, insanely slow cursor in fullscreen mode, crashes on borderless windowed... all that has been pretty much present in my experience with latest AMD (I'm using RadeonProDuo). Also, hell that breaks loose sometimes with my desktop since I'm using a 5K and a 3440x1400....
Quote:


> Originally Posted by *illies100*
> 
> I don't think it's normal , but you can simply change the fan for 15$ , noctua fan are very silent


Ye, do it and you'll fix it. Just buy a new fan. Bundled one sucks.


----------



## danjal

I'm with illies100, I had a corsair fan in my top exhaust, which is mounted horizontally, I replaced it with a Noctua nf-a14 and Noctua has a new customer from now on. but it wasnt $15 it was closer to $30.
Quote:


> Originally Posted by *Medusa666*
> 
> Guys, can I have some quick feedback on this video of my Radeon Pro Duo radiator fan in idle.
> 
> I think that the sound is extremely annoying, however is it is normal I won't RMA the card.
> 
> Is this noise acceptable / normal for this fan?


I think modern AMD quality has went downhill honestly.. I think they are focusing on inexpensive rather than quality and functionality..

I thought the fans on the fury-x and pro duo were suppose to be high end fans?


----------



## Alwrath

Well looks like my open box Fury X from newegg turned out to be a dud. Sent it back for a refund, they wouldent let me get a replacement. sigh. Oh well, guess ill have to make due with my 290 till vega comes out.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> The performance score.
> 
> I also ran the Extreme score which was 5996 which was more in line for a +50% nano (compared to the Guru3D runs = http://www.guru3d.com/articles-pages/amd-radeon-r9-nano-review,25.html)


The P score takes into account CPU based tests as well, here is my 3DM 11 result and you'll see I don't reach Guru3D P score of 17997.

Now if you take the Videocardz Fury X Graphics score of 19315 I beat it slightly







(no tess tweak). With tess.tweak I'd be ~6000 higher for GS







.


----------



## pdasterly

with the 1080 almost out when should we expect a radeon price cut?


----------



## danjal

Quote:


> Originally Posted by *pdasterly*
> 
> with the 1080 almost out when should we expect a radeon price cut?


I'm just guessing, but I would think within 2 weeks.. Whats the start date of Computex?


----------



## Gdourado

I read somewhere that to stop the Nano from throttling, it is needed to raise both the power limit and the temp limit.
I know the power limit is raised by overclocking software or even AMD overdrive.
But how does one raise the temp limit? Bios Edit?


----------



## toncij

Quote:


> Originally Posted by *Gdourado*
> 
> I read somewhere that to stop the Nano from throttling, it is needed to raise both the power limit and the temp limit.
> I know the power limit is raised by overclocking software or even AMD overdrive.
> But how does one raise the temp limit? Bios Edit?


Slap a water cooling on it?







My RPD does not throttle a tiny bit since it's water-cooled.


----------



## blue1512

Quote:


> Originally Posted by *Flamingo*
> 
> The performance score.
> 
> I also ran the Extreme score which was 5996 which was more in line for a +50% nano (compared to the Guru3D runs = http://www.guru3d.com/articles-pages/amd-radeon-r9-nano-review,25.html)


Dude, I think your score is *Overall score*, which is tied with CPU. The graph of VCz is *Graphics score*


----------



## RatPatrol01

So I gotta admit, WC'ing the Nano really kills it's compact form factor







even with the Nano and a mini-itx board, this 350D is getting pretty crowded


----------



## gupsterg

@Gdourado

There is "Target GPU Temperature" in OverDrive page, when GPU reaches set temp it will be throttled (also in ROM, PowerPlay > PowerTune). SO yes you can raise it so GPU doesn't throttle, but on stock cooler/profile = increased GPU temp.

In ROM (ie PowerPlay) there is "Target GPU Temperature", this is part of fan table (aka cooling profile). Editing this value changes cooling solution behavior to maintain set temp on GPU. You can also set custom fan profile via MSI AB, etc *but* ROM uses "Fuzzy Logic" where as custom fan curve in OS SW = Lookup table.

Only my opinion, if you plan to OC I would consider Nano PCB not ideal. For example view data for amps / watts on GPU VRM in HWiNFO.



Spoiler: Fury X no 2 1135MHz @ 1.243V VID









Spoiler: Fury X no 3 1130MHz @ 1.225V VID







Basically my Fury X no 3 is using more A/W as it's a higher leakage ASIC than no 2, even though no 3 is set to lower VID in ROM. Nanos from some ASIC quality results shared by members has low leakage ASIC but there is still a range (ie some more others less). Note: High ASIC quality = higher leakage, Low ASIC quality = lower leakage (GPU-Z has info wrong way around).

Just as added info yesterday a Fury X no 4 reached 350W on GPU VRM







, 1120MHz @ 1.243V VID. This card is the second highest leakage I've had, it's using +50mV over stock (1.193V VID stock).

These A/W figures are peak/max I'm stating, average is lower as you can see in HWiNFO data.

Below is PL compare of stock/ref PCB ROMs.



Spoiler: Warning: Spoiler!







Now Fury / X from info by Buildzoid is deemed to support ~420A, has 6 phases to GPU. Nano has 4 phases, thus 280A is max. The Nano has only 1x 8 pin PCI-E connector = 150W + 75W from PCI-E slot = 225W max power delivery to PCB. Even if VRM can support 280A the PCI-E power delivery limits card.

How I see it is Nano is full fat Fiji shoe horned to meet SFF users requirements, if you have no SFF need I reckon Fury / X is the way to go. For me Fury X equaled better "bang for $" after considering aspects of Nano.


----------



## Luftdruck

Quote:


> Originally Posted by *Medusa666*
> 
> Guys, can I have some quick feedback on this video of my Radeon Pro Duo radiator fan in idle.
> 
> I think that the sound is extremely annoying, however is it is normal I won't RMA the card.
> 
> Is this noise acceptable / normal for this fan?


That's exactly how a Gentle Typhoon sounds like in idle. You can't do much about it except trying to change the RPM on the other case fans to change the overall pressure and vibration frequency


----------



## GruntXIII

Quote:


> Originally Posted by *gupsterg*
> 
> @Gdourado
> 
> Now Fury / X from info by Buildzoid is deemed to support ~420A, has 6 phases to GPU. Nano has 4 phases, thus 280A is max. The Nano has only 1x 8 pin PCI-E connector = 150W + 75W from PCI-E slot = 225W max power delivery to PCB. Even if VRM can support 280A the PCI-E power delivery limits card.
> 
> How I see it is Nano is full fat Fiji shoe horned to meet SFF users requirements, if you have no SFF need I reckon Fury / X is the way to go. For me Fury X equaled better "bang for $" after considering aspects of Nano.


Afaik 225 Watt is the official maximum, but not the real one (take a look at the R9 295x2).

Nonetheless if you've got a custom watercooling solution like me, it simply is way more expensive if you buy the R9 Fury X + watercooler compared to a Nano + watercooler. Would have paid € 150-200 more...for maybe 50-100 MHz more clock speed (if you happen to get the right card).


----------



## Gdourado

So let's take this example.
A fury Nano and a fury x, both with full cover waterblocks. Both with raised power limit.
In this scenario, does the VRM design and dual 8 pin make a significant difference in overclock?
How much core speed can a fury x on a custom loop do in average?


----------



## Flamingo

So much hate in this article, esp the way its written. Kyle bennett going at it again







Everyone knew why RTG was created but this guy just tells the story with such negativity

http://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility

Bound to cause panic in AMD circles


----------



## Flamingo

Quote:


> Originally Posted by *Gdourado*
> 
> So let's take this example.
> A fury Nano and a fury x, both with full cover waterblocks. Both with raised power limit.
> In this scenario, does the VRM design and dual 8 pin make a significant difference in overclock?
> How much core speed can a fury x on a custom loop do in average?


Overall Average for Nano
http://hwbot.org/hardware/videocard/radeon_r9_nano/

Overall Average for Fury X
http://hwbot.org/hardware/videocard/radeon_r9_fury_x/

Highest for Nano
http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2519&cores=1#start=0#interval=20

Highest for Fury X
http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2477&cores=1#start=0#interval=20

But then again no one tried LN2 with the Nano lol, so comparing highest AIO or watercooling clocks = 1162(Nano) vs 1180(FuryX)

Averages Fury X wins because it already comes with a water cooler and has better results from the get go

This really makes me wanna watercool the nano lol, but blocks are so expensive and then there is the 4GB ram and I start thinking about how long before I jump for an upgrade lol.


----------



## gupsterg

@GruntXIII

The R9 295X2 has 2x 8pin (300W) + 75W from slot = 375W, I can see from the stock ref PCB ROMs MPDL = 202W per GPU = 404W. The card does not conform to PCI-SIG spec, PCI-SIG don't enforce spec either AFAIK, AMD are building to hardware specs. From what I understand basically AMD are assuming a user will then use good quality PSU and I would also assume anyone buying a 295X2 would.



Image from link.

So we can see the connector/terminals can handle more.

Next wiring, AFAIK PCI-E minimum should be 18AWG = 10 amps (IIRC 75C spec). So we come up with 12V x 3 x 10A = 360W .

So now the connector / wiring is out the way there is still the PCB to consider, now this post I had found interesting. I interpreted this post as meaning if there are extra connectors and/or 8pin vs 6pin on PCB the card can use the power from PSU as it can distribute it via more lands (caveat being VRM used).

This was research I did when looking at PowerLimit for Hawaii bios mod, I deemed it best to guide on PCI-SIG spec / PCB VRM and not go into hardware specs to this depth. Just in case someone had not so great PSU, etc and blamed me for an issue. In original post I had highlighted data was A/W peak/MAX and average is lower.

Personally my opinion is it it's better to have 2x 290X than 1x 295X2. Same way I think it's better to have Fury / X vs Nano. I would only consider the 295X2 or Nano if the build limited me in some way that I couldn't have 2x 290X or Fury / X.

For me Nano was £350, Fury Tri-X £360 and Fury X £400 and Fury / X had better stock cooling / PCB so no brainer to go for it. No card I was gonna WC, so my original post and this is based on that.

@Gdourado

IMO you could get a pants clocking Fury / X and a better clocking Nano, basically good old "Silicon lottery".

My opinion for going for Fury / X has been outlined in this post and other. This opinion was no way meant to convey Fury / X OC better than Nano but only to inform that technically the Fury / X PCB is better built and that a full fat Fiji chip can use x amount of A/W.


----------



## diggiddi

Quote:


> Originally Posted by *gupsterg*
> 
> @GruntXIII
> 
> The R9 295X2 has 2x 8pin (300W) + 75W from slot = 375W, I can see from the stock ref PCB ROMs MPDL = 202W per GPU = 404W. The card does not conform to PCI-SIG spec, PCI-SIG don't enforce spec either AFAIK, AMD are building to hardware specs. From what I understand basically AMD are assuming a user will then use good quality PSU and I would also assume anyone buying a 295X2 would.
> 
> 
> 
> Image from link.
> 
> So we can see the connector/terminals can handle more.
> 
> Next wiring, AFAIK PCI-E minimum should be 18AWG = 10 amps (IIRC 75C spec). So we come up with 12V x 3 x 10A = 360W .
> 
> So now the connector / wiring is out the way there is still the PCB to consider, now this post I had found interesting. I interpreted this post as meaning if there are extra connectors and/or 8pin vs 6pin on PCB the card can use the power from PSU as it can distribute it via more lands (caveat being VRM used).
> 
> This was research I did when looking at PowerLimit for Hawaii bios mod, I deemed it best to guide on PCI-SIG spec / PCB VRM and not go into hardware specs to this depth. Just in case someone had not so great PSU, etc and blamed me for an issue. In original post I had highlighted data was A/W peak/MAX and average is lower.
> 
> *Personally my opinion is it it's better to have 2x 290X* than 1x 295X2. Same way I think it's better to have Fury / X vs Nano. I would only consider the 295X2 or Nano if the build limited me in some way that I couldn't have 2x 290X or Fury / X.
> 
> For me Nano was £350, Fury Tri-X £360 and Fury X £400 and Fury / X had better stock cooling / PCB so no brainer to go for it. No card I was gonna WC, so my original post and this is based on that.
> 
> @Gdourado
> 
> IMO you could get a pants clocking Fury / X and a better clocking Nano, basically good old "Silicon lottery".
> 
> My opinion for going for Fury / X has been outlined in this post and other. This opinion was no way meant to convey Fury / X OC better than Nano but only to inform that technically the Fury / X PCB is better built and that a full fat Fiji chip can use x amount of A/W.


So What size PSU would you recommend for my dual lightnings and if I wanted to add say a 390x?


----------



## Medusa666

Quote:


> Originally Posted by *Luftdruck*
> 
> That's exactly how a Gentle Typhoon sounds like in idle. You can't do much about it except trying to change the RPM on the other case fans to change the overall pressure and vibration frequency


I just want to reply to what you wrote.

What you are saying is not true, I got the card replaced by the retailer because of this fan, and the new card is much more silent in idle, there is a huge difference, i.e the first fan was faulty as I suspected.

I'm extremely happy with the quality of this card, it is the heaviest card I have ever held in my hand, it runs cool, and it is dead silent during full load for hours.


----------



## Luftdruck

Quote:


> Originally Posted by *Medusa666*
> 
> What you are saying is not true, I got the card replaced by the retailer because of this fan, and the new card is much more silent in idle, there is a huge difference, i.e the first fan was faulty as I suspected.


I was able to get my hands on 4 R9 Fury X's from different AIB's and they all share the same sound profile. It varies in terms of noise level, but they sound all the same.




 It's the 5400 RPM version, the RPD and Fury X uses the 3000 RPM one, but you still get the same resonance while applying a low voltage to it.

But still, I'm happy to see you managed to get one which is quieter as the one you had before


----------



## Tgrove

Finally got rigbuilder set up and some pics, so its time i join the club


----------



## Medusa666

Swee
Quote:


> Originally Posted by *Tgrove*
> 
> Finally got rigbuilder set up and some pics, so its time i join the club


Very nice setup you got there, those dual Fury X looks strong!


----------



## GruntXIII

Well...overclocked my Nano a bit and 1100 MHz core clock at +12 mV seem to be the sweet spot (didn't change the RAM frequency). If I go higher with clock it gets unstable. If I then raise the voltage it starts clocking down. I guess I'm fine with it...so far everything runs great







.

Btw., temperatures under water are really good with this card. After a few hours of Doom, the GPU temp was at 37 °C (water temperature somewhere around room temperature at 28°C)

@Tgrove Nice Rig


----------



## spyshagg

How much graphics score with that clock? Firestrike 1080p


----------



## GruntXIII

http://www.3dmark.com/3dm/12217694

Hope link works


----------



## Tgrove

Thanks guys, they really are amazing cards. I wish they got the respect they deserve


----------



## Sonikku13

How badly does an A10-7850K bottleneck a Nano? The only reference point I have is 90 FPS in FFXIV: HW at 1080p max.

I am considering buying a new setup... with a Core i7 6700K, 16 GB of DDR4 SDRAM, and an ASUS Z170 motherboard... for the sake of removing the bottleneck.


----------



## SuperZan

The 6700k is a nice CPU but you'll probably want to consider upping your resolution if you want to keep the bottleneck on the GPU side. Plus, Fiji as a design tends to choke at 1080p.


----------



## Sonikku13

Quote:


> Originally Posted by *SuperZan*
> 
> The 6700k is a nice CPU but you'll probably want to consider upping your resolution if you want to keep the bottleneck on the GPU side. Plus, Fiji as a design tends to choke at 1080p.


I'm probably gonna go with an i5 6400, Gigabyte GA-Z170N-WIFI motherboard, 16 GB of relatively cheap DDR4 SDRAM, a low profile heatsink, Corsair RM550x, Cooler Master Elite 110, and Windows 10 Home... all this fits in a $650 budget, allowing me to downsize. I would have gone Haswell, but turns out Skylake is only an extra $22-ish including mini-ITX motherboard, so decided why not?

I presume the i5 6400 is fast enough, well, compared to the A10-7850K.


----------



## Flamingo

Is there any explanation relating to the Fiji architecture as to why Fiji performance is not as ideal at lower resolutions? Whats causing the bottleneck? Is it driver overhead? If so has OpenGL, Vulcan or DX12 at 1080p shown superior performance?

All those extra shader cores and performance increase is not like it was expected.

Under OpenCL the situation is different:



At less detailed images, performance is higher. At more complex images, its not able to pull through.

I wish there was a decent tech review site that actually investigated these things


----------



## D2015

Does any one have Gigabyte Fury?

or Can any one post a picture of it without a cooler?

thx


----------



## looncraz

Quote:


> Originally Posted by *Flamingo*
> 
> Is there any explanation relating to the Fiji architecture as to why Fiji performance is not as ideal at lower resolutions? Whats causing the bottleneck?(


It's not actually a bottleneck - it's just that nVidia hardware doesn't scale as well with resolution as AMD hardware. Been that way for years.


----------



## Flamingo

Does the polaris release change anything for those thinking of jumping from red (Fiji) to green?

Nano = 8 teraflops
GTX 1070 = 6.45 teraflops
RX 480 = >5 teraflops

Considering resale value of the Fijis


----------



## SuperZan

For myself, I wasn't going to consider an upgrade from Fiji until the big-dies drop. That hasn't changed, though at sub-$300 I'll probably pick up a Pol 10 8Gb model just for fun.

But to upgrade from dual Fiji it's always been Vega/GP102-ish at the earliest.


----------



## dagget3450

I myself am interested in dx11 performance oc polaris along with dx12. They added some things that should make it better at dx11 than previous gpus. I may pick up some to play with also. Depends on reviews for me right now. I was considering a 1070 to play with but i dont think will considering its gap from 1080gtx.


----------



## Newbie2009

Quote:


> Originally Posted by *Flamingo*
> 
> Does the polaris release change anything for those thinking of jumping from red (Fiji) to green?
> 
> Nano = 8 teraflops
> GTX 1070 = 6.45 teraflops
> RX 480 = >5 teraflops
> 
> Considering resale value of the Fijis


Yeah, massively. 480 looks to be between 390x and fury x. Probably about fury speeds, 150w part with 4GB for $199, 8GB for $250?

My cards are almost worthless now lol.


----------



## dagget3450

Quote:


> Originally Posted by *Newbie2009*
> 
> Yeah, massively. 480 looks to be between 390x and fury x. Probably about fury speeds, 150w part with 4GB for $199, 8GB for $250?
> 
> My cards are almost worthless now lol.


I would assume you had 290x for quite some time now? I don't think it makes them obsolete myself unless you dont want the heat or power usage. Maybe wait for wome reviews to see?


----------



## Newbie2009

Quote:


> Originally Posted by *dagget3450*
> 
> I would assume you had 290x for quite some time now? I don't think it makes them obsolete myself unless you dont want the heat or power usage. Maybe wait for wome reviews to see?


Yeah I bought on launch. Actually unlocked 290s. I'm pretty happy with the performance I'm getting from them after all this time, I just mean if I tried to sell, which I won't bother anyway, I would get very little for them.


----------



## Flamingo

Anyone with AoTS and R9 Nano? Need to see benchmark and compare against:

http://www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/8b748568-fc96-4e48-9fed-22666a7149f5


----------



## LtMatt

Quote:


> Originally Posted by *Flamingo*
> 
> Anyone with AoTS and R9 Nano? Need to see benchmark and compare against:
> 
> http://www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/8b748568-fc96-4e48-9fed-22666a7149f5


...


----------



## looncraz

Quote:


> Originally Posted by *Flamingo*
> 
> Anyone with AoTS and R9 Nano? Need to see benchmark and compare against:
> 
> http://www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/8b748568-fc96-4e48-9fed-22666a7149f5


Those results are not very impressive. About the same as an R9 290, actually.

My underclocked R9 290 (900mhz) results are nearly identical at the same settings:

http://files.looncraz.net/Ashes-Extreme-290_900mhz.png

...

Also, my power draw was only about 160~180W average above idle thanks to the underclock...


----------



## Blotto80

http://www.ashesofthesingularity.com/metaverse#/personas/b90be4b6-9278-44bb-a81f-35ac5889c667/match-details/8b49cda6-d479-4a6a-87ea-ecccad301c38

Here's mine on an overclocked Fury X.


----------



## Alastair

Guys. I have a serious question. Judging from what AMD is telling us about about 480x performance being in the 500 dollar performance range for 199 bucks. Could it effect the resale value of normal Fury (non X) if so should I sell my cards before 480 launches proper?


----------



## Blotto80

I would already think the value has tanked on them. With the 1070 already "out" in the enthusiast mindset, I don't anticipate an easy sell at more than $300ish for a used Fury. Oh well, Fury X is plenty fast for me until Vega and then I'll sell it for pennies or give it to my brother to replace his 280x.


----------



## xTesla1856

FINALLY got tracking for my RMA'd Fury. Should be here tomorrow


----------



## Medusa666

Quote:


> Originally Posted by *Blotto80*
> 
> I would already think the value has tanked on them. With the 1070 already "out" in the enthusiast mindset, I don't anticipate an easy sell at more than $300ish for a used Fury. Oh well, Fury X is plenty fast for me until Vega and then I'll sell it for pennies or give it to my brother to replace his 280x.


I don't mind, I just think it is good that AMD has a serious chance at getting back into the game now with Polaris.

I bought the Radeon Pro Duo like a week ago, and I'm extremely satisfied. This card is silent as can be, runs cool, and has so much power it puts a grin on my face whenever i play games. The plan is that it will last me a good 3-4 years ( the last 4th year with low-medium settings ).


----------



## Blotto80

Quote:


> Originally Posted by *Medusa666*
> 
> I don't mind, I just think it is good that AMD has a serious chance at getting back into the game now with Polaris.
> 
> I bought the Radeon Pro Duo like a week ago, and I'm extremely satisfied. This card is silent as can be, runs cool, and has so much power it puts a grin on my face whenever i play games. The plan is that it will last me a good 3-4 years ( the last 4th year with low-medium settings ).


That's exactly how I feel too, Fury X is quiet as can be, runs cool, plows through anything I throw at it at 1440p. I went from Crossfire 290x's which I hated to a Fury Tri-X and now the Fury X and I'm done for a while. No matter what else gets announced this year.


----------



## Kana-Maru

Quote:


> Originally Posted by *Blotto80*
> 
> That's exactly how I feel too, Fury X is quiet as can be, runs cool, plows through anything I throw at it at 1440p. I went from Crossfire 290x's which I hated to a Fury Tri-X and now the Fury X and I'm done for a while. No matter what else gets announced this year.


I agree with you and Medusa. I also own the Fury X and it has been great so far. It's very quiet and the GPU temps are always very low. I don't think I'm ever going back to air. AMD has clawed back some of the GPU market and I hope they continue to do well. We all need competition and the $200 Radeon RX480 is a step in the correct direction. I can't wait to see Vega and big Pascal go head to head whenever they release.

I still feel that I'll still be using my Fury X until then and possibly afterwards since AMD GPUs tend to age very well compared to my old GTX 400\500\600 series. Doom @ 1440p gives me 83fps average and 4K gives me 48fps average. That's running 100% Max-Ultra Settings + OpenGL 4.3 and we are still waiting on Vulkan. So once DX12\Vulkan takes off the Fury X might age better than we think, even with the 4GB HBM limitation [based on my Hitman 4K DX12 results].


----------



## Thoth420

Considering the Fury X (blocked. ..no idea what to expected with it in stock AIO config) is the only piece of hardware I am satisfied with in my new build I have to agree. Sadly I dislike win10 yet need it for DX12....Chances are it will be the only thing to remain after a rebuild aside my PSU and Intel SSD...done with Samsung...

I will probably keep the Fury X for a 4K rig to play single player games where IQ matters more and buy something budget to play shooters and online stuff at 1080 144hz. I would just dual display it but frankly I prefer a single screen per system...I know...old man.


----------



## kokobash

Sorry i cant seem to fomd a real answer to my question. But is a 750w seasonic gold rated enough to run dual nano with a non k 4790.


----------



## GruntXIII

2x 175 W Nano
+ 1x 84 W Intel
+ a few watt for everything else. That's sth around 500 watt max.

So yes, it will be enough. There is room for OC as well.

Btw, good choice. SeaSonic power supplies are really well built and their support is A+.


----------



## Kana-Maru

I'll just leave this here for those who like to read.

*Analyzing AMD Mainstream Strategy*
http://www.overclock-and-game.com/news/pc-gaming/48-analyzing-amd-mainstream-strategy

Just in case you missed my article a few weeks ago you can ear my GTX 1080 article here:

*GTX 1080 - What's Not Being Discussed*
http://www.overclock-and-game.com/news/pc-gaming/46-gtx-1080-what-s-not-being-discussed

Mission Accomplished by the way. It appears I actually started a discussion across the web. Unfortunately not everyone agreed with me and called me names, including bias for AMD, despite my Nvidia support for many, many years.


----------



## Elmy

EK Pro Duo waterblock.... A thing of beauty!


----------



## Kana-Maru

Quote:


> Originally Posted by *looncraz*
> 
> Sweetness. Read both articles - fully agree with both! Nice style and clarity!


Thanks man. Glad you enjoyed them. I'll try to keep it up in my free time.


----------



## GruntXIII

Nice







.

Really weird to see a jetplate in a gpu watercooler ^^.


----------



## gupsterg

Quote:


> Originally Posted by *GruntXIII*
> 
> 2x 175 W Nano
> + 1x 84 W Intel
> + a few watt for everything else. That's sth around 500 watt max.


I think the 175W is thermal design power (TDP).
Quote:


> For this test, we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, the values here only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.


Quote:


> Peak: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Highest single reading during the test.


Peak 209W in TPU review for Nano (stock clocks).


----------



## GruntXIII

It's usually somewhere in that region.

But I've read around 200 watt peak as well, you're right.

750 watt fits fine, nonetheless.

German pcgameshardware measured the system total of a 6700k with OC at 4,5 GHz, 16 GB RAM and a Nano and it hits 260 watt at 4K gaming. If you add a 2nd one with 200 watt, you'll be well below my mentioned 500 watt (if both rlly get 100% utilized).


----------



## gupsterg

Yes I agree with you that the 750W would be fine







.

I'll be honest I've never looked at CPU power consumption when gaming, but I probably wouldn't spec PSU based on gaming power usage for total system. This is because at times I'll run [email protected] or say a stress test where CPU/GPU are both under stress to high level (RealBench stress mode).

So in the Bit Tech review total system with i7 6700K running P95 small FFT pulled 148W from wall, OC'd to 4.8GHz @ 1.35V = 197W (test setup) As these figures are W from wall socket I'd leave the extra W in figures as a buffer.

So I'd come up with 197W + (209W x 2) = 615W , CPU OC'd but 2x Nano stock, so a 500W I wouldn't wish to have.


----------



## Medusa666

Quote:


> Originally Posted by *Elmy*
> 
> EK Pro Duo waterblock.... A thing of beauty!


Can you post some temperatures with that block when you got it installed?

I'm considering getting this togheter with a EKWB Predator 240 or 360.

Thanks!


----------



## GruntXIII

Quote:


> Originally Posted by *Medusa666*
> 
> I'm considering getting this togheter with a EKWB Predator 240 or 360.
> 
> Thanks!


If you're going to watercool your system, I'd rlly recommend buying the single components, rather than going for those kits. I'm watercooling my parts for a few years now and I can say that you'll get more bang for the buck and you can fit it way better to your needs than if you take the kits.

Also, if you cool your whole system I'd go for more radiator surface. Otherwise you'll end up with fans spinning at high rpms or high core temperatures just like with the pre-installed AIO watercooler (120mm radiator for 350 watt+?...I really can't understand why AMD didn't go for at least 240mm or even better 360).

Per 100 watt TDP you should usually count at least 120mm radiator surface. In your case 600mm (350W TDP GPU + 140W TDP CPU), so at least a 360 and a 240 (or 480 + 120, etc. ...depends on your case). With oc, good temperatures and low noise in mind, I'd rather get more. You may also take a look at external radiators like the Watercool MO-RA 3 or Phobya G-Changer NOVA. If you get one of those you'll never again have to think about adding radiator size again and you can switch your case easier, without having to worry about changing radiators or if there's enough room...you only need to have holes to go out to your external radiator, which in worst case, can be easily made by yourself with the right tools.

But...you can do it anyway you like. Just wanted to point that out


----------



## Flamingo

So I had enough fun with the Nano, so I though of overclocking it.

Started with +50% 1050Mhz, it throttled down to 1022Mhz and 980Mhz in the first test of 3DMark (firestrike 1.1)

Running 3DMark2001SE, the card throttled non stop lol @ 800-900Mhz.

Also it heated up so fast, that I had to run tests @ 100% fan speed (even though my sensitivity is up 150%).

So yea, pretty much decided not to go any further because of heating issues. From here I have three options:

1) Use Gelid Extreme to repaste > interposer risk
2) Buy Silent Wings for mod
3) Buy EK block with T12 fractal cooler > is that investment worth it (also 120mm rad for CPU+GPU seems meh considering post before about 120mm per 100W).

or just sell it and get a GTX 1070/80 lol


----------



## Medusa666

Quote:


> Originally Posted by *GruntXIII*
> 
> If you're going to watercool your system, I'd rlly recommend buying the single components, rather than going for those kits. I'm watercooling my parts for a few years now and I can say that you'll get more bang for the buck and you can fit it way better to your needs than if you take the kits.
> 
> Also, if you cool your whole system I'd go for more radiator surface. Otherwise you'll end up with fans spinning at high rpms or high core temperatures just like with the pre-installed AIO watercooler (120mm radiator for 350 watt+?...I really can't understand why AMD didn't go for at least 240mm or even better 360).
> 
> Per 100 watt TDP you should usually count at least 120mm radiator surface. In your case 600mm (350W TDP GPU + 140W TDP CPU), so at least a 360 and a 240 (or 480 + 120, etc. ...depends on your case). With oc, good temperatures and low noise in mind, I'd rather get more. You may also take a look at external radiators like the Watercool MO-RA 3 or Phobya G-Changer NOVA. If you get one of those you'll never again have to think about adding radiator size again and you can switch your case easier, without having to worry about changing radiators or if there's enough room...you only need to have holes to go out to your external radiator, which in worst case, can be easily made by yourself with the right tools.
> 
> But...you can do it anyway you like. Just wanted to point that out


Thank You Sir, I'm grateful for the advice.

It sounds better to do that when I begin with custom watercooling.


----------



## Performer81

Quote:


> Originally Posted by *Flamingo*
> 
> So I had enough fun with the Nano, so I though of overclocking it.
> 
> Started with +50% 1050Mhz, it throttled down to 1022Mhz and 980Mhz in the first test of 3DMark (firestrike 1.1)
> 
> Running 3DMark2001SE, the card throttled non stop lol @ 800-900Mhz.
> 
> Also it heated up so fast, that I had to run tests @ 100% fan speed (even though my sensitivity is up 150%).
> 
> So yea, pretty much decided not to go any further because of heating issues. From here I have three options:
> 
> 1) Use Gelid Extreme to repaste > interposer risk
> 2) Buy Silent Wings for mod
> 3) Buy EK block with T12 fractal cooler > is that investment worth it (also 120mm rad for CPU+GPU seems meh considering post before about 120mm per 100W).
> 
> or just sell it and get a GTX 1070/80 lol


Take your voltage down. Lower voltage = higher stable clocks, because it doesnt run into powerlimit so easy and stays cooler, especially with the nano. Even 1050 could be done with lower voltage from what i red.
My XFX Fury also manages 1050 with -50mv offset (Stock at 1000MHZ).


----------



## Flamingo

Quote:


> Originally Posted by *Performer81*
> 
> Take your voltage down. Lower voltage = higher stable clocks, because it doesnt run into powerlimit so easy and stays cooler, especially with the nano. Even 1050 could be done with lower voltage from what i red.
> My XFX Fury also manages 1050 with -50mv offset (Stock at 1000MHZ).


Thank you, I will try that.

While searching for Nano undervolting, I came across another fan mod for the Nano lol:







Might not be suitable for SFF, because it splashes heat everywhere, but might try to move one of my my gentle typhoons to the Nano lol

Source: https://hardforum.com/threads/just-bought-a-fury-nano-and-i-need-help.1883424/#post-1042006692


----------



## bluezone

New driver release.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.1-Release-Notes.aspx

64 bit Win 7:

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

64 bit Win 10:

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

32 bit Win 7::

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+32

32 bit Win 10:

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+32

Enjoy.

Cheers.


----------



## dagget3450

Quote:


> Originally Posted by *bluezone*
> 
> New driver release.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.1-Release-Notes.aspx
> 
> 64 bit Win 7:
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> 64 bit Win 10:
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64
> 
> 32 bit Win 7::
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+32
> 
> 32 bit Win 10:
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+32
> 
> Enjoy.
> 
> Cheers.


Thats it i am going green team, to many driver updates with red team.... Not!!


----------



## battleaxe

Quote:


> Originally Posted by *dagget3450*
> 
> Thats it i am going green team, to many driver updates with red team.... Not!!


What no overpriced 1080 series for you? Not...

dual 480 beats the 1080 for under $500.00... nice...


----------



## Orthello

.
Quote:


> Originally Posted by *dagget3450*
> 
> Thats it i am going green team, to many driver updates with red team.... Not!!


LOL .. i was a little bit Green with envy at the CFX support for Warhammer already a driver or two back from AMD.. still waiting on a decent profile from NV .

I'm liking the direction Raja is steering the ship in.


----------



## SuperZan

Quote:


> Originally Posted by *Orthello*
> 
> .
> LOL .. i was a little bit Green with envy at the CFX support for Warhammer already a driver or two back from AMD.. still waiting on a decent profile from NV .
> 
> I'm liking the direction Raja is steering the ship in.


It hasn't been a seamless transition from CCC but Raja and RTG have pretty much been on point. I'm much happier with the general user experience than I was a year ago.


----------



## Orthello

Quote:


> Originally Posted by *battleaxe*
> 
> What no overpriced 1080 series for you? Not...
> 
> dual 480 beats the 1080 for under $500.00... nice...


That was impressive with what looks like immature drivers too.

What is rumored to be impressive is they might have done this at lower wattage also. Eg total power of the two 480s was (rumored to be) less than the 1080 whilst doing this - that's something I'm looking forward to seeing proven or not in the future.

Raja did point to +Efficiency win in the Benchmark and some think that means power drawn was less than the 1080. It could just mean it had untapped reserves (mainly in the single batch it seams or light loading).

Its also one game so there is a lot to be proven yet with RX480 CFX - certainly at the price CFX could be a competitor to custom 1070s AIB models (they could be over $400 usd) , i think just as long as AAA titles continue to get CFX support in a timely manor then doubling up remains a good option. Especially if power and price are not also problematic.

Nobody knows the numbers yet but say a 1070 gains 30-40% over a RX480 .. well CFX 480s actually will pull ahead of that in most CFX supported titles as that would be considered very poor scaling.

There are quite a few options ... Looking forward to the reviews.


----------



## pdasterly

whats going to happen with the radeon pro duo, with the rx480 coming soon its priced twice a much as its worth. Heck i see fury non-x for $300 used US


----------



## djsatane

Before we declare 480 beating 1080 lets wait for actual real world non affiliated reviews and both cards in hands of customers







Also I am very skeptical of the CFX performance as being hyped to be with 480. It does sound interesting but we shall see.


----------



## Jflisk

You see this stuff all the time. Fury X is the titan killer . 1080 is the best of the best . Then you get to test the cards yourself and find out how far off base the hype really is. Been happening since I have been buying cards.


----------



## Kana-Maru

Quote:


> Originally Posted by *Jflisk*
> 
> You see this stuff all the time. Fury X is the titan killer . 1080 is the best of the best . Then you get to test the cards yourself and find out how far off base the hype really is. Been happening since I have been buying cards.


To be fair AMD did give you Titan\TitanX performance for less than $1000 both times with the 290X and Fury X. The GTX 980 Ti and Titan\Titan X benefits were the overclocking overhead, until you got to 4K when it didn't matter that much. As far as reference designs goes the Fury X was always competitive with the GTX x80 - GTX 980 Ti and the Titans. The 295X2 was also much cheaper than the Titan Z even before the 295X2 price drop.

The GTX 1080 has it's fair share of problems, but the card still gets a perfect "A" or "100%" from several review sites although there are plenty of flaws and issues surrounding the card. I seen one site give the GTX 1080 a "A" in the *price* and temperature category







. It's definitely a powerful card and overkill for 1080p, but people have to stop getting caught up in the "Ghz glamour" [yes that was coined by yours truly] and marketing. AIBs should fix most of the GTX 1080 issues, but the reviewers were all reviewing the "Founders Edition" which is not a 100% GPU by any means. From what I've seen and read the card is actually hotter than the 290X. Yet there are no volcano or magma GTX 1080 memes floating around.

When I was running GTX my different Nvidia GPUs since 2010-2015 I finally had to say enough is enough with the marketing. I need my GPUs to live a decently long life without resulting to SLI or regretting my $300-$500+ purchases. I dodged the GTX 970 3.5GB fiasco even though I wanted to buy two initially and SLI them. I'm also glad I passed up the GTX 980 since it didn't take long for that card to get beaten by cheaper alternatives. Luckily I have no such feelings since switching back to ATI\AMD with my Fury X.

Looking at DX12 & Vulkan, it's proving that architecture and long-term planning is better than burning a hole into the card with high stock clocks and overclocks. I enjoy a cool case and a cool ambient temp in my room when the AC is off. Plus the 4GB HBM1 is doing much better than I though at 4K when it comes to frame rate averages, 97th percentile and frame times when 100% maxed. Vega with HBM2 should be a beast and Nvidia is enjoying the high clocks.....this should make for a great high end competition next year.


----------



## Elmy

Custom Coppered out Aquacomputer Fury X waterblocks. #Waterblockporn #SoShiny


----------



## Willius

Quote:


> Originally Posted by *Elmy*
> 
> Custom Coppered out Aquacomputer Fury X waterblocks. #Waterblockporn #SoShiny
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [/SPOILER


Looks good!

On a side note, anyone knows of any R9 Nano bios mods to enhance overclocking? Thermals aren't an issue since I've got it blocked with an EK Waterblock.

Since it has a dual bios I'd thought I should make use of it


----------



## Gdourado

So I just saw a great promotion on a 390X.
It's probably a clereance sale or something...
But either way, a pair of 390X comes at a sweet price.
How is the performance of a 390X crossfire?
How do they compare against a single Aircooled Fury?
From some reviews I see online, at 1080p, the fury is anywhere from 3 to 15 fps ahead of a single 390X.
So even with bad scalling, a crossfire 390X can pull ahead by 50-60 fps from a fury?
How is the current state of MicroStutter? Is it still an issue with latest drivers?

Cheers!


----------



## gupsterg

Quote:


> Originally Posted by *Willius*
> 
> On a side note, anyone knows of any R9 Nano bios mods to enhance overclocking?


Fiji bios mod







.


----------



## Jflisk

Quote:


> Originally Posted by *Kana-Maru*
> 
> To be fair AMD did give you Titan\TitanX performance for less than $1000 both times with the 290X and Fury X. The GTX 980 Ti and Titan\Titan X benefits were the overclocking overhead, until you got to 4K when it didn't matter that much. As far as reference designs goes the Fury X was always competitive with the GTX x80 - GTX 980 Ti and the Titans. The 295X2 was also much cheaper than the Titan Z even before the 295X2 price drop.
> 
> The GTX 1080 has it's fair share of problems, but the card still gets a perfect "A" or "100%" from several review sites although there are plenty of flaws and issues surrounding the card. I seen one site give the GTX 1080 a "A" in the *price* and temperature category
> 
> 
> 
> 
> 
> 
> 
> . It's definitely a powerful card and overkill for 1080p, but people have to stop getting caught up in the "Ghz glamour" [yes that was coined by yours truly] and marketing. AIBs should fix most of the GTX 1080 issues, but the reviewers were all reviewing the "Founders Edition" which is not a 100% GPU by any means. From what I've seen and read the card is actually hotter than the 290X. Yet there are no volcano or magma GTX 1080 memes floating around.
> 
> When I was running GTX my different Nvidia GPUs since 2010-2015 I finally had to say enough is enough with the marketing. I need my GPUs to live a decently long life without resulting to SLI or regretting my $300-$500+ purchases. I dodged the GTX 970 3.5GB fiasco even though I wanted to buy two initially and SLI them. I'm also glad I passed up the GTX 980 since it didn't take long for that card to get beaten by cheaper alternatives. Luckily I have no such feelings since switching back to ATI\AMD with my Fury X.
> 
> Looking at DX12 & Vulkan, it's proving that architecture and long-term planning is better than burning a hole into the card with high stock clocks and overclocks. I enjoy a cool case and a cool ambient temp in my room when the AC is off. Plus the 4GB HBM1 is doing much better than I though at 4K when it comes to frame rate averages, 97th percentile and frame times when 100% maxed. Vega with HBM2 should be a beast and Nvidia is enjoying the high clocks.....this should make for a great high end competition next year.


I mean I try to keep up with the next gen when it comes to my system. The only regret I have with my Fury X's is that here we are one year latter and ready to go to HBM2 . So here comes another 600.00 coming out of pocket . The funny thing is I gave up 3 x 290X to go to the FURY X's and the cards were a match as far as bench marks go. However my power bill went down. considering the 290X 's were power monsters . Had them under water with many radiators and saw 60C at times. Those were paired with a FX9590 that probably didn't help matters much -that's another power beast. I mean the FURY X's run cool (no reason to put them under water at this point) and now paired with the I7 4970K . The ambient in my house without AC is pretty decent( in other words system is not a space heater any more). I have no problem playing games at 2560X1440( think this is considered 2K But I am happy with it) . I am hoping they get the Vulkan and DX12 injected into the games to see where the benefits are to the API.


----------



## GruntXIII

@Gdourado

How much do you pay for them?

Maybe it would be better to wait for RX 480(X) or GTX 1060(ti)?


----------



## Gdourado

Quote:


> Originally Posted by *GruntXIII*
> 
> @Gdourado
> 
> How much do you pay for them?
> 
> Maybe it would be better to wait for RX 480(X) or GTX 1060(ti)?


250 Euros each. 500 for both.


----------



## Jflisk

Quote:


> Originally Posted by *Gdourado*
> 
> So I just saw a great promotion on a 390X.
> It's probably a clereance sale or something...
> But either way, a pair of 390X comes at a sweet price.
> How is the performance of a 390X crossfire?
> How do they compare against a single Aircooled Fury?
> From some reviews I see online, at 1080p, the fury is anywhere from 3 to 15 fps ahead of a single 390X.
> So even with bad scalling, a crossfire 390X can pull ahead by 50-60 fps from a fury?
> How is the current state of MicroStutter? Is it still an issue with latest drivers?
> 
> Cheers!


I can tell you your best bet is one card not 2+ . Crossfire is still out there and being supported . But rule of thumb is you are always better with one card and not 2+. I have not seen any micro studded on any supported games I own that support crossfire. I added the + to designate tri or quad fire. Thanks








.


----------



## Gdourado

Quote:


> Originally Posted by *Jflisk*
> 
> I can tell you your best bet is one card not 2+ . Crossfire is still out there and being supported . But rule of thumb is you are always better with one card and not 2+. I have not seen any micro studded on any supported games I own that support crossfire. I added the + to designate tri or quad fire. Thanks
> 
> 
> 
> 
> 
> 
> 
> .


So you are saying I should buy a Fury instead a pair of 390X?


----------



## Jflisk

Quote:


> Originally Posted by *Gdourado*
> 
> So you are saying I should buy a Fury instead a pair of 390X?


The answer to that is you would need to see what kind of framerates a Fury/X gets as opposed to the 390x X 2. If the performance is the same or equal then the one Fury would be the better choice.

Looks like agent smith did the math

http://www.overclock.net/t/1566264/2-radeon-r9-390x-or-1-fury-x


----------



## Willius

Quote:


> Originally Posted by *gupsterg*
> 
> Fiji bios mod
> 
> 
> 
> 
> 
> 
> 
> .


oh derp, i couldn't find it. Searching is hard right


----------



## gupsterg

No worries







, look forward to some results and shares of your exploits







.


----------



## Willius

The Nano that i have is a notorious bad overclocker, atleast with the standard bios. My highest 3DMark - Fire Strike - 13290 marks - Radeon R9 Nano @ 1060/520MHz. Can get it to 1062/525 on Skydriver, but 1 Mhz over that and it artifacts/crashes the driver. But keep in mind i run an 4670k.. So that gimps my physics score quite a bit. overclocking it to 4.85Ghz only increases my scores a tiny little bit.
But well see what i can do


----------



## gupsterg

I have had 1x Fury Tri-X and 6x Fury X, I have not owned a Nano. From shares of members they seem to respond to voltage decrease to combat throttling, raising aspects of the PowerLimit by ROM may help without having to decrease voltage, which in turn may help it sustain and/or attain better OC.

The PowerLimit section on Fiji bios mod I kept quite sane (ie to PCI-SIG spec) but hardware spec is higher so if someone wished to be adventurous it could gain them more. 1 member has done so on a Nano, view Huntscraft's posts in the thread.

Generally speaking modded bios won't make the silicon achieve something it can't do, some aspects of bios mod are not in the same way as we say tweak mobo bios settings to stabilise CPU/RAM OC.

I'd advise not OC'ing HBM for now.

Reason 1

From 3DM FS benches, scaling is per % in HBM clock = 0.3% in performance gain where as GPU is pretty much 1:1.

Reason 2

AMD Matt on several forums posted that HBM clocks at steps (ie 500.00/545.45/600.00/666.66MHz). He is an AMD support person, he has no reason to give misinformation. At first I was inclined to think HBM clocks in smaller steps, but now I'm still not sure but working towards knowing if it does. I'm correlating data together from benches and stability testing of HBM clocks. Then hoping to present it in the Fiji bios mod thread for others to see if they can do same testing and give data so we can arrive at a conclusion.


----------



## Butthurt Beluga

So this is probably a (really) stupid question, but is the Fury series and the R9 300 series on the same driver release(s)?
Because my machine with my Fury X is on driver release 16.20.xxx and my R7 370 machine is on 16.6.1?

Also I know there was an Overwatch driver I think in 16.5.1 or so but I haven't seen any FPS improvement on my Fury X... where my little R7 370 is doing 69~ FPS on high settings @1080p


----------



## gupsterg

We have method of confirming AMD Matt's information on how Fiji memory controller clocks HBM RAM in steps







.

If members can view this post and test/confirm with their cards it would be great







.


----------



## ht_addict

For those that waterblocked there FuryX, did you notice much of a difference in temp? I was thinking of doing my FuyX's with Predator 360 and dual prefilled blocks with quick disconnects. Only thing is the cost would be about $600US. Or do I just clean up the TIM and go with quieter fans?


----------



## gupsterg

I'm pretty impressed by Fury X AIO (stock fan/TIM/pads)







, I wouldn't spend 600USD on cooling upgrade.



I have set GPU 1135MHz with +31.25mV VID over stock via ROM , HBM is clocked 545MHz with +18.75mV via ROM, it is a custom fan profile via ROM (ie Fuzzy Logic). Ambient temps in the UK have been pretty high past few days, I have seen 27C on my digital room temp monitor in the daytime and ~21C night time (~12am). My system has not been off for over ~65hrs







.

Folding log of past ~25hrs, hit poor unit







.

fh_log.txt 112k .txt file


0 bad states CPU / GPU







.


----------



## hyp36rmax

Quote:


> Originally Posted by *ht_addict*
> 
> For those that waterblocked there FuryX, did you notice much of a difference in temp? I was thinking of doing my FuyX's with Predator 360 and dual prefilled blocks with quick disconnects. Only thing is the cost would be about $600US. Or do I just clean up the TIM and go with quieter fans?


This is really subjective. As a hobby I would say why not re-block with a dedicated GPU block. You really have to ask yourself if it's worth it. I personally re-blocked my FURY X with a couple EK blocks for the sake of aesthetics and a little cooler using copper tubing. My setup doesn't break 35C with an ambient of about 20-21C with both cards at full load and an overclock i7 5820K. New tim is really all you have to do though without going all out.

I'll try to get some screenshots later


----------



## toncij

Quote:


> Originally Posted by *hyp36rmax*
> 
> This is really subjective. As a hobby I would say why not re-block with a dedicated GPU block. You really have to ask yourself if it's worth it. I personally re-blocked my FURY X with a couple EK blocks for the sake of aesthetics and a little cooler using copper tubing. My setup doesn't break 35C with an ambient of about 20-21C with both cards at full load and an overclock i7 5820K. New tim is really all you have to do though without going all out.
> 
> I'll try to get some screenshots later


Paste is so bad on stock?


----------



## Maximization

Quote:


> Originally Posted by *ht_addict*
> 
> For those that waterblocked there FuryX, did you notice much of a difference in temp? I was thinking of doing my FuyX's with Predator 360 and dual prefilled blocks with quick disconnects. Only thing is the cost would be about $600US. Or do I just clean up the TIM and go with quieter fans?


the water blocks keep it in check and keep it silent more, I got tired of the 2 rads hanging out of my case so it was for looks also. It is an expensive hobby no doubt. I am hoping I can hold out till pcie-4 with them.


----------



## mustrum

I got the EKWB block on my fury X. It runs at 45 instead of 65 degrees but wont clock better because of that. The stock cooler was too loud though and the sytem is in place. Running a Mora radiator just for the cpu would be a bit overkill. It is more elegant this way.


----------



## bluezone

I decided to run an couple experiments with my Nano to see if I could gain some extra cooling headroom. First I temporarily replaced the plastic shroud with an aluminum shroud. This had contact with the fins of the Nano heat sink.
Then I ran the Valley benchmark. It slightly lowered fan speeds and temps 1-2 deg. and the shroud is cool to touch. But if I shut down the PC immediately the shroud temperature will match heat sink temperature. 60-70 deg. In my opinion not safe and possibly electrically dangerous. (big ground).

The second option was to apply thermal foam seal to the inside of the stock plastic shroud. This is in order to force air flow only out cooling fins and to build static air pressure. Then I ran the Valley bench mark again. This resulted in slower fan speeds and max temperature attainment and IMO less temperature fluctuation. Same max temperature but quicker cooling.

I ended up sticking (no pun intended) with the thermal foam tape solution.


----------



## Unkzilla

Well the good part about the 1080/1070's is that i've managed to pickup a fury X on the cheap. Brand new on clearance I got this for 700AUD (for comparison the 1080 is 1200AUD) so I don't think its a bad deal. Not even sure if i'll use it or auction it yet.. using a r9 390 at the moment and have mixed feelings on the driver side of things. But here she is:

http://s33.postimg.org/xm9juqb0f/20160610_113018.jpg


----------



## Kana-Maru

Quote:


> Originally Posted by *mustrum*
> 
> I got the EKWB block on my fury X. It runs at 45 instead of 65 degrees but wont clock better because of that. The stock cooler was too loud though and the sytem is in place. Running a Mora radiator just for the cpu would be a bit overkill. It is more elegant this way.


Am I the only person to never have noise coming from my Fury X? Also I don't think I've ever hit 65c even on the hottest days while running 3Dmark 11 or FireStrike.

Quote:


> Originally Posted by *Unkzilla*
> 
> Well the good part about the 1080/1070's is that i've managed to pickup a fury X on the cheap. Brand new on clearance I got this for 700AUD (for comparison the 1080 is 1200AUD) so I don't think its a bad deal. Not even sure if i'll use it or auction it yet.. using a r9 390 at the moment and have mixed feelings on the driver side of things. But here she is:
> 
> http://s33.postimg.org/xm9juqb0f/20160610_113018.jpg


Nice. That's the same one I have. That GameCaster deal is pretty nice if you stream a lot.


----------



## SuperZan

Quote:


> Originally Posted by *Kana-Maru*
> 
> Am I the only person to never have noise coming from my Fury X? Also I don't think I've ever hit 65c even on the hottest days while running 3Dmark 11 or FireStrike.


I don't know if I'm just not sensitive to it or what but two air-cooled Furies, a Fury X, and an AIO Fury have all been in use here and I've not had a noise issue with any of them past the first thirty minutes of fresh install.


----------



## Kana-Maru

Quote:


> Originally Posted by *SuperZan*
> 
> I don't know if I'm just not sensitive to it or what but two air-cooled Furies, a Fury X, and an AIO Fury have all been in use here and I've not had a noise issue with any of them past the first thirty minutes of fresh install.


I'm not sure if the noise issues were blown out of proportion to make AMD look bad or something, but there were some complaints across the web about the coil whine. As far as I know AMD pulled the cards off the market and fixed the issue. I retired my GTX dual SLI and pruchased my Fury X after the issue was resolved and have no coil whine or pump noise coming from my GPU either.

A lot of people claim that it's the actual card to this day, when it fact it was the Corsair AIO cooler issue from what I read from reviewers. The GTX 1080 has fan noise issues and there doesn't seem to be any outrage about that. The GTX 1080 gets very warm, hotter than a 290X reference in some cases and there's no widespread complaints. Not even a meme with something on fire. I don't know I guess it's front page news for many months if it's AMD, but not worth mentioning in most cases about if it's Nvidia.


----------



## GruntXIII

Some don't care...some (like me) do.

If you're using headphones, noise is usually no problem. Though...you can hear coil whine even with headphones on..that's a big issue with Nano cards (and I guess Fury X as well). My 780s had it as well...but only a tiny bit at a few hundred fps. Nano screams like hell even at 90 fps (it's a bit better with undervolting).

But, if you use noise dampening mats (or what's it called in english?) it's not an issue any more.


----------



## Kana-Maru

Quote:


> Originally Posted by *GruntXIII*
> 
> Some don't care...some (like me) do.
> 
> If you're using headphones, noise is usually no problem. Though...you can hear coil whine even with headphones on..that's a big issue with Nano cards (and I guess Fury X as well). My 780s had it as well...but only a tiny bit at a few hundred fps. Nano screams like hell even at 90 fps (it's a bit better with undervolting).
> 
> But, if you use noise dampening mats (or what's it called in english?) it's not an issue any more.


Yeah they are called many names, but "noise\sound\vibration etc dampening" mats is basically what we call them here. Yeah I forgot that a ton of people use headsets during gaming. I the biggest complaints came from reviewers running the card on open test benches. I don't remember many or any reviewer throwing the GPU in a case and testing the dBs, but perhaps some did. For me it would probably be no big deal since I'm usually around hundreds of servers all of the time with A\Cs blowing and computer fans going at 100%. I'm used to noise.


----------



## gupsterg

Air cooled Fury Tri-X literally luv'd the cooler, even with fan profile I thought very quiet. TBH at times was checking if the fans were running when card under load.

AIO Fury X only gripe in my view was installation, found maneuvering 2 elements for installation bit tricky. This is probably a combo of me/my case, so others may not think the same. Depending among revs of fan each sample I've had seems differing on perceptive noise, but not an issue as usually the fan never runs at those speeds, these were checks I did with manually setting fan speed at x RPM.

Out of all 7 Fiji cards, all ref AMD PCB, had no issues with coil whine. Never heard them through headphones, which taking into context what I do on PC is not a big proportion of the time. Even without headphones pretty much all of them it has been diffcult to hear coil whine when room has some low ambient noise. Yes at times hear whine when room totally silent, but no way IMO excessive or irritating noise.

I don't have a hearing issue IMO and would think I'm an OCD PC owner







.

All in all well pleased with "out of box" experience of Fiji vs 4 differing Hawaii cards with aftermarket coolers I owned at one point.

The temps on VRM are impressive IMO when comparing to the Hawaii cards as well. What the Fury/X coolers maintained as GPU temp with slight fan profile mod in ROM is phenomenal comparing to Hawaii with AIB HSF.


----------



## mustrum

When i said i replaced the cooler with an EKWB block because of noise i did not mean the Fury X was loud. I could hear it.








I got a very powerful watercooling setup. My PC is so silent, that you connot hear if it is running or not unless you go very close to it.
Compared to that the fan of the fury X is "loud". It reached 65 degrees because i have no need for very much airflow in my tower. I do have airflow for the components but it is not enough to transport the heat of a fury X under load out of the case quickly. Since the radiator could not be mounted outside since you cannot unplug the hoses it basically baked itself inside.









I am fully aware that the Fury X in it'S stock setup is a silent card compared to the competition. Mine does not have any coil whine either.
It's a great product really.


----------



## Flamingo

Posted in the wrong thread so reposting









The Polaris 10 leaked 3dmark result in interesting...

http://www.3dmark.com/3dm11/11263084

3DMark 11 Performance Graphics Score = 67DF:C7 = 18060

3DMark 11 Performance Graphics Score = R9 Nano stock = 17763

3DMark 11 Performance Graphics Score = R9 Nano with +50% = 18614

A card worth $200 dollars reaching Nano performance already, welp. But I wonder if it holds its grounds at 4k.


----------



## flopper

Quote:


> Originally Posted by *Flamingo*
> 
> Posted in the wrong thread so reposting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A card worth $200 dollars reaching Nano performance already, welp. But I wonder if it holds its grounds at 4k.


4k no as I suspect it be bandwidth limited to some extent.
and besides, its made for 1080/1440p


----------



## dagget3450

If the 200ish dollar gpu performs around a nano i sure hope they support up to 4 way CF also. I am curious to see these in action for sure.


----------



## Maximization

i can't seem to find any benchmarks of 1080 sli vs fury x crossfire @ 4K


----------



## dagget3450

Seems like its marketing. I see some reviews used a pro2duo but thTs not very accurate against fury x. I try to remain open minded but Nvidia really seems underhanded this launch. I still dont understand why so many review sites dont have a fury x or even 2 for reviews. There is no shortage of 980tis or 1070/1080s for reviews. Heh. I like how a fury or nano is used for fiji on many sites and to hell with a furyx.


----------



## Thoth420

Quote:


> Originally Posted by *dagget3450*
> 
> Seems like its marketing. I see some reviews used a pro2duo but thTs not very accurate against fury x. I try to remain open minded but Nvidia really seems underhanded this launch. I still dont understand why so many review sites dont have a fury x or even 2 for reviews. There is no shortage of 980tis or 1070/1080s for reviews. Heh. I like how a fury or nano is used for fiji on many sites and to hell with a furyx.


Expect a 1080Ti whenever AMD makes a move that puts the pressure on Nvidia(from a marketing and media standpoint as that tends to matter more than reality). I would guess that or some form of new Titan will be the Big Pascal with HBM2. I plan on sitting on the fury X or if I rebuild without a loop perhaps one of the new Polaris cards as a placeholder until both camps have a HBM2 Flagship GPU out to decide. I am very biased toward AMD after all my experience...as long as the card stays cooled properly they last just as long as Nvidia and frankly they age better.

The 1080 just like 980 is for fanbois who literally will just "upgrage" as soon as Nvidia releases a better card just months later....which they will...

OR

idiots who fall victim to marketing

OR

someone who really needs a new top end GPU right now and prefers Nvidia (only person in my mind with a brain in this group of variables)


----------



## GruntXIII

If you want the best card today, 1080 is the way to go. It's simple as that. You don't have to prefer any brand (I don't). Unless you have to choose because of G-Sync or Freesync.

But somehow I'm interested in the RX 480. Nano is really good (mine operates at Fury X niveau as mentioned earlier), but I guess I don't need that high performance.

Any european in here interested in a watercooled Nano?


----------



## Kana-Maru

Quote:


> Originally Posted by *GruntXIII*
> 
> If you want the best card today, 1080 is the way to go. It's simple as that. You don't have to prefer any brand (I don't).


That's so vague. You could have easily said the same things about the 780\980\Titans as well. They were the best cards at the time with no competition and we see how well that has worked out once the "Ti" & next gen Titan dropped as well as AMD GPUs. So that's only going to cost you $649 & $549 to $1000 for the "best GPUs at that time". Simple as that right? Pay a nice sum of money and swallow the performance gap 6 to 7 months later and some sooner than that. Don't worry Nvidia will have more performance coming soon for anywhere from $649.99-$1049.99 or more for the top of the line graphics.

I'm all for getting the best card, but the prices are simply getting ridiculous and it looks like it's not going to stop. Nvidia is just going to cash in and cover their architecture flaws with minor to no complaints. So far there is "some" backlash from Nvidia customers, but they are still purchasing the GPUs so that completely erases any criticism they have.

At some point brands do matter to a certain extent. When I see company using their money to influence the results, that's simply cringe worthy. You know readers are going to take those reviews and run with them. This is one of the reasons I normally run my own benchmarks and take some reviews with a grain of salt. At this point the Nvidia marketing, long money flow & the sponsors aren't trying to hide it. It's completely blatant right now.

I used Nvidia GPUs for 5 years until my Fury X, but I'm called a fanboy







. All of those years with Nvidia GPU I saw that AMD wasn't getting a fair shake even though their GPUs were very competitive and even better than Nvidia GPUs in some reviews. Nvidia has the biggest brand though. and they have the largest group of "keyboard soldiers" ready to call you names on every website.

Being unbiased is great, but being foolish and falling for a paper launch, while paying a premium price [$699+whatever demand price pops up] for a REFERENCE card that is low in quantity is "the" definition of a fanboy or someone who loves their company brand. Forget that "best card today" crap. Some of these people can't wait and pre-ordered without thinking twice. I'm looking at AMD and their Nano paper launch as well, but this time AMD [RX 480] was careful with their words, Nvidia wasn't.

Sorry for the long post.


----------



## dagget3450

Quote:


> Originally Posted by *Kana-Maru*
> 
> That's so vague. You could have easily said the same things about the 780\980\Titans as well. They were the best cards at the time with no competition and we see how well that has worked out once the "Ti" & next gen Titan dropped as well as AMD GPUs. So that's only going to cost you $649 & $549 to $1000 for the "best GPUs at that time". Simple as that right? Pay a nice sum of money and swallow the performance gap 6 to 7 months later and some sooner than that. Don't worry Nvidia will have more performance coming soon for anywhere from $649.99-$1049.99 or more for the top of the line graphics.
> 
> I'm all for getting the best card, but the prices are simply getting ridiculous and it looks like it's not going to stop. Nvidia is just going to cash in and cover their architecture flaws with minor to no complaints. So far there is "some" backlash from Nvidia customers, but they are still purchasing the GPUs so that completely erases any criticism they have.
> 
> At some point brands do matter to a certain extent. When I see company using their money to influence the results, that's simply cringe worthy. You know readers are going to take those reviews and run with them. This is one of the reasons I normally run my own benchmarks and take some reviews with a grain of salt. At this point the Nvidia marketing, long money flow & the sponsors aren't trying to hide it. It's completely blatant right now.
> 
> I used Nvidia GPUs for 5 years until my Fury X, but I'm called a fanboy
> 
> 
> 
> 
> 
> 
> 
> . All of those years with Nvidia GPU I saw that AMD wasn't getting a fair shake even though their GPUs were very competitive and even better than Nvidia GPUs in some reviews. Nvidia has the biggest brand though. and they have the largest group of "keyboard soldiers" ready to call you names on every website.
> 
> Being unbiased is great, but being foolish and falling for a paper launch, while paying a premium price [$699+whatever demand price pops up] for a REFERENCE card that is low in quantity is "the" definition of a fanboy or someone who loves their company brand. Forget that "best card today" crap. Some of these people can't wait and pre-ordered without thinking twice. I'm looking at AMD and their Nano paper launch as well, but this time AMD [RX 480] was careful with their words, Nvidia wasn't.
> 
> Sorry for the long post.


In short, i recall how many threads and attacks AMD got over "paper launch" of Fiji. To be fair though, it appears more people have gotten their hands on 1080gtx than the FuryX launch in the first initial weeks. I think people often mistake "best" for top performance. Someone who can only afford 200 or 300 dollars isn't going to call a 1080gtx the "best" because its not an option for them. As you say the "best" if using top performance as the marker is very temporary. Who knows how long it will be before AMD or even Nvidia themselves drop a new top performing card. I would say were looking roughly a few months to 6 or so at the most? Truth is we don't know what all AMD has in store for Polaris just yet. I am hoping these new Polaris gpu's address the weakness of the previous GCN on things like dx11/tessellation. Overall i have been underwhelmed myself on the FuryX mostly on the OC headroom and Vram size. The biggest incentive though for AMD has been all the features and crazy things I've been able to do. I ran Nvidia for most of my gaming years and once i got into surround then swapped to AMD and jumped into eyefinity its been absolutely fun. I still have yet to get freesynch but i am going to wait a bit longer and see how polaris and/or vega play out. That is mostly due to the changing gaming market(dx12/vulkan/win10/CF/SLI) and i want to see how much harder multi gpu is hit by end of year 2016


----------



## Kana-Maru

Quote:


> Originally Posted by *dagget3450*
> 
> In short, i recall how many threads and attacks AMD got over "paper launch" of Fiji. To be fair though, it appears more people have gotten their hands on 1080gtx than the FuryX launch in the first initial weeks. I think people often mistake "best" for top performance.


Yeah, but if you can remember the Fury X was pulled from the market to fix the coil whine issues caused by Corsair's AIO pump. Something that is usually left out. AMD was careful with their wording this time around and Nvidia went straight to the hype train. Now the Nano I will call out on the paper launch for sure. It was new technology so I can understand, but that's no excuse for the Nano.

Quote:


> I am hoping these new Polaris gpu's address the weakness of the previous GCN on things like dx11/tessellation.


That would be nice, but what's the point of tessellation fixes when it doesn't increase IQ after a certain setting. Excessive use of tessellation has never increased quality. Now high tessellation on strands of hair [or random objects] is a "thing" now. The only way I see it is that excessive use of tessellation along with Nvidia Gameworks is purely for marketing and Day 1 benchmarks.

http://i.imgur.com/2wyxv90.jpg
http://www.extremetech.com/wp-content/uploads/2015/05/Witcher3.png

I can see the concern around tessellation being abused. I guess if AMD can improve the performance Nvidia would stop using it excessively.

Quote:


> Overall i have been underwhelmed myself on the FuryX mostly on the OC headroom and Vram size.


I see. The 4GB HBM1 holds up well against 6GB GDDR5 according the benchmarks. The most I was able to push my Fury X was 1180Mhz I believe and that's not much if you compare to Nvidia counterparts. I was able to hit 1125Mhz with only changing the core clock. However, I'm not really complaining about overclocking since the performance is much better than I thought it would be at stock clocks.
Quote:


> That is mostly due to the changing gaming market(dx12/vulkan/win10/CF/SLI) and i want to see how much harder multi gpu is hit by end of year 2016


Don't expect much until developers drop DX11 support and start building solely around DX12. Vulkan multi-GPU support is coming soon I believe. Even Nvidia is banking on this, but I'm hoping devs won't support only two GPUs because Nvidia officially decided to stop supporting 3 and 4 way SLI. I'm hoping they will allow everything to scale according to hardware as most parallel programs do.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Posted in the wrong thread so reposting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Polaris 10 leaked 3dmark result in interesting...
> 
> http://www.3dmark.com/3dm11/11263084
> 
> 3DMark 11 Performance Graphics Score = 67DF:C7 = 18060
> 
> 3DMark 11 Performance Graphics Score = R9 Nano stock = 17763
> 
> 3DMark 11 Performance Graphics Score = R9 Nano with +50% = 18614
> 
> A card worth $200 dollars reaching Nano performance already, welp. But I wonder if it holds its grounds at 4k.
> 
> 
> Spoiler: Warning: Spoiler!


RX 480 is looking pretty good







, clocked my i5 4690K to 3.9GHz CPU/Cache and tested 3DM11, stock Fury X, 1135/545 OC Fury X. Next CPU/Cache at my daily 4.9/4.4, stock Fury X , 1135/545 OC Fury X.

At the moment ebay running a £1 final fee promo, viewing recently ended listings I could well make a small amount selling Fury X. Very tempting to get RX 480 and meddle with something new







.

Only qualms I have really:-

a) like the quiet cooling on Fury X.
b) nice temps when OC'd for GPU/VRM.
c) no heat when GPU under load in case.
d) really like the look of the Fury X through side panel.

The other thing I'm considering is RX 480 8GB MSRP is $229, so I reckon gonna be close to £200 in the UK. Technically my Fury X cost me a bit more but not vastly more, if the RX 480 blower is noisy when OC'd and I swap cooler I'd end up close to Fury X purchase price. I'm not finding the 4GB HBM limiting my uses and I'd be surprised if when the RX 480 is making use of 8GB it can hold a frame rate which would make it worth while to have. Which makes the RX 480 4GB @ $199 the card to have, all in all RX 480 is making me







+







in a







way.

Really can't wait for more bench results on RX 480







, to me seems like RX 480 gonna be a hit







.


----------



## Flamingo

8GB HBM1



Core Clock 1150Mhz
Memory Clock 1250Mhz?

What the heck?

Also 67-*C8*


----------



## Kana-Maru

Quote:


> Originally Posted by *Flamingo*
> 
> 8GB HBM1
> 
> 
> 
> Core Clock 1150Mhz
> Memory Clock 1250Mhz?
> 
> What the heck?
> 
> Also 67-*C8*


You are joking right?

Clearly this man is full of jokes.


----------



## HyeVltg3

Wheres a good place to get some updated benchmarks, really thinking about upgrading to Fury X instead of 480/490, mainly because I'm currently on 390, if I wait too long the sell value of 390s will plummet when 480 comes out...so gotta go fast!

all the benches I keep finding are from 2015 when the card was still "new" and drivers were just not "perfect" for it.
have people noticed any bottlenecks in games with the 4GB HBM?

*Is there anything wrong with the ASUS Radeon R9 Fury X ?*
its priced at $569cad versus the other brands which are $849cad, Limited offer on Newegg till Wednesday.


----------



## Kana-Maru

Quote:


> Originally Posted by *HyeVltg3*
> 
> Wheres a good place to get some updated benchmarks, really thinking about upgrading to Fury X instead of 480/490, mainly because I'm currently on 390, if I wait too long the sell value of 390s will plummet when 480 comes out...so gotta go fast!
> 
> all the benches I keep finding are from 2015 when the card was still "new" and drivers were just not "perfect" for it.
> have people noticed any bottlenecks in games with the 4GB HBM?


Yeah it is a problem when you need up to date benchmarks on a lot of games. The Fury X gets abandoned for some reason nowadays. I have updated benchmarks on my blog. I'm not sure what games you play, but I've benchmarked a lot of games.

All games 100% max settings. The settings are 100% maxed even at 4K.

Latest Games
-Doom 2016 - 1440p - 4K
-Hitman DX12 - 1440p - 4K
-Rise of the Tomb Raider 1080p - 1440p - 4K [Controversy - Gameworks Disabled]

Other benchmarks 100% maxed at 4K:
-The Witcher 3
-Ryse: Some of Rome
-The Evil Within
-MGSV: The Phantom Pain
-Batman: Arkham Knight
-Unreal Tournament
*and a lot of other games.*

Unfortunately I can't post my blog on this site without getting flagged for advertising. It's ok if someone else post it, but apparently I can't. So far I have been enjoying my Fury X and i'm sure it will age well based on AMD drivers and architecture.


----------



## HyeVltg3

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah it is a problem when you need up to date benchmarks on a lot of games. The Fury X gets abandoned for some reason nowadays. I have updated benchmarks on my blog. I'm not sure what games you play, but I've benchmarked a lot of games.
> 
> All games 100% max settings. The settings are 100% maxed even at 4K.
> 
> Latest Games
> -Doom 2016 - 1440p - 4K
> -Hitman DX12 - 1440p - 4K
> -Rise of the Tomb Raider 1080p - 1440p - 4K [Controversy - Gameworks Disabled]
> 
> Other benchmarks 100% maxed at 4K:
> -The Witcher 3
> -Ryse: Some of Rome
> -The Evil Within
> -MGSV: The Phantom Pain
> -Batman: Arkham Knight
> -Unreal Tournament
> *and a lot of other games.*
> 
> Unfortunately I can't post my blog on this site without getting flagged for advertising. It's ok if someone else post it, but apparently I can't. So far I have been enjoying my Fury X and i'm sure it will age well based on AMD drivers and architecture.


looked at your sig...uh is there any way to provide a Search Term instead of the link?
I play at 1440p.
Currently "most" demanding ones I play are
- Overwatch
- Witcher 3
- Rise of the Tomb Raider

Just reposting my Q before it gets buried quite urgent to know if its the old "coil whine" batch or its just a retailer trying to get rid of overflowing stock

*Is there anything wrong with the ASUS Radeon R9 Fury X ?*
its priced at $569cad versus the other brands which are $849cad, Limited offer on Newegg till Wednesday.


----------



## GruntXIII

Quote:


> Originally Posted by *HyeVltg3*
> 
> Wheres a good place to get some updated benchmarks, really thinking about upgrading to Fury X instead of 480/490, mainly because I'm currently on 390, if I wait too long the sell value of 390s will plummet when 480 comes out...so gotta go fast!
> 
> all the benches I keep finding are from 2015 when the card was still "new" and drivers were just not "perfect" for it.
> have people noticed any bottlenecks in games with the 4GB HBM?


Here you go:

http://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/Specials/Benchmark-Test-Video-1195464/2/

It's german, but a benchmark ain't hard to read ^^. You can choose the games on the top and the resolutions on the upper left side (it says "Benchmarks (1 von 3)")

They do bench a lot and compare against overclocked cards as well (you can find that one under "Spezialbenchmarks")


----------



## gupsterg

Quote:


> Originally Posted by *HyeVltg3*
> 
> *Is there anything wrong with the ASUS Radeon R9 Fury X ?*


All Fury X are the same ref PCB, cooler, etc.

Only difference is brand stickers, box, bundle, warranty t&c of AIB but card made by same OEM.


----------



## Kana-Maru

Quote:


> Originally Posted by *HyeVltg3*
> 
> looked at your sig...uh is there any way to provide a Search Term instead of the link?
> I play at 1440p.
> Currently "most" demanding ones I play are
> - Overwatch
> - Witcher 3
> - Rise of the Tomb Raider
> 
> Just reposting my Q before it gets buried quite urgent to know if its the old "coil whine" batch or its just a retailer trying to get rid of overflowing stock
> 
> *Is there anything wrong with the ASUS Radeon R9 Fury X ?*
> its priced at $569cad versus the other brands which are $849cad, Limited offer on Newegg till Wednesday.


Yeah they very picky about links in the profile as well here. You can google my username and add Fury X to the end and I'm sure you'll see it. I don't have Overwatch yet. The Fury X prices are getting lower since AMD is releasing their new GPUs and Nvidia Pascal is releasing. I'm actually using the ASUS Radeon R9 Fury X and I have 0 issues with it. It comes with a 1 year free GameCaster license which is cool if you like streaming or recording your games. The GPU temps will spoil you and you'll probably never go back to air blowers any time soon







. My GPU temps are always in the low 40s. In my benchmarks I record the CPU and GPU temps as well. The GPU temps are always low even on very warm days.

Quote:


> Originally Posted by *GruntXIII*
> 
> Here you go:
> 
> http://www.pcgameshardware.de/Nvidia-Geforce-Grafikkarte-255598/Specials/Benchmark-Test-Video-1195464/2/
> 
> It's german, but a benchmark ain't hard to read ^^. You can choose the games on the top and the resolutions on the upper left side (it says "Benchmarks (1 von 3)")
> 
> They do bench a lot and compare against overclocked cards as well (you can find that one under "Spezialbenchmarks")


Ugh.....PCGH. Their results are usually worse than mine results and they are running newer tech than me. Plus I perform all of my benchmarks at stock clocks. Based on what I've seen from them compared to several other sites I wouldn't recommend that site for a fair Fury X comparison.


----------



## looncraz

Quote:


> Originally Posted by *Flamingo*
> 
> 8GB HBM1
> 
> Also 67-*C8*


The ID is legit, that's all I can say.

http://cateee.net/lkddb/web-lkddb/DRM_AMDGPU.html


----------



## Kana-Maru

Quote:


> Originally Posted by *looncraz*
> 
> The ID is legit, that's all I can say.
> 
> http://cateee.net/lkddb/web-lkddb/DRM_AMDGPU.html


Isn't that picture outdated and old as dirt though?


----------



## HyeVltg3

Quote:


> Originally Posted by *Kana-Maru*
> 
> Yeah they very picky about links in the profile as well here. You can google my username and add Fury X to the end and I'm sure you'll see it. I don't have Overwatch yet. The Fury X prices are getting lower since AMD is releasing their new GPUs and Nvidia Pascal is releasing. I'm actually using the ASUS Radeon R9 Fury X and I have 0 issues with it. It comes with a 1 year free GameCaster license which is cool if you like streaming or recording your games. The GPU temps will spoil you and you'll probably never go back to air blowers any time soon
> 
> 
> 
> 
> 
> 
> 
> . My GPU temps are always in the low 40s. In my benchmarks I record the CPU and GPU temps as well. The GPU temps are always low even on very warm days.


Oh geez, all the temps are low 40s, wow I knew WC a GPU gave great temps, I'm at 77-82c on load in any game that pushes 100% usage. Seeing 40s is just shocking.
Great site, already ordered the Fury X, should arrive sometime Tues. - Wed.
Now I just need to figure out how to get DVI-D DL to my QNIX monitor from the HDMI/DP ports of the Fury X...
Quote:


> Originally Posted by *gupsterg*
> 
> All Fury X are the same ref PCB, cooler, etc.
> 
> Only difference is brand stickers, box, bundle, warranty t&c of AIB but card made by same OEM.


Thank you very much.
This is exactly what I wanted to hear, was pacing about wondering if this is just pricing to compete with upcoming price drops when 480 launches, or is there something wrong with the ASUS, well, hopefully nothing, because the special discount is from "TECHOOL" via Newegg, so was thinking maybe they found a bunch of the old batch (coil whining issue) and are selling it at $210 less than all other "new batch" Fury Xs.
Just found it bit fishy, but I really hope its just a special discount, for discount sakes.

Still thank for that info, just narrows down what to worry about. haha


----------



## nyk20z3

Quote:


> Originally Posted by *HyeVltg3*
> 
> Oh geez, all the temps are low 40s, wow I knew WC a GPU gave great temps, I'm at 77-82c on load in any game that pushes 100% usage. Seeing 40s is just shocking.
> Great site, already ordered the Fury X, should arrive sometime Tues. - Wed.
> Now I just need to figure out how to get DVI-D DL to my QNIX monitor from the HDMI/DP ports of the Fury X...
> Thank you very much.
> This is exactly what I wanted to hear, was pacing about wondering if this is just pricing to compete with upcoming price drops when 480 launches, or is there something wrong with the ASUS, well, hopefully nothing, because the special discount is from "TECHOOL" via Newegg, so was thinking maybe they found a bunch of the old batch (coil whining issue) and are selling it at $210 less than all other "new batch" Fury Xs.
> Just found it bit fishy, but I really hope its just a special discount, for discount sakes.
> 
> Still thank for that info, just narrows down what to worry about. haha


Yup my Nano temps never go past 45C under water and i am only using a 240 rad and a 6700k in the loop at 4.5ghz.


----------



## Medusa666

Posting benchmarks for the Radeon Pro Duo in Firestrike and Valley Extreme HD, both are the free versions of the software.


----------



## xTesla1856

Just had that weird thing happen again where I get a black screen, audio stops, USB stops responding and both Furys go to 100% fan speed. Stays that way until I force kill my PC. I swear, if one of these cards is going bad again....


----------



## looncraz

Quote:


> Originally Posted by *Kana-Maru*
> 
> Isn't that picture outdated and old as dirt though?


Yes, it's a year old, but that really doesn't mean much - it takes years to design modern tech. We are just now hearing about fourth gen GCN and not a single product has been released using it, but fifth gen GCN (or GCN's replacement) is already likely years into its development cycle.

HBM took, IIRC, about six years from its initial design phase until a product came to market.


----------



## looncraz

Quote:


> Originally Posted by *xTesla1856*
> 
> Just had that weird thing happen again where I get a black screen, audio stops, USB stops responding and both Furys go to 100% fan speed. Stays that way until I force kill my PC. I swear, if one of these cards is going bad again....


I had a faulty power supply cause that for me. I replaced pretty much everything before I figured it out. Not sure what exactly was going on - if it was a spike, or one of the power rails went offline, but a new PSU and the problem never returned. It would happen at the weirdest of times, too, but only every few days, sometimes less, sometimes more often.


----------



## HyeVltg3

Quote:


> Originally Posted by *looncraz*
> 
> Yes, it's a year old, but that really doesn't mean much - it takes years to design modern tech. We are just now hearing about fourth gen GCN and not a single product has been released using it, but fifth gen GCN (or GCN's replacement) is already likely years into its development cycle.
> 
> HBM took, IIRC, about six years from its initial design phase until a product came to market.


Quote:


> Originally Posted by *looncraz*
> 
> I had a faulty power supply cause that for me. I replaced pretty much everything before I figured it out. Not sure what exactly was going on - if it was a spike, or one of the power rails went offline, but a new PSU and the problem never returned. It would happen at the weirdest of times, too, but only every few days, sometimes less, sometimes more often.


Before jumping on a new PSU, grab a PSU tester from newegg or something, dirt cheap, cheaper than a new PSU. test the rails, ez-pz. IF it is the PSU, get a new one. IF it isnt. then you can save the hassle of having to unplug/plug the PSU to all the connections, imo I find this a pain in the ass, just to test if "maybe its a faulty/dying PSU"--assumption.

also try reinstalling drivers?
use Display Driver Uninstaller (DDU), remove drivers, re install from AMD.

sounds similar to what happened during my first escapades into 390 CF. already taxing a Corsair AX850, when the recommended is 1050w. was really grasping at solutions when this happened, really was hoping it wasnt the PSU, Driver reinstall fixed everything.


----------



## Kana-Maru

Quote:


> Originally Posted by *HyeVltg3*
> 
> Oh geez, all the temps are low 40s, wow I knew WC a GPU gave great temps, I'm at 77-82c on load in any game that pushes 100% usage. Seeing 40s is just shocking.
> Great site, already ordered the Fury X, should arrive sometime Tues. - Wed.
> Now I just need to figure out how to get DVI-D DL to my QNIX monitor from the HDMI/DP ports of the Fury X...


I'm never going back to air for cooling. 77c-82c, there's no way I want that heat dumping into my room. The GPU temps are something that is largely ignored for some reason, but it definitely nothing short of amazing. It keeps the ambient temps low as well as the PC case temps low. It's awesome to finally see my GPU run at the same temperature [sometimes below] a overclocked water cooled CPU.

I just checked my Fury X box and it comes with an HDMI to DVI-SL connector. I personally went with Display Port since offers great sound and quality. What I like about display port over HDMI is that the DP offers great image output, not that HDMI doesn't, but the DP actually "locks" into place. This is much better than VGA\DVI screws and plug with no lock. Sometimes my HDMI cable falls out and I have a hard time plugging it into the monitor.


----------



## looncraz

Quote:


> Originally Posted by *HyeVltg3*
> 
> Before jumping on a new PSU, grab a PSU tester from newegg or something, dirt cheap, cheaper than a new PSU. test the rails, ez-pz. IF it is the PSU, get a new one. IF it isnt. then you can save the hassle of having to unplug/plug the PSU to all the connections, imo I find this a pain in the ass, just to test if "maybe its a faulty/dying PSU"--assumption.
> 
> also try reinstalling drivers?
> use Display Driver Uninstaller (DDU), remove drivers, re install from AMD.
> 
> sounds similar to what happened during my first escapades into 390 CF. already taxing a Corsair AX850, when the recommended is 1050w. was really grasping at solutions when this happened, really was hoping it wasnt the PSU, Driver reinstall fixed everything.


I would second checking every possible thing out first, especially drivers, but a PSU tester will do little good. Most do little more than provide a light load and show an LED for the power rails you attach. That won't let you know if you are having a random moment of failure. An oscilloscope could certainly do so, but you will need to monitor every power rail during at least one such event and you would kind of need to know what you were looking for already in order to setup an event capture.

Of course, replacing a PSU in my system is rather painless, and my next case (a few months away) will make it even easier.


----------



## HyeVltg3

Quote:


> Originally Posted by *Kana-Maru*
> 
> I'm never going back to air for cooling. 77c-82c, there's no way I want that heat dumping into my room. The GPU temps are something that is largely ignored for some reason, but it definitely nothing short of amazing. It keeps the ambient temps low as well as the PC case temps low. It's awesome to finally see my GPU run at the same temperature [sometimes below] a overclocked water cooled CPU.
> 
> I just checked my Fury X box and it comes with an HDMI to DVI-SL connector. I personally went with Display Port since offers great sound and quality. What I like about display port over HDMI is that the DP offers great image output, not that HDMI doesn't, but the DP actually "locks" into place. This is much better than VGA\DVI screws and plug with no lock. Sometimes my HDMI cable falls out and I have a hard time plugging it into the monitor.


Ya my 390 turns my room up 5-10c if I have my door closed. completely killing me in the summer, I dont mind the heating in the winter though.
How's the pump noise? I'm assuming you dont have and/or didnt keep the coil whinning first batch of Fury Xs.

Grabbing a DP to DVI-D DL visiontek adapter off of amazon, really hope it works. 33% of the reviews say it doesnt work at the rated 1600p, then the other 33% say it works and to disregard the others that say it doesnt because they're trying with a DVI SL cable, then 33% say it still doesnt work with a DVI DL cable. haha, no help at all. but its cheap and so far the only one I've found <$100 that claims to support higher than 1080p resolutions.

Need that DL (Dual Link) SL, Single is only good for 1920x1200 and below.
the QNIX I have is a 1440p (2560x1440) I bought to upgrade from 1080p, then found the awesome deal on the Fury X, I would really hate to have to go back to my 1080p benq all because of a port issue.

I'm guessing you're not interested in VR or havent benched (how you do it, I have no idea) VR Games....unless?
just wondering how the Fury X plays with VR, SteamVR rates it at Very High, same as the 980TI albeit 9-10 vs 11 scores.
tested my i7-4790k and 390 and got a 7 score in Fidelity.


----------



## Kana-Maru

Quote:


> Originally Posted by *HyeVltg3*
> 
> Ya my 390 turns my room up 5-10c if I have my door closed. completely killing me in the summer, I dont mind the heating in the winter though.
> How's the pump noise? I'm assuming you dont have and/or didnt keep the coil whinning first batch of Fury Xs.


What pump noise?







I've turned off everything I haven't heard any noise. My PC case is on top of my computer desk as well. The only noise comes from my Delta and Gentle Typhoon fans. The fan that comes with the Fury X is actually very low unless you run it at 100%, which will never happen.

Quote:


> Need that DL (Dual Link) SL, Single is only good for 1920x1200 and below.
> the QNIX I have is a 1440p (2560x1440) I bought to upgrade from 1080p, then found the awesome deal on the Fury X, I would really hate to have to go back to my 1080p benq all because of a port issue.


Yeah I know. You'll definitely want the DVI D DL for sure. 1920x1200 simply not enough. You'll be @ 1440p for nearly all of the games anyways. I simply went with the Display Port for 4K @ 60hz and the fact that that the older tech [VGA\DVI etc] is being replaced by Display Port.
Quote:


> I'm guessing you're not interested in VR or havent benched (how you do it, I have no idea) VR Games....unless?
> just wondering how the Fury X plays with VR, SteamVR rates it at Very High, same as the 980TI albeit 9-10 vs 11 scores.
> tested my i7-4790k and 390 and got a 7 score in Fidelity.


I like VR, but I don't want to pay a arm and leg for it. In SteamVR here are my results from February 2016:

*STOCK Fury X = GPU Core: 1050Mhz & HBM = 500Mhz*

*4Ghz DDR3-1400Mhz* = 9.3 - 8471


*4.8Ghz - DDR3-1675Mhz* = 9.6 - 8955 [Stock Fury X]

Now the overclock settings and results.

*Overclocked Fury X = GPU Core: 1100Mhz & HBM* = 550Mhz minor +50Mhz on Core and HBM.
*4Ghz DDR3-1400Mhz - OC Fury X* = 9.8 - 9199

*Overclocked Fury X = GPU Core: 1125Mhz & HBM* = 550Mhz minor +75Mhz on Core and +50Mhz HBM.
*4Ghz DDR3-1400Mhz - OC Fury X* = 9.9 - 9295

I'm not using the most up to date platforms so obvious the scores can increase with Ivy-Haswell-Broadwell and newer tech. For what it's worth I think the Fury X did very well in the VR test.


----------



## gupsterg

Quote:


> Originally Posted by *HyeVltg3*
> 
> Ya my 390 turns my room up 5-10c if I have my door closed. completely killing me in the summer, I dont mind the heating in the winter though.


Fury X rad will blow hot air, the air just from holding my hand close to it is no cooler than when my Vapor-X 290X was installed in rig.

Only difference being Vapor-X dumped air in case so mobo, CPU, etc increased in temp, then case fan vented air. Where as Fury X is not dumping hot air in case I see ~5C lower mobo, CPU, etc temps vs Vapor-X.

No worries on Fury X brand info







.

Another idea is perhaps sell 390 now and maybe get RX 480, etc. I would think they are gonna be blowing out cooler air than 2xx/3xx/Fury Series. TBH RX 480 may just worked out as a cooler / lower power using sideways upgrade.


----------



## HyeVltg3

I made a reddit thread yesterday to gain more insight on the success of the Fury X and was really surprised by the outcome.

__
https://www.reddit.com/r/4nskst/its_mid2016_now_should_i_get_fury_x_or_980ti_for/%5B/URL

Another idea is perhaps sell 390 now and maybe get RX 480, etc. I would think they are gonna be blowing out cooler air than 2xx/3xx/Fury Series. TBH RX 480 may just worked out as a cooler / lower power using sideways upgrade.[/QUOTE]

The RX 480 looks like a great card, but its not meant for someone like me haha, its aimed at mainstream and giving casual gamers a very cheap option to play mid-high end games.
sure the AMD demonstration was great but it needed to be crossfire'd to gain that "lead" over the 1080.
also, I've never bought a card that just launched, I'll give it a few months before that happens.
I only grabbed my 390 some time last summer.


----------



## Flamingo

Quote:


> Why are people claiming the card cant OC when I can clearly see people with higher than 1050mhz clocks, unless its unstable for 24/7? are the clocks good enough for 24/7? was it easy to "Custom ROM" (going to look that up after I install the card this week).


Average overclocks of 1136Mhz reported at hwbot - thats a 8% overclock. Compare that to the nvidia 980 Ti (~28% on air, 32% on water) series, its definitely not impressive and certainly no overclockers dream.
Quote:


> Is the 4GB really that limiting, I asked in the reddit thread but didnt really get a response, well got one saying the 4GB vram is bad, I thought the whole point of HBM was it wasnt like GDDR5, where if your game used more vram than your card had, you could clearly see how bad it can be with fps drops...but with the HBM you'd never hit the max because it was moving through the saved frames much faster because of the memory bandwidth, unless my understanding of HBM is way off.


Games have to be optimized by the driver dept of AMD to remove unneccesary stuff to keep the VRAM from limiting or affecting performance. As of today, Mirrors Edge is the only game that refuses to run on Fury series on the max settings. Unless AMD can change that, its a sign of things to come.
Quote:


> I did some more reading after the thread and found that thanks to some Windows 10 update the AMD cards got a huge boost in gaming and the fury X became 6-7% better than a reference 980ti. I really should have asked "Fury X vs 490" as those two cards are the only cards I feel like upgrading to. I really dont want to go back to overpriced Nvidia.


got a source on that? I'd like to read it too.


----------



## HyeVltg3

Quote:


> Originally Posted by *Flamingo*
> 
> Average overclocks of 1136Mhz reported at hwbot - thats a 8% overclock. Compare that to the nvidia 980 Ti (~28% on air, 32% on water) series, its definitely not impressive and certainly no overclockers dream.
> Games have to be optimized by the driver dept of AMD to remove unnecessary stuff to keep the VRAM from limiting or affecting performance. As of today, Mirrors Edge is the only game that refuses to run on Fury series on the max settings. Unless AMD can change that, its a sign of things to come.
> got a source on that? I'd like to read it too.


http://wccftech.com/amd-r9-fury-x-performance-ahead-nvidia-980-ti-latest-drivers/
http://www.overclock.net/t/1578881/fury-x-is-now-just-as-fast-as-gtx-980ti-in-1080p-1440p-and-faster-in-4k (same as above)




http://www.gamespot.com/forums/system-wars-314159282/i-have-joined-the-dark-side-and-fury-x-now-dominat-32816885/

not all reputable sources. but results.

So basically no need to worry about the 4GB vram "issue" or are you saying its an issue but not for today's games (other than GTAV I dont know of one that goes over 4GB of vram usage)

Is the 8% noticeable or just only meant for bragging rights.


----------



## Kana-Maru

Quote:


> Originally Posted by *HyeVltg3*
> 
> I made a reddit thread yesterday to gain more insight on the success of the Fury X and was really surprised by the outcome.
> 
> Its brought up some questions I have about the Fury X now:
> 
> 
> Why are people claiming the card cant OC when I can clearly see people with higher than 1050mhz clocks, unless its unstable for 24/7? are the clocks good enough for 24/7? was it easy to "Custom ROM" (going to look that up after I install the card this week).
> Is the 4GB really that limiting, I asked in the reddit thread but didnt really get a response, well got one saying the 4GB vram is bad, I thought the whole point of HBM was it wasnt like GDDR5, where if your game used more vram than your card had, you could clearly see how bad it can be with fps drops...but with the HBM you'd never hit the max because it was moving through the saved frames much faster because of the memory bandwidth, unless my understanding of HBM is way off.
> I did some more reading after the thread and found that thanks to some Windows 10 update the AMD cards got a huge boost in gaming and the fury X became 6-7% better than a reference 980ti. I really should have asked "Fury X vs 490" as those two cards are the only cards I feel like upgrading to. I really dont want to go back to overpriced Nvidia.
> The RX 480 looks like a great card, but its not meant for someone like me haha, its aimed at mainstream and giving casual gamers a very cheap option to play mid-high end games.
> sure the AMD demonstration was great but it needed to be crossfire'd to gain that "lead" over the 1080.
> also, I've never bought a card that just launched, I'll give it a few months before that happens.
> I only grabbed my 390 some time last summer.


1.) People are comparing the card to the GTX brand, but they always forget about the Radeon "architecture". Nvidia selling point is overclocking which debunks a lot of Nvidia fans "complaints" over the years. The biggest complaints about AMD GPUs were the "heat" and the wattage usage. First of all, enthusiast gamers don't give a rats butt about wattage or power usage, hence the extremely high PSU output [700w - 1400w and some people have dual Radiators]. The high wattage didn't really add nothing worth noting to the electric bill anyways. Secondly, these complaints are noting short of bias to lead someone from one brand to the other [basically complain as much as possible about AMD].

AMD has taken care of the heat issues and the power usage is much lower. Overclock headroom is fine, but it's not fine when your two major complaints were power usage and HEAT. Now all of a sudden power usage and heat doesn't matter when it's Nvidia GPUs smh. This made all of those who complained against AMD GPUs hypocrites. So now that those two complaints can't be used against AMD GPUs people look at overclocking, while completely leaving out the "architecture" and driver performance increases. DX11 is old tech and GTX users don't want to grasp the new tech since Nvidia can't support some features properly that are widely used. So while you can overclock the Fury X, but not as high as a GTX, that not the entire story. Just to show you why architecture matters in the long run just check this link: [GTX 1080 vs Radeon Fury [non X]

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-pro-duo-fiji-owners-club/8340#post_25146804
[the actual difference was 2.18% after re-calculating]

2.) I think AMD platform will address the benefits of HBM memory better, well I hope it does since there are so many things they can do with the tech. You seen my benchmarks, HBM hasn't been a issue since Day 1, but time after time people continue to make it a issue. See I can't curse on OCN like other sites so I have to keep it clean because this really gets me upset. If you look at the benchmarks you'll the Fury X actually beats the 6GB competitors in some games at high resolutions or basically giving the same performance. DX12\Vulkan will allow the architecture to perform even better [check my Hitman 2016 Day 1 vs Patch performance]. Obviously you will need a decent PC setup to get all of the benefits of high end gaming. I expected micro stuttering and performance issues and @ 4K i have not had any issues like that at all. People can complain about 4GB HBM all day, but when you look at the benchmarks you'll always see the Fury X right there with the TitanX\980 Ti. There's some info I'd like to post, but I can't post it here due to the length of the information. I'll send you a PM.

3.) Yeah I was with Nvidia for 5 years and got so tired of upgrading regularly that I went with the SLI setup. SLI isn't what it used to be and Nvidia has some serious issues that I had to consider. The 970 3.5GB issue was a bullet I dodged and if I went GTX 980 I'll probably be looking to upgrade again. So far I think I made the best decision instead of going GTX 980 Ti which would've cost $100 for the 980 Ti I wanted over the Fury X. The 980 Ti isn't a bad card, but the prices were simply ridiculous. $649.99 - $1049.99, simply disgusting that the prices surpassed the Titan X. Nvidia gets away with so I don't see them being reasonable anytime soon.

Quote:


> Is the 8% noticeable or just only meant for bragging rights.


You'll get a boost in performance for sure. You'll probably want to look through the topic for some tips about overclocking the Fury\Fury X though. Some settings can diminish your performance while others can increase performance. All of the benchmarks you see me running are 99% stock clocks.


----------



## Flamingo

Quote:


> Originally Posted by *HyeVltg3*
> 
> http://wccftech.com/amd-r9-fury-x-performance-ahead-nvidia-980-ti-latest-drivers/
> http://www.overclock.net/t/1578881/fury-x-is-now-just-as-fast-as-gtx-980ti-in-1080p-1440p-and-faster-in-4k (same as above)
> 
> 
> 
> 
> http://www.gamespot.com/forums/system-wars-314159282/i-have-joined-the-dark-side-and-fury-x-now-dominat-32816885/
> 
> not all reputable sources. but results.
> 
> So basically no need to worry about the 4GB vram "issue" or are you saying its an issue but not for today's games (other than GTAV I dont know of one that goes over 4GB of vram usage)
> 
> Is the 8% noticeable or just only meant for bragging rights.


Thanks for the links.

The 4GB vram issue has to be seen. Whether AMD chooses to work on or let go of the Fury X not being able to run the Hyper mode in Mirrors Edge Catalyst. Then it could very well become a real problem in future games (esp those backed by nvidia). Mirrors Edge Catalyst is a special case, i feel like the devs will not be in a position to help AMD because of the "VRAM limit adjust" console like feature that has been included in the PC version. But we have to wait and see.

The overclock part, its not only about bragging rights - but ultimately being able to squeeze more performance and getting a feeling of getting more out of your purchase. 8% might not be that noticeable and nothing to brag about really. At 4k resolutions, idk how noticeable it might be, but a 28-30% oc will definitely be more noticeable. Not to mention aftermarket cards that come overclocked and become standard in benchmarks - that blurs actual performance comparisons as well.


----------



## gupsterg

@HyeVltg3

I have spent a lot of my ownership time OC'ing / bios modding







, I have also owned 7 Fiji cards







(got 3 at present).



Spoiler: My cards



1st Sapphire Fury Tri-X STD edition stock clock 1000MHz 3584SP , this card had a VID of 1.243V, it unlocked to 3840SP exactly between Fury / X, it benched same as a Fury X, adding upto +50mV VID resulted in no OC gain, max was 1090MHz / 525MHz @ stock VID/MVDDC (please view later links in post regarding HBM clocking,etc).

2nd Sapphire Fury X, this card had a VID of 1.250V, this reached 1090/525, adding upto +50mV resulted in no OC gain.

3rd Sapphire Fury X, this card had a VID of 1.212V, this reached 1135/545 by adding ~+31mv VID, GPU will OC (+scale) a bit more but not thoroughly stable (will state by stability testing/thoughts later in post).

4th MSI Fury X, this card had a VID of 1.187V, this reached 1130/540 by adding ~+38mv VID, within a week it began to not hold OC, max then was ~1110MHz.

5th Sapphire Fury X, this card had a VID of 1.193V, this reached 1100MHz by adding ~+38mV VID, I did not OC HBM as I'd started a little investigation about it.

6th Sapphire Fury X, this card had a VID of 1.231V, this failed to pass [email protected] at stock clocks so was pulled from rig and maybe checked again.

7th Sapphire Fury X, this card had a VID of 1.231V, passed testing of 1100MHz with stock VID, 1115MHz with +25mV is failing, yet to complete testing.

All cards used same PowerLimit in ROM, stock coolers, TIM, pads. Fury X (Fiji cards 3 - 7) even same ROM was modded and I stuck to same driver (WHQL 16.3.2).

Stability test consists of 1hr each of 3DM FS / Heaven / Valley plus min 12hr [email protected] runs with no GPU "bad state" to equal stable OC plus some general gaming. At times these cards have been run upto 24hrs continuous [email protected] run and then 3D loads used.

In 3DM FS % clock gain = ~% performance gain, only when excess voltage is applied Fiji will not scale 1:1 from OC.



Like Flamingo said @ 4K scaling from OC you may as well forget and IMO it aids more 1080P than 1440P but you still get some benefit in latter res vs 4K.

Earlier I said I will later state about HBM OC'ing, please view these 2 posts:-

i) Fiji Memory controller / HBM clock stepping
ii) Performance scaling with HBM OC.

Since getting Fiji it has bugged why 4096SP is not killing benches, I know little of GPU architecture so started reading "stuff", I came to think lack of ROPs. Then a more learned member, The Stilt gave same view (he is pro overclocker with good AMD knowledge). Then I did searches relating to this, I've noted posts by others relating to ROP count on this site plus others.

All in all Fury/X I've had a good experience as "out of box", but also be aware I have not been paying UK online prices for these cards.

OC ability has not knocked me out. Considering how many cards I've had it's a bit disappointing not to find a stella card. HBM performance scaling with 545MHz is nothing at all to post about TBH.

IMO if I had a 390/X I'd be holding out till Vega, if power usage/heat is an issue with 390/X then RX 480. Yes it's mid range but it's coming across as 390 to Fury level performance. Selling that when Vega arrives will probably be less of loss of money vs 390/X & Fury/X. I'm in lucky position that I paid so cheap for Fury X that even if I sell when Vega is gonna hit I reckon I may lose no money, but still thinking about selling now and getting RX 480 to tide me over til Vega.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> I'm never going back to air for cooling. 77c-82c, there's no way I want that heat dumping into my room.


You do realize that heat generated is identical independent of the cooling mechanism you use and that that heat will end up in your room one way or the other?


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> You do realize that heat generated is identical independent of the cooling mechanism you use and that that heat will end up in your room one way or the other?


It's much better than air blowers dumping heat into the room. I've had my PC in the same place for many years. With my single GTX GPUs the room was very warm and with my GTX dual GPUs no heat was needed in the room during winter months.

I noticed that my room doesn't get hot as used to when I was running fan blowers. Even my gf noticed the room doesn't get as hot as it used to. I can actually close the door and breath now.


----------



## gupsterg

Quote:


> Originally Posted by *toncij*
> 
> You do realize that heat generated is identical independent of the cooling mechanism you use and that that heat will end up in your room one way or the other?


I concur, when both rigs been folding with Fury X the room feels as warm as when Hawaii cards with HSF were used.


----------



## toncij

Quote:


> Originally Posted by *Kana-Maru*
> 
> It's much better than air blowers dumping heat into the room. I've had my PC in the same place for many years. With my single GTX GPUs the room was very warm and with my GTX dual GPUs no heat was needed in the room during winter months.
> 
> I noticed that my room doesn't get hot as used to when I was running fan blowers. Even my gf noticed the room doesn't get as hot as it used to. I can actually close the door and breath now.


It has nothing to do with the type of cooler. A card generates certain amount of heat. Dissipating it to the case to heat the case surrounding slower in all directions or outside of it to make it faster in one, doesn't change the amount of heat generated.


----------



## Kana-Maru

Quote:


> Originally Posted by *toncij*
> 
> It has nothing to do with the type of cooler. A card generates certain amount of heat. Dissipating it to the case to heat the case surrounding slower in all directions or outside of it to make it faster in one, doesn't change the amount of heat generated.


I've stated *my* experience. If you don't like what I have to say I completely understand. I noticed a difference in my room ambient temp, especially during early Spring, Fall and Winter months.


----------



## battleaxe

Quote:


> Originally Posted by *toncij*
> 
> It has nothing to do with the type of cooler. A card generates certain amount of heat. Dissipating it to the case to heat the case surrounding slower in all directions or outside of it to make it faster in one, doesn't change the amount of heat generated.


I agree. But, when a certain clock speed needs less voltage as a result of being cooler, then it will dump less heat into the room, correct?

Less heat, less increase of resistance, less voltage, then less heat produced. So cooler is better as it allows the same clock at a lower temp overall and then produces less heat as the voltage needed to sustain those clocks is less. I've seen this on my pair of 390x cards and I assume it works the same on the Fury series and all other cards too. Even Nividia cards will use less voltage at a given overclock when run cooler. So I think when you can drop temps the cards really are producing less heat overall.


----------



## gupsterg

Fury / X / Nano are not too far off on stock VID from Hawaii/Grenada.

For example lowest LeakageID ASIC (= higher stock VID) on Hawaii XT is default VID DPM 7 1.287V, Grenada XT 1.281V and Fiji is 1.250V.

Bare in mind this is VID for all cases, so VDDC is lower due to LLC, etc.


----------



## battleaxe

Quote:


> Originally Posted by *gupsterg*
> 
> Fury / X / Nano are not too far off on stock VID from Hawaii/Grenada.
> 
> For example lowest LeakageID ASIC on Hawaii XT is default VID DPM 7 1.287V, Grenada XT 1.281V and Fiji is 1.250V.
> 
> Bare in mind this is VID for all cases, so VDDC is lower due to LLC, etc.


I think we are talking about two different things though.

I am speaking to any GPU. A GPU that needs say 1.3v to be stable, when the temps are reduced by say 25C the voltage is often less that is needed to hold the same clocks. So in this case, you can use less volts for same mhz in OC. The reason is because lower heat allows for lower resistance in all circuits and in the core itself.

Therefore, less heat will be generated (overall) as a result of this. Or theoretically, you can push the OC higher which is what most of us do around here with the added OC headroom. But, nonetheless - lower heat, less voltage used, lower heat generated. I'm not a rocket scientist, but I know a bit about resistance and what it does. In essence, there is less insertion loss from the voltage circuit to the output.

= less heat


----------



## gupsterg

Kana-Maru uses his Fury X mainly at stock from how his posts came across to me (I could be wrong).
Quote:


> Originally Posted by *Kana-Maru*
> 
> The most I was able to push my Fury X was 1180Mhz I believe and that's not much if you compare to Nvidia counterparts. *I was able to hit 1125Mhz with only changing the core clock.* However, I'm not really complaining about overclocking since the performance is much better than I thought it would be at stock clocks.


Quote:


> Originally Posted by *Kana-Maru*
> 
> Plus I perform all of my benchmarks at stock clocks. Based on what I've seen from them compared to several other sites I wouldn't recommend that site for a fair Fury X comparison.


Quote:


> Originally Posted by *Kana-Maru*
> 
> All of the benchmarks you see me running are 99% stock clocks.


So I was conveying "out of box" / "stock" scenario, to show HSF vs AIO in a way with max stock VID.

View this Crysis 3 data for my Vapor-X 290X VID for that OC/ROM is 1.30V but MAX VDDC = 1.218V for Crysis 3 load, due to EVV for like DPM state card will run at differing VID/VDDC.



Next Fury X, is at 1.243V in ROM, MAX VDDC is 1.212V, again EVV rules apply.



Now Vapor-X GPU VRM watts = 260W , Fury X GPU VRM watts = 303W.

Yep Fury X is cooler running on AIO but the cooler is better at removing the heat, so I'm not surprised the exhaust air off the 120mm rad feels as hot or hotter than the Vapor-X dumping air in my case and noticing warmth at case fan exhaust.


----------



## diggiddi

Quote:


> Originally Posted by *Kana-Maru*
> 
> I'm never going back to air for cooling. 77c-82c, *there's no way I want that heat dumping into my room*. The GPU temps are something that is largely ignored for some reason, but it definitely nothing short of amazing. It keeps the ambient temps low as well as the PC case temps low. It's awesome to finally see my GPU run at the same temperature [sometimes below] a overclocked water cooled CPU.
> 
> I just checked my Fury X box and it comes with an HDMI to DVI-SL connector. I personally went with Display Port since offers great sound and quality. What I like about display port over HDMI is that the DP offers great image output, not that HDMI doesn't, but the DP actually "locks" into place. This is much better than VGA\DVI screws and plug with no lock. Sometimes my HDMI cable falls out and I have a hard time plugging it into the monitor.


Umm, so where does the heat go???


----------



## bluezone

Quote:


> Originally Posted by *diggiddi*
> 
> Umm, so where does the heat go???


Lower temps of liquid cooling can lead to lower current leakage. Less leakage = less heat. To a degree.
Less voltage to maintain clocks as well.
When your not pushing heat into the PC case, other components have less thermal stress as well. Which leads to less current used as well.

The exact opposite of this is an old silicon component problem. Thermal run away. With old audio amps, discreet single output transistors were used in "the good old days". This condition was catastrophic. High temps lead to higher current flow which lead to greater heating, etc.... End result, magic blue smoke. Very expensive when your building your own equipment. TO3 3055's were not cheep.


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> Kana-Maru uses his Fury X mainly at stock from how his posts came across to me (I could be wrong).
> 
> So I was conveying "out of box" / "stock" scenario, to show HSF vs AIO in a way with max stock VID.
> 
> View this Crysis 3 data for my Vapor-X 290X VID for that OC/ROM is 1.30V but MAX VDDC = 1.218V for Crysis 3 load, due to EVV for like DPM state card will run at differing VID/VDDC.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Next Fury X, is at 1.243V in ROM, MAX VDDC is 1.212V, again EVV rules apply.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now Vapor-X GPU VRM watts = 260W , Fury X GPU VRM watts = 303W.
> 
> Yep Fury X is cooler running on AIO but the cooler is better at removing the heat, so I'm not surprised the exhaust air off the 120mm rad feels as hot or hotter than the Vapor-X dumping air in my case and noticing warmth at case fan exhaust.


My goodness dude calm down. First of all, it isn't THAT serious. Secondly, the GPU behaves differently when I'm playing games rather than benchmarking them. There is a "Power Efficiency" mode that allows you to enjoy games without always using the max clock for starters. I always use this settings unless I want constant clocks. Every game doesn't need the core clock running at max either.

Then there is FRTC which can also help lower the power usage and core clocks for those who use it. Then you have some people who undervolt the GPU and that shown to actually increase performance with some Fiji cards. Now enjoy life.

Quote:


> Originally Posted by *diggiddi*
> 
> Umm, so where does the heat go???


C'mon man. Please don't be that guy. Don't take words out of context.


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> My goodness dude calm down. First of all, it isn't THAT serious. Secondly, the GPU behaves differently when I'm playing games rather than benchmarking them. There is a "Power Efficiency" mode that allows you to enjoy games without always using the max clock for starters. I always use this settings unless I want constant clocks. Every game doesn't need the core clock running at max either.
> 
> Then there is FRTC which can also help lower the power usage and core clocks for those who use it. Then you have some people who undervolt the GPU and that shown to actually increase performance with some Fiji cards. Now enjoy life.


I'm calm







.

Never said it was THAT serious







.

"Power Effciency" does not work always that way







(generally speaking), when I my posted Crysis 3 Fury X HWiNFO screenie (an "off the cuff" test) I looked at the average GPU clock and thought wow my OC is throttling so last night I ran Crysis 3 with MSI AB graphing GPU clock. This was with "Power Efficency" ON







, if you read the "PowerTune" whitepaper PDF it aims to stick to max clock.

Crysis3_Fury_X_1135_545.zip 12k .zip file


For example PE OFF makes a difference to Heaven/Valley but not 3DM13 and again GPU clock will not be 75% less with PE On in Heaven/Valley. I was recently helping a member with a Nano to tweak ROM, that card does not have "Power Efficiency" toggle in driver page.



Spoiler: Nano / Fury X PE compare















So in above compare when card Nano is at loading screens there is clock bounce, on my Fury X with PE ON it's more well behaved. Even when that clock bounce occurs on Nano GPU usage is very low, close to 0%, so I would think the clock bounce is just showing clock change but as GPU is not showing usage it would not be as high a power load as when it is. So even if Nano had PE it could be questionable how much power the card would save when clock bounce occurs but no/low GPU usage.

FRTC I can understand would help but on an another AMD card with HSF it would also help







.

The only reason why I have stated info is due to member wishing to purchase a Fury X and pointing out he feels 390 adds heat to room, I did not want him to think Fury X is so much cooler that it will not do the same. This is similar to how you would do an article on your site to help viewers to be better informed







.


----------



## Alastair

If I can add my two cents to the ongoing discussion.

A reference 290X at let's say 275 watts is doing 90C. That blower is pumping the hot near 90C air out the back of the chassis.

A reference FuryX doing over 300 watts has a much more efficient cooler design. It cools that 300 watts more effectively and the core is only reaching let's say 50 degrees. Let's say the exhaust temp of the air coming out of the radiator is 40C.

Although the one card is DISSIPATING more heat. The one card is still dumping hotter air into the room due to its less efficient cooling.

So while the one card is dissipating more in watts. The other card is seemingly dumping hotter air into the room.


----------



## gupsterg

Interesting point







, +rep .


----------



## Orthello

Quote:


> Originally Posted by *Alastair*
> 
> If I can add my two cents to the ongoing discussion.
> 
> A reference 290X at let's say 275 watts is doing 90C. That blower is pumping the hot near 90C air out the back of the chassis.
> 
> A reference FuryX doing over 300 watts has a much more efficient cooler design. It cools that 300 watts more effectively and the core is only reaching let's say 50 degrees. Let's say the exhaust temp of the air coming out of the radiator is 40C.
> 
> Although the one card is DISSIPATING more heat. The one card is still dumping hotter air into the room due to its less efficient cooling.
> 
> So while the one card is dissipating more in watts. The other card is seemingly dumping hotter air into the room.


Heat eventually has to go somewhere , the energy is not disappearing. However liquid can hold a lot of heat energy for every degree it raises in temperature vs air. Roughly 4x more energy per degree.

The liquid itself is heat storage .. much more effective than storing heat in air. EG to raise a litre of water by 1c takes more energy than raising 1 cubic litre of air by 1c. Couple that with been able to cool the surface better , having far greater metal / air ratio with the radiators vs heat sink and fan and you get the more efficient cooler.

Fury X without watercooling would have been very hot due to HBM and ~ 9 billion Transistors in a small area, raising its TDP due to leakage . Attempting to air cool it could have produced a 350 watt card etc and most likely AMD would have reduced clocks / voltage if they had to stick to air cooling.


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> I'm calm
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Never said it was THAT serious
> 
> 
> 
> 
> 
> 
> 
> .
> 
> "Power Effciency" does not work always that way
> 
> 
> 
> 
> 
> 
> 
> (generally speaking), when I my posted Crysis 3 Fury X HWiNFO screenie (an "off the cuff" test) I looked at the average GPU clock and thought wow my OC is throttling so last night I ran Crysis 3 with MSI AB graphing GPU clock. This was with "Power Efficency" ON
> 
> 
> 
> 
> 
> 
> 
> , if you read the "PowerTune" whitepaper PDF it aims to stick to max clock.
> .


Every game I play isn't stressing the GPU like Crysis 3 does. Obviously Crysis is known to burn holes in GPUs. You never had to say it was that serious. Your post proves that with pointless details since it won't change my experience at all coming form dual air coolers. At the end of day ALL of those test isn't going to change my experience with air coolers and the water cooler. Period. Just like my opinions on certain situations won't change many minds.

Quote:


> This was with "Power Efficency" ON smile.gif , if you read the "PowerTune" whitepaper PDF it aims to stick to max clock.


From my testing, experience and monitoring it "Power Efficiency: DISABLED" causes the GPU to NOT down-clock during Idle situations otherwise the GPU will fluctuate from the smallest hint of video activity. Such as a Youtube activity, ads, Netflix video etc. Sometimes it doesn't have to be a video at all just basic PC usage. This adds unnecessary heat and clocks higher than 300Mhz. It doesn't cause my games to down clock either depending on the title. Some games dip harder than others since not all games stress the GPU.

With "Power Efficiency: ENABLED" the GPU stays at 300Mhz during idle periods and which keeps the heat low and the room cooler. While surfing the web or what YouTube\Netflix etc the GPU stays below 300Mhz. So unless I'm benchmarking I normally leave "Power Efficiency: ENABLED" for the types of games I play . This causes the GPU to fluctuate core clocks so when the game isn't stressing the card there no reason to max the core and the voltage which helps keep heat from what I've seen.

Otherwise I can set the FRTC which helps even more and from there I can down clock since I can enjoy many games that I love to play below 1000Mhz. Every game isn't a "Crysis situation"







< Did anyone cringe, if so, sorry I've been playing a lot of Rainbow Six: Siege lately?

Quote:


> The only reason why I have stated info is due to member wishing to purchase a Fury X and pointing out he feels 390 adds heat to room, I did not want him to think Fury X is so much cooler that it will not do the same. This is similar to how you would do an article on your site to help viewers to be better informed wink.gif .


Well as far as I know he's already purchased it. So he'll find out sooner or later. Perhaps it will be just as warm for him. I have no issues with if that turns out to be his experience with the Fury X. It won't my experience either way. I'm pretty sure he understands that the radiator is going to heat up.


----------



## Flamingo

Quote:


> Originally Posted by *gupsterg*
> 
> 
> 
> Spoiler: Nano / Fury X PE compare
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So in above compare when card Nano is at loading screens there is clock bounce, on my Fury X with PE ON it's more well behaved. Even when that clock bounce occurs on Nano GPU usage is very low, close to 0%, so I would think the clock bounce is just showing clock change but as GPU is not showing usage it would not be as high a power load as when it is. So even if Nano had PE it could be questionable how much power the card would save when clock bounce occurs but no/low GPU usage.


Just wanted to add that when I oc'd my Nano to 1050Mhz with +50% PL, it throttled down only at the first 3DMark test (while not reaching threshold temperature).

When I ran 3DMark2001SE, it throttled very heavily. Maybe because it drawing more and more power, its weird how it works yet at the same time interesting...


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> Every game I play isn't stressing the GPU like Crysis 3 does. Obviously Crysis is known to burn holes in GPUs. You never had to say it was that serious. Your post proves that with pointless details since it won't change my experience at all coming form dual air coolers. At the end of day ALL of those test isn't going to change my experience with air coolers and the water cooler. Period. Just like my opinions on certain situations won't change many minds.


I picked Crysis 3 due to having the data on Hawaii, I did that data for a post on hexus.net where a member months ago was wondering on temps/power consumption prior to buying card. To me & him C3 was a max real world scenario rather than doing Furmark / Kombustor / OCCT (which I never run on a GPU).

Recently for several days I had also been pondering viewing data and comparing it with Fury X as I just wanted to know, as you can tell from the screenie date done prior to this subject coming up in thread AFAIK.

I'm not out to change your opinion







. I'm just discussing topic that has been presented in thread







. We can post ideas / thoughts / opinions / experience / data , this may help any of us or not and it is our choice if from the posts it changes our view or enlightens us







and it may not







.
Quote:


> Originally Posted by *Kana-Maru*
> 
> From my testing, experience and monitoring it "Power Efficiency: DISABLED" *causes the GPU to down-clock during Idle situations* otherwise the GPU will fluctuate from the smallest hint of video activity. Such as a Youtube activity, ads, Netflix video etc. Sometimes it doesn't have to be a video at all just basic PC usage. This adds unnecessary heat and clocks higher than 300Mhz. It doesn't cause my games to down clock either depending on the title. Some games dip harder than others since not all games stress the GPU.


The bold text I disagree with. The GPU is down clocking is not because of PE=Off but just due to how PowerTune work. Rest of your text is why Power Efficiency is in driver, it used not be available to us in earlier drivers and by default it was on. This clock bounce at hint of GPU activity was one gripe I had with Hawaii, hence after experiencing Fury X it was another reason to keep card. Power Efficiency toggle pretty much came about when clock blocker arrived on the web, the creation of clock blocker happened due to a thread on AMD community forum regarding display corruption on Fury and a member was author of it. So AMD released drivers with PE toggle so GPU will maintain highest clock for certain scenarios where PowerTune was hindering them.
Quote:


> Originally Posted by *Kana-Maru*
> 
> With "Power Efficiency: ENABLED" the GPU stays below 300Mhz during idle periods and which keeps the heat low and the room cooler.


Unless you have modded the ROM, GPU can never be lower than 300MHz as DPM 0 in all stock ROMs is 300MHz, even with PE=On.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Just wanted to add that when I oc'd my Nano to 1050Mhz with +50% PL, it throttled down only at the first 3DMark test (while not reaching threshold temperature).
> 
> When I ran 3DMark2001SE, it throttled very heavily. Maybe because it drawing more and more power, its weird how it works yet at the same time interesting...


I've been modding PowerLimit for member on OCuk, first 2 test ROMs showed improved clock stability with 3DM FS but still not flat like Fury/X







. Member is now got ROM same PL as Huntcraft Nano ROM for WC setup (posted in Fiji bios mod), waiting on test info. He's also testing reducing voltage offset via MSI AB as several members on here have had success with that to help maintain better average clock.

I'm just surprised Nano does not have PE in drivers on by default or option like Fury/X to toggle it, it's same GPU plus would benefit it more to keep tighter power usage, etc.


----------



## Alastair

Quote:


> Originally Posted by *Orthello*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> If I can add my two cents to the ongoing discussion.
> 
> A reference 290X at let's say 275 watts is doing 90C. That blower is pumping the hot near 90C air out the back of the chassis.
> 
> A reference FuryX doing over 300 watts has a much more efficient cooler design. It cools that 300 watts more effectively and the core is only reaching let's say 50 degrees. Let's say the exhaust temp of the air coming out of the radiator is 40C.
> 
> Although the one card is DISSIPATING more heat. The one card is still dumping hotter air into the room due to its less efficient cooling.
> 
> So while the one card is dissipating more in watts. The other card is seemingly dumping hotter air into the room.
> 
> 
> 
> Heat eventually has to go somewhere , the energy is not disappearing. However liquid can hold a lot of heat energy for every degree it raises in temperature vs air. Roughly 4x more energy per degree.
> 
> The liquid itself is heat storage .. much more effective than storing heat in air. EG to raise a litre of water by 1c takes more energy than raising 1 cubic litre of air by 1c. Couple that with been able to cool the surface better , having far greater metal / air ratio with the radiators vs heat sink and fan and you get the more efficient cooler.
> 
> Fury X without watercooling would have been very hot due to HBM and ~ 9 billion Transistors in a small area, raising its TDP due to leakage . Attempting to air cool it could have produced a 350 watt card etc and most likely AMD would have reduced clocks / voltage if they had to stick to air cooling.
Click to expand...

Yes. Which is precisely the reason why I blower air cooler seems to pump out more heat vs. a competing liquid solution. The liquid is storing more energy than the air is so the relative heat dump on an air cooler is much greater vs. that of a liquid or water cooler.


----------



## Kana-Maru

Quote:


> Originally Posted by *gupsterg*
> 
> The bold text I disagree with. The GPU is down clocking is not because of PE=Off but just due to how PowerTune work. Rest of your text is why Power Efficiency is in driver, it used not be available to us in earlier drivers and by default it was on. This clock bounce at hint of GPU activity was one gripe I had with Hawaii, hence after experiencing Fury X it was another reason to keep card. Power Efficiency toggle pretty much came about when clock blocker arrived on the web, the creation of clock blocker happened due to a thread on AMD community forum regarding display corruption on Fury and a member was author of it. So AMD released drivers with PE toggle so GPU will maintain highest clock for certain scenarios where PowerTune was hindering them.


That was my mistake. I left out one word: "not"

I meant to say
Quote:


> From my testing, experience and monitoring it "Power Efficiency: DISABLED" causes the GPU to *NOT* down-clock during Idle situations otherwise the GPU will fluctuate from the smallest hint of video activity.


Quote:


> Originally Posted by *gupsterg*
> 
> Unless you have modded the ROM, GPU can never be lower than 300MHz as DPM 0 in all stock ROMs is 300MHz, even with PE=On.


I also meant to say that the Fury X stays at 300Mhz, not below 300Mhz.

I've corrected those two mistakes in my first post.


----------



## spyshagg

Quote:


> Originally Posted by *Alastair*
> 
> If I can add my two cents to the ongoing discussion.
> 
> A reference 290X at let's say 275 watts is doing 90C. That blower is pumping the hot near 90C air out the back of the chassis.
> 
> A reference FuryX doing over 300 watts has a much more efficient cooler design. It cools that 300 watts more effectively and the core is only reaching let's say 50 degrees. Let's say the exhaust temp of the air coming out of the radiator is 40C.
> 
> Although the one card is DISSIPATING more heat. The one card is still dumping hotter air into the room due to its less efficient cooling.
> 
> So while the one card is dissipating more in watts. The other card is seemingly dumping hotter air into the room.


It doesn't work like that im afraid.

Consumed watts are dissipated watts. It doesn't matter if the cooling solution manages to keep a 300w chip at 50º, and the other at 90º, because those 300w will be dissipated as heat regardless of how "hot" the air coming out feels in your hand.

The difference in your two examples is that one cooling solution manages to dissipate the same amount of heat in a wider area at a greater pace, making the air feel less hot. The amount of watts being dissipated into the room however, is still the same.


----------



## Agent Smith1984

Posted elsewhere also, but:

A big RX480 leak with performance numbers, temps, and power is set to be out tomorrow.

When asked if it's good by users, the source said "I'll let you be the judge of that...."

When prodded again, the source said "all I can say is, if you have a 390 or 980, you better sell it now!"

Another leaked graph has shown the RX480 to clock to 1400mhz and beat a Fury X.

If we really do see the 8GB variant around $230, then this will be an insane level of performance at that price point.

NVIDIA may have offered a pretty hard blow to the face with it's 1000 series, but these little 480's are going to be a DIRECT KICK IN THE NUTS TO THEM~









Just thought I'd share the gossip


----------



## xTesla1856

Checked my PSU, it isn't faulty. What else could be causing the black screen+full fan speed with R9 Furys in CF? It happened today again, right after I logged in to Windows. This is really starting to get annoying....


----------



## gupsterg

@Orthello @spyshagg

+rep







.

@Kana-Maru

No worries







.

@Agent Smith1984

Some new benches on Videocardz







.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gupsterg*
> 
> @Orthello @spyshagg
> 
> +rep
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Kana-Maru
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Agent Smith1984
> 
> Some new benches on Videocardz
> 
> 
> 
> 
> 
> 
> 
> .


Yes, and tomorrow should have even more stuff.

Let's see, $230 8GB GPU with Fury performance (mind you it may take some OCing to truly reach Fury level) that only uses around 140w TDP and will likely clock in the high-1300/low-1400 range...

Yeah, I'll take two of those over single $450 1070 please


----------



## gupsterg

Is seeming like a steal







, can't wait tell some game benches are out







.

I was thoroughly tempted to put my Fury X on ebay past few days as it was £1 max selling fees, but as I bought it so cheap







I think I'm gonna keep til Vega is out







.

It's so tempting to go CF on RX 480 for "bang for $" with lower power usage than say 2xx/3xx/Fiji CF.


----------



## Flamingo

The 1400mhz overclock graph reaching Fury X and GTX 1070 levels is a confirmed fake.


----------



## Alastair

Quote:


> Originally Posted by *spyshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> If I can add my two cents to the ongoing discussion.
> 
> A reference 290X at let's say 275 watts is doing 90C. That blower is pumping the hot near 90C air out the back of the chassis.
> 
> A reference FuryX doing over 300 watts has a much more efficient cooler design. It cools that 300 watts more effectively and the core is only reaching let's say 50 degrees. Let's say the exhaust temp of the air coming out of the radiator is 40C.
> 
> Although the one card is DISSIPATING more heat. The one card is still dumping hotter air into the room due to its less efficient cooling.
> 
> So while the one card is dissipating more in watts. The other card is seemingly dumping hotter air into the room.
> 
> 
> 
> It doesn't work like that im afraid.
> 
> Consumed watts are dissipated watts. It doesn't matter if the cooling solution manages to keep a 300w chip at 50º, and the other at 90º, because those 300w will be dissipated as heat regardless of how "hot" the air coming out feels in your hand.
> 
> The difference in your two examples is that one cooling solution manages to dissipate the same amount of heat in a wider area at a greater pace, making the air feel less hot. The amount of watts being dissipated into the room however, is still the same.
Click to expand...

Yes but air 90C air coming out the back of a blower is still 90C air vs. 50C air out the back of a CLC. Regardless of watts generated the relative air temp is till higher on the one cooling solution.


----------



## spyshagg

Quote:


> Originally Posted by *Alastair*
> 
> Yes but air 90C air coming out the back of a blower is still 90C air vs. 50C air out the back of a CLC. Regardless of watts generated the relative air temp is till higher on the one cooling solution.


You don't get it.

Which feels hotter to the hand, a 300 watt hair dryer or a 300 watt living room heater? Both heat the room at the same rythm.

You cant judge what heats up the room more by using the cooling capability of a device.

Watts are Watts, no matter how quickly they get out of the computer case

(300w Fury X = quickly = feels fresher | 300w 290x = slowly = feels hotter)

cheers


----------



## gupsterg

@Alastair

Not to sure it would be 90C and by this hoping someone gives info







.

I was viewing this. So air has less thermal conductivity so it would carry less of the heat energy, air also has less heat capacity. So theoretically I'm assuming the heat energy is being stored more in the heatsink so temps higher on it / what it's cooling?

Then I'm thinking the AIO unit's liquid has stored more heat energy, when reaching rad it's more/concentrated? taking into account Fury X only 120mm rad, if it was bigger I'd think be cooler air/less concentrated.

Say on my Vapor-X it had large heatsink, so heat energy being dispersed across large area. It never felt to touch as hot as GPU temps was, yes the air on case exhaust fan was noticeably warm when GPU under load.

Now on Fury X the rad is only 120mm, not much more depth to it than Vapor-X HS height, but that had way bigger length. To me Fury X rad air feels warmer more concentrated air.

Perhaps just rambling







, can't find any test data on exhaust air of HSF vs AIO vs WC even for CPU situation. As interested by all this may just have to run some tests







.


----------



## spyshagg

Theoretically a 120mm radiator heats up the room *Quicker* than a 290x stock cooler when both cards are consuming 300watts of waste heat. But at the end of a fixed period, say 2 hours, the rooms will be exactly the same temperature.


----------



## gupsterg

Yep, I would agree from what I'm understanding







.

On another note







, tonight had a little session of SWBF - MP Blast, shows sweet clock stability (driver defaults)







. OC via ROM GPU: 1135 (~+31mV ie 1.243V VID) HBM: 545MHz (+25mV ie 1.325V), attached HML file.

SWBF_Fury_X_1135_545.zip 19k .zip file


Gotta start doing some FRAPS tests to compare with Hawaii for games I have data







.


----------



## bluezone

Quote:


> Originally Posted by *spyshagg*
> 
> It doesn't work like that im afraid.
> 
> Consumed watts are dissipated watts. It doesn't matter if the cooling solution manages to keep a 300w chip at 50º, and the other at 90º, because those 300w will be dissipated as heat regardless of how "hot" the air coming out feels in your hand.
> 
> The difference in your two examples is that one cooling solution manages to dissipate the same amount of heat in a wider area at a greater pace, making the air feel less hot. The amount of watts being dissipated into the room however, is still the same.


Yes. BTU's not temp. energy both stored or active can only be transformed not pulled out of thin air. 90w in = 90w out.








Quote:


> Originally Posted by *Agent Smith1984*
> 
> Posted elsewhere also, but:
> 
> A big RX480 leak with performance numbers, temps, and power is set to be out tomorrow.
> 
> When asked if it's good by users, the source said "I'll let you be the judge of that...."
> 
> When prodded again, the source said "all I can say is, if you have a 390 or 980, you better sell it now!"
> 
> Another leaked graph has shown the RX480 to clock to 1400mhz and beat a Fury X.
> 
> If we really do see the 8GB variant around $230, then this will be an insane level of performance at that price point.
> 
> NVIDIA may have offered a pretty hard blow to the face with it's 1000 series, but these little 480's are going to be a DIRECT KICK IN THE NUTS TO THEM~
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just thought I'd share the gossip


Thank you. Thank you. Thank you.









REP +1
Quote:


> Originally Posted by *xTesla1856*
> 
> Checked my PSU, it isn't faulty. What else could be causing the black screen+full fan speed with R9 Furys in CF? It happened today again, right after I logged in to Windows. This is really starting to get annoying....


Sounds like a driver sounds like a crash. Have you DDU'ed and reinstall drivers? also have you tried running a single card in the second PCI-e slot?
Quote:


> Originally Posted by *gupsterg*
> 
> @Alastair
> 
> Not to sure it would be 90C and by this hoping someone gives info
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I was viewing this. So air has less thermal conductivity so it would carry less of the heat energy, air also has less heat capacity. So theoretically I'm assuming the heat energy is being stored more in the heatsink so temps higher on it / what it's cooling?
> 
> Then I'm thinking the AIO unit's liquid has stored more heat energy, when reaching rad it's more/concentrated? taking into account Fury X only 120mm rad, if it was bigger I'd think be cooler air/less concentrated.
> 
> Say on my Vapor-X it had large heatsink, so heat energy being dispersed across large area. It never felt to touch as hot as GPU temps was, yes the air on case exhaust fan was noticeably warm when GPU under load.
> 
> Now on Fury X the rad is only 120mm, not much more depth to it than Vapor-X HS height, but that had way bigger length. To me Fury X rad air feels warmer more concentrated air.
> 
> Perhaps just rambling
> 
> 
> 
> 
> 
> 
> 
> , can't find any test data on exhaust air of HSF vs AIO vs WC even for CPU situation. As interested by all this may just have to run some tests
> 
> 
> 
> 
> 
> 
> 
> .


Water moderates the BTU released to air. max temps are lower but total BTU is the same.









Cheers


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> The 1400mhz overclock graph reaching Fury X and GTX 1070 levels is a confirmed fake.


do you have a link? I'd like to read this. The only thing I've read so far supporting it being fake assumes 6 pin can only supply + 75 watts, which isn't precisely true. More info would be appreciated.


----------



## NBrock

Finally got my Fury X back (after like 3 months). I sent it in, they had it for a while then sent the same thing back and it didn't work. I got irritated and emailed support and they essentially said I was full of crap. So I went to their FaceBook page and complained. I got someone there to get me shipping covered and sent it back along with videos of it not working. Anyway. Got what looks and smells like a brand new card just not in the OE boxing. Pump isn't as quiet as my original one was but this one seems to run a bit cooler and at a bit lower voltage.


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> do you have a link? I'd like to read this. The only thing I've read so far supporting it being fake assumes 6 pin can only supply + 75 watts, which isn't precisely true. More info would be appreciated.




__
https://www.reddit.com/r/4nvlry/leaked_rx_480_oced_to_1400_shows_fury_x/d47ex8i%5B/URL

R9 nano score is a bit lower (~100 points less) than my system though:

Stock : 3441
+20 PL : 3626
+50 PL: 3774


----------



## dagget3450

Quote:


> Originally Posted by *Flamingo*
> 
> 
> __
> https://www.reddit.com/r/4nvlry/leaked_rx_480_oced_to_1400_shows_fury_x/d47ex8i%5B/URL
> 
> R9 nano score is a bit lower (~100 points less) than my system though:
> 
> Stock : 3441
> +20 PL : 3626
> +50 PL: 3774


I know one thing i am done with these 480 threads. There are like 3 or 4 now with "leaked benches". NDA cannot lift fast enough but at the same time i am just going to avoid the news section. I feel like now its just all guessing and faking.


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> 
> __
> https://www.reddit.com/r/4nvlry/leaked_rx_480_oced_to_1400_shows_fury_x/d47ex8i%5B/URL
> 
> R9 nano score is a bit lower (~100 points less) than my system though:
> 
> Stock : 3441
> +20 PL : 3626
> +50 PL: 3774


Thanks I'll get a Korean friend to translate the link in the linked page. Bing translate doesn't do a very good job.


----------



## spyshagg

Quote:


> Originally Posted by *bluezone*
> 
> Yes. BTU's not temp. energy both stored or active can only be transformed not pulled out of thin air. 90w in = 90w out.


hi

That is exactly what i said. He was confusing Temp he felt in the hand with energy being dissipated in the room. Separate things, like you said.

cheers


----------



## illies100

i found this : https://i.imgur.com/6jCfQQH.png
this is the OC potential of RX480 , taken from AMD reddit


----------



## nyk20z3

Sold my Lian Li PC-05S and picked up a In Win 805 but i think the Nano might be 2 tiny for this build lol -


----------



## MrKoala

That got to be the funniest mobo/case combo I've seen.


----------



## bdub109

New here. I think my nano is running warmer then it should. It is trottleing a lot when playing the division down to 800mhz and sometimes lower. I had a pop up from cam stating that my gpu had reached over 82 degrees. I switched some of my fans around and it seems alittle better. Is this normal? I do plan on putting it on water soon.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> New here. I think my nano is running warmer then it should. It is trottleing a lot when playing the division down to 800mhz and sometimes lower. I had a pop up from cam stating that my gpu had reached over 82 degrees. I switched some of my fans around and it seems alittle better. Is this normal? I do plan on putting it on water soon.


It does sound a little warm. Do you have it over clocked or in a restricted airflow case? If your not using any overclock apps, have you opened the Crimson software settings interface and set the overdrive auto fan slider to 100% max. Default is 50% max. This can help.


----------



## bdub109

I did a major overhaul and upgrade to my rig which went from an air240 to a corsair spec-alpha which seems more restrictive. I'll be changing the case again in the near future lol. Anyways I noticed my amd software is not on my pc anymore. I went to the amd website and clicked on auto detect and says everything is up to date. Which software do I need exactly?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> I did a major overhaul and upgrade to my rig which went from an air240 to a corsair spec-alpha which seems more restrictive. I'll be changing the case again in the near future lol. Anyways I noticed my amd software is not on my pc anymore. I went to the amd website and clicked on auto detect and says everything is up to date. Which software do I need exactly?


I just looked up the corsair spec-alpha. looks like it has pretty good air flow. Sounds like you might have to reinstall AMD soft ware. Have you checked your uninstall programs to check if AMD Install Manager is still present. If it is I'd uninstall and run DDU from safe mode before anything else. DDU link is here if you need it. http://www.wagnardmobile.com/forums/viewtopic.php?f=5&t=273.

I'd download the latest version of Crimson (16.6.1) to have it on hand before you get started on the uninstall.

Let me know how things go.


----------



## bdub109

I reinstalled the Crimson software. Are you talking about the graphics profile and optimized performance?


----------



## bdub109

Target fan speed was not to 100% it is now should I change anything else there?


----------



## bdub109

Should my target tempature be maxed as well?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> Target fan speed was not to 100% it is now should I change anything else there?


Yes if you went open Crimson interface. Gaming>global Graphics>Global Overdrive>Target Fan Speed.

stock is about 65% fan speed, set to 100%. I'd leave the Target Temperature @ 85 for now.


----------



## bdub109

I'll give that a try. I also used the qfan in my bios on my asus z170 maybe that will help. Seems like it's kind of complicated because I have cam for my hue+ which can set gpu and fan, also using corsair link with my aio, too much tinkering and I don't know which program would have priority if that makes any sense. Doesn't help that my mobo has like twelve fan headers does it matter which ones go where?


----------



## bdub109

On this picture I have the division loaded but not even playing just sitting at menu. Seems weird that it's trottleing down to 900mhz shouldn't it stay at 1000 for the most part?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> I'll give that a try. I also used the qfan in my bios on my asus z170 maybe that will help. Seems like it's kind of complicated because I have cam for my hue+ which can set gpu and fan, also using corsair link with my aio, too much tinkering and I don't know which program would have priority if that makes any sense. Doesn't help that my mobo has like twelve fan headers does it matter which ones go where?


Do you have the Asus Fan Expert software installed. It will not help with setting GPU fan speed, but it will help you identify individual fans.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> 
> 
> On this picture I have the division loaded but not even playing just sitting at menu. Seems weird that it's trottleing down to 900mhz shouldn't it stay at 1000 for the most part?


What do you have your power limit in overdrive set to..


----------



## bdub109

After playing for a few minutes. I have had this card less then a month. Bought it from microcenter as an openbox. Worries me maybe someone returned it for this reason?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> 
> 
> After playing for a few minutes. I have had this card less then a month. Bought it from microcenter as an openbox. Worries me maybe someone returned it for this reason?


Are these temperatures are being reached @ zero power limit. If so it looks like you have the same problem I had. Bad application of TIM on the GPU. The problem is to fix it your likely to void the warranty by carefully replacing the TIM. I take it you cannot return it to Microcenter. You could always try a RMA to the manufacturer. Other wise I would suggest redoing the TIM. If you decide to try this let me know because you have look out for certain things on Fury GPUs.

Give your posts 24 Hrs. to see if anyone else can come up with any ideas.


----------



## bdub109

Quick update it has a lot to do with my case. I took all the side panels off and my temps are down to 70


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> Quick update it has a lot to do with my case. I took all the side panels off and my temps are down to 70


you must have your case fans set very low. 70 isn't to bad.

What do you have the Power limit set to.


----------



## bdub109

I should be able to return it to microcenter if there is an actual problem with the gpu. I have not adjusted the power at all.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> I should be able to return it to microcenter if there is an actual problem with the gpu. I have not adjusted the power at all.


70 @ stock power limit is good. You just need some more airflow in the PC case with the side panels on. After (and not before you resolve that) you can start bumping up the power limit to 20-30%. That will help with the down clocking. The Nano is power and temperature limited, down clocking is normal when stock. When you raise the power limit it will stay at higher clocks.

Adding a second GPU cooling fan and direct airflow to the back of the PCB will help with tempatures as well.

I've already experimented with fans on the rear of the graphics card. Out of curiosity I reran my temperature tests using the demo portion of FS (hottest part of FS).

Nano cooling fan only on custom fan curve. 77C Max. for 20-30% of demo.

Nano cooling fan plus rear fan. rising temps to 77C Max for 2 second peak.

Nano cooling fan plus stacked intake. Slow rise to 74C for 3 second peak.

3 fans normal running. Slower rise to 71C for 1 second peak.

3 fans all max speed. Very slow rise to 68c 10 second peak

Best bang for the buck stacked intake fans.

This was @ 50% Power limit for bench marking ONLY, @1100 MHz and -24 mv offset voltage with TIM replacement.


----------



## Flamingo

The 980 ti was for 370$ on newegg, RIP resale value of Furys







Quote:


> Originally Posted by *bdub109*
> 
> I should be able to return it to microcenter if there is an actual problem with the gpu. I have not adjusted the power at all.


Ok here is the deal (own a Nano too).

If you run uncapped / with vsync off. the GPU will run at 100% unless held back by the processor, which means it will heat up fast and reach its stock threshold of 85C, and with the fans capped at 60% in Crimson, your gonna run into some standard throttling/

Couple of things you can do to alleviate the issue:

1. If noise is not an issue > set the upper fan speed limit to 100%.

OR

2. Use V-sync, the GPU will be used anywhere between 30%, 50%, 80% or 100% depending on graphics settings, age of game, refresh rate on monitor. Since it will limit GPU use, the fan should be able to keep temperatures in check.

Currently playing Thief at highest settings at 1440p, V-Sync (60Hz), GPU usage around 50-60%. Overdrive settings at 60% upper fan speed limit. Temperatures hover around 68C-72C.

If you stick with the stock overdrive settings and v-sync off, the GPU will eventually reach 85C and heat your case real fast.

This is actually the only graphics card where Ive had to make sure V-sync was on and prevent 100% usage lol - just to keep temps low and the fans from revving upto 100C.

When I eventually play Rise of the Tomb Raider, Il lmake sure to have special noise cancelling headphones and crank it up to 100% fan speed to play


----------



## ozyo

Quote:


> Originally Posted by *nyk20z3*
> 
> Sold my Lian Li PC-05S and picked up a In Win 805 but i think the Nano might be 2 tiny for this build lol -


perfect fit


----------



## gupsterg

Several pages back I was posting how Fiji cards were thin supply @ UK etailers (~20/04/16) , well OCuk have pretty much de-listed Fiji cards, only 1 SKU of Fury listed with stock.

Links: Fury Nano Fury X

They seem the first at present on delisting Fiji.


----------



## gupsterg

Quote:


> Originally Posted by *Flamingo*
> 
> Also it heated up so fast, that I had to run tests @ 100% fan speed (even though my sensitivity is up 150%).


In the FanTable of PowerPlay is a value which limits MAX fan RPM, usFanRPMMax, a Stock AMD Nano ROM I downed from TPU has value of 2765RPM.

See heading *How to edit cooling profile in ROM* > *Extra cooling profile information for advanced manual modders* in Fiji bios mod thread OP.


----------



## bdub109

What do you mean by stacked intake fans? Also someone was asking about the cpu, it's an i7-6700k. Also I have a 24inch 1080p asus monitor, the one that has 3D on the base that's really popular and gets good reviews, goes for about 260$, sorry forget model number lol.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> What do you mean by stacked intake fans? Also someone was asking about the cpu, it's an i7-6700k. Also I have a 24inch 1080p asus monitor, the one that has 3D on the base that's really popular and gets good reviews, goes for about 260$, sorry forget model number lol.


\

here is a picture.

How is your frame rate when you play games?


Quote:


> Originally Posted by *Flamingo*
> 
> The 980 ti was for 370$ on newegg, RIP resale value of Furys
> 
> 
> 
> 
> 
> 
> 
> 
> Ok here is the deal (own a Nano too).
> 
> If you run uncapped / with vsync off. the GPU will run at 100% unless held back by the processor, which means it will heat up fast and reach its stock threshold of 85C, and with the fans capped at 60% in Crimson, your gonna run into some standard throttling/
> 
> Couple of things you can do to alleviate the issue:
> 
> 1. If noise is not an issue > set the upper fan speed limit to 100%.
> 
> OR
> 
> 2. Use V-sync, the GPU will be used anywhere between 30%, 50%, 80% or 100% depending on graphics settings, age of game, refresh rate on monitor. Since it will limit GPU use, the fan should be able to keep temperatures in check.
> 
> Currently playing Thief at highest settings at 1440p, V-Sync (60Hz), GPU usage around 50-60%. Overdrive settings at 60% upper fan speed limit. Temperatures hover around 68C-72C.
> 
> If you stick with the stock overdrive settings and v-sync off, the GPU will eventually reach 85C and heat your case real fast.
> 
> This is actually the only graphics card where Ive had to make sure V-sync was on and prevent 100% usage lol - just to keep temps low and the fans from revving upto 100C.
> 
> When I eventually play Rise of the Tomb Raider, Il lmake sure to have special noise cancelling headphones and crank it up to 100% fan speed to play


FTC (Framerate Target Control) can also help reduce temperature.


----------



## HyeVltg3

Quote:


> Originally Posted by *bluezone*
> 
> here is a picture.
> 
> How is your frame rate when you play games?


Does that actually work? do you have temps of before and after?


----------



## bluezone

Quote:


> Originally Posted by *HyeVltg3*
> 
> Does that actually work? do you have temps of before and after?


See post #8833 above.

Canadian Eh! I live in Ontario.


----------



## pdasterly

will the rx480 crossfire with fury?


----------



## bdub109

My frame rates are great. I totally changed my fans on my rig. Added two in the front. Added a littler fan I had lying around under my nano bringing in extra air from the four pcie expansion slots. I also fixed my exhaust as I had it going wrong way. Also changed my top from exhaust to intake. Lastly I went into my bios and cranked all fans to max. Noisey but it doesn't bother me.


----------



## pdasterly

top is exhaust


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> My frame rates are great. I also fixed my exhaust as I had it going wrong way. Also changed my top from exhaust to intake. Lastly I went into my bios and cranked all fans to max. Noisey but it doesn't bother me.


Great. The exhaust acting as a intake would explain the high temperatures. The top fans blowing on the back of the card might help cool the card as well.


----------



## bdub109

Now that I have these temps under control what settings do I need to use to keep my gpu at 1000mhz or higher while gaming?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> Now that I have these temps under control what settings do I need to use to keep my gpu at 1000mhz or higher while gaming?


!/ Are you using any overclocking software other than the Radeon Settings Overdrive.

2/ Try raising the Power Limit to 20-30% and keep an eye on temperatures and GPU Frequency while gaming.

This is a nice distraction from keeping an eye out for tornados in my area.







.


----------



## bdub109

I am using cam for my nzxt hue + it has software to overclock gpu, but my Radeon software runs in the background too would I set both of them to 20%? I used msi afterburner in the past with my msi cards I could download that if it's better.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> I am using cam for my nzxt hue + it has software to overclock gpu, but my Radeon software runs in the background too would I set both of them to 20%? I used msi afterburner in the past with my msi cards I could download that if it's better.


Use only one piece of software for settings. It can cause conflicts otherwise. Try cam to see if it works. If not try MSI Afterburner or Sapphire Trixx..


----------



## bdub109

At 20% I'm hovering around 950-980 MHz temps at 75% Also thanks for all the help I greatly appreciate it!!!


----------



## bdub109

What's weird is by going to +20% my frame rates actually dropped about 20%


----------



## Flamingo

Quote:


> Originally Posted by *bluezone*
> 
> FTC (Framerate Target Control) can also help reduce temperature.


I tested that with Batman Arkhum Origins, and it was kinda buggy in the sense, with V-sync off it gave ALOT of tearing. With V-sync on, it felt kinda pointless - also that setting keeps turning itself on and off which can be annoying.

Quote:


> Originally Posted by *pdasterly*
> 
> will the rx480 crossfire with fury?


Nope, you need similar core/architecture for crossfire to work apparently. Fiji being GCN 1.2 and Polaris being GCN4 (or 1.3)
Quote:


> Originally Posted by *bdub109*
> 
> What's weird is by going to +20% my frame rates actually dropped about 20%


Then you might be hitting the limits set in Overdrive, which could be causing the GPU to downclock and give lower frame rates. What are your target temp and target fan speed settings in overdrive?

The downside of increasing power limit is that it heats up the GPU faster.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> What's weird is by going to +20% my frame rates actually dropped about 20%


Very weird. Try afterburner. See If it still has the same effect.

EDIT: Don't forget to reset hue + to stock settings if you use afterburner.

Flamingo's also good at OC'ing


----------



## bdub109

I tried plus 20 in amd and plus 20 in cam at the same time. I'm just trying with cam at plus 20 now. I'll try afterburner next. Also my temp and fan settings are on max.


----------



## Flamingo

Quote:


> Originally Posted by *bdub109*
> 
> I tried plus 20 in amd and plus 20 in cam at the same time. I'm just trying with cam at plus 20 now. I'll try afterburner next. Also my temp and fan settings are on max.


+20% power limit? Try to use one software. Usually overdrive is good enough, unless

1) you want to under volt
2) set fans at 100% from the beginning to factor out temp related throttling.

then you use afterburner.

What game are you running and resolution? Whats your refresh rate?


----------



## bdub109

I'm playing at 1080 on a 144hz asus. I also could use asus gpu tweak as I think it's included with my new motherboard and my asus nano. I mainly play first person shooters, really into the division right now but play most aaa newer first person shooters. Still fall back and play some cs go and also like racing games.


----------



## bdub109

Yes 20% power limit I also tried 30% power and boosted clock to 1060mhz but only got to try it for a minute as I was just pushed into another forced Windows 10 update that I didn't want and my system restarted itself. Temps were up to about 78 but I was staying anywhere from 980-1050 MHz for the most part.


----------



## bdub109

I have been contemplating adding a second gpu then doing a custom loop. Anyone running crossfire and would it be worth it?


----------



## SuperZan

I run two Furies in Crossfire and I love it. I'm at 4K so the extra horsepower helps and most of what I play either has a profile or can be made to work in Crimson. The only issue I had was random signal loss with FreeSync active but with the more recent editions of Crimson that seems to have gone.


----------



## Flamingo

Quote:


> Originally Posted by *bdub109*
> 
> I'm playing at 1080 on a 144hz asus. I also could use asus gpu tweak as I think it's included with my new motherboard and my asus nano. I mainly play first person shooters, really into the division right now but play most aaa newer first person shooters. Still fall back and play some cs go and also like racing games.


The nano can easily hit 144fps in CSGO right? For that, there shouldnt be any clock throttle (maybe but unlikely).

But for the latest titles, you will surely get throttling as GPU usage is 100% at 144Hz vysnc or not. So maybe if your okay with lower refresh rates, then you can try to use V-sync if clock throttle bothers you.

Its not a bad thing because remember its full Fiji core with roughly half the power requirements. Your not getting a Fury X performance without special measures (took a while for that to dawn upon me whenever i felt bad about the Nano throttling).
Quote:


> Originally Posted by *bdub109*
> 
> Yes 20% power limit I also tried 30% power and boosted clock to 1060mhz but only got to try it for a minute as I was just pushed into another forced Windows 10 update that I didn't want and my system restarted itself. Temps were up to about 78 but I was staying anywhere from 980-1050 MHz for the most part.


Afaik, the power options in Afterburner and Overdrive are the same, so stick to using one software for now, unless you want to undervolt and/or check if cooling if affecting performance.
Quote:


> Originally Posted by *bdub109*
> 
> I have been contemplating adding a second gpu then doing a custom loop. Anyone running crossfire and would it be worth it?


At this point, personally I wouldnt advise anyone to go for a Fiji, unless prices are dropped further. Simply because its more power hungry and has less RAM comparitively to the latest releases.

So, if you want:

1) Capped fps (60Hz?) and stable 1000Mhz
Set power limit to 50%. Lower if it gets too hot too fast.

2) Uncapped fps and stable 1000Mhz (or above)
Will need special cooling

Best would be to cap frame-rates. Getting 144fps at 1080p wont be possible on a Nano because 1080p is not a strong point for Fijis (relatively better performance than competition at 1440p and 4k is). On CSGO maybe. You probably need 2 Nanos to get 144fps.

I ran a Dota 2 benchmark on my Nano at 1080p and got 108fps avg. 




It would also advise using the afterburner overlay for more information on whats going on (wont work in DX12 for now though)


----------



## Emarossa

Hello, I just began watercooling my R9 Nano and I have now reached 1100MHz with +50% power limit during load I reach 36c temp. Is it worth it hunting for more? What do others reach on water?


----------



## bdub109

Would it make sense to crossfire with a fury x perhaps or fury? The reason I got the nano was because they had a special on it for 380.00 at the time which seemed like a great deal.


----------



## HyeVltg3

Another CF question.
What exactly are the downsides of Fury X + Fury CF ?
thinking about it since I dont have another spot to put the 2nd Fury X's rad+fan...


----------



## Flamingo

Quote:


> Originally Posted by *Emarossa*
> 
> Hello, I just began watercooling my R9 Nano and I have now reached 1100MHz with +50% power limit during load I reach 36c temp. Is it worth it hunting for more? What do others reach on water?


1162Mhz on water

http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2519&cores=1#start=0#interval=20


----------



## AndreDVJ

Quote:


> Originally Posted by *HyeVltg3*
> 
> Another CF question.
> What exactly are the downsides of Fury X + Fury CF ?
> thinking about it since I dont have another spot to put the 2nd Fury X's rad+fan...


Would be limited by the air-cooled Fury card.

What you can try to do with a Fury is to try your luck and unlock to 3840SP's, so your Fury X wouldn't be slowed down that much.


----------



## gupsterg

Only when I had a Fury Tri-X air cooled I felt it was very much on a par with Fury X AIO, if not better.

At the time both cards were OC'd the same (1090/500), with near identical VID (Fury = 1.243V in ROM, Fury X 1.250V in ROM). The Fury Tri-X unlocked to 3840SP as well.

The air cooled Fury GPU VRM has a separate plate from the GPU, it does connect to the main heatsink fins though but temps were better IMO for VRM vs Fury X AIO. The Fury X AIO coolant goes to GPU first and then to GPU VRM, then flowing back to RAD, so the GPU VRM gets heated coolant.

I am planning on keeping the Fury X for longer now and if I do find a cheap Tri-X / XFX air cooler I may swap it over.

After seeing this video I may now not meddle around with cooler, spent too much time finding a card which is clocking well with performance scaling using low VID increase IMO.






Considering the experience @buildzoid has, totally flabbergasted to know what happened,







.


----------



## bdub109

I think performance wise I'd benefit from a second card then water cooling but I still haven't made up my mind. I will do both just a matter of which first. Also my mobo is the asus z170s it looks like I might be limited by pcie lanes. Looks like I'll have to run both cards at x8 does that matter much?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> I think performance wise I'd benefit from a second card then water cooling but I still haven't made up my mind. I will do both just a matter of which first. Also my mobo is the asus z170s it looks like I might be limited by pcie lanes. Looks like I'll have to run both cards at x8 does that matter much?


No it should not matter. pcie 3.0 is good, even pcie 2.0 on older boards would be plenty.

If your curious read this article on pcie 2.0

http://www.tomshardware.com/reviews/pci-express-scaling-p67-chipset-gaming-performance,2887.html


----------



## Thoth420

I'm a novice and I know not to touch the exposed interposer on the Fury X....he should feel stupid.


----------



## gupsterg

The pressure bracket on the rear he stated is a pain to reinstall.

From what I'm understanding he thinks from irregular pressure application whilst installing the pressure bracket he may have caused damage to interposer or HBM.


----------



## bluezone

Every time you remove or replace a thermal solution (water/air) from a modern GPU, you are taking a chance. Unlike tower PC CPU's; there is no heat spreader to protect the delicate silicon that the processor is based on. If you want to sweat bullets. try de-lidding a 3000 series and up Intel proc. to replace the TIM.
Some times you break an egg. It can be very upsetting when it happens. You can know the risks and accept them. But you don't have to be happy if it happens.
With the Fiji series of GPU's this is exacerbated. Owing to it being a delicate composite of silicon layers. The interposer is a sandwich of 3 or more thin layers of silicon. Held together by a adhesive and aligning BGA between the layers. The GPU is affixed by adhesive on top this aligning another BGA. HBM RAM modules (also BGA) are layered silicon; which can be so thin it is as flexible as paper. So it's kind of a house of cards to play with.

As for anyone with the knowledge and the willingness to experiment to help other owners. I applaud them and feel for them when they experience setbacks like this.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> The pressure bracket on the rear he stated is a pain to reinstall.
> 
> From what I'm understanding he thinks from irregular pressure application whilst installing the pressure bracket he may have caused damage to interposer or HBM.


Most likely. It is very easy to damage the Fury X, plenty of vets already have. I don't trust myself enough so I had someone with powerful sorcery do mine.


----------



## bdub109

So a little update. It seems like my fps has in suffered thus far with trying to overclock. Turns out that any settings I was using in amd were being over ridden by the cam software. I wish I didn't have to use it at all but the have the best lighting system at the time. Actually what I wish was there was a software that would control all devices from all companies. A software to control my corsair aio, my corsair mouse, also working with my dominator platinums as far as temps, control my lighting, control fans, had a decent gpu over clocked in it, some good monitoring and stress testing/benchmarks. I'd be a happy man.


----------



## bdub109

Just got done running fire strike ultra at 20% power increase only now that I have fixed the cam gpu tuning software and I went from a score of 3427 to 3615!! Wow now here's hoping my fps go up in real gaming.


----------



## bdub109

Digging further into results seems like my cpu was off in the lower scored run


----------



## bdub109

What would cause this difference?


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> So a little update. It seems like my fps has in suffered thus far with trying to overclock. Turns out that any settings I was using in amd were being over ridden by the cam software. I wish I didn't have to use it at all but the have the best lighting system at the time. Actually what I wish was there was a software that would control all devices from all companies. A software to control my corsair aio, my corsair mouse, also working with my dominator platinums as far as temps, control my lighting, control fans, had a decent gpu over clocked in it, some good monitoring and stress testing/benchmarks. I'd be a happy man.


That was one of the conflicts I was worried about. Is there any way of uninstalling part of the cam software? With Asus AI suite you can install or uninstall parts you do not want.


----------



## xkm1948

Do you guys think us FuryX owners will get the awesome new overclocking panel currently said to be available for RX480?


----------



## bluezone

I hope so.


----------



## bdub109

There is a hotfix and a way to change setting my changing files which I have done. I'm now trying using just the Radeon software for awhile to see if it's better. I also noticed I have gpu tweak 2 available, it came with my gpu. I'll probably try afterburner if I don't get much results from the Radeon


----------



## bluezone

If you end up needing to adjust voltage offset. The only software I've found to do adjustment with that work are Afterburner and Trixx. Other wise you need bios mods


----------



## Flamingo

Quote:


> Originally Posted by *bdub109*
> 
> Just got done running fire strike ultra at 20% power increase only now that I have fixed the cam gpu tuning software and I went from a score of 3427 to 3615!! Wow now here's hoping my fps go up in real gaming.


Here are my scores

http://www.3dmark.com/compare/fs/7695848/fs/8168688/fs/8612546/fs/8656301

Stock > Power limit changed > Power limit + Fan speed changed + Temp limit changed > Overclocked to 1050Mhz


----------



## bdub109

now I'm having a problem I've never had, while playing the division everything looks jacked up. I can't even describe it sow here's a picture.


----------



## bdub109

^^^dont know why the picture did not load. Whatever the problem it seems like it's only that game. Doom ran fine. Wow flamingo that last run is impressive. Are you on water to achieve that? Would that even be possible for me and safe to run that way all the time? Also what settings and software are you using?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> As for anyone with the knowledge and the willingness to experiment to help other owners. I applaud them and feel for them when they experience setbacks like this.


+1 .
Quote:


> Originally Posted by *xkm1948*
> 
> Do you guys think us FuryX owners will get the awesome new overclocking panel currently said to be available for RX480?


I would think so.

I am intrigued to see what it has on offer but IMO I think it will be similar to how MSI AB/TriXX is, I doubt we'll see HBM voltage / fSW adjustment. I can still see myself using MSI AB for when I create logs of monitored data / i2c dumps / etc. Personally I've always preferred the bios mod route than SW OC.


----------



## Flamingo

Quote:


> Originally Posted by *bdub109*
> 
> ^^^dont know why the picture did not load. Whatever the problem it seems like it's only that game. Doom ran fine. Wow flamingo that last run is impressive. Are you on water to achieve that? Would that even be possible for me and safe to run that way all the time? Also what settings and software are you using?


thanks. on the stock cooler. i just cranked up the power limit to +50 and clock speed to 1050Mhz and ran the test. At first, it throttled heavily (because temps).

So I used Afterburner to set the fan speed to 100%, then re-ran the test and it stayed at 1050Mhz and got that result. Dont know why it isnt exactly like a Fury X's score, but its very very close. Also note my chasis

http://www.modding.fr/wp-content/uploads/2015/03/SG-13B-Intro-01.jpg

its almost like an open box







- so 100% fan speed gets annoying loud.

I wouldnt go +50% power unless I have a full EK faceplate and backplate, because apparently the VRMS get really got ( 104C than standard 84C under load). This page discusses the VRM temps when it uses the Silent Wings 2 mod

http://www.tomshardware.de/amd-radeon-r9-nano-modding-silent-umbau,testberichte-241932-4.html

Its in german, you can use google translate and get a rough idea of how thinks turn out.

Would be nice to know if anyone has been running on air +50% PL and 1050Mhz for a long period now... ( I guess bluezone is running that on his 1100Mhz stacked air nano).

I dont need the extra frames atm - doing fine at most games 60fps @ 1080p at highest settings. Will try to improve cooling / upgrade perhaps if I change monitor.


----------



## bdub109

Would it make sense to cap my fps at 144 since my monitor is 144? Also I notice in a lot of games you can't set the refresh rate past 60.


----------



## Dr. Vodka

Quote:


> Originally Posted by *bdub109*
> 
> Would it make sense to cap my fps at 144 since my monitor is 144? Also I notice in a lot of games you can't set the refresh rate past 60.


Yes, cap the framerate 1 or 2 FPS below your screen refresh limit (as not to cause tearing), games that go higher than that won't stress and heat up your GPU unnecessarily (although this part of FPS capping is more useful at lower FPS/refresh rates, like 60 or 75







)


----------



## Elmy

Pro Duo with a fresh set of Ek Clothes... LoL


----------



## bdub109

That duo is damn sexy. Do you have to run it in crossfire or does it read as a single gpu?

Anyone having problems with the division?


----------



## bdub109

Is this a decent nano score in fire strike ultra?

http://www.3dmark.com/3dm/12633192

Went to some extreme measures to keep tempatures good never went above 67


----------



## Flamingo

Quote:


> Originally Posted by *bdub109*
> 
> Is this a decent nano score in fire strike ultra?
> 
> http://www.3dmark.com/3dm/12633192
> 
> Went to some extreme measures to keep tempatures good never went above 67


Can also check against submissions here:
http://hwbot.org/benchmark/3dmark_-_fire_strike_ultra/rankings?hardwareTypeId=videocard_2519&cores=1#start=0#interval=20

Note though both "keithplayspc" and "[email protected]" have fishy results - achieving 4k graphics score on much lower clocks (lol @ 1025Mhz).

Ill see if I can reach 1080Mhz and try to do a test too.

What extreme measures did you take lol.


----------



## Flamingo

Here is my score at 1080Mhz.

http://www.3dmark.com/compare/fs/8910204/fs/8915673

settings were:

100% fan speed, +50% power limit, 8% overclock from crimson. 80C threshold. No HBM oc.

About 42 points higher, idk how significant is this 3DMark terms. Try not to overclock the HBM and check again.

For some odd reason, Test 1 of Firestrike always causes power related throttling - cant seem to figure out why, it seems like a power hungry test. My clocks were jumping from 985 to 1012 to 1025 to 1080Mhz only in this test. Rest of the tests kept 1080Mhz stable.

@bluezone and @bdub109 can you both check if it throttles during test 1 too in your OC settings (1100Mhz and 1080Mhz respectively)? If not, are you adding mV elsewhere (BIOS or Afterburner)?

I know bluezones settings are +50PL and -24mV at 1100Mhz and TIM replacement.

Thanks


----------



## bdub109

I'll try when I get home from work. As far as cooling goes besides my normal fans I add three while running benchmarks. There all three the same size like 90mm or so. I put one at the back of my case right below the nano and one right below the nano at the other end facing same direction. At the from of my case I have an airflow fan pulling fresh air in. It creates a great line of air from the front to the back. I also use the third slanted sitting on top of the nano to bring cool air in from my top fan. It really does a great job of keeping everything cool. I also turn the portable air conditioner I have in the room as low as it will go lol down to 67F I'll take a picture next time I run a test.


----------



## bdub109

Newbie question TIM replacement meaning thermal paste replacement? How difficult is that?


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> Here is my score at 1080Mhz.
> 
> http://www.3dmark.com/compare/fs/8910204/fs/8915673
> 
> settings were:
> 
> 100% fan speed, +50% power limit, 8% overclock from crimson. 80C threshold. No HBM oc.
> 
> About 42 points higher, idk how significant is this 3DMark terms. Try not to overclock the HBM and check again.
> 
> For some odd reason, Test 1 of Firestrike always causes power related throttling - cant seem to figure out why, it seems like a power hungry test. My clocks were jumping from 985 to 1012 to 1025 to 1080Mhz only in this test. Rest of the tests kept 1080Mhz stable.
> 
> @bluezone and @bdub109 can you both check if it throttles during test 1 too in your OC settings (1100Mhz and 1080Mhz respectively)? If not, are you adding mV elsewhere (BIOS or Afterburner)?
> 
> I know bluezones settings are +50PL and -24mV at 1100Mhz and TIM replacement.
> 
> Thanks


I have slightly modded Bios and have very slight throttling @ 1100Mhz. Usually 1085-1100Mhz.


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> Newbie question TIM replacement meaning thermal paste replacement? How difficult is that?


Being a Newbie you should pay attention to this video before considering thermal paste replacement.






After watching that, decide if you want to replace GPU TIM. There are certain things you have watch out for on Fiji GPU's (Fury series). Go back and read the last 25 or pages Fiji owners club to get an idea of what you need to do. Then PLEASE ask questions. Remember this may likely void warranty and is something you do at your own peril. Meaning were not responsible if you break something.

I think your temperatures are fine right now. I on the other hand had very bad thermals due to the TIM.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I have slightly modded Bios and have very slight throttling @ 1100Mhz. Usually 1085-1100Mhz.


Any chance of a log from MSI AB (Zip HML file,attach)? and share of ROM? the member I'm trying to help on OCuk we're not even cracking 1000MHz rock solid in 3DM FS







.


----------



## bdub109

When I say newbie I mean I've been deeply into computers since the age of 14, now I'm 28. I have always been a hardcore electronics, gaming, and pc nerd. I have been around electronics in retail sense and management/training professional. I'm pretty knowledgable just haven't dug this deep into modding and over clocking until recently. I do not plan on doing anything else to the card until I out it on water. In that case I would have to redo the Tim anyways.


----------



## bdub109

On a side note a nano with ek water block just sold on eBay for 405$. Pretty nice deal I bid 400 but didn't really have the extra money so stopped bidding.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Any chance of a log from MSI AB (Zip HML file,attach)? and share of ROM? the member I'm trying to help on OCuk we're not even cracking 1000MHz rock solid in 3DM FS
> 
> 
> 
> 
> 
> 
> 
> .


I am already writing a PM already about this.








I will post on the board later.








Quote:


> Originally Posted by *bdub109*
> 
> When I say newbie I mean I've been deeply into computers since the age of 14, now I'm 28. I have always been a hardcore electronics, gaming, and pc nerd. I have been around electronics in retail sense and management/training professional. I'm pretty knowledgable just haven't dug this deep into modding and over clocking until recently. I do not plan on doing anything else to the card until I out it on water. In that case I would have to redo the Tim anyways.


Sorry no insult intended. Just trying to look out for you.









Most of the info your looking for is in the last 25 pages along with links to check out. Pay particular care and attention to the interposer.


----------



## bdub109

Non taken, I totally appreciate the info!! This has by far been the most helpful group I've come across forum wise. I have been trying to get caught up by reading all the pages


----------



## bluezone

Quote:


> Originally Posted by *bdub109*
> 
> Non taken, I totally appreciate the info!! This has by far been the most helpful group I've come across forum wise. I have been trying to get caught up by reading all the pages


I know what you mean. I don't think I've seen one person in this group ever remind anyone what "the search function is for". Which is rather refreshing.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I am already writing a PM to already about this.
> 
> 
> 
> 
> 
> 
> 
> 
> I will post on the board later.


Cheers







.


----------



## bdub109

Are there two versions of the asus white nano?


----------



## bdub109




----------



## Flamingo

^ can't see image.
Quote:


> Originally Posted by *bdub109*
> 
> On a side note a nano with ek water block just sold on eBay for 405$. Pretty nice deal I bid 400 but didn't really have the extra money so stopped bidding.


Isn't 400usd a lot for the front and back plate? Stock prices are around 150 and 80ish I think.

As for the interposer thing, at the moment I wouldn't do a repaste unless I had a water cooling system at hand.

I am thinking of getting noctua's a9 fan though to replace the stock fan (since I can't stack it). Don't know if it will fit. ?


----------



## bdub109

400usd would be a lot for just the block and backplate lol but this came with a gpu as well


----------



## Flamingo

There is only one version of the Nano everywhere. Asus just changed the shroud cover.


----------



## gupsterg

@xkm1948

New overclocking util is looking way better than what OverDrive is currently







(Source Videocardz).



It's giving access to a lot we're doing via bios mod.


----------



## Flamingo

Quote:


> Originally Posted by *gupsterg*
> 
> @xkm1948
> 
> New overclocking util is looking way better than what OverDrive is currently
> 
> 
> 
> 
> 
> 
> 
> (Source Videocardz).
> 
> 
> 
> It's giving access to a lot we're doing via bios mod.


Nice! Too bad no fan granularity though


----------



## gupsterg

Yeah that is missing but still a vast improvement over OD, etc.

Really like how VID per DPM is available plus clocks. I saw in Hawaii factory OC ROMs they increased each DPM clock as x % of highest state, I plan on doing that with Fury X now I've got a good OC I'm happy with for scaling/stability.


----------



## Semel

So, guys, are you keeping your furys or selling them?







I was thinking about selling mine and buying rx480. I'll decide when reviews\benchmarks get available and it depends on how well it runs on stock and how well it overclocks if at all.
I guess I would have some money left after buying rx480 and then in a year I could get Vega.:








If overclocked rx480 doesn't beat my unlocked fury tri-x then I'll keep it.


----------



## SuperZan

I'm keeping my Crossfire Furies but I sold my Fury X to play with a pair of 480's until Vega.


----------



## pdasterly

to early to tell but fury outperforms rx480 and with gtx 980 price drop fury should soon be around 300. Still waitin for pro duo price drop then im buying


----------



## Kana-Maru

Quote:


> Originally Posted by *Semel*
> 
> So, guys, are you keeping your furys or selling them?
> 
> 
> 
> 
> 
> 
> 
> I was thinking about selling mine and buying rx480. I'll decide when reviews\benchmarks get available and it depends on how well it runs on stock and how well it overclocks if at all.
> I guess I would have some money left after buying rx480 and then in a year I could get Vega.:
> 
> 
> 
> 
> 
> 
> 
> 
> If overclocked rx480 doesn't beat my unlocked fury tri-x then I'll keep it.


Fury X user here.

I thought about buying a RX480 for benchmarking purposes then selling it on Ebay or something. I'm definitely keeping my Fury X in my rig until at least next summer, possibly longer.


----------



## LeadbyFaith21

Quote:


> Originally Posted by *gupsterg*
> 
> @xkm1948
> 
> New overclocking util is looking way better than what OverDrive is currently
> 
> 
> 
> 
> 
> 
> 
> (Source Videocardz).
> 
> 
> 
> It's giving access to a lot we're doing via bios mod.


That looks fun to overclock with! Can't wait to get my hands on it!


----------



## Semel

I thought it was a fake.. Making voltage adjustment officially available for end-users ? That's a bit risky...


----------



## bluezone

"The uprising has begun".


----------



## SuperZan

The new era had better include some definitive information on 480's ROP counts and performance.


----------



## bluezone

PCPER is doing live cast and interview for RX 480.

Card (2) give away as well!!!!!!!!!!!!!









http://www.pcper.com/news/General-Tech/PCPer-Live-Radeon-RX-480-Live-Stream-Raja-Koduri

Set your calendar for this coming Wednesday at 1:30pm ET / 10:30am PT

EDIT: Added Reddit.

Reddit hosting RX 480 AMA. 14 RX 480 giveaway.


__
https://www.reddit.com/r/4phz39/amd_will_be_making_reddit_history_next_week/


----------



## Medusa666

Quote:


> Originally Posted by *Semel*
> 
> So, guys, are you keeping your furys or selling them?


I'm going to use my Radeon Pro Duo for the coming 2-4 years, I wanted something somewhat futureproof and this card is amazing, silent and cool no matter what I throw at it, performance is beast.


----------



## SpeedyVT

Quote:


> Originally Posted by *Medusa666*
> 
> I'm going to use my Radeon Pro Duo for the coming 2-4 years, I wanted something somewhat futureproof and this card is amazing, silent and cool no matter what I throw at it, performance is beast.


It's mostly for development purposes that's why it has a hefty tag on it. It'll only be obsolete by the release of the next development card. Radeon Pro Duo is for graphics editing and 3D Modeling. Games it's average because not a lot utilize it properly with all the crooked AMD or NVidia that fan exclusive features.


----------



## Flamingo

Quote:


> Originally Posted by *Semel*
> 
> So, guys, are you keeping your furys or selling them?
> 
> 
> 
> 
> 
> 
> 
> I was thinking about selling mine and buying rx480. I'll decide when reviews\benchmarks get available and it depends on how well it runs on stock and how well it overclocks if at all.
> I guess I would have some money left after buying rx480 and then in a year I could get Vega.:
> 
> 
> 
> 
> 
> 
> 
> 
> If overclocked rx480 doesn't beat my unlocked fury tri-x then I'll keep it.


Going to wait and see the price and lenght of the new dual 480 card from Powercolor. Might switch to that.

A single RX 480 doesn't seem to as strong as the Fury at higher 1440p+


----------



## HyeVltg3

Quote:


> Originally Posted by *Semel*
> 
> So, guys, are you keeping your furys or selling them?
> 
> 
> 
> 
> 
> 
> 
> I was thinking about selling mine and buying rx480. I'll decide when reviews\benchmarks get available and it depends on how well it runs on stock and how well it overclocks if at all.
> I guess I would have some money left after buying rx480 and then in a year I could get Vega.:
> 
> 
> 
> 
> 
> 
> 
> 
> If overclocked rx480 doesn't beat my unlocked fury tri-x then I'll keep it.


I dont see why anyone would sell their fury for a RX480, but I guess if you were crossfiring the 480s, sure.
Overclock? so far the reports about the 480 is that its an amazing overclocker, going from stock 1266 upwards to 1500 and even a 1600 reported.
But will that OC make it better than a fury? no idea, will have to wait for benches like the rest of use.

I actually already sold/returned my Fury X and am planning to just coast off a 480 CF till Vega. so... =D
returned for a completely different reason; was getting the infamous display corruption problem with every low-end game I booted, Clockblocker wasnt always helping 100% of the time. rather not have to deal with a workaround while waiting on a solution that may not come. google-fu showed me that this has been an issue since Fury X launch and the only driver to address this issue so far has been 16.6.1 and that only came out a week ago, and it didnt help. rather opt out of the trouble Now instead of later.
Quote:


> Originally Posted by *Flamingo*
> 
> A single RX 480 doesn't seem to as strong as the Fury at higher 1440p+


A 480 is rated at being in between R9 390 and R9 390X performance. but at a MUCH lower TDP, I'm really excited to try them out in crossfire.
Was a bit scared of trying to CF my previous 390 on my Corsair AX850, had it going for a bit but was getting scared because everywhere it recommends at least 1050W to CF 390s and 1100+ for 390X.
So with the loowwww power consumption of 480s I should be golden, probably overkill with my 850W, haha.

I'm just hoping it doesnt end up being like the Pascal fiasco; stock shortage and founder's edition crap, making AIB card prices shoot way above MSRP.
also hoping theres no "Limit 1 per customer" during checkout.


----------



## Medusa666

Quote:


> Originally Posted by *SpeedyVT*
> 
> It's mostly for development purposes that's why it has a hefty tag on it. It'll only be obsolete by the release of the next development card. Radeon Pro Duo is for graphics editing and 3D Modeling. Games it's average because not a lot utilize it properly with all the crooked AMD or NVidia that fan exclusive features.


I do some of that stuff too (3D modeling and art) but I don't agree with you, the card is a beast overclocked and runs games perfectly smooth with crazy FPS where Crossfire is supported, staying silent and cool. I'm not that picky when it comes to graphic fidelity levels so I can easily see myself holding on to this the coming years, all in all it is a great card.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> So, guys, are you keeping your furys or selling them?
> 
> 
> 
> 
> 
> 
> 
> I was thinking about selling mine and buying rx480. I'll decide when reviews\benchmarks get available and it depends on how well it runs on stock and how well it overclocks if at all.
> I guess I would have some money left after buying rx480 and then in a year I could get Vega.:
> 
> 
> 
> 
> 
> 
> 
> 
> If overclocked rx480 doesn't beat my unlocked fury tri-x then I'll keep it.


There are already some reports that RX 480 reaches about ~1400MHz ref PCB/cooler, these are leaks by certain sites which have the cards already. That's ~10% OC headroom, I reckon the cards are not limited by the 6 pin connector as technically more is available than PCI-SIG spec (~190W+75W from slot). The VRM to me is looking like 6+1+1 and IR3567B voltage, which seems pretty powerful for low TDP GPU and wouldn't limit OC headroom IMO. Temps seem what a ref blower would be getting, IIRC ~70C+.

Videocardz RX 480 rumors part 3 , part 6.

This is not to say I don't think RX 480 is gonna be a great "bang for $" card. The Overwatch video by Hardware Unboxed made me go WOW. If I get a RX 480 it will be just to meddle with it and later sell on.

I'd think I'd still be keeping my Fury X, just really like the size, quietness, build quality and performance. Recently a member in the 390 club posted his 3DM FS score, so i7 6700K @ 4.6GHz + 390X @ 1250/1750 vs My i5 4690K @ 4.9GHz + Fury X @ 1145/545 compare link.


----------



## Flamingo

the current news is going around that none of the reviewers have managed to hit 1400Mhz on their cards. i wonder if its the power or cooling restriction. a bit of both including most likely the latter.


----------



## Orthello

Quote:


> Originally Posted by *Flamingo*
> 
> the current news is going around that none of the reviewers have managed to hit 1400Mhz on their cards. i wonder if its the power or cooling restriction. a bit of both including most likely the latter.


I think more power restriction although the cooling is not great. From the other threads it seems fans at 81% ~ 4000 rpm keep the card at 74c at 1380 mhz clock. Reports also of short momentary spikes in mhz up to 1680mhz before power limiters kick in.

The AIB models will have more beefy cooling and plenty more available power.

I think they will see really good oc on those models (1500s i would suspect) although perf/watt will i suspect only be comparable in dx12 to nv cards . Not that i care to be honest about p/w , at least the AIB models will provide extra performance which is a nice change from the pascal AIBs that basically just reduced the noise.


----------



## bluezone

Ok her are the results from my custom voltage per DPM ROM.

Tess on, "0" PL.


Spoiler: Warning: Spoiler!







Tess off, "0" PL.


Spoiler: Warning: Spoiler!








Tess on, 50% PL.


Spoiler: Warning: Spoiler!







What do you guys think?


----------



## Kedas

Hi guys,

If anyone could answer me this question, i appreciated









Would it be worth to get one fury x second handed for around 400€?

Thanks


----------



## Flamingo

Quote:


> Originally Posted by *Kedas*
> 
> Hi guys,
> 
> If anyone could answer me this question, i appreciated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would it be worth to get one fury x second handed for around 400€?
> 
> Thanks


If you can get a gtx 1070 for that price, go for it. if not, the fury x or 980 ti is the better option.

@bluezone, im gonna wait for the new driver release before diving into dpm states - still a new / unknown concept to me.


----------



## Kedas

Quote:


> Originally Posted by *Flamingo*
> 
> If you can get a gtx 1070 for that price, go for it. if not, the fury x or 980 ti is the better option.
> 
> @bluezone, im gonna wait for the new driver release before diving into dpm states - still a new / unknown concept to me.


For 400€ it's impossible to get the 1070 because they cost 500€ the cheapest one, maybe if I can get the guy to drop it to 350€ I think the fury x will be worth it, don't you think? Or wait a bit for the price drop on the 1070


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> @bluezone, im gonna wait for the new driver release before diving into dpm states - still a new / unknown concept to me.


It was a long, but not as long after I figured out a quicker method, process to set voltage per DPM correctly. One of the original Bios DPM voltage settings was lower than optimal.
It may just be me, but after correcting this, frame presentation was a bit smoother.


----------



## shadowxaero

So I sprung a leak......one of the fittings on my lower radiator went bad I suppose. Luckily nothing was damaged. I did however take the time to switch over to rigid tubing. After a couple failed bent tubes (specifically from the GPU to CPU) I managed to finish and get everything up and running again.

How does it look?




I still think a Fury with an EKWB is one of the best looking cards on the market.


----------



## SuperZan

That's a beautiful setup you've got. Cheers on getting everything sorted without any real incident.


----------



## Flamingo

Quote:


> Originally Posted by *Kedas*
> 
> For 400€ it's impossible to get the 1070 because they cost 500€ the cheapest one, maybe if I can get the guy to drop it to 350€ I think the fury x will be worth it, don't you think? Or wait a bit for the price drop on the 1070


350 for the fury x would be good. tell him about the 4gb limitation and all. but 400 is good too then (1070 being 500).


----------



## Flamingo

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.2-Release-Notes.aspx

New drivers out. Wattman seems limited to the RX400 series. Havent tried it myself yet.


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.2-Release-Notes.aspx
> 
> New drivers out. Wattman seems limited to the RX400 series. Havent tried it myself yet.


Thanks

REP: +1


----------



## RedGoose

So I was thinking of buying the Fury X given newegg's great deal on the XFX version (only 460 bucks), but I wanted to check with other owners of the card first and see how its doing. I know at launch there were a lot of concerns about coil whine and micro stutter. How are the cards you guys own faring and would you say that these issues have been largely resolved? Any long term issues that have popped up that we weren't previously aware of?

Thanks for the info!


----------



## xkm1948

Anyone tested whether the new Wattman overclock panel works for us FuryX owners? I surely wish it can.


----------



## bluezone

unfortunately NO.


----------



## xkm1948

Quote:


> Originally Posted by *bluezone*
> 
> unfortunately NO.


Did you try? Damn it! I really wanted that!


----------



## bluezone

Quote:


> Originally Posted by *xkm1948*
> 
> Did you try? Damn it! I really wanted that!


For me, just the old interface showed up. I'm disappointed too.

I'm watching the live stream on PCPER right now and their discussing it related to RX 480.


----------



## xkm1948

I am sure we can get it working for FuryX. Hell there are BIOS mods out there which is even more hardcore than this.


----------



## bluezone

Quote:


> Originally Posted by *RedGoose*
> 
> So I was thinking of buying the Fury X given newegg's great deal on the XFX version (only 460 bucks), but I wanted to check with other owners of the card first and see how its doing. I know at launch there were a lot of concerns about coil whine and micro stutter. How are the cards you guys own faring and would you say that these issues have been largely resolved? Any long term issues that have popped up that we weren't previously aware of?
> 
> Thanks for the info!


I have a Nano myself. it's quite except for the fan. I haven't heard too much about coil or pump noise on newer units. so you are probably safe unless they are old sku's

Newer drivers have helped also.


----------



## RedGoose

Quote:


> Originally Posted by *bluezone*
> 
> I have a Nano myself. it's quite except for the fan. I haven't heard too much about coil or pump noise on newer units. so you are probably safe unless they are old sku's
> 
> Newer drivers have helped also.


Thanks for the info.

Anybody have micro stutter issues?


----------



## BIGTom

Quote:


> Originally Posted by *RedGoose*
> 
> So I was thinking of buying the Fury X given newegg's great deal on the XFX version (only 460 bucks), but I wanted to check with other owners of the card first and see how its doing. I know at launch there were a lot of concerns about coil whine and micro stutter. How are the cards you guys own faring and would you say that these issues have been largely resolved? Any long term issues that have popped up that we weren't previously aware of?
> 
> Thanks for the info!


I've had the XFX Fury-X since launch day. My card came with the original pump design exhibiting the pump whine, but they acknowledged the issue with a revised pump. I didn't pursue the option to replace the pump and my card has since lost the noise.

Performance has been great for me. I usually run games in native 3440x1440 resolution if the game supports it. It runs most games on highest settings at 60 FPS, but I typically change some options like AA or disabling Gameworks.
Some may say that the 4GB of HBM might not be a wise choice, but I haven't run into any issues being limited by the 4GB at my native resolution so far. The card runs cool and quiet, and I don't think that I will ever go back to air cooled GPUs.

If you are looking for an overclocking beast, it might let you down. It's not a huge priority for me, but I do it for fun as a hobby and with the performance it gives at stock, it's not necessary.

I have been extremely satisfied with it for this past year. Performance seems to improve with every driver revision. I think the deal you are mentioning is not bad if you have the right expectations. If you value a cool, quiet and stable card and understand that it is going to underperform compared to a slightly less expensive air cooled 1070 with higher VRAM, you will probably be happy also.


----------



## RedGoose

Quote:


> Originally Posted by *BIGTom*
> 
> I've had the XFX Fury-X since launch day. My card came with the original pump design exhibiting the pump whine, but they acknowledged the issue with a revised pump. I didn't pursue the option to replace the pump and my card has since lost the noise.
> 
> Performance has been great for me. I usually run games in native 3440x1440 resolution if the game supports it. It runs most games on highest settings at 60 FPS, but I typically change some options like AA or disabling Gameworks.
> Some may say that the 4GB of HBM might not be a wise choice, but I haven't run into any issues being limited by the 4GB at my native resolution so far. The card runs cool and quiet, and I don't think that I will ever go back to air cooled GPUs.
> 
> If you are looking for an overclocking beast, it might let you down. It's not a huge priority for me, but I do it for fun as a hobby and with the performance it gives at stock, it's not necessary.
> 
> I have been extremely satisfied with it for this past year. Performance seems to improve with every driver revision. I think the deal you are mentioning is not bad if you have the right expectations. If you value a cool, quiet and stable card and understand that it is going to underperform compared to a slightly less expensive air cooled 1070 with higher VRAM, you will probably be happy also.


Thanks Tom. Sounds perfect.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> I am sure we can get it working for FuryX. Hell there are BIOS mods out there which is even more hardcore than this.


Last night tested Crimson v16.6.2 and like bluezone saw OD interface rather than WattMan.

Also mucked around editing driver install inf, but still no go.

From few reviews WattMan limits VID increase to 1150mV on RX 480, this may mean owners still use 3rd party apps/bios mod/volt mod to gain more if card has potential/better cooling solution.

2-3 people have posted on AMD Community asking WattMan should support other cards aswell, perhaps join the Rebellion







.


----------



## bluezone

Anyone want to beta test Radeon software? Any Beta testers?

http://radeon.com/calling-on-radeon-software-beta-testers/

There is a signup link at the bottom of the page.

Yes this is for real.


----------



## Alastair

Can someone willing do a test for me please. Can anyone test a 4_low or 4_high BIOS on a fully enabled Fury X. I want to see if it looks out 4CU's. I'm dangling +1 rep for ya! tongue.gif


----------



## AndreDVJ

@Alaistair, a Fury X BIOS does "work" on normal Fury, however it'll keep seeing 8 CU's disabled. No idea how are read these disabled CU's. I personally flashed a Fury X BIOS myself on my card. You'll have to toggle the force option in ATIflash.

So I see no reason why a Tri-X BIOS would lock out CU's on a Fury X.


----------



## Alastair

Quote:


> Originally Posted by *AndreDVJ*
> 
> @Alaistair, a Fury X BIOS does "work" on normal Fury, however it'll keep seeing 8 CU's disabled. No idea how are read these disabled CU's. I personally flashed a Fury X BIOS myself on my card. You'll have to toggle the force option in ATIflash.
> 
> So I see no reason why a Tri-X BIOS would lock out CU's on a Fury X.


I want to know if a 4 low or 4 High would end up locking out 4 of CU's on a fully enabled Fury X.


----------



## gupsterg

Quote:


> Originally Posted by *AndreDVJ*
> 
> @Alaistair, a Fury X BIOS does "work" on normal Fury, however it'll keep seeing 8 CU's disabled. No idea how are read these disabled CU's. I personally flashed a Fury X BIOS myself on my card. You'll have to toggle the force option in ATIflash.
> 
> So I see no reason why a Tri-X BIOS would lock out CU's on a Fury X.


Fury X ROM does not have the table which locks/configures SP, I posted about it in Fiji bios mod thread when Alastair brought discussion to that thread. Due to this table not being in Fury X Atomtool can not be run on it to set SP, I modded AMD updated Fury X ROM to have the table plus changed PowerPlay to have correct CAC records so under EVV you'd have correct VID per DPM.

This same method of ROM locking was used on Hawaii and later H/W lock.

If I run the CUinfo tool on genuine Fury X it would let me know if it is still in writable state for SP, if so a Fury ROM or Fury X ROM with table which configure SP could be used to set 3584 3776 & 3840 SP IMO.
Quote:


> Originally Posted by *Alastair*
> 
> I want to know if a 4 low or 4 High would end up locking out 4 of CU's on a fully enabled Fury X.


Only if the GPU has writable state.


----------



## AndreDVJ

Yes I misunderstood the question. I did not realize flashing Fury X with already modded roms. Thanks guys for clarifying.


----------



## gupsterg

@andredvj

No worries








.

@Alastair

Genuine Fury X in CUinfo look like this.



Will mod the SP configuration table (Gfx_Harvesting) into Fury X ROM when have time and report back, do you want 3DM FS result or something else?


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @andredvj
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Alastair
> 
> Genuine Fury X in CUinfo look like this.
> 
> 
> 
> Will mod the SP configuration table (Gfx_Harvesting) into Fury X ROM when have time and report back, do you want 3DM FS result or something else?


I just want to see i a 4 low/high BIOS locks cores. I might have a shot at getting an X. And I think it will just work better if I can lock it out to 3840 to match my 3840 Fury. And also because SCIENCE!


----------



## broadbandaddict

Quote:


> Originally Posted by *RedGoose*
> 
> So I was thinking of buying the Fury X given newegg's great deal on the XFX version (only 460 bucks), but I wanted to check with other owners of the card first and see how its doing. I know at launch there were a lot of concerns about coil whine and micro stutter. How are the cards you guys own faring and would you say that these issues have been largely resolved? Any long term issues that have popped up that we weren't previously aware of?
> 
> Thanks for the info!


I ordered one of these on the 30th with the $20 rebate. Seemed like a good match for my new 3440x1440 monitor.









Got an EK block on the way as well. Does anyone use one of the EK backplates on their Fury X? They've got a gold one that is ~$50 shipped that would look great in my build.


----------



## Thoth420

Quote:


> Originally Posted by *broadbandaddict*
> 
> I ordered one of these on the 30th with the $20 rebate. Seemed like a good match for my new 3440x1440 monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got an EK block on the way as well. Does anyone use one of the EK backplates on their Fury X? They've got a gold one that is ~$50 shipped that would look great in my build.


My Fury X has an EK backplate painted white.


----------



## Alastair

-Guys a friend of mine is in the market for new cards. He has his heart set on a pair of 480's. But I am a believer of the age old saying, 1 Big card over two little ones. And he is tossing and turning between the Fury X and 2 480's, especially since there is a Fury X he can get for 449 dollars. I said he should go Fury X. Thoughts?


----------



## Blotto80

I'm with you on that. I had a pair of 290x's (Roughly similar to the 480) and the second card was used so little due to poor crossfire support on new games.

Of all the games I played over the time I had the 290x's, only Far Cry Primal and Dirt Rally had CF support out of the box, most other games got it added via driver too late and I had already finished with the game (Fallout 4, The Division), never got Crossfire support at all (Rainbow Six: Siege, Quantum Break, Just Cause 3), or had CF support on paper but was broken and completely unplayable (Witcher 3).

I switched to a Fury X as a single 290x wasn't cutting it at 1440p and I'm very pleased with the results. Everything is playable maxed out at 1440p, Witcher 3 gets 50-60fps, Doom 70-100. This card is fast, cool, and very quiet. When gaming the loudest part of my system is now the Noctua 140mm fans on my D15.

I would never rely on the promise of CF support for a card purchase, I had CF 5870s before and the story was the same. CF is great when you have a TOTL card and want to get beyond what any single card can offer and are willing to deal with the drawbacks or if you want to breathe some life into an older system and grab the second card while it's cheap. That's usually how I do it, but one top-end card and then skip the next gen and buy a match for it on the used market when games start to fall behind a bit.


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> I said he should go Fury X. Thoughts?


Fury X IMO.


Spoiler: RX 480 PCI-E slot power usage



This is how I see it, RX 480 is hammering the PCI-E slot even for average figure.


Spoiler: Collated THG power data for 390X/Nano/Fury X/RX 480







Do bare in mind the data from THG is being recording at very high speed, the spikes are not as relevant as average power usage, PC perspective's article uses a better method IMO and also has paragraphs explaining why their test method is so.

I don't believe they can curb RX 480 to be like those past cards on PCI-E slot power usage. IR3567B AFAIK does not differentiate on powering VRM from PCI-E slot/plugs. ROM PowerPlay PowerLimit does not contain values to separate PCI-E slot/plugs. To me via driver I reckon they will either lower voltages on card or exert tighter PowerTune algorithm to reduce power usage but still have higher usage compared with past cards. I reckon it is a PCB design flaw.





Spoiler: RX 480 cooler



The RX 480 cooler is pants, if upgraded = money. In the UK Nitro RX 490 1 etailer is preordering at £250, I've seen Amazon warehouse deals on Fury X for £300 recently, I'd rather pay the £50 extra and get a Fury X open box.





Spoiler: Thoughts on RX 480 vs Fury X on perf.per watt



I usually use a UK magazine's reviews for purchases and they are sometimes on Bit Tech, in their test setup they used later drivers for 390X & Fury X. Comparing say 1080P results only in Hitman did RX 480 gain over Fury X, I think a driver thing with Fury X possibly. Then rest of test the Fury X is faster than RX 480 by (used min FPS):-

AOTS 18% , Fallout 4 36%, Division 34%, Warhammer 11%, Witcher 3 26%

So average for those set of games = ~25% better performance. For the power usage test they take total system draw and run Valley at 1440P, Fury X is 50% faster whilst system uses 50% more power. TPU's performance per watt chart may not be that accurate, right at the top it states:-
Quote:


> We used the relative performance scores and the typical gaming power consumption result.


Typical gaming power consumption result would be 163W RX 480 and 246W Fury X on page 22 of review and I think they use that wattage and apply to charts on page 24 to come up with performance per watt. As they use Metro: Last Light @ 1080P for the typical gaming power result I thought I'll use FPS of that but there is no data to calculate perf.per watt.

How I see it there could be games which don't stick to highest DPM state on Fury X = less power usage where as due to RX 480 being lower SP, etc it maybe sticking to higher state. So better performance per watt data would be measuring each games power usage/performance and coming up with figures. I'm reckoning Fiji is pretty good on performance per watt.


----------



## Flamingo

Pity that AMD"s been having a rough time with releases.

290x - blower issue
390 series - rebrand issue
Fury - overhyped? not overclockers dream? unable to overtake 980 Ti at time of release.
480 - power issue


----------



## Orthello

Quote:


> Originally Posted by *Flamingo*
> 
> Pity that AMD"s been having a rough time with releases.
> 
> 290x - blower issue
> 390 series - rebrand issue
> Fury - overhyped? not overclockers dream? unable to overtake 980 Ti at time of release.
> 480 - power issue


Yeah it is unfortunate this occurs on the initial reviews of the ref cards.

Luckily either Drivers or AIB cards tend to fix most of the issues later on vs the competition.
390 was a rebrand 290 but with improved memory speed and clocks , so i guess same as 770 vs 680 there. So NV could / should have had the rebrand issue there too.
480 power issue is a bit disappointing , i think the only cards that should be considered are the AIBs to come just due to how close to power limits the ref design is.

Still most of the people that buy AMD have the sense to look at all of this subjectively and expect performance to improve with drivers etc - which does happen. In Dx12 the Fury(x) cards are now faster than the 1070 , so there has been progress.


----------



## 12Cores

Quote:


> Originally Posted by *GruntXIII*
> 
> Well...overclocked my Nano a bit and 1100 MHz core clock at +12 mV seem to be the sweet spot (didn't change the RAM frequency). If I go higher with clock it gets unstable. If I then raise the voltage it starts clocking down. I guess I'm fine with it...so far everything runs great
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Btw., temperatures under water are really good with this card. After a few hours of Doom, the GPU temp was at 37 °C (water temperature somewhere around room temperature at 28°C)
> 
> @Tgrove Nice Rig


Quote:


> Originally Posted by *Flamingo*
> 
> Pity that AMD"s been having a rough time with releases.
> 
> 290x - blower issue
> 390 series - rebrand issue
> Fury - overhyped? not overclockers dream? unable to overtake 980 Ti at time of release.
> 480 - power issue


Well said, I hope the AIB 480's clock north on 1.4mhz on average for less than $260 US, they need some good news and fast.


----------



## flopper

Quote:


> Originally Posted by *Alastair*
> 
> -Guys a friend of mine is in the market for new cards. He has his heart set on a pair of 480's. But I am a believer of the age old saying, 1 Big card over two little ones. And he is tossing and turning between the Fury X and 2 480's, especially since there is a Fury X he can get for 449 dollars. I said he should go Fury X. Thoughts?


single cards always a better gaming experience.
crossfire/sli a hit and miss


----------



## Kamikaze127

Well, as an old timer I was waiting for the GTX 1080, and then when those sold out; Crossfire RX480's. I can say the best thing that happened was the price drop when all the new tech came out and sold out. Picked up my XFX R9 Fury X today at Fry's, and had them price match Newegg for $459.99. By far the best (and quietest) card I'll be getting in the next 12 months for less than $500 due to tech economics and the price hikes on Polaris/Pascal. Not going to wait for Vega, my old 7870 Ghz Edition was starting to choke with newer titles.





https://www.techpowerup.com/gpuz/details/gce87


----------



## Waitng4realGPU

Quote:


> Originally Posted by *Kamikaze127*
> 
> Well, as an old timer I was waiting for the GTX 1080, and then when those sold out; Crossfire RX480's. I can say the best thing that happened was the price drop when all the new tech came out and sold out. Picked up my XFX R9 Fury X today at Fry's, and had them price match Newegg for $459.99. By far the best (and quietest) card I'll be getting in the next 12 months for less than $500 due to tech economics and the price hikes on Polaris/Pascal. Not going to wait for Vega, my old 7870 Ghz Edition was starting to choke with newer titles.


It's not a terrible purchase but it will suck if an AIB 480 performs at 90% of a fury X for up to $200 cheaper. Won't include the AIO cooling for that price though of course.


----------



## Kamikaze127

Quote:


> Originally Posted by *Waitng4realGPU*
> 
> It's not a terrible purchase but it will suck if an AIB 480 performs at 90% of a fury X for up to $200 cheaper. Won't include the AIO cooling for that price though of course.


Yeah, I agree with you. I mean that is if the prices are reasonable. The fact that all of these new cards are retailing $100-150 over MSRP is just killing me though. Not to mention, this route I do get the AIO cooler, and mature drivers. Mine doesn't even display any pump whine that I would consider a dealbreaker like many of the first Fury X's supposedly did.


----------



## gupsterg

Was just browsing HWBot RX 480 single card subs and found one at GPU: 1460MHz RAM: 2250MHz.

These are tess.tweak results as HWBot allows them.

My 24/7 OC vs RX 480 1460/2250

My HWBot sub of 1175/545 vs RX 480 1460/2250


----------



## Orthello

Quote:


> Originally Posted by *gupsterg*
> 
> Was just browsing HWBot RX 480 single card subs and found one at GPU: 1460MHz RAM: 2250MHz.
> 
> These are tess.tweak results as HWBot allows them.
> 
> My 24/7 OC vs RX 480 1460/2250
> 
> My HWBot sub of 1175/545 vs RX 480 1460/2250


Be interesting to see like for like with the 480 eg same tess tweaks. Your CPU is a lot weaker than that 480 system , the physics score really highlights that so its really a matter of if the tess tweaks mean more than the CPU etc.

I think that is the issue with comparing to 3d mark with different cpus / mem system. To a level it will alter the graphics scores also.

1460 is an interesting oc .. i wonder what card it was on .


----------



## gupsterg

The RX 480 bench is tess.tweak result (ie tess=off), so are mine







. Every HWBot'r that knows you can have tess. off will do that.

Here is RX 480 result on it's own showing tess.tweak done, link. The HWBot sub page for that RX 480 bench.

My HWBot Fury X sub page and HWBot sub 3DM score link, my 24/7 OC 3DM link.

So far Gorod's result is highest on HWBot 3DM FS, subs page.

The combined test does load CPU as well, that's why most will say if you just gaming get a i5 vs i7, regardless if Skylake, Devil's Canyon, etc.
Quote:


> Combined Test
> 
> Now the torture really starts: both CPU and GPU are pushed hard at the same time. The GPU load is a mix of Graphics test 1 and 2, using tessellation, volumetric illumination, fluid simulation, particle simulation, FFT based bloom and Depth of Field. The CPU is pushed by creating the rigid body physics of the breaking statues (background). The test runs 32 world simulations running in separate threads each containing 1 statue crumbling into 113 parts. On top off that: 16 invisible rigid bodies (in all but one world). The simulations run on one thread per available CPU core.


Link:- [GUIDE] 3DMark Score Calculation - how to calculate 3DMark Scores

The linked ROG guide is done by member viewing 3DMark technical guide.

Author : Henkenator68NL
Date : 1-july-2013
Source : Official Futuremark 3DMark Technical Guide
http://www.futuremark.com/downloads/3DMark_Technical_Guide.pdf
Quote:


> Originally Posted by *Orthello*
> 
> I think that is the issue with comparing to 3d mark with different cpus / mem system. To a level it will alter the graphics scores also.


On 3DM13 as long as CPU is not bottlenecking GPU, Graphics score / Graphics test 1 & 2 is comparable







.


----------



## Flamingo

So I was browsing Imgur and opened a video gif or whatever its called there.

The screen flickered and few secs later, the card actually went off.

it usually happens on reddit, but the card never "shuts down" during that flicker

Using FireFox 47


----------



## gupsterg

Seems very similar to the "display corruption" issue that several owners have on AMD Community.


----------



## Spartoi

So I've been playing Mirror's Edge: Catalyst and using CAM, I've notice that my my Fury's clock speed is never stable and always fluctuates (as does my frame rate). Is there a setting in Crimson or other software (not Afterburner) that can lock my GPU's clock speed?


----------



## gupsterg

Try switching "power efficiency" off in "gaming" > "global settings", to improve clock stability. For FPS you could try "frame rate target control" to achieve less frame fluctuation. FRTC can per set per game if there is profile or you make one IIRC. PE can only be set as a global value.


----------



## Spartoi

Quote:


> Originally Posted by *gupsterg*
> 
> Try switching "power efficiency" off in "gaming" > "global settings", to improve clock stability. For FPS you could try "frame rate target control" to achieve less frame fluctuation. FRTC can per set per game if there is profile or you make one IIRC. PE can only be set as a global value.


Thanks. Turning Power Efficiency off in Crimson did the trick.


----------



## gupsterg

No worries







.

Do be aware with PE off you will have clock bounce in deskop use (ie browsing, office, etc), it will not stick to 300MHz. I wish they'd allow this setting in profiles.


----------



## Flamingo

Pfft happened again at Imgur, flickered 3 times before the card crashed itself.

Going to turn off HW acceleration in Firefox and update drives from WHQL to 16.6.2

When my 7970 used to crash, somehow catalyst was able to reset the card... has that feature been removed?


----------



## gupsterg

I don't think the display corruption/flickering is driver crash. There are several owners in the AMD community thread that have had this issue for months. Some have RMA'd cards and it still occurs. They have tried all sorts of things like bios updates/drivers, differing cables/ports, monitors, etc. Some are plagued with it happening intermittently/repeatedly with no real pattern as to when and how it occurs.

AMD responded in the thread early on, they basically have washed their hands of it now, as it was not reproducible on RMA'd cards with fault that they tested in their labs. None of the members have found a solution that resolves issue.

I only had it once on 1 card out of the 7 Fiji ones I've had. I was browsing web with a video running in windows media player. Stock ROM, driver defaults, v16.3.2 WHQL. Display corruption only went away with reboot, the card was disposed shortly afterwards. I have kept using v16.3.2 WHQL with at least 3 cards with the 2 differing rigs and have not encountered the issue again.


----------



## xTesla1856

Quick question: How do I make Afterburner apply my HBM overclock to both cards, not just the primary card? RTSS also shows just the first card at higher mem clock, the second card is stuck at 500mhZ and I can't move the slider for it. Core clocks are synced nicely. Using the rig in my sig. Thanks!

EDIT: Trixx syncs HBM clock speeds, but now one of my cores is at 1160 and the other one is at 1150. Strange


----------



## Jflisk

Quote:


> Originally Posted by *Flamingo*
> 
> So I was browsing Imgur and opened a video gif or whatever its called there.
> 
> The screen flickered and few secs later, the card actually went off.
> 
> it usually happens on reddit, but the card never "shuts down" during that flicker
> 
> Using FireFox 47


Quote:


> Originally Posted by *gupsterg*
> 
> Seems very similar to the "display corruption" issue that several owners have on AMD Community.


I have had the corruption in question in the past . Also posted it the AMD forum above. I have not seen it in the latest Beta driver. The only problem with the corruption it is very sporadic to say the least. It creates a screen tear -The way to fix it quickly is change the resolution you can still see the screen enough to do that.


----------



## pdasterly

still waiting for the pro duo price cut , thanks amd


----------



## gupsterg

Quote:


> Originally Posted by *xTesla1856*
> 
> Quick question: How do I make Afterburner apply my HBM overclock to both cards, not just the primary card? RTSS also shows just the first card at higher mem clock, the second card is stuck at 500mhZ and I can't move the slider for it. Core clocks are synced nicely. Using the rig in my sig. Thanks!
> 
> EDIT: Trixx syncs HBM clock speeds, but now one of my cores is at 1160 and the other one is at 1150. Strange


No idea, I never tried CF, I know sounds nutty when I had cards to do it, due to PSU/CPU in my rig I didn't try it.

As no one has replied Unwinder on Guru3D (author of MSI AB) may know, for TriXX W1zzard on TPU.


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> Quick question: How do I make Afterburner apply my HBM overclock to both cards, not just the primary card? RTSS also shows just the first card at higher mem clock, the second card is stuck at 500mhZ and I can't move the slider for it. Core clocks are synced nicely. Using the rig in my sig. Thanks!
> 
> EDIT: Trixx syncs HBM clock speeds, but now one of my cores is at 1160 and the other one is at 1150. Strange


In Trixx under settings menu are you selecting the synchronize check box and if you are not, are you selecting each card individually from the drop down menu at the top of Trixx setting menu? Just curious.


----------



## josephimports

Quote:


> Originally Posted by *xTesla1856*
> 
> Quick question: How do I make Afterburner apply my HBM overclock to both cards, not just the primary card? RTSS also shows just the first card at higher mem clock, the second card is stuck at 500mhZ and I can't move the slider for it. Core clocks are synced nicely. Using the rig in my sig. Thanks!
> 
> EDIT: Trixx syncs HBM clock speeds, but now one of my cores is at 1160 and the other one is at 1150. Strange


For AB, you'll need to remove/disable GPU1, follow the same procedure to unlock the memory slider, verify, and reinstall/enable GPU1.
Quote:


> Originally Posted by *bluezone*
> 
> In Trixx under settings menu are you selecting the synchronize check box and if you are not, are you selecting each card individually from the drop down menu at the top of Trixx setting menu? Just curious.


----------



## xTesla1856

Quote:


> Originally Posted by *josephimports*
> 
> For AB, you'll need to remove/disable GPU1, follow the same procedure to unlock the memory slider, verify, and reinstall/enable GPU1.


Thanks, will do that when I get home today! So far, I both cards will do 1150mhz, but require +96mV. Coolers keep up though, 72°C max on the top card


----------



## Alastair

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *josephimports*
> 
> For AB, you'll need to remove/disable GPU1, follow the same procedure to unlock the memory slider, verify, and reinstall/enable GPU1.
> 
> 
> 
> Thanks, will do that when I get home today! So far, I both cards will do 1150mhz, but require +96mV. Coolers keep up though, 72°C max on the top card
Click to expand...

you sure you aren't negative scaling? As far as I am aware negative scaling starts from around 49mv.


----------



## xTesla1856

Quote:


> Originally Posted by *Alastair*
> 
> you sure you aren't negative scaling? As far as I am aware negative scaling starts from around 49mv.


I was benching the other day and the only way my cards pass stability testing at 1150/550 is with +96mV in Afterburner. Any less and the driver would crash midway through Firestrike. I'll test again later to verify though.


----------



## xTesla1856

Just tested again, +48mV and I get artifacting and eventually a driver crash in Firestrike. At +96mV, I get a pass without a hitch.


----------



## Alastair

Yes but what I am saying is are you sure you aren't getting negative scaling at that voltage, what I mean by that is, FPS and scores tend to start DROPPING at voltages above the +50mv range.


----------



## Thoth420

Sad to say guys but I am going to ditch Kung Fury X and go with an X99 platform with a CLC for the CPU and an air cooled 1080 EVGA ACX(most likely). I abhor Nvidia but as my rig was shop built it is either a Fury X(non water cooled), a Fury or a 1080 to stay inside the budget for a rebuild. I don't want to take a risk with issues of sound from the Fury X and I never planned on a Fury. I was also told I could wait for the Titan P and get that and drop my 1.2 TB Intel NVME drive down to the 400GB model to make the cost fit. I am obviously probably going to lose my white theme and have to find a new G Sync panel and sell this great BenQ but I don't have the skillset to pull things out of this custom loop so this way if something craps out I can just replace it myself and save the headache.

I wish I could stay with the red team but I don't see a single air cooled flagship on the horizon anywhere to push 4K(if I am going to use a 1080 why not...?) and again I am at the mercy of what my builder can get before Deus Ex comes out in AUGust







as that is the one game I refuse to play on my Xbone while this water cooled paperweight just sits here.

Rebuild will be everything including chassis less peripherals.

P.S. I would love to drop the full specs here when they are finalized to get some feedback because if I can stick with AMD and not lose out in my specific scenario I would love to but I am under time constraints...been out a main system for too long. I also do not use multi GPU configs so anything with Xfire would take a lot of convincing.


----------



## broadbandaddict

Newegg has the XFX Fury X for $400 now. I ordered mine last month with the $20 rebate but they gave me $60 Newegg credit after chatting with them. Might be worth a shot if you purchased recently.


----------



## Thoth420

Quote:


> Originally Posted by *broadbandaddict*
> 
> Newegg has the XFX Fury X for $400 now. I ordered mine last month with the $20 rebate but they gave me $60 Newegg credit after chatting with them. Might be worth a shot if you purchased recently.


Awesome deal and XFX has the best AMD GPU warranty support and even allow modding


----------



## Kana-Maru

Quote:


> Originally Posted by *Thoth420*
> 
> Sad to say guys but I am going to ditch Kung Fury X and go with an X99 platform with a CLC for the CPU and an air cooled 1080 EVGA ACX(most likely). I abhor Nvidia but as my rig was shop built it is either a Fury X(non water cooled), a Fury or a 1080 to stay inside the budget for a rebuild. I don't want to take a risk with issues of sound from the Fury X and I never planned on a Fury.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I was also told I could wait for the Titan P and get that and drop my 1.2 TB Intel NVME drive down to the 400GB model to make the cost fit. I am obviously probably going to lose my white theme and have to find a new G Sync panel and sell this great BenQ but I don't have the skillset to pull things out of this custom loop so this way if something craps out I can just replace it myself and save the headache.
> 
> I wish I could stay with the red team but I don't see a single air cooled flagship on the horizon anywhere to push 4K(if I am going to use a 1080 why not...?) and again I am at the mercy of what my builder can get before Deus Ex comes out in AUGust
> 
> 
> 
> 
> 
> 
> 
> as that is the one game I refuse to play on my Xbone while this water cooled paperweight just sits here.
> 
> Rebuild will be everything including chassis less peripherals.
> 
> P.S. I would love to drop the full specs here when they are finalized to get some feedback because if I can stick with AMD and not lose out in my specific scenario I would love to but I am under time constraints...been out a main system for too long. I also do not use multi GPU configs so anything with Xfire would take a lot of convincing.


^ Well I suppose that you are looking for 60fps @ 4K right? The Fury X is water cooled, but I'm sure you mean custom loop and closed loop. I'm guessing that you probably won't hear much news about AMD next flagship until later this year [Q4?]. They are on the affordable mainstream path to regain some market share and so far it has been working.

I'm still enjoying 4K for the games I play, but I understand some people "require" 60fps. I don't require 60fps, but I do require a smooth experience with little to no screen tearing and no micro-stutter. I'm fine with 35+. The GTX 1080 is the latest and greatest from Nvidia, but so far it's coming with the price of a arm and a leg. The price gouging and rare availability is real.

In the meantime I'm waiting on Vega and I might pick up a non reference RX 480 OC for benchmarking purposes. I might also pick up the GTX 1060 if it isn't too expensive for benchmarking purposes as well, but I did hear that it's going to have a Founders Edition as well so who knows.

At the moment I'm running 4K benchmarks on several games and hopefully I can get those results updated soon. Some of the games are re-test with new drivers and game patches. Some are new games I've never tested.

These are all of the games I've testing with 100% MAX SETTINGS @ 4K. Some games will be tested with Max Settings with AA disabled since I really don't need to use AA at 4K from my experience with the image quality. I might also run some benchmarks with the Tessellation set to X8 and lower and completely disabled to see if it makes a difference at 4K. At the moment I'm running all settings maxed.

Fury X Stock @ 4K one year later [games I'm testing]:

-Doom 2016
-MGSV: The Phantom Pain
-Ryse: Son of Rome
-The Evil Within
-Hitman DX12
-Rise of the Tomb Raider
-Shadows of Mordor + 6GB HD Texture Pack
-Crysis 3
-Metro: 2033 Redux
-Metro: Last Light Redux
-Rainbow Six Siege
-Batman: Arkham Knight
-The Witcher 3

I've already went through the data and got some results for the games above. Here's a little data for you guys to read before the article is posted.
My FPS Min Caliber™........ long story short is basically 97th Percentile by the way

_[There's a new patch for Doom 2016, but I haven't benchmarked the game using the latest Patch yet]_
*Doom 2016* @ 4K - Max Graphical Settings + AA [TSAAA (8TX)] *Enabled*:
Level: Know Your Enemy
*-FPS Avg: 45.22*
-FPS Max: 68
-FPS Min Caliber™: 34.5

*Doom 2016* @ 4K - Max Graphical Settings + *AA Disabled*:
Level: Know Your Enemy
*-FPS Avg: 48*
-FPS Max: 79.4
-FPS Min Caliber™: 36.1

*Metal Gear Solid 5: The Phantom Pain* @ 4K - Max Graphical Settings:
Level: Mission 30: Sahelanthropus Intro Cinematic
*-FPS Avg: 36.13*
-FPS Max: 55.1
-FPS Min Caliber™: 25.6

*Ryse: Son of Rome* @ 4K - Max Graphical Settings:
Level: Chapter 4
*-FPS Avg: 40fps*
-FPS Max: 56.4
-FPS Min Caliber™: 33

*The Evil Within* @ 4K - Max Graphical Settings:
Level: Chapter 2
*-FPS Avg: 39fps*
-FPS Max: 49
-FPS Min Caliber™: 31.9

*Hitman 2016* @ 4K - Max Graphical Settings [DX12 + Patch 1.1.2]:
Level: Episode 2: Sapienza
*-FPS Avg: 44fps*
-FPS Max: 68.1
-FPS Min Caliber™: 36.5

Remember than all of these games are 100% maxed out so that includes AA max unless otherwise stated. I can easily drop some necessary graphical settings and gain more fps and lower frametimes. I spent a decent amount on my Fury X [$649+] and I always want to know my true performance with all settings maxed. So far no complaints from me at 4K resolution gaming. The 4GBs HBM is holding up much better than I thought. It's only going to get better from here since we are waiting on another flagship.

I still have a lot of games to benchmark. 1440p is great, but I am enjoying 4K gaming believe it or not. I'm going to perform some off-screen recordings that shows my FPS on-screen. The performance hit is just to much when trying to record 1440 and 4K. I'm thinking about picking up another Fury X as well since they are getting cheaper. Then again I think a single Fury X could hold me over until next summer.


----------



## Thoth420

Quote:


> Originally Posted by *Kana-Maru*
> 
> ^ Well I suppose that you are looking for 60fps @ 4K right? The Fury X is water cooled, but I'm sure you mean custom loop and closed loop. I'm guessing that you probably won't hear much news about AMD next flagship until later this year [Q4?]. They are on the affordable mainstream path to regain some market share and so far it has been working.
> 
> I'm still enjoying 4K for the games I play, but I understand some people "require" 60fps. I don't require 60fps, but I do require a smooth experience with little to no screen tearing and no micro-stutter. I'm fine with 35+. The GTX 1080 is the latest and greatest from Nvidia, but so far it's coming with the price of a arm and a leg. The price gouging and rare availability is real.
> 
> In the meantime I'm waiting on Vega and I might pick up a non reference RX 480 OC for benchmarking purposes. I might also pick up the GTX 1060 if it isn't too expensive for benchmarking purposes as well, but I did hear that it's going to have a Founders Edition as well so who knows.
> 
> At the moment I'm running 4K benchmarks on several games and hopefully I can get those results updated soon. Some of the games are re-test with new drivers and game patches. Some are new games I've never tested.
> 
> These are all of the games I've testing with 100% MAX SETTINGS @ 4K. Some games will be tested with Max Settings with AA disabled since I really don't need to use AA at 4K from my experience with the image quality. I might also run some benchmarks with the Tessellation set to X8 and lower and completely disabled to see if it makes a difference at 4K. At the moment I'm running all settings maxed.
> 
> Fury X Stock @ 4K one year later [games I'm testing]:
> 
> -Doom 2016
> -MGSV: The Phantom Pain
> -Ryse: Son of Rome
> -The Evil Within
> -Hitman DX12
> -Rise of the Tomb Raider
> -Shadows of Mordor + 6GB HD Texture Pack
> -Crysis 3
> -Metro: 2033 Redux
> -Metro: Last Light Redux
> -Rainbow Six Siege
> -Batman: Arkham Knight
> -The Witcher 3
> 
> I've already went through the data and got some results for the games above. Here's a little data for you guys to read before the article is posted.
> My FPS Min Caliber™ long story short is basically = 97th Percentile by the way
> 
> _[There's a new patch for Doom 2016, but I haven't benchmarked the game using the latest Patch yet]_
> *Doom 2016* @ 4K - Max Graphical Settings + AA [TSAAA (8TX)] *Enabled*:
> Level: Know Your Enemy
> *-FPS Avg: 45.22*
> -FPS Max: 68
> -FPS Min Caliber™: 34.5
> 
> *Doom 2016* @ 4K - Max Graphical Settings + *AA Disabled*:
> Level: Know Your Enemy
> *-FPS Avg: 48*
> -FPS Max: 79.4
> -FPS Min Caliber™: 36.1
> 
> *Metal Gear Solid 5: The Phantom Pain* @ 4K - Max Graphical Settings:
> Level: Mission 30: Sahelanthropus Intro Cinematic
> *-FPS Avg: 36.13*
> -FPS Max: 55.1
> -FPS Min Caliber™: 25.6
> 
> *Ryse: Son of Rome* @ 4K - Max Graphical Settings:
> Level: Chapter 4
> *-FPS Avg: 40fps*
> -FPS Max: 56.4
> -FPS Min Caliber™: 33
> 
> *The Evil Within* @ 4K - Max Graphical Settings:
> Level: Chapter 2
> *-FPS Avg: 39fps*
> -FPS Max: 49
> -FPS Min Caliber™: 31.9
> 
> *Hitman 2016* @ 4K - Max Graphical Settings [DX12 + Patch 1.1.2]:
> Level: Episode 2: Sapienza
> *-FPS Avg: 44fps*
> -FPS Max: 68.1
> -FPS Min Caliber™: 36.5
> 
> Remember than all of these games are 100% maxed out so that includes AA max unless otherwise stated. I can easily drop some necessary graphical settings and gain more fps and lower frametimes. I spent a decent amount on my Fury X [$649+] and I always want to know my true performance with all settings maxed. So far no complaints from me at 4K resolution gaming. The 4GBs HBM is holding up much better than I thought. It's only going to get better from here since we are waiting on another flagship.
> 
> I still have a lot of games to benchmark. 1440p is great, but I am enjoying 4K gaming believe it or not. I'm going to perform some off-screen recordings that shows my FPS on-screen. The performance hit is just to much when trying to record 1440 and 4K. I'm thinking about picking up another Fury X as well since they are getting cheaper. Then again I think a single Fury X could hold me over until next summer.


It isnt about 4K it is about the shops returns on custom water loop parts. I would take a 1080 over a Fury X being it wontmake a difference in price. I am just going 4K because well almost everything I plan on playing is single player eye candy anyway so I figure why not since I have someone interested in my BenQ and since it is cherry they are offering what I paid for it which was 600 US.


----------



## xTesla1856

Quote:


> Originally Posted by *Alastair*
> 
> Yes but what I am saying is are you sure you aren't getting negative scaling at that voltage, what I mean by that is, FPS and scores tend to start DROPPING at voltages above the +50mv range.


You were right, I benched GTA V 3 times and the results were quite sobering: There was virtually no difference between running 1150/550/+96mV and running 1075/500/+-0mV. Talking one, maybe two FPS difference between 3 runs each. Might as well start undervolting to lower temps and power draw


----------



## gupsterg

Yeah this aspect sucks on Fiji







.

I have tried a few different methods of applying voltage with combination of PowerPlay VID and VRM controller offset and does not help







.

It does not matter what PL or driver options we set, at x voltage increase a card will scale negatively on performance, regardless if at stock clocks or OC clocks







.

At what voltage a card starts negatively scaling on performance with voltage increase depends on each card from 7 I tested







.


----------



## Kamikaze127

Quote:


> Originally Posted by *broadbandaddict*
> 
> Newegg has the XFX Fury X for $400 now. I ordered mine last month with the $20 rebate but they gave me $60 Newegg credit after chatting with them. Might be worth a shot if you purchased recently.


Dang, I just bought my Fury X on the 3rd, but I went to my local Fry's and had them price match Newegg to $459.99. At least I get a $30 rebate out of Fry's, not to mention I don't really order from Newegg anymore. But $399.99 is an outstanding deal, they must be trying to empty out their stock. I assume AMD isn't going to build anymore Fury cards until Vega.


----------



## xTesla1856

Quote:


> Originally Posted by *gupsterg*
> 
> Yeah this aspect sucks on Fiji
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I have tried a few different methods of applying voltage with combination of PowerPlay VID and VRM controller offset and does not help
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It does not matter what PL or driver options we set, at x voltage increase a card will scale negatively on performance, regardless if at stock clocks or OC clocks
> 
> 
> 
> 
> 
> 
> 
> .
> 
> At what voltage a card starts negatively scaling on performance with voltage increase depends on each card from 7 I tested
> 
> 
> 
> 
> 
> 
> 
> .


I got 1075/500 running at -48mV, reducing my max temps by about 3°C. Gonna test tonight with my Kill-a-watt to compare system power draw.


----------



## gupsterg

That is a good result in context of reducing voltage and getting small OC still. That will probably be the most optimal OC in context of all aspects, mine is 1115MHz.

My card @ stock settings has DPM 7 VID of 1.212V (see fiji bios mod OP to gain your card's stat), I can OC to 1115MHz without any voltage increase. With 1.243V in ROM I get 1135MHz, 1145MHz needs 1.268V. Anything past 1145MHz I am sort of running into scaling issue due to VID requirement. My OC have been tested for nutty amounts of time, for example ~3hrs each of 3DM FS loop, Heaven, Valley and at times upto 48hrs [email protected]

For HBM to be stable @ 545MHz I need to add +25mV via ROM to MVDDC. I've been running 1145/545 now for few weeks solid and not had an issue in games/normal uses.


----------



## xTesla1856

How much benefit would I get out of a custom BIOS as opposed to Sapphire's OC BIOS on the Nitro Fury?


----------



## Flamingo

Rise of Tomb Raider finally supports async compute with the latest patch:


----------



## xTesla1856

Quote:


> Originally Posted by *Flamingo*
> 
> Rise of Tomb Raider finally supports async compute with the latest patch:


MIght finally buy the game then. Does it support Crossfire?


----------



## Flamingo

Quote:


> Originally Posted by *xTesla1856*
> 
> MIght finally buy the game then. Does it support Crossfire?


Supposedly according to the patch notes.
Quote:


> Adds DirectX12 Multi-GPU support. Many of you have requested this, so DirectX 12 now supports NVIDIA SLI and AMD CrossFireX configurations.
> 
> The benefits of DirectX 12 are still the same, but now you are less likely to be GPU bottlenecked and can reach higher framerates due to the improved CPU utilization DirectX 12 offers.
> 
> Adds utilization of DirectX 12 Asynchronous Compute, on AMD GCN 1.1 GPUs and NVIDIA Pascal-based GPUs, for improved GPU performance.
> 
> On the latest Windows 10 version, V-sync can now be disabled (Windows Store version), and behavior of disabled V-sync has been improved (Steam version).
> 
> Improvements to stereoscopic 3D rendering, including fixes for NIVIDA Surround and AMD Eyefinity in combination with stereoscopic 3D.
> 
> Improved default settings for certain integrated GPUs.
> 
> Removes the Voidhammer Shotgun for users that do not have Cold Darkness Awakens DLC or have not yet unlocked it by rescuing prisoners in Cold Darkness Awakens.
> 
> A fix for a save issue where some game state could get lost in very rare circumstances.
> 
> This patch force-disables the Steam Overlay when using DirectX 12. This is due to stability issues with the Steam Overlay in combination with DirectX 12.
> 
> A fix for crashes with VXAO enabled on NVIDIA GTX1080 cards.
> 
> A variety of other smaller optimizations, bug-fixes, and tweaks.
> Quote:


----------



## xTesla1856

So only in DX12 ? I'm on Windows 7


----------



## bluezone

Quote:


> Originally Posted by *Flamingo*
> 
> Rise of Tomb Raider finally supports async compute with the latest patch:


Excellent I'll have to check that out. Dam no Mr. Burns emoji.

REP +1.

EDIT: FPS counters (Steam, FRAPS) apparently do not work in DX12. The game is much much smoother and no load stutter @ beginning of levels now.


----------



## gupsterg

@xTesla1856

As your reducing voltage, custom ROM will not help you much. If you run into an issue with lowered voltage at say desktop use for example, then custom ROM maybe better option. If you were increasing voltage then yes I'd say go custom ROM. When we apply an offset via MSI AB it effects all DPM states, where as with ROM we can edit each one as required. Best example to show benefit is when we increase voltage. Lets say you need +50mV for 1100MHz, now the voltage increase adds to all states (even idle), but you have only OC'd highest state by MSI AB, with ROM you can add voltage to the state you require.

If you read headings in OP of Fiji bios mod you can make your own mind up if it is or is not beneficial for you to go custom ROM







.

@bluezone

Was reading the extremetech RX 480 review and noted this:-


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> @bluezone
> 
> Was reading the extremetech RX 480 review and noted this:-


Very interesting.


----------



## gupsterg

We had "future" tech prior to release







.


----------



## utnorris

So I got my Fury X under water finally. I can go to 1140Mhz on the GPU without adding voltage. Temps sit around 36c with ambient temp around 27c. That is under full load benching. I have not tried upping my memory yet, but I wanted to see where folks are getting with and without voltage and is there a big benefit or is it only minor compared to the GPU?

My Specs are in my signature.


----------



## utnorris

As I was looking at my results from previous runs I noticed I was able to bench at 549Mhz on the memory. So I went and tried it again. I believe the earlier run had 549Mhz in error as I was able to run the benchmark, but my score was higher:

http://www.3dmark.com/3dm/13051184?

This is the comparison, so not too bad for 49Mhz increase on the memory.

http://www.3dmark.com/compare/fs/9214440/fs/9197717

No voltage added either, so quite happy with that. I only moved the Power Limit to +50.


----------



## bluezone

Quote:


> Originally Posted by *utnorris*
> 
> As I was looking at my results from previous runs I noticed I was able to bench at 549Mhz on the memory. So I went and tried it again. I believe the earlier run had 549Mhz in error as I was able to run the benchmark, but my score was higher:
> 
> http://www.3dmark.com/3dm/13051184?
> 
> This is the comparison, so not too bad for 49Mhz increase on the memory.
> 
> http://www.3dmark.com/compare/fs/9214440/fs/9197717
> 
> No voltage added either, so quite happy with that. I only moved the Power Limit to +50.


Not too bad at all. Just watch out for lost textures in FS. It will appear as pop-up in demo section of FS. If it doesn't happen then your golden.


----------



## gupsterg

@utnorris

Sweet result of 1140/549 without adding voltage







.

Using your result as a comparative to my 1145/545 seems I'm ok on scaling (not done many FS E benches, I usually do FS).

When I first got Fiji card I had read HBM clocks in steps, at first I was a sceptic about this clocking aspect but it seems to be correct, see this post. Basically 549MHz is clocking to 545MHz.


----------



## utnorris

Ok, so I was getting a "Time measurement inconsistencies detected during benchmark run." error and had to up my voltage by +18 to make it stable. This is the result:

http://www.3dmark.com/fs/9217068

So overall a 7669 score and GPU score of 8086.

Not too bad as the added voltage doesn't seem to affect temps. I may try to get my GPU up higher, not sure yet as I am pretty happy with these results so far.

Here is my FS bench:

http://www.3dmark.com/fs/9217186

Overall score is 14863

GPU score is 17124


----------



## ukic

Any old 7990 users out here? Leaning towards a Fury X, wondering if it's a huge jump.


----------



## bluezone

Quote:


> Originally Posted by *ukic*
> 
> Any old 7990 users out here? Leaning towards a Fury X, wondering if it's a huge jump.


How about someone who was useing HD 7950 X-fire? It's an improvement with newer drivers above 1080p. I'm running a Nano the Fury X is better.


----------



## bluedevil

Just gonna leave this here.







Tks!

http://www.overclock.net/t/1598316/sponsored-classified-demon-watercooled-cm-mastercase-5-pro-evga-z170-classified-amd-fury-x


----------



## mechwarrior

hi Guys can you install a universal water block on the gigabyte fury oc?


----------



## bluezone

I running a lot of bench's in FS today. FS is having a hard time registering my GPU clock frequency. This run was @ 1100 Mhz. but it says I hit 1350 Mhz core clock. I wish. LOL.









http://www.3dmark.com/3dm/13082678


----------



## broadbandaddict

Got my waterblock installed on my Fury X today. Gotta love the look of the EK blocks and that sweet GPU die with HBM.











Spoiler: Pics













Idle temp is ~25C (room temp ~24C), max load temp so far is 31C. I'm looking into a modded vBIOS and hoping to get it OCd soon.


----------



## Cannon19932006

Just joined the club and had a few questions about overclocking the Fury (3584)
GPU-Z link

1. What is the average OC for these cards, both core and mem?
2. Do these cards have a similar bit of weirdness like the 390 series where if you go too high past 50-75mv your max clock actually is less, or if my temps permit can i just max out the voltage slider?
3. What is the preferred Overclocking utility for this card?

Thanks!


----------



## ozyo

Quote:


> Originally Posted by *Cannon19932006*
> 
> Just joined the club and had a few questions about overclocking the Fury (3584)
> GPU-Z link
> 
> 1. What is the average OC for these cards, both core and mem?
> 2. Do these cards have a similar bit of weirdness like the 390 series where if you go too high past 50-75mv your max clock actually is less, or if my temps permit can i just max out the voltage slider?
> 3. What is the preferred Overclocking utility for this card?
> 
> Thanks!


1-1150/550 for new bios
2-i don't know








3-sapphire trixx


----------



## Cannon19932006

http://www.3dmark.com/fs/9243706

Pushed it all I could to hit that 15k overall mark.


----------



## ozyo

Quote:


> Originally Posted by *Cannon19932006*
> 
> http://www.3dmark.com/fs/9243706
> 
> Pushed it all I could to hit that 15k overall mark.


Did you update the bios?


----------



## ukic

Quote:


> Originally Posted by *broadbandaddict*
> 
> Got my waterblock installed on my Fury X today. Gotta love the look of the EK blocks and that sweet GPU die with HBM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Idle temp is ~25C (room temp ~24C), max load temp so far is 31C. I'm looking into a modded vBIOS and hoping to get it OCd soon.


Pretty!


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I running a lot of bench's in FS today. FS is having a hard time registering my GPU clock frequency. This run was @ 1100 Mhz. but it says I hit 1350 Mhz core clock. I wish. LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/13082678


I've never had this issue and 3DM13 is what I mainly use. I'd report it on the Futuremark forum, there have been other minor issues after the "improved" UI has been implemented that they have fixed after users reported them.


----------



## Performer81

Quote:


> Originally Posted by *ozyo*
> 
> 1-1150/550 for new bios
> 2-i don't know
> 
> 
> 
> 
> 
> 
> 
> 
> 3-sapphire trixx


What do you mean with new bios?


----------



## josephimports

Quote:


> Originally Posted by *Performer81*
> 
> What do you mean with new bios?


https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware


----------



## xTesla1856

What's the general consensus on Fiji ASIC-quality? My Furys are at 63% and 57%. Do the same "rules" apply as on Nvidia chips regarding voltage/overclocking?


----------



## Cannon19932006

Quote:


> Originally Posted by *ozyo*
> 
> Did you update the bios?


No, I looked at the link you posted but only see nano and fury x.

edit: The Sapphire nitro already has a EFI bios though.


----------



## ozyo

Quote:


> Originally Posted by *Cannon19932006*
> 
> No, I looked at the link you posted but only see nano and fury x.
> 
> edit: The Sapphire nitro already has a EFI bios though.


oh sorry I thought you have fury x


----------



## Cannon19932006

Nope, just a Sapphire Fury Nitro, is there any way to increase core more than 75mv in TRIXX?

My temps don't go over 55c under Firestrike load and I'd like to try some more voltage to get over 1180MHz.


----------



## ozyo

as far I know you can't


----------



## gupsterg

Quote:


> Originally Posted by *xTesla1856*
> 
> What's the general consensus on Fiji ASIC-quality? My Furys are at 63% and 57%. Do the same "rules" apply as on Nvidia chips regarding voltage/overclocking?


See post 7 & 9 in linked thread







.


----------



## xTesla1856

Quote:


> Originally Posted by *Cannon19932006*
> 
> Nope, just a Sapphire Fury Nitro, is there any way to increase core more than 75mv in TRIXX?
> 
> My temps don't go over 55c under Firestrike load and I'd like to try some more voltage to get over 1180MHz.


In Afterburner you can go +96mV


----------



## xTesla1856

Quote:


> Originally Posted by *gupsterg*
> 
> See post 7 & 9 in linked thread
> 
> 
> 
> 
> 
> 
> 
> .


So the higher the better, but core voltage will be different?


----------



## Alastair

Quote:


> Originally Posted by *Cannon19932006*
> 
> Nope, just a Sapphire Fury Nitro, is there any way to increase core more than 75mv in TRIXX?
> 
> My temps don't go over 55c under Firestrike load and I'd like to try some more voltage to get over 1180MHz.


don't bother going higher than +75mv unless you get yourself a custom bios. Negative scaling will definitely start kicking in beyond +75mv.


----------



## xkm1948

Anyone else having problem with 16.7.2 driver on FuryX? For the same OC I used to have I am having a lot of BSOD during VR.


----------



## Alastair

Quote:


> Originally Posted by *xkm1948*
> 
> Anyone else having problem with 16.7.2 driver on FuryX? For the same OC I used to have I am having a lot of BSOD during VR.


Nope nothing here so far. Crossfire Fury's at 1100/550.


----------



## Cannon19932006

Quote:


> Originally Posted by *Alastair*
> 
> don't bother going higher than +75mv unless you get yourself a custom bios. Negative scaling will definitely start kicking in beyond +75mv.


Where would be a good place to look for some custom bios for this card?


----------



## xkm1948

Quote:


> Originally Posted by *Alastair*
> 
> Nope nothing here so far. Crossfire Fury's at 1100/550.


Running 16.7.2? That is so weird. Can you try 3DMark Stress Test with FireStrike Ultra?


----------



## Semel

Quote:


> Originally Posted by *ozyo*
> 
> as far I know you can't


You can

__
https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/
 Trixx to allow any voltage increase.It's pretty easy... But it won't matter coz of negative performance scaling when voltage is changed beyond +30mV via trix\afterburner.


----------



## Cannon19932006

Quote:


> Originally Posted by *Semel*
> 
> You can
> 
> __
> https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/
> Trixx to allow any voltage increase.It's pretty easy... But it won't matter coz of negative performance scaling when voltage is changed beyond +30mV via trix\afterburner.


Is the negative scaling true for the non reference cards like my Sapphire as well? Because I did not notice any negative scaling all the way up to Trixx's max of +72mv.


----------



## bluezone

Quote:


> Originally Posted by *Cannon19932006*
> 
> Is the negative scaling true for the non reference cards like my Sapphire as well? Because I did not notice any negative scaling all the way up to Trixx's max of +72mv.


Try a high clock rate with PL of +50% with Tessellation OFF in Fire Strike. This is an extreme load. It generally will show neg. scaling if you have it. Except under certain specific conditions you will achieve only a Max of approx. 275 amps power use peak.

Curious to see how you do.

EDITED: Meant AMPs not watts.


----------



## Cannon19932006

Quote:


> Originally Posted by *bluezone*
> 
> Try a high clock rate with PL of +50% with Tessellation OFF in Fire Strike. This is an extreme load. It generally will show neg. scaling if you have it. Except under certain specific conditions you will achieve only a Max of approx. 275 Watts power use peak.
> 
> Curious to see how you do.


How should I test, stock clocks and just move voltage around and re-run the test?


----------



## bluezone

Okay I think I now understand where you are coming from.
What is being talked about is negative scaling with voltage. What this means is that overclocking performance increases on the Fiji series GPU's is hampered by additional voltage. You might be to OC to say 1.75 Mhz, but if you need to add voltage to achieve a stable overclock. Bench marking scores will decrease.

Depending on your GPU adding voltage @ stock clocks might not hurt it performance wise. Bois ROMs with very specific mods can slightly help this, but of course it depends on your GPU.

I had initially thought that you had already overclocked the card. But it now sounds like you have not. Please correct me if I'm wrong.

bring me up to speed as to what you have done so far,

Here's my current FS score.

http://www.3dmark.com/fs/9243706

Ignore the 1180 Mhz. FS is have trouble reading my correct GPU frequency. This is @ 1115 Mhz.


----------



## Cannon19932006

Quote:


> Originally Posted by *bluezone*
> 
> Okay I think I now understand where you are coming from.
> What is being talked about is negative scaling with voltage. What this means is that overclocking performance increases on the Fiji series GPU's is hampered by additional voltage. You might be to OC to say 1.75 Mhz, but if you need to add voltage to achieve a stable overclock. Bench marking scores will decrease.
> 
> Depending on your GPU adding voltage @ stock clocks might not hurt it performance wise. Bois ROMs with very specific mods can slightly help this, but of course it depends on your GPU.
> 
> I had initially thought that you had already overclocked the card. But it now sounds like you have not. Please correct me if I'm wrong.
> 
> bring me up to speed as to what you have done so far,
> 
> Here's my current FS score.
> 
> http://www.3dmark.com/fs/9243706
> 
> Ignore the 1180 Mhz. FS is have trouble reading my correct GPU frequency. This is @ 1115 Mhz.


This is my FS score!

User: CaNnoN


----------



## bluezone

Make sure Crimson Power Saving is toggled off. Then use highest stable clock rate with stock voltage and with a PL setting of +50% with Tessellation OFF in Fire Strike. Then run again @ Max stable clocks @ +75mv offset. If Max clock GPU score is lower, then you have Neg. scaling
Quote:


> Originally Posted by *Cannon19932006*
> 
> This is my FS score!
> 
> User: CaNnoN


Sorry about that copied and pasted wrong link.
I'll run a fresh one in 10 Min.


----------



## bluezone

Hot off the press.

http://www.3dmark.com/3dm/13154088

I've had better with older drivers. 16.3.2 are great driver for benches.


Spoiler: Warning: Spoiler!


----------



## Cannon19932006

Quote:


> Originally Posted by *bluezone*
> 
> Hot off the press.
> 
> http://www.3dmark.com/3dm/13154088
> 
> I've had better with older drivers. 16.3.2 are great driver for benches.
> 
> 
> Spoiler: Warning: Spoiler!


Heres +102mv (actually this is the highest valid score with a 6700k and a fury recorded by 3dmark)

1200MHz Core 560MHz Mem
http://www.3dmark.com/fs/9291910

Ended up higher than 1180MHz and 560MHz at +72mv
http://www.3dmark.com/fs/9243706


----------



## bluezone

Quote:


> Originally Posted by *Cannon19932006*
> 
> Heres +102mv
> 
> 1200MHz Core 560MHz Mem
> http://www.3dmark.com/fs/9291910
> 
> Ended up higher than 1180MHz and 560MHz at +72mv
> http://www.3dmark.com/fs/9243706


Excellent.








AFAIK Your the first to manage no Neg scaling on stock Bios. I'm the only one who has managed it on a modded bios.

Have you checked your ASIC quality in GPU-Z.?
Do you mind if I direct Gupsterg your way? I think he would want a copy of your Bios and a I2C dump to look at your DPM settings.


----------



## Cannon19932006

Quote:


> Originally Posted by *bluezone*
> 
> Excellent.
> 
> 
> 
> 
> 
> 
> 
> 
> AFAIK Your the first to manage no Neg scaling on stock Bios. I'm the only one who has managed it on a modded bios.
> 
> Have you checked your ASIC quality in GPU-Z.?
> Do you mind if I direct Gupsterg your way? I think he would want a copy of your Bios and a I2C dump to look at your DPM settings.


I have checked it, and it is 64.8%.

And sure I'd be happy to upload a bios dump.


----------



## bluezone

Quote:


> Originally Posted by *Cannon19932006*
> 
> I have checked it, and it is 64.8%.
> 
> And sure I'd be happy to upload a bios dump.


I'll PM him to look out for it.

Thanks









That about an average ASIC. mine is 62.4


----------



## Cannon19932006

Quote:


> Originally Posted by *bluezone*
> 
> I'll PM him to look out for it.
> 
> Thanks


 FijiCaNnoN.zip 103k .zip file


----------



## bluezone

Here's some quick screen shots of your bios in the editor.


Spoiler: Warning: Spoiler!











+1 REP for the share


----------



## xTesla1856

I saw almost the same FS score at 1150/550 with +96mV as at 1075/500 with -48mV. SO maybe the scaling is not negative, but neutral







Anyway, I was toying with the thought of getting a 1080, but I kinda ditched that idea. I feel a love for these Fiji cards I never felt for a nVidia card


----------



## gupsterg

@Cannon19932006

Good results on 3DM FS







. Cheers for ROM, the last Nitro ROM I looked at they have a small voltage offset already in ROM. On mobile at present but will check your ROM later.

To see if you have an offset now, whilst on stock ROM/no OC check your idle voltage, are you at 0.900V?

In Fiji bios mod OP is section showing how to use AiDA64 to gain VID per DPM, can you run this and attach results txt to post? I suspect you have low DPM 7 VID which is allowing you to add more before you hit negative performance scaling with voltage increase.

Also use MSI AB to gain i2cdump if you don't mind, again info in bios mod thread OP.

Cheers.


----------



## LionS7

Hello everyone. I have a bit of a problem here with my R9 Fury X. Everything is ok, just before I increase voltage in Afterburner. Even with only +12mV. The problem is fast freeze, and then - black screen with "no video input" massage. When I dont use any voltage, there is no problem. Im using display port to DVI. With single HDMI cable is the same.

Here is my specs:
R9 Fury X stock 16.7.2 Crimson
Sabertooth X79
i7-3930K @ 4500Mhz 1.32V
16GB Team Vulcan @ @2400Mhz 1.65V
Monitor: Philips 227ELH
PSU: Cougar SX 700W

Before I was with R9 290 @ 1100/6100 100mv and no problems ? Some help pls.


----------



## bluezone

try under volting instead and see if it helps.


----------



## Cannon19932006

Quote:


> Originally Posted by *gupsterg*
> 
> @Cannon19932006
> 
> Good results on 3DM FS
> 
> 
> 
> 
> 
> 
> 
> . Cheers for ROM, the last Nitro ROM I looked at they have a small voltage offset already in ROM. On mobile at present but will check your ROM later.
> 
> To see if you have an offset now, whilst on stock ROM/no OC check your idle voltage, are you at 0.900V?
> 
> In Fiji bios mod OP is section showing how to use AiDA64 to gain VID per DPM, can you run this and attach results txt to post? I suspect you have low DPM 7 VID which is allowing you to add more before you hit negative performance scaling with voltage increase.
> 
> Also use MSI AB to gain i2cdump if you don't mind, again info in bios mod thread OP.
> 
> Cheers.


I'll do all that as soon as I get off work, from what I saw at +100mv my card is at 1.35v under light load and has some vdroop under heavier load. This is just what I saw running FS last night logging voltage.


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> try under volting instead and see if it helps.


Thx, but it was failed 8-pins connector - Bitfenix Alchemy. I'll replace it.


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> Thx, but it was failed 8-pins connector - Bitfenix Alchemy. I'll replace it.


I was thinking the extra power draw was causing the problem, but I wasn't expecting that though.

Glad to hear you found the problem.


----------



## bluezone

I knew the Vulcan update in DOOM improved FPS for AMD cards, BUT.....


Spoiler: Warning: Spoiler!












...... 52% improvement for Fury X and 51% improvement for Nano. Cool.


----------



## Cannon19932006

Quote:


> Originally Posted by *gupsterg*
> 
> @Cannon19932006
> 
> Good results on 3DM FS
> 
> 
> 
> 
> 
> 
> 
> . Cheers for ROM, the last Nitro ROM I looked at they have a small voltage offset already in ROM. On mobile at present but will check your ROM later.
> 
> To see if you have an offset now, whilst on stock ROM/no OC check your idle voltage, are you at 0.900V?
> 
> In Fiji bios mod OP is section showing how to use AiDA64 to gain VID per DPM, can you run this and attach results txt to post? I suspect you have low DPM 7 VID which is allowing you to add more before you hit negative performance scaling with voltage increase.
> 
> Also use MSI AB to gain i2cdump if you don't mind, again info in bios mod thread OP.
> 
> Cheers.


Idle voltage is 0.92V on the dot

atigpuregCaNnoN.txt 44k .txt file


i2cdump.txt 25k .txt file


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> I was thinking the extra power draw was causing the problem, but I wasn't expecting that though.
> 
> Glad to hear you found the problem.


Well, it wasn't tha cable... pff. The problem persist. I'll lower the voltage ok, but I want to oc the card, and dont thing that my psu is weak. Some other suggestions ?


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> Well, it wasn't tha cable... pff. The problem persist. I'll lower the voltage ok, but I want to oc the card, and dont thing that my psu is weak. Some other suggestions ?


What are your best safe GPU settings now and what are the settings the settings that crash?


----------



## bluezone

If your GPU was stable after was stable after undervolting, we can assume that it isn't lacking voltage.
I've a finished a little bit of reading. Try switching to Sapphire Trixx. A few people have had problems with crashes using afterburner after applying voltage. Sounds weird but it might be worth a shot.


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> If your GPU was stable after was stable after undervolting, we can assume that it isn't lacking voltage.
> I've a finished a little bit of reading. Try switching to Sapphire Trixx. A few people have had problems with crashes using afterburner when applying voltage. Sounds weird but it might be worth a shot.


That would be weird, in my experience, Trixx was the more unstable program compared to Afterburner. My display starts flickering when adjusting settings and settings wouldn't always sync across both cards when overclocking the HBM. Also, you get no monitoring in Trixx, which is it's biggest flaw. In AB, once you extend the official OC'ing limits and disable power saving mode in Radeon Settings, everything is ssmooth sailing. Voltage as well as HBM settings work perfectly across both cards. One weird thing is that it seems to pool memory in the monitoring section.


----------



## bluezone

Quote:


> That would be weird, in my experience, Trixx was the more unstable program compared to Afterburner.


I did say it was weird. Except for a way earlier version of Trixx Than is being used now, I've never had a problem with it.


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*


I mean in-game overlay monitoring, à la RivaTuner.


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> I mean in-game overlay monitoring, à la RivaTuner.


Yes that's hard to give up.


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> I mean in-game overlay monitoring, à la RivaTuner.


Tesla. Do you happen to have RotTR. If you do, could you check if the monitoring works after the last DX!2 update. Steam overlay and fraps no longer function(blocked). It would be nice if afterburner would work.


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> Tesla. Do you happen to have RotTR. If you do, could you check if the monitoring works after the last DX!2 update. Steam overlay and fraps no longer function(blocked). It would be nice if afterburner would work.


Nope, I don't own the game, and I'm still on Windows 7. So no DX12 goodness for me yet. I plan on getting another SSD for a Windows 10 install.


----------



## bluezone

Ok. I'll have to download afterburner and check it out myself.

Thanks.

+1 REP

P.S. DX12 on RotTR is very smooth after update.


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> Ok. I'll have to download afterburner and check it out myself.
> 
> Thanks.
> 
> +1 REP
> 
> P.S. DX12 on RotTR is very smooth after update.


Thanks


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> If your GPU was stable after was stable after undervolting, we can assume that it isn't lacking voltage.
> I've a finished a little bit of reading. Try switching to Sapphire Trixx. A few people have had problems with crashes using afterburner after applying voltage. Sounds weird but it might be worth a shot.


I'll test and report. Is there any chance that my PSU is weak. It has 4 12V lines. 20, 20, 24, 24. Now the card is connected to 20A and 24A... hmm
But the problem start even when I up the Core clock to 1100, with just 50Mhz, not even touching the mV. Im testing in Star Wars Battlefront.


----------



## Thoth420

In my experience all of those OC software packages are about the same as far as stability. The RTSS optional install however which alot of users prefer to use however is not always stable. The only time I have ever had afterburner or trixx cause issues is if I do not uninstall it before a driver swap and that only on occasion causes issues but I still nuke the software and it's settings first before any driver swap if I have it installed with any kind of profile. I am sure I will catch some hate for talking bad about the RTSS since they guy who works on it is just one lone dude and takes criticism badly....I couldn't do his job but it won't stop me from saying that not using it ever has solved any problems with either of the aforementioned software packages.


----------



## LionS7

Its not the RivaTuner. I swap cables on my psu, so that the card can connect to 2x 24A 12V and nothing. Sometimes "no video input" after 1 hour, sometimes after 5 min. I try two bios files. When my monitor say "no video input", the card drops load. I have no idea and hoping someone to help me. I dont have right now other monitor with Digital input like DVI or HDMI, and dont have other strong PSU, other then my Cougar SX 700W... Return to Crimson 16.6.2 didn't help... Im welcome ideas...


----------



## xkm1948

Fury owners show your 3DMark Time Spy benchmark results!


----------



## Orthello

Quote:


> Originally Posted by *xkm1948*
> 
> Fury owners show your 3DMark Time Spy benchmark results!


Didn't relise it was out ... need to see Fury X vs 1070 !! an 480 vs 980.


----------



## xkm1948

Quote:


> Originally Posted by *Orthello*
> 
> Didn't relise it was out ... need to see Fury X vs 1070 !! an 480 vs 980.




480 already destroyed 980


----------



## Orthello

Quote:


> Originally Posted by *xkm1948*
> 
> 
> 
> 480 already destroyed 980


Cheers for that +Rep.

Hmm AMD have some work in this bench, Fury x below 1070 by 21% .. not what actual games are showing in dx12 , apart from maybe ROTR. Reading is important , that is a Fury !!

Sill look at the 480 .. above 980 and within 14% of Fury .. dang !!

AIB 480 should be right near Fury score i would pick.


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> Its not the RivaTuner. I swap cables on my psu, so that the card can connect to 2x 24A 12V and nothing. Sometimes "no video input" after 1 hour, sometimes after 5 min. I try two bios files. When my monitor say "no video input", the card drops load. I have no idea and hoping someone to help me. I dont have right now other monitor with Digital input like DVI or HDMI, and dont have other strong PSU, other then my Cougar SX 700W... Return to Crimson 16.6.2 didn't help... Im welcome ideas...


Well it might be the power supply or a bad plug on the mother board. Do you still have the 390 that you were using before to test if that still works correctly? If it works ok, then it might be time to consider a return on the Fury x. Silicon lottery may have dealt a bad card.
The only other thing I can think to try is a custom Bios to see if a incorrect DPM value is effecting the card.
Did you use DDU when uninstalling the drivers?


----------



## dagget3450

Im trying to run it in quadfire, but its crashing 3 times now. I suspect i am not stable or the benchmark is buggy and or crossfire may only be loading 2 gpus...

EDIT:

on my old SR2(ancient) with 4 furyx i got the below score just now dunno if its any good will have to try 1x/2x/3x and see how they stack up. i did run a single card earlier and it was 5k i think. ill rerun

http://www.3dmark.com/3dm/13194121?


1gpu
http://www.3dmark.com/spy/8199


----------



## dagget3450

Looks like results aren't available online yet to compare?

(oops double post)


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> Well it might be the power supply or a bad plug on the mother board. Do you still have the 390 that you were using before to test if that still works correctly? If it works ok, then it might be time to consider a return on the Fury x. Silicon lottery may have dealt a bad card.
> The only other thing I can think to try is a custom Bios to see if a incorrect DPM value is effecting the card.
> Did you use DDU when uninstalling the drivers?


Yes, Im using DDU always ? Where I can find that kind of bios ?


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> Yes, Im using DDU always ? Where I can find that kind of bios ?


Good to hear that you follow best practices using DDU,

Information on custom bios can be found here.

http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo

I could try to give you a hand in making a custom bios, But you would have to preform some testing and supply some information for me to do it. It might take a couple hours to run the tests though.

Does your card have a bios switch.


----------



## LionS7

Quote:


> Originally Posted by *LionS7*
> 
> Yes, Im using DDU always ?


Quote:


> Originally Posted by *bluezone*
> 
> Good to hear that you follow best practices using DDU,
> 
> Information on custom bios can be found here.
> 
> http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo
> 
> I could try to give you a hand in making a custom bios, But you would have to preform some testing and supply some information for me to do it. It might take a couple hours to run the tests though.
> 
> Does your card have a bios switch.


Yes, the card is ref. standard R9 Fury X with water cooling. I'll check the link now. Is there any chance that Battlefront crash is only on fullscreen. I'll need to test this. Cos I just run full Time Spy and didn't crash. This was on 1100/1000.

We dont know what need to be edited in the bios. I was working with bios editors with GTX680, HD7950 before.


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> Yes, the card is ref. standard R9 Fury X with water cooling. I'll check the link now. Is there any chance that Battlefront crash is only on fullscreen. I'll need to test this. Cos I just run full Time Spy and didn't crash. This was on 1100/1000.


Likely a heavy load.

Quote:


> Originally Posted by *LionS7*
> 
> We dont know what need to be edited in the bios. I was working with bios editors with GTX680, HD7950 before.


Cool. Your an advanced user then. Can you set the Frequency to each individual DPM frequencies in afterburner and just run the demo section of FS. Monitor the active average voltage (VDDC) on a second screen hooked up to integrated graphics output using HWiFO64, note max applied VDDC, during the run as well.
A copy of your Bios would be appreciated.

I think we should continue this Via PM rather than clutter the Board.


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> Likely a heavy load.
> Cool. Your an advanced user then. Can you set the Frequency to each individual DPM frequencies in afterburner and just run the demo section of FS. Monitor the active average voltage (VDDC) on a second screen hooked up to integrated graphics output using HWiFO64, note max applied VDDC, during the run as well.
> A copy of your Bios would be appreciated.
> 
> I think we should continue this Via PM rather than clutter the Board.


I'll check this things, but is there a chance that my gpu is so bad, so it can run 1100Мhz minimum on +30mV, or because of heavy load from the new Crimson generation drivers... hm ?


----------



## bluezone

Quote:


> Originally Posted by *LionS7*
> 
> I'll check this things, but is there a chance that my gpu is so bad, so it can run 1100Мhz minimum on +30mV, or because of heavy load from the new Crimson generation drivers... hm ?


IIRC we had one member who had to down clock his card, but he was getting extremely good scores out of the card.
Try Crimson 16.2.1. its very good at overclocking. If that helps then it may be load related. how are your temps on GPU and VRM's.

I think this is as far as my help can go on this.

Good Luck.


----------



## xkm1948




----------



## Cannon19932006

http://www.3dmark.com/spy/22308

my time spy run at 1200/560 +145mv


----------



## dagget3450

Warming up the 5960x for some quadfire fury on timespy. Want to see how different it is from my sr2 benches. I bought the dlc so i dont have to look at the fracking demo all day

update for the crickets at night








http://www.3dmark.com/3dm/13223214?


----------



## ozyo

http://www.3dmark.com/3dm/13227068?

warming up


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> Is there any chance that Battlefront crash is only on fullscreen.


I've had SWBF since release and not had any issue similar to yours. Is it only SWBF giving you an issue?

I'm suspicious of your PSU, I couldn't find detailed review of it, like say what JonnyGuru/TPU would do. OEM is HEC, THG have it in a 2010 PSU roundup review.

Quote:


> Originally Posted by *bluezone*
> 
> Try Crimson 16.2.1. its very good at overclocking.


I'd go v16.3.2 WHQL, been the best for me when compared with older & newer versions.

Quote:


> Originally Posted by *Cannon19932006*
> 
> http://www.3dmark.com/spy/22308
> 
> my time spy run at 1200/560 +145mv


Not got Win10 so not run this bench but comparing it with this seems your not getting the benefit of OC.

You have a 18.75mV offset programmed in via ROM so you see 0.92V instead of 0.900V at idle.

I have some other bits to share but on mobile and will aim to post it when at a PC.


----------



## mechwarrior

hi guys I'm looking at getting the gigabyte nano.
have a few questions?
1- can i install a universal water block on it?
2- can i control voltages?
3-is the fan quiet?

thanks for the replies in advance.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> I'd go v16.3.2 WHQL, been the best for me when compared with older & newer versions.
> Quote:
> 
> 
> 
> You are correct sir, typo on my part.
> Quote:
> 
> 
> 
> Originally Posted by *gupsterg*
> 
> I'm suspicious of your PSU, I couldn't find detailed review of it, like say what JonnyGuru/TPU would do. OEM is HEC, THG have it in a 2010 PSU roundup review.
> .
> 
> 
> 
> Ya I mentioned that to him along with a possible bad power connection to mother board. 4 separate 12v rails makes it seem odd. He has not said anything back on weather he still has his 390 to test the power supply with either.
> 
> Thank you for pitching in.
> 
> +1 REP:
> 
> Click to expand...
Click to expand...


----------



## Flamingo

My 3DMark Timespy scores:

Graphics Score / Settings

4573 - Stock Nano settings (http://www.3dmark.com/spy/28243)
4856 - +50% PL and 100% fan speed (http://www.3dmark.com/spy/28402)
5064 - +50% PL and 100% fan speed and 1050Mhz (http://www.3dmark.com/spy/28548)

Quote:


> Originally Posted by *mechwarrior*
> 
> hi guys I'm looking at getting the gigabyte nano.
> have a few questions?
> 1- can i install a universal water block on it?
> 2- can i control voltages?
> 3-is the fan quiet?
> 
> thanks for the replies in advance.


Which universal water block? ekwb doesnt show any universal ones compatible with the Nano. And this thread says no too.

Voltages can be controlled via bios mods and afterburner (not the states via afterburner though - ie wattman like stuff through bios mods only)

fan is quiet if you leave it at 60% max. at 100% is loud.


----------



## Flamingo

Well I was happy until I saw the 980 Ti (@1053Mhz) giving a graphics score of 5319 (http://www.3dmark.com/spy/17879). I think Time-Spy is proof that DX12 still does not guarantee a bright future for AMD (unless the game is developed with AMD's help or it doesnt have any GameWorks code).

What was that tessellation tweak allowed by HWbot again? What should I set the factor to?


----------



## ozyo

for some reason my volt drop when i start the benchmark any one knows why?


----------



## Flamingo

Setting tessellation to OFF or 16x doesn't seem to affect TimeSpy scores...


----------



## flopper

Quote:


> Originally Posted by *Flamingo*
> 
> Rise of Tomb Raider finally supports async compute with the latest patch:


double the minfps for furyx good.
still 1080 runs circles aorund it.
Quote:


> Originally Posted by *Flamingo*
> 
> Well I was happy until I saw the 980 Ti (@1053Mhz) giving a graphics score of 5319 (http://www.3dmark.com/spy/17879). I think Time-Spy is proof that DX12 still does not guarantee a bright future for AMD (unless the game is developed with AMD's help or it doesnt have any GameWorks code).


amd wont be saved by dx12 but it will at least allow them to have their drivers fixed for performance.
There be less difference with dx12 but that is still 3 to 5 years away. as Nvidia has Pascal now and while Maxwell is dead amd wont be able to capture cash markets with graphics and what they can do is to increase market share at the lower end 80% and make an expensive all out vega.
if Zen delivers thats the cash cow amd needs.
graphics as Polaris are for Mobile/laptop market with APU along the way.


----------



## ozyo

my time spy results
gpu clock- volt-mem clock-pl- overall scores- gpu score

1150-108-550-25-5466-5430
1150-108-550-30-5476-5439
1150-108-550-20-5496-5442
1150-78-550-20-5509-5456

*cpu oc to 4300mhz*

1150-78-550-20-5569-5410
1160-78-550-20-fail
1160-84-555-15-5597-5435
1160-84-555-10-fail
1160-84-555-20-5597-5433
1170-84-555-20-5642-5483
1170-89-555-20-fail
1170-80-555-20-5638-5488
1170-80-555-15-5645-5485
1180-80-555-15-5668-5519
1190-78-555-15-fail
1190-80-555-15-fail
1190-84-555-15-fail
1190-89-555-15-5706-5561
1200-80-555-15-fail
1200-84-555-15-5746-5600
1210-84-555-15-fail
1210-89-555-15-fail
1210-90-555-20-fail

http://www.3dmark.com/spy/29143


----------



## LionS7

Quote:


> Originally Posted by *bluezone*
> 
> IIRC we had one member who had to down clock his card, but he was getting extremely good scores out of the card.
> Try Crimson 16.2.1. its very good at overclocking. If that helps then it may be load related. how are your temps on GPU and VRM's.
> 
> I think this is as far as my help can go on this.
> 
> Good Luck.


Thank you for everything for now. I'll hold on on the bios part for now, and try to test the best I can with 30mV for 1100Mhz. I have friends with R9 290, and after the Crimson generation drivers, the cards need 20-30mV for the same Mhz on the Core in general. So, I thing the Fury is the same, or maybe worst. Im with final official bios from AMD.


----------



## RatPatrol01

Wooo number 1 score in Time Spy with an E3-1230v2 and an R9 Nano!

http://www.3dmark.com/spy/15903

(even though the only other score with that configuration is invalid...and also me)


----------



## dagget3450

Quote:


> Originally Posted by *RatPatrol01*
> 
> Wooo number 1 score in Time Spy with an E3-1230v2 and an R9 Nano!
> 
> http://www.3dmark.com/spy/15903
> 
> (even though the only other score with that configuration is invalid...and also me)


what driver version are you using? I want to post a valid result but i have no idea what driver version is WHQL...


----------



## gupsterg

@Cannon19932006

Some slides on Nitro that may interest you







.




Quote:


> Originally Posted by *Cannon19932006*
> 
> I have checked it, and it is 64.8%.


My Fury X is 64.4% and is 1.212V for DPM 7 (ie highest state), your card with 64.8% I would assume to be lower, but due to the OC in your DPM 7 from factory I reckon the 1.237V is not true representation of your stock VID. Without going into too much detail why I think this, if we flash your stock ROM with DPM 7 set as 1000MHz we will see lower VID for DPM 7 IMO. Which would correlate with how your card is scaling well with higher voltage increase. You see I've had other Fiji cards, one such Fury X DPM 7 was 1.187V and allowed more voltage increase prior to negative performance scaling.



The info we see in GPU-Z for ASIC quality is explained/it's relevance in Hawaii bios mod OP heading *What is "ASIC Quality"?* (gotta do a similar section in Fiji bios mod







).


----------



## aDyerSituation

Anyone with a Fury X or Overclocked Nano want to chime in on their FPS in Overwatch? I am looking for just a little more oomph to push the game at 1440p/144hz as I want a new monitor with Freesync. I don't see vega coming anytime soon


----------



## Semel

http://www.3dmark.com/3dm/13250193

5251 Graphics Score on my old PC ([email protected] /16GB 1600Mhz/.Amd fury 3840 @ 1120/550 )


----------



## BIGTom

Quote:


> Originally Posted by *aDyerSituation*
> 
> Anyone with a Fury X or Overclocked Nano want to chime in on their FPS in Overwatch? I am looking for just a little more oomph to push the game at 1440p/144hz as I want a new monitor with Freesync. I don't see vega coming anytime soon


Using 100% render scale, FuryX can run Overwatch 1440p at 150fps with ease.


----------



## aDyerSituation

Quote:


> Originally Posted by *BIGTom*
> 
> Using 100% render scale, FuryX can run Overwatch 1440p at 150fps with ease.


Alright cool. I still lower some settings anyway because I don't notice them, and recording takes a few frames from me so that works out.


----------



## Bender82

http://www.3dmark.com/spy/13447


----------



## aDyerSituation

Anyone have a regular Firestrike run on a Fury X with 16.7.1?


----------



## ozyo

any one know what is highest fury x(1 gpu) score in time spy ?
3dmark search not working for me


----------



## gupsterg

Quote:


> Originally Posted by *aDyerSituation*
> 
> Anyone have a regular Firestrike run on a Fury X with 16.7.1?


Past few days been doing about 18 runs of 3DM FS for Catalyst v15.7.1 WHQL vs Crimson v16.3.2 WHQL vs Crimson v16.7.2 (Non-WHQL), here are the best results for each on OC of 1145 545.

I also have about 9 runs of 3DM FS E and 18 runs of 3DM11 P & X from past few days testing.

After all this testing I've deemed I'm gonna keep 1140/545 as a daily OC as it only requires ~+44mV over stock VID in ROM and scales best for me, 3DM FS result.

Note:- All runs are driver defaults (ie no tessellation tweak done).


----------



## aDyerSituation

Thank you!


----------



## dagget3450

Is 16.6.2 the latest WHQL?


----------



## gupsterg

Quote:


> Originally Posted by *aDyerSituation*
> 
> Thank you!


No worries







, those are also non tessellation tweak results (ie driver defaults, only custom ROM to set card as I want).

Got about 6 runs of Heaven/Valley as well







, gotta start FRAPS testing some games as well to compare with my Hawaii runs, so if there is anything else I have happy to share







.
Quote:


> Originally Posted by *dagget3450*
> 
> Is 16.6.2 the latest WHQL?


If you mean 16.7.2? no those are not WHQL, IIRC last are 16.3.2.


----------



## aDyerSituation

I'm just debating on if selling my 290x and paying the difference for a Fury/X is worth it.


----------



## Semel

I would wait for vega tbh.. or get nvidia's gpu.It depends..


----------



## aDyerSituation

Quote:


> Originally Posted by *Semel*
> 
> I would wait for vega tbh.. or get nvidia's gpu.It depends..


Tired of waiting. And I'm not paying $200 more for a monitor


----------



## gupsterg

@aDyerSituation

Depends TBH.

In my case grinning ear to ear with Fury X, I sold all my Hawaii cards with no loss, I got Fiji cards at good promo prices. As I was contemplating WC'ing Hawaii the amount I put in extra to buy Fiji was lower than WC'ing Hawaii.

I had 4 Hawaii's in total, the last Tri-X OC'd to 1140/1495 with just +6.25mV extra over stock (did not test more as got Fiji). but it didn't beat the Vapor-X 290X @ 1100/1525. Here is VX290X vs Fury X. The VX290X ROM besides clocks/VID/PL as I wanted it had tightened RAM timings plus 390/X memory controller timings, Fury X has no RAM timings mod (yet).

Fury X for me benches better in Heaven and Valley vs Hawaii, for gaming it depends on title. As a UK resident, Custom PC mag which I subscribe to showed a nice increase with Fury X so rolled with it. Bit tech site has sometimes review from that mag, launch review, next in RX 480 review they use newer drivers.

One thing to note in the RX 480 review is the Valley 1440P score, Fury X 50% extra than RX 480. Now the big thing about RX 480 is perf.per watt, when I asked on Bit tech forum the staff told me the power consumption page of review is Valley at 1440P. So the total system with Fury X drew 50% more power from wall and gave 50% more performance.

Next I asked at TPU concerning the perf.per watt chart in RX 480 review, the question, the answer. Now the way PowerTune can work at times and how many SP, etc Fury X has vs RX 480 you could have a scenario where a game does not make Fury X reach DPM 7 statically all the time = less power usage and the RX 480 could be highest state = it's max draw.

Final image, view


http://imgur.com/UGRDMGa

, from extremetech RX 480 review.

Even though the stock air coolers were good on the Hawaii cards the Fury X AIO is leagues ahead in quietness and temps. My CPU temps when running say RealBench stress mode/[email protected] dropped by 5C as no hot air being dumped in case, even mobo temp sensor showed cooler temps.

I also had a Fury Tri-X at one point. I wouldn't of minded that to keep vs Fury X AIO (VRM temps are better/no issue keeping GPU @55C quietly either). I can't recall how it effected CPU temps though (may have data).

You could wait for Vega as you have Hawaii which is still decent, for me back in Mar 16 it was no brainer. Some points to consider:-

- Even if Vega comes out late 2016 or early 2017 only you can decide if you are willing to wait that long. It could be if you prefer AIB cards to ref PCB your waiting a bit longer.

- If like me you can sell Hawaii for little to no loss and get Fiji reasonably good price then TBH you have to judge if it's worth it for you. Even if I see AIB RX 480 coming soon in the UK a Nitro is £250 pre order, a Devil is £260, I paid ~£260 for my Fury X and glad I didn't sell it for a profit prior to RX 480 launch







.


----------



## Crisium

So i've been out of the loop for a while. Is it still thought to be true that Fiji can only change voltage in .06mv increments, and that the HBM can only run at 500, 545, 600, or 666 and everything else is rounding?


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> , those are also non tessellation tweak results (ie driver defaults, only custom ROM to set card as I want).
> 
> Got about 6 runs of Heaven/Valley as well
> 
> 
> 
> 
> 
> 
> 
> , gotta start FRAPS testing some games as well to compare with my Hawaii runs, so if there is anything else I have happy to share
> 
> 
> 
> 
> 
> 
> 
> .
> If you mean 16.7.2? no those are not WHQL, IIRC last are 16.3.2.


No sir, i mean 16.6.2

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.2-Release-Notes.aspx

this appears to be an RX 480 launch driver, but is WHQL? I downloaded it but not sure if it will work with FuryX? I just want to post valid scores on timespy... I don't know why AMD makes this so hard with all these betas and seems like the HQL driver is hidden


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> No sir, i mean 16.6.2
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.2-Release-Notes.aspx
> 
> this appears to be an RX 480 launch driver, but is WHQL? I downloaded it but not sure if it will work with FuryX? I just want to post valid scores on timespy... I don't know why AMD makes this so hard with all these betas and seems like the HQL driver is hidden


http://www.futuremark.com/support/benchmark-rules#approveddrivers


----------



## dagget3450

Funny thing is i keep getting time validation error, when i OC gpu.... if i go stock its normal.... wth
http://www.3dmark.com/3dm/13273593?



Gonna give 16.6.2 a whirl now


----------



## littlestereo

Time Spy results of Fury X CF @ *1145 + 570*
Graphics: *10552*
Overall: *9021
*
http://www.3dmark.com/spy/48168


----------



## Semel

Quote:


> Originally Posted by *aDyerSituation*
> 
> Tired of waiting. And I'm not paying $200 more for a monitor


Why would u need another monitor? Freesync\g-sync? We've lived ages without this and fared just fine.


----------



## Butthurt Beluga

This is probably something of a petty question, but would the Radeon R9 Fury X, during it's time of release be considered the flagship product of it's generation?

I'm thinking I might keep it since it was "my first flagship GPU" but i was thinking that might be the Radeon Pro Duo?
In any case I'll probably keep it long after I've upgraded just because I like it so much.


----------



## Cannon19932006

Quote:


> Originally Posted by *gupsterg*
> 
> @Cannon19932006
> 
> Some slides on Nitro that may interest you
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> My Fury X is 64.4% and is 1.212V for DPM 7 (ie highest state), your card with 64.8% I would assume to be lower, but due to the OC in your DPM 7 from factory I reckon the 1.237V is not true representation of your stock VID. Without going into too much detail why I think this, if we flash your stock ROM with DPM 7 set as 1000MHz we will see lower VID for DPM 7 IMO. Which would correlate with how your card is scaling well with higher voltage increase. You see I've had other Fiji cards, one such Fury X DPM 7 was 1.187V and allowed more voltage increase prior to negative performance scaling.
> 
> 
> 
> The info we see in GPU-Z for ASIC quality is explained/it's relevance in Hawaii bios mod OP heading *What is "ASIC Quality"?* (gotta do a similar section in Fiji bios mod
> 
> 
> 
> 
> 
> 
> 
> ).


I re-ran Time spy with lower voltage and lower clock, got a lower score.

http://www.3dmark.com/compare/spy/49330/spy/22308

The other guy's score at 1150 may be higher because of the driver difference, I can't tell which driver he is using though.

Edit: got the same driver as the top score, ran it a few more times at 1200/560 +150mv and +100mv, 1150/560 +36mv I see minimal difference at 1200 +100mv/150mv and a loss at 1150 +35mv. Even at 1200 though I still lose to him in gpu score by 6%.


----------



## dagget3450

Time spy hates me, i yanked out my 5960x and put in a xeon with different ddr4 modules that doesn't overclock because they are locked. Behold! i still get time measurement errors... lol (everything is at stock)

http://www.3dmark.com/3dm/13289109?

So yeah i did manage to get one validation and it was totally screwed up. Crossfire was broken after a bios update and it was stutter heaven. However guess what.... it was low enough score it validated it.. lol.

broken crossfire result:
http://www.3dmark.com/spy/52719

So what Timespy is saying is 4 FuryX are slower than 2 1080gtx...

The invalid runs are too high of a score... http://www.3dmark.com/spy/52611

I suspect if it's hardware then i cannot explain why Firestrike validates just fine...

I have beat my head against the wall on this so i am burnt out now. Ah well... I do wonder if anyone running FuryX CF is able to get valid scores though.... i dont have a second x99 mobo to try


----------



## diggiddi

Quote:


> Originally Posted by *dagget3450*
> 
> Time spy hates me, i yanked out my 5960x and put in a xeon with different ddr4 modules that doesn't overclock because they are locked. Behold! i still get time measurement errors... lol (everything is at stock)
> 
> http://www.3dmark.com/3dm/13289109?
> 
> So yeah i did manage to get one validation and it was totally screwed up. Crossfire was broken after a bios update and it was stutter heaven. However guess what.... it was low enough score it validated it.. lol.
> 
> broken crossfire result:
> http://www.3dmark.com/spy/52719
> 
> So what Timespy is saying is 4 FuryX are slower than 2 1080gtx...
> 
> The invalid runs are too high of a score... http://www.3dmark.com/spy/52611
> 
> I suspect if it's hardware then i cannot explain why Firestrike validates just fine...
> 
> I have beat my head against the wall on this so i am burnt out now. Ah well... I do wonder if anyone running FuryX CF is able to get valid scores though.... i dont have a second x99 mobo to try


Sync your clock with server


----------



## dagget3450

Quote:


> Originally Posted by *diggiddi*
> 
> Sync your clock with server


i honestly thought you kidding when you posted that.... but i set my clock and i'll be damned if it didnt take it lol.... muhahahaha wow.. just crazy.. ill keep toying around now and see if this sticks.. +rep

It still doesn't like my xeon ID but thats okay... 2 x furyx CF for test below.
http://www.3dmark.com/3dm/13290646?

xeon is 2.4ghz obviously under powered but ill put the 5960x back in


----------



## ozyo

Quote:


> Originally Posted by *Cannon19932006*
> 
> I re-ran Time spy with lower voltage and lower clock, got a lower score.
> 
> http://www.3dmark.com/compare/spy/49330/spy/22308
> 
> The other guy's score at 1150 may be higher because of the driver difference, I can't tell which driver he is using though.
> 
> Edit: got the same driver as the top score, ran it a few more times at 1200/560 +150mv and +100mv, 1150/560 +36mv I see minimal difference at 1200 +100mv/150mv and a loss at 1150 +35mv. Even at 1200 though I still lose to him in gpu score by 6%.


try 16.2
http://www.3dmark.com/compare/spy/55340/spy/29143#
and look at my post keep eye on voltage and power limit
Edit: go for 16.7.2









http://www.3dmark.com/spy/55803


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> No sir, i mean 16.6.2
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-Software-Crimson-Edition-16.6.2-Release-Notes.aspx
> 
> this appears to be an RX 480 launch driver, but is WHQL? I downloaded it but not sure if it will work with FuryX? I just want to post valid scores on timespy... I don't know why AMD makes this so hard with all these betas and seems like the HQL driver is hidden


Yep, 16.6.2 was RX 480 launch driver, the one floating around the web at the time lacked other card support in inf but this does have other cards, you well see "AMD Radeon™ R9 Fury Series Graphics" stated in "Radeon Desktop Product Family Compatibility" table








.

I've found drivers that usually have "Hotfix" in title are not WHQL







. I'm not on Win 10, but on Win 7, if installing non WHQL driver it will state something like "this driver is not signed, do you trust it" and you have option to install or not. IIRC you can see in device manager > properties of device > driver if it is signed. I will try this driver







.

Why it's "hidden" / previous drivers section is 16.7.2 is the newest "Hotfix" as it has the RX 480 PCI-E slot power usage improvement.
Quote:


> Originally Posted by *bluezone*
> 
> http://www.futuremark.com/support/benchmark-rules#approveddrivers


When you read the first section of page it comes across they are WHQL, "Guidelines" heading has "The driver must be WHQL certified for all supported graphics chipsets.", but further on we see:-
Quote:


> Special cases
> 
> In special cases, such as a major product launch, we will consider approving a new driver with the following exceptions to our standard policy:
> 
> The driver may be pre-release and is not required to be WHQL certified.


So







but will try and report back







.


----------



## gupsterg

Quote:


> Originally Posted by *Cannon19932006*
> 
> I re-ran Time spy with lower voltage and lower clock, got a lower score.
> 
> http://www.3dmark.com/compare/spy/49330/spy/22308
> 
> The other guy's score at 1150 may be higher because of the driver difference, I can't tell which driver he is using though.
> 
> Edit: got the same driver as the top score, ran it a few more times at 1200/560 +150mv and +100mv, 1150/560 +36mv I see minimal difference at 1200 +100mv/150mv and a loss at 1150 +35mv. Even at 1200 though I still lose to him in gpu score by 6%.


Yeah your driver differs to AndreDVJ, in "Graphics card" section it will state driver version, but matching this to an AMD package I have found no quick method







.

Anyhow:-

a) I've no idea how much run to run variance there is on this bench as I don't have Win 10.
b) seems ozyo scaled well with 1200MHz, 3 way result compare. I would think the GS score are not effected by CPU as all have CPU which would not be bottlenecking card.
c) even though HBM clock for tests in compares differ you're all at 545MHz, see this post (I was a skeptic about the clock steps but no longer).
d) I've noted from say comparing 4 differing Hawaii cards I've owned, even when all clocked the same, some may bench slightly better than another. It could well be regarded as run to run variance, but I have noted others think this who have had more than 1 card to compare with. Even though had 7 Fiji cards I've been too preoccupied with other things to correlate data to see same occurrence, but would not be surprised if same occurs.
e) perhaps negative scaling "thing"? when viewing your result vs ozyo?

For example take this result compare, your 4.8% GPU increase over mine results in 0.1-1.5% gain in GT1/GT2/Combined FPS. The -30mV does not mean I was reducing it from stock VID but from an increased VID over stock in ROM (see below).

For example 1145 is on the cusp of negative scaling for my current card, so ROM has DPM 7 [email protected], then I reduce to ~1.238V, 3 loop compare, best of each compared, worst out of each compared.

So IMO negative performance scaling with voltage increase is 0.5% to 1.0% in this OC profile compare, this effect increases if your OC is higher/where a member is experiencing this "phenomenon".


----------



## dagget3450

Quote:


> Originally Posted by *diggiddi*
> 
> Sync your clock with server


i got 2 runs on xeon with just unknown cpu error after trying your fix, now i put back 5960x and time errors again. I'm on pacific time but now its not working... lol... this is insane.


----------



## ozyo

my final run
fury [email protected]/555 i7 [email protected] "can't go any higher at this temp" +87 volt [email protected]%

http://www.3dmark.com/spy/56491
Quote:


> Originally Posted by *gupsterg*
> 
> Yeah your driver differs to AndreDVJ, in "Graphics card" section it will state driver version, but matching this to an AMD package I have found no quick method
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Anyhow:-
> 
> a) I've no idea how much run to run variance there is on this bench as I don't have Win 10.
> b) seems ozyo scaled well with 1200MHz, 3 way result compare. I would think the GS score are not effected by CPU as all have CPU which would not be bottlenecking card.
> c) even though HBM clock for tests in compares differ you're all at 545MHz, see this post (I was a skeptic about the clock steps but no longer).
> d) I've noted from say comparing 4 differing Hawaii cards I've owned, even when all clocked the same, some may bench slightly better than another. It could well be regarded as run to run variance, but I have noted others think this who have had more than 1 card to compare with. Even though had 7 Fiji cards I've been too preoccupied with other things to correlate data to see same occurrence, but would not be surprised if same occurs.
> e) perhaps negative scaling "thing"? when viewing your result vs ozyo?
> 
> For example take this result compare, your 4.8% GPU increase over mine results in 0.1-1.5% gain in GT1/GT2/Combined FPS. The -30mV does not mean I was reducing it from stock VID but from an increased VID over stock in ROM (see below).
> 
> For example 1145 is on the cusp of negative scaling for my current card, so ROM has DPM 7 [email protected], then I reduce to ~1.238V, 3 loop compare, best of each compared, worst out of each compared.
> 
> So IMO negative performance scaling with voltage increase is 0.5% to 1.0% in this OC profile compare, this effect increases if your OC is higher/where a member is experiencing this "phenomenon".


i have fury x but Cannon19932006 fury non x


----------



## gupsterg

Cool







, I just remembered AndreDVJ has Fury unlocked to 4096SP = Fury X







.


----------



## Medusa666

*Radeon Pro Duo and HW-E 5960X*

3DMark Score: 9699
Graphics Score: 9637
CPU Score: 10 067

http://www.3dmark.com/3dm/13298316

I'm happy and suprised with the scores, looks like the Pro Duo will be a good card for the coming years.


----------



## dagget3450

Quote:


> Originally Posted by *Medusa666*
> 
> *Radeon Pro Duo and HW-E 5960X*
> 
> 3DMark Score: 9699
> Graphics Score: 9637
> CPU Score: 10 067
> 
> http://www.3dmark.com/3dm/13298316
> 
> I'm happy and suprised with the scores, looks like the Pro Duo will be a good card for the coming years.


i noticed your score is invalid like mine, we have very similar hardware. Can you tell me if your able to get a valid run without time measure error?


----------



## fat4l

Hey guys.... For those of you considering on buying a Fury X, OCUK sells it for 396£, NEW!








That's a nobrainer to me...

https://www.overclockers.co.uk/gigabyte-radeon-fury-x-4096mb-hbm-pci-express-graphics-card-ax-r9-fury-x-4gbd5-3dh-gx-168-gi.html

Hey @gupsterg, still tuning bioses ?








Out of curiosity, how far can a Fury X go with modded bios, @reasonable temps40-50C load, on average? 1200MHz ?

If its useful for anyone, Der8auer just released a modded bios for RX480:
http://overclocking.guide/download/amd-radeon-rx-480-unlocked-air-bios/
Package includes stock BIOS, unlocked BIOS and ATIFlash:
TDP up to 225 W (+50 %)
Voltage unlocked to 1.40 Volt
Run the approperiate .bat file as administrator to flash your card.


----------



## gupsterg

They were £350 ish a few days back on ebuyer







, £300 at Amazon warehouse deals a few months back as well







.

Yep, still on the bios mod "train"







.

Getting 1200MHz isn't the issue as such (depending on card), it's the blinking negative performance scaling as voltage increase is applied







. Tried a lot of things via ROM to combat it, but unsuccessful







, behind the scenes pestering "you know who" with data/tests as well







.

Again temps even on stock TIM/pads/AIO on Fury X not an issue IMO. Even Tri-X cooler can easily maintain <55C, GPU VRM cooler than Fury X AIO, as on that coolant from GPU flows to copper pipe in contact with VRM, where as Tri-X independent plate for VRM from GPU.

Even when we had the few days of high ambient temps in the UK (you gotta luv the weather here







) cooling not an issue IMO. Without fan at 100% I can cruise along at ~50C under load way quieter fan than the aftermarket air cooler Hawaii's I had.

TBH if I had a custom loop already in rig I'd consider WC'ing Fury/X, but if not (as in my case) I don't see the point.


----------



## dagget3450

Jesus finally i am able to validate would have been 6ish yesterday but you take what you can.
Quote:


> Well i posted on futuremark my issue, not sure if anything was done but today i changed nothing on my system and ran a gpu stock run again...
> 
> Finally... made it to top 10 hof, only 10th though... yesterday it would have been closer to 6th. Anyways i am just happy to have a valid result...
> 
> http://www.3dmark.com/3dm/13318515?


----------



## bluedevil

Anyone getting a pulsing load when the Fury X is under load? I think that might be driver related...anyone else?


----------



## dagget3450

Quote:


> Originally Posted by *bluedevil*
> 
> Anyone getting a pulsing load when the Fury X is under load? I think that might be driver related...anyone else?


in a specific app?

I do in timespy, but only in CF


----------



## diggiddi

Quote:


> Originally Posted by *dagget3450*
> 
> i got 2 runs on xeon with just unknown cpu error after trying your fix, now i put back 5960x and time errors again. I'm on pacific time but now its not working... lol... this is insane.


Did you resync after switching the cpu?


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> They were £350 ish a few days back on ebuyer
> 
> 
> 
> 
> 
> 
> 
> , £300 at Amazon warehouse deals a few months back as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yep, still on the bios mod "train"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Getting 1200MHz isn't the issue as such (depending on card), it's the blinking negative performance scaling as voltage increase is applied
> 
> 
> 
> 
> 
> 
> 
> . Tried a lot of things via ROM to combat it, but unsuccessful
> 
> 
> 
> 
> 
> 
> 
> , behind the scenes pestering "you know who" with data/tests as well
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Again temps even on stock TIM/pads/AIO on Fury X not an issue IMO. Even Tri-X cooler can easily maintain <55C, GPU VRM cooler than Fury X AIO, as on that coolant from GPU flows to copper pipe in contact with VRM, where as Tri-X independent plate for VRM from GPU.
> 
> Even when we had the few days of high ambient temps in the UK (you gotta luv the weather here
> 
> 
> 
> 
> 
> 
> 
> ) cooling not an issue IMO. Without fan at 100% I can cruise along at ~50C under load way quieter fan than the aftermarket air cooler Hawaii's I had.
> 
> TBH if I had a custom loop already in rig I'd consider WC'ing Fury/X, but if not (as in my case) I don't see the point.


+ rep for the info! Keep the red team up bro


----------



## bluedevil

Quote:


> Originally Posted by *dagget3450*
> 
> in a specific app?
> 
> I do in timespy, but only in CF


Only really tested in DOOM, opengl and Vulkan does the same pulsing.


----------



## Krzych04650

Can anyone who owns Sapphire Fury Nitro tell me what are the fan speeds in RPM and on what % speed fans start to spin? I tried 390 Nitro recently (and returned because of super crazy coil whine) and fan speeds were a bit stupid, when they start spinning at I believe 25% speed they are already at ~1200 RPM. All cards I owned before had much more balanced fan control, 1200 RPM was round 50% fan speed, not at minimum speed. Is it improved with Fury Nitro or those are just exact the same fans with exact the same speed range control as 390 Nitro ones?


----------



## mypickaxe

Quote:


> Originally Posted by *RatPatrol01*
> 
> Wooo number 1 score in Time Spy with an E3-1230v2 and an R9 Nano!
> 
> http://www.3dmark.com/spy/15903
> 
> (even though the only other score with that configuration is invalid...and also me)


I need to figure out what is going on with the drivers. Funny, my 1000 MHz core / 525 MHz HBM score is higher than my 1049 MHz core / 500 MHz HBM score.

http://www.3dmark.com/spy/23336


----------



## dagget3450

Quote:


> Originally Posted by *mypickaxe*
> 
> I need to figure out what is going on with the drivers. Funny, my 1000 MHz core / 525 MHz HBM score is higher than my 1049 MHz core / 500 MHz HBM score.
> 
> http://www.3dmark.com/spy/23336


Yeah i wonder how prevalent the negative scaling is with upping voltage still.

On a side note my best run yet with 100+mv even
Quote:


> My best single gpu score so far
> 
> dagget3450--- [email protected] -- 1x furyx 1200/560 -- score 6125
> 
> http://www.3dmark.com/3dm/13335532?


----------



## mypickaxe

Quote:


> Originally Posted by *dagget3450*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> I need to figure out what is going on with the drivers. Funny, my 1000 MHz core / 525 MHz HBM score is higher than my 1049 MHz core / 500 MHz HBM score.
> 
> http://www.3dmark.com/spy/23336
> 
> 
> 
> Yeah i wonder how prevalent the negative scaling is with upping voltage still.
> 
> On a side note my best run yet with 100+mv even
> Quote:
> 
> 
> 
> My best single gpu score so far
> 
> dagget3450--- [email protected] -- 1x furyx 1200/560 -- score 6125
> 
> http://www.3dmark.com/3dm/13335532?
> 
> 
> Click to expand...
Click to expand...

Cool score. Much higher than I have achieved on my 6600K. But my SLI 1070s...I have the top score so far for any 1070. It won't last for long, though. The 6950X in second place isn't overclocked from what I can tell. http://www.3dmark.com/spy/73718

Without further testing, I can't speak to whether or not negative scaling has anything to do with it. Rather, it is just as, if not more likely, the memory bandwidth bump is more beneficial than clock speed. But that's just a guess. You could be right.

I haven't touched voltage on my R9 Nano. Also, it seems to be a really good sample...I haven't had any throttling on the card in my Define Nano S. I'm thinking it may be a good candidate for water cooling. Its ASIC score is 59%.


----------



## shadowxaero

So from the bit of testing I did, negative scaling doesn't seem to impact my Fury in this Dx12 bench versus firestrike.

Nonetheless my best score so far http://www.3dmark.com/spy/73642

Graphics score of 5,441


----------



## dagget3450

Quote:


> Originally Posted by *mypickaxe*
> 
> Cool score. Much higher than I have achieved on my 6600K. But my SLI 1070s...I have the top score so far for any 1070. It won't last for long, though. The 6950X in second place isn't overclocked from what I can tell. http://www.3dmark.com/spy/73718
> 
> Without further testing, I can't speak to whether or not negative scaling has anything to do with it. Rather, it is just as, if not more likely, the memory bandwidth bump is more beneficial than clock speed. But that's just a guess. You could be right.
> 
> I haven't touched voltage on my R9 Nano. Also, it seems to be a really good sample...I haven't had any throttling on the card in my Define Nano S. I'm thinking it may be a good candidate for water cooling. Its ASIC score is 59%.


I don't know about this benchmark.. it seems to be under using my fiji's.. At any rate i dont think there is much negative scaling on voltage in this either. However it feels like the gpus are not really at the full potential either. Even overclocking seems flat to me.

my best 2 way so far....
Quote:


> dagget3450 --- 2 way FuryX 1200/540 -- [email protected] --- score 10851
> 
> http://www.3dmark.com/3dm/13341053?


however compare to stock run it seems lackluster.


Maybe new drivers or tweaks can be done by AMD i dunno...i think i am going to compare wattage usage with something like firestrike extreme...

I think i just barely passed the 1070sli score you posted. it mostly looks like cpu though. hence why i think gpus are under performing in this bench.

Quote:


> Originally Posted by *shadowxaero*
> 
> So from the bit of testing I did, negative scaling doesn't seem to impact my Fury in this Dx12 bench versus firestrike.
> 
> Nonetheless my best score so far http://www.3dmark.com/spy/73642
> 
> Graphics score of 5,441


I think i may be able to concur with that as well, i see gains even with 96mv+ even if they are very small.


----------



## looncraz

Quote:


> Originally Posted by *dagget3450*
> 
> I don't know about this benchmark.. it seems to be under using my fiji's.. At any rate i dont think there is much negative scaling on voltage in this either. However it feels like the gpus are not really at the full potential either. Even overclocking seems flat to me.


My system usually pulls near 400W while gaming with any game and unlocked framerates. This benchmark only uses around 300W. My i7 2600k isn't going to use 100W by itself in the likes of BF4, even it at 4.5Ghz (with no added voltage).

It definitely feels like the benchmark is not stressing the hardware at all.

EDIT:

GPU-Z shows full clocks nice and steady, 100% GPU usage, but lower than usual temps (only 65C after the full benchmark at 900Mhz, rather than 72C for BF4 for the same-ish amount of time).

CPU usage is 45~50%, same as with with BF4, but overall power usage is 80W lower. The card itself is only pulling 170W during Time Spy (or whatever it's called), but closer to 250W in every other gaming or bench-marking scenario.


----------



## AndreDVJ

Quote:


> Originally Posted by *shadowxaero*
> 
> So from the bit of testing I did, negative scaling doesn't seem to impact my Fury in this Dx12 bench versus firestrike.
> 
> Nonetheless my best score so far http://www.3dmark.com/spy/73642
> 
> Graphics score of 5,441


Sorry dude, negative scaling all the way. Below is mine. 1150/550Mhz did 5493 points.

http://www.3dmark.com/spy/13079


----------



## Bender82

i can not push more out off this card but its only 50c* max core 1140mhz (mV 36) and memory 570mhz and (Power limit 25%) http://www.3dmark.com/3dm/13356251?


----------



## dagget3450

Quote:


> Originally Posted by *looncraz*
> 
> My system usually pulls near 400W while gaming with any game and unlocked framerates. This benchmark only uses around 300W. My i7 2600k isn't going to use 100W by itself in the likes of BF4, even it at 4.5Ghz (with no added voltage).
> 
> It definitely feels like the benchmark is not stressing the hardware at all.
> 
> EDIT:
> 
> GPU-Z shows full clocks nice and steady, 100% GPU usage, but lower than usual temps (only 65C after the full benchmark at 900Mhz, rather than 72C for BF4 for the same-ish amount of time).
> 
> CPU usage is 45~50%, same as with with BF4, but overall power usage is 80W lower. The card itself is only pulling 170W during Time Spy (or whatever it's called), but closer to 250W in every other gaming or bench-marking scenario.


I am noticing this when im running multiple furys, esp in quadfire. it is not pushing the gpus very hard. I think it needs some work either from AMD or 3dmark... but something tells me this will be how it is from now on.... I guess this bench right now is as is for me...

did a quick and dirty watt at the wall compare.

Max watt peaks i saw
fse vs timespy
gputest 1 720 644
gputest 2 650 690

The avgs were probably a good 30-50watts less example, i saw 600ish avg watts in timespy test 1, and 680ish FSE gpu test1

Not really scientific but it needs more testing i think. i should probably look at the gpu temps as well. i know they are also indicative of less load. I think the results in test2 on time spy peak might be when the gpu is actually used to its max. but it doesn't last long and its down. I realize FSe is dx11 and thus overhead issues arise. So i would say that FSE is not pushing as hard as it can. Perhaps i should try something else but its hard to compare different engines and API's


----------



## dagget3450

So my best single fury timespy run now is a whooping few more points.....

The furyx is at 1220/540 even....
http://www.3dmark.com/spy/73510



results are fluctuating i guess within margin of error, same with overcclock margin of error?
http://www.3dmark.com/compare/spy/73572/spy/73510


----------



## gupsterg

Quote:


> Originally Posted by *AndreDVJ*
> 
> Sorry dude, negative scaling all the way. Below is mine. 1150/550Mhz did 5493 points.
> 
> http://www.3dmark.com/spy/13079


I had been using your bench for compares with others and then realised you have 4096SP unlocked, shadowxaero is 3840SP IIRC.

So in this compare is CaNnoN Fury (Fury 3584SP) vs Shadowxaero (Fury 3840SP) vs ozyo (genuine Fury X) vs AndreDVJ (Fury 4096SP)


----------



## BIGTom

Quote:


> Originally Posted by *aDyerSituation*
> 
> Alright cool. I still lower some settings anyway because I don't notice them, and recording takes a few frames from me so that works out.


You are right, most of the settings in Overwatch don't provide great visual improvements. I played a few hours of Competitive last night while monitoring FPS and on custom settings with mix of low-high at 1440p and I didn't notice min FPS under 180 and it ran all the way up to 250.

Hope this helps


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> I had been using your bench for compares with others and then realised you have 4096SP unlocked, shadowxaero is 3840SP IIRC.
> 
> So in this compare is CaNnoN Fury (Fury 3584SP) vs Shadowxaero (Fury 3840SP) vs ozyo (genuine Fury X) vs AndreDVJ (Fury 4096SP)


how did you check the SP?

Also, i have a question for you if you don't mind. I have a furyx that says ASIC score of 64% it appears to be highest of my 4 cards. However it wont Oc for nothing. I cant go past 1100 core stock or even with voltage. I thought i saw somewhere ASIC is meaningless for fiji. My question is though do you think i can do anything to the card to help it? I was debating taking it apart to see if its not making good contact on cooler.

any thoughts?


----------



## hyp36rmax

Here are my TimeSpy results with two FuryX and an Intel 5820K @ 4.3Ghz



*Results:* Link


----------



## shadowxaero

Quote:


> Originally Posted by *gupsterg*
> 
> I had been using your bench for compares with others and then realised you have 4096SP unlocked, shadowxaero is 3840SP IIRC.
> 
> So in this compare is CaNnoN Fury (Fury 3584SP) vs Shadowxaero (Fury 3840SP) vs ozyo (genuine Fury X) vs AndreDVJ (Fury 4096SP)


You have any idea why negative scaling doesn't seem to be a factor in this particular bench or is it Dx12 in general? I am going to run some ashes test with varying voltages. Like I did two 1150 runs, one with my normal 1.256v and another with +54mV offset and pretty much same scores (within like 40 points of each other in the graphics score).

And yes I am 3840SP unlocked.


----------



## gupsterg

No idea







.

I have not run Timespy as have yet to decide to take up Win 10 free offer by converting my retail Win 7 Pro license, may get an el cheapo grey market Win 10 Pro OEM key.

The thing that is coming across from others posts which are looking at power usage they are not reaching a level same as our usual benches (ie 3DM11, 3DM13 FS,E,U, Unigine, etc). Perhaps this has something to do with it, do you get negative scaling in AOTS?

Past few days I have been doing extensive testing using older driver Catalyst 15.7.1, Crimson v16.3.2 and v16.7.2 with non DX12 loads and I note:-

i) If I take an OC of say 1145MHz with +56mV and keep reducing voltage to +24mV in steps of 6mV I see better scaling by upto 1% consistently at +24mV vs +56mV. Bare in mind that 1145 +56mV is at the cusp of negative scaling for me. If I go over 1145MHz the negative scaling is greater and reducing voltage yields a greater % of positive scaling vs last mentioned test method.

ii) HBM oc'ing has no impact on negative scaling, you always gain same x % of gain from it at higher clock/gpu voltage and lower.

iii) Catalyst v15.71 performs lower than Crimson v16.3.2, plus has less of a positive scaling % as voltage is reduced for an OC. This older driver also does not have "Power Efficiency" feature so we see clock bounce at idle at desktop.

iv) Crimson v16.7.2 performs better than Catalyst v15.7.1, but slightly lower than Crimson v16.3.2 plus has less of a positive scaling % as voltage is reduced for an OC. This driver does have PE and switching on/off has no effect on negative scaling when experiencing it (same with Crimson v16.3.2).


----------



## shadowxaero

Ashes is installing now
Quote:


> Originally Posted by *gupsterg*
> 
> No idea
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I have not run Timespy as have yet to decide to take up Win 10 free offer by converting my retail Win 7 Pro license, may get an el cheapo grey market Win 10 Pro OEM key.
> 
> The thing that is coming across from others posts which are looking at power usage they are not reaching a level same as our usual benches (ie 3DM11, 3DM13 FS,E,U, Unigine, etc). Perhaps this has something to do with it, do you get negative scaling in AOTS?
> 
> Past few days I have been doing extensive testing using older driver Catalyst 15.7.1, Crimson v16.3.2 and v16.7.2 with non DX12 loads and I note:-
> 
> i) If I take an OC of say 1145MHz with +56mV and keep reducing voltage to +24mV in steps of 6mV I see better scaling by upto 1% consistently at +24mV vs +56mV. Bare in mind that 1145 +56mV is at the cusp of negative scaling for me. If I go over 1145MHz the negative scaling is greater and reducing voltage yields a greater % of positive scaling vs last mentioned test method.
> 
> ii) HBM oc'ing has no impact on negative scaling, you always gain same x % of gain from it at higher clock/gpu voltage and lower.
> 
> iii) Catalyst v15.71 performs lower than Crimson v16.3.2, plus has less of a positive scaling % as voltage is reduced for an OC. This older driver also does not have "Power Efficiency" feature so we see clock bounce at idle at desktop.
> 
> iv) Crimson v16.7.2 performs better than Catalyst v15.7.1, but slightly lower than Crimson v16.3.2 plus has less of a positive scaling % as voltage is reduced for an OC. This driver does have PE and switching on/off has no effect on negative scaling when experiencing it (same with Crimson v16.3.2).


I am updating Ashes now. As for power usage HWinfo is reporting 359 watts for me.

I ran Firestrike at the same voltage I ran Time Spy at and sure enough negative scaling.


----------



## shadowxaero

Ashes is installing now
Quote:


> Originally Posted by *gupsterg*
> 
> No idea
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I have not run Timespy as have yet to decide to take up Win 10 free offer by converting my retail Win 7 Pro license, may get an el cheapo grey market Win 10 Pro OEM key.
> 
> The thing that is coming across from others posts which are looking at power usage they are not reaching a level same as our usual benches (ie 3DM11, 3DM13 FS,E,U, Unigine, etc). Perhaps this has something to do with it, do you get negative scaling in AOTS?
> 
> Past few days I have been doing extensive testing using older driver Catalyst 15.7.1, Crimson v16.3.2 and v16.7.2 with non DX12 loads and I note:-
> 
> i) If I take an OC of say 1145MHz with +56mV and keep reducing voltage to +24mV in steps of 6mV I see better scaling by upto 1% consistently at +24mV vs +56mV. Bare in mind that 1145 +56mV is at the cusp of negative scaling for me. If I go over 1145MHz the negative scaling is greater and reducing voltage yields a greater % of positive scaling vs last mentioned test method.
> 
> ii) HBM oc'ing has no impact on negative scaling, you always gain same x % of gain from it at higher clock/gpu voltage and lower.
> 
> iii) Catalyst v15.71 performs lower than Crimson v16.3.2, plus has less of a positive scaling % as voltage is reduced for an OC. This older driver also does not have "Power Efficiency" feature so we see clock bounce at idle at desktop.
> 
> iv) Crimson v16.7.2 performs better than Catalyst v15.7.1, but slightly lower than Crimson v16.3.2 plus has less of a positive scaling % as voltage is reduced for an OC. This driver does have PE and switching on/off has no effect on negative scaling when experiencing it (same with Crimson v16.3.2).


I am updating Ashes now. As for power usage HWinfo is reporting 359 watts for me.

I ran Firestrike at the same voltage I ran Time Spy at and sure enough negative scaling. I lost 461 point on the Graphics score to be exact
http://www.3dmark.com/compare/fs/9369169/fs/9369392

However in Time Spy I am getting pretty much linear gains regardless of voltage, my card hitting a maximum of 1.356v on the core.
http://www.3dmark.com/compare/spy/73642/spy/73892/spy/82113

As far as drivers, 16.3.2 performs better in Firestrike, maybe Dx11 in general, than 16.6.2 and 16.7.2 does.
However Dx12 performance was worse on 16.3.2 with 16.6.2 performing the best.

Edit: I do get negative scaling in Ashes once I go past 1.256v


----------



## Semel

Anyone of you got a fury that can't have a stable OC past 1100+? I thought my 1120 OC @ 1.3 V set via bios for DMP7(it's more like 1.25+V when under load) was enough for all games but as it turns out some game are pretty sensitive to OC and I got driver stopped responding\BSODs errors and had to lower my core clock.(I don't feel comfortable going pat 1.3 V although i think I could coz cooling is great on trixx)

Am I the only one that unlucky ?







I managed to unlock my fury to 3840 but OCing -wise this card is atrocious..


----------



## dagget3450

Quote:


> Originally Posted by *Semel*
> 
> Anyone of you got a fury that can't have a stable OC past 1100+? I thought my 1120 OC @ 1.3 V set via bios for DMP7(it's more like 1.25+V when under load) was enough for all games but as it turns out some game are pretty sensitive to OC and I got driver stopped responding\BSODs errors and had to lower my core clock.(I don't feel comfortable going pat 1.3 V although i think I could coz cooling is great on trixx)
> 
> Am I the only one that unlucky ?
> 
> 
> 
> 
> 
> 
> 
> I managed to unlock my fury to 3840 but OCing -wise this card is atrocious..


i have 4 furyx and 2 of them appear to not oc well roughly around 1100\540, and the other two 1150/540 on stock voltages. i am also finding the ones that dont oc well sem to not respond with voltage. I am going to switch to Trixx for the ocing again to verify this is not specific to MSI AB but in fact may be just the cards.


----------



## zuru1

Hello everyone,

I have question regarding the sapphire R9 fury tri-x owners.
Those of you that own this card,
1. How much coil whine do you have?
2. Any way to solve this issue without returning the card?
3. Any chance that a full cover water block will remove or reduce the coil whine?

PS:
For me its more noticeable in benchmarking and also when the frame rate is high or very high.
In game its slightly low buzzing noise which can be heard over the fun noise.


----------



## Krzych04650

Coil whine is inevitable. You can buy top tier PSU, you can change voltage, you can try different clock speeds, but you will have it anyway. There are some people who don't hear it, for example my father is completely deaf to this sound even though his hearing is very good and he hears every little buzz from the fans. But if you can hear it then you are more or less screwed. This is only one thing in the industry that prevents silent gaming. You can get good fans and set them at slow RPM, you can get quiet GPU, you can get completely silent CPU cooler, but there will always be this stupid whining noise that will ruin all your silent build attempts. Cooler got quiet and efficient, noisy HDDs were replaced by silent SSDs, but coil whine is still there and it is not going anywhere. I had only one GPU with not annoying and little whine but it started to whine like crazy after few months anyway, and all of other cards I had were whining out of the box.

This is actually only reason why I ever returned GPU, actually 3 of them right now out of 5 ever purchased. Asus 270x DCU II and MSI 980 Ti Gaming were okay at the beginning (and both started to whine after some time) and Asus 980 Strix, EVGA 980 Ti Hybrid and Sapphire 390 Nitro were whining like crazy.

I don't like headphones and I prefer playing with 5.1 speakers, but it looks like I will have to look for some super comfortable headset and isolate myself from ambient noise because this only way how you can avoid coil whine.


----------



## dagget3450

#6 in HOF for 4 x
http://www.3dmark.com/hall-of-fame-2/timespy+3dmark+score+performance+preset/version+1.0/4+gpu
Quote:


> Dagget340 --- [email protected] -- 4x [email protected] 1150/540-1150/540-1100/540-1100/540 --- score 16100
> 
> http://www.3dmark.com/3dm/13369730?


I am happy for what it is i guess.







(it wont last long )


----------



## zuru1

@ Krzych04650
Well then since I'm screwed I will try with the GPU block since its arriving tomorrow and see what can be achieved by that.
Otherwise its either move the PC from the table to a more far place than ~30 cm away from me or return the GPU.

Thank u for ur thoughts on the matter.


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> how did you check the SP?


GPU-Z







.



Quote:


> Originally Posted by *dagget3450*
> 
> Also, i have a question for you if you don't mind. I have a furyx that says ASIC score of 64% it appears to be highest of my 4 cards. However it wont Oc for nothing. I cant go past 1100 core stock or even with voltage. I thought i saw somewhere ASIC is meaningless for fiji. My question is though do you think i can do anything to the card to help it? I was debating taking it apart to see if its not making good contact on cooler.
> 
> any thoughts?


ASIC quality does mean something but not the way generally it is thought of.

I have learned what it means for us (as in AMD owners) by reading The Stilt's posts. It can be hard to make people believe it is as The Stilt states, a few times I've had members think I'm







IMO (







) .

Besides in the OP of Hawaii bios mod ("What is ASIC Quality?") I posted that same info in Fiji ASIC quality thread, short post , long post, then you also wanna read this. The cards in the last linked post were complete crap for OC'ing even though 65/68% ASIC rating (bad ASIC IMO). I will dig up the data from my other rig, suffice to say they needed like +50mV to reach 1100MHz or 1110MHz. My stabilty testing is harsh compared to others from reading their posts. I will at times do 2-3 hrs each of Heaven/Valley/3DM13 looped and 12-24hrs solid [email protected] I work from home and have an open plan living space so can keep an eye on them whilst getting on with my daily routine. The 7 Fiji cards I had at times they were running without powering down for a break whilst being tested to determine which 1 I keep.

So basically I use ASIC quality to know how leaky the ASIC is. Coupled with that VID per DPM is also what I use to deem if it is bad or good asic within a leakage level.

But as always how a card reacts to OC'ing is still plain old "silicon lottery".

One of the 7 cards I had was stonkingly high leakage ASIC (ie high ASIC quality) it had DPM 7 1.187V, it scaled the best and allowed more voltage before negative scaling kicked in *but* within <14days it stopped holding an OC which it had been fine with in the first week. The one in my sig I've had the longest and been through hell with me







but is the best of them all IMO







.

I don't know if removing cooler will aid you, how I see it we can aim to improve OC ability with cooling, bios, which driver we use, etc but if the silicon can't do it, it just won't.


----------



## Krzych04650

Quote:


> Originally Posted by *zuru1*
> 
> @ Krzych04650
> Well then since I'm screwed I will try with the GPU block since its arriving tomorrow and see what can be achieved by that.
> Otherwise its either move the PC from the table to a more far place than ~30 cm away from me or return the GPU.
> 
> Thank u for ur thoughts on the matter.


Having your PC as close as 30 cm from your sitting place is always horrible idea, you can hear every little buzz and whine from this place. I was talking about having PC below the desk. Having PC on the desk is just asking for acoustic issues. But with coil whine it is a bit different, it will whine all around the room and no sound dumped case or desk can stop it anyway.


----------



## dagget3450

Quote:


> Originally Posted by *Krzych04650*
> 
> Having your PC as close as 30 cm from your sitting place is always horrible idea, you can hear every little buzz and whine from this place. I was talking about having PC below the desk. Having PC on the desk is just asking for acoustic issues. But with coil whine it is a bit different, it will whine all around the room and no sound dumped case or desk can stop it anyway.


I thought all the cool kids had VR goggles glued to their heads with massive around the ear headphones.











They could have one of these

and wouldnt know it


----------



## zuru1

Quote:


> Originally Posted by *Krzych04650*
> 
> Having your PC as close as 30 cm from your sitting place is always horrible idea, you can hear every little buzz and whine from this place. I was talking about having PC below the desk. Having PC on the desk is just asking for acoustic issues. But with coil whine it is a bit different, it will whine all around the room and no sound dumped case or desk can stop it anyway.


I currently have no sort of organization in my office space but will change that soon when I'm done with my build and reorganized the office.
I don't like having my pc on the floor since dust is sucked in the case and honestly the PC isn't that loud, fans run about 600- 900 rpm and don't mind having it near me, its the coil whine that is just killing the set up.
My last XFX 390X in CF had a really bad coil whine, were they squeaked at anything demanding and had to return then.


----------



## josephimports

Spoiler: Warning: Spoiler!







http://www.3dmark.com/3dm/13368347
Killawatt peaked at ~1250w.


----------



## xkm1948

Time spy is useless. Apparently this is just another bought out from nVidia. It did not tap on Async compute at all to favor nvidia hardware. Really shouldn't waste 5 dollars on this.


----------



## dagget3450

Quote:


> Originally Posted by *josephimports*
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/13368347
> Killawatt peaked at ~1250w.


good stuff!
Quote:


> Originally Posted by *xkm1948*
> 
> Time spy is useless. Apparently this is just another bought out from nVidia. It did not tap on Async compute at all to favor nvidia hardware. Really shouldn't waste 5 dollars on this.


While it may have it's issues, its a benchmark thats going to see a lot of use probably even in reviews. 5 dollars isn't much to get upset over. However maybe things will get better.

*People using more than 2 gpus need to be aware of a current issue with the benchmark:*


Spoiler: Warning: Spoiler!






Spoiler: Warning: Spoiler!



From this thread:
Quote:


> Originally Posted by *FMJarnis*
> 
> It is LDA Explicit, or Linked-Node Explicit Multi-adapter. Identical GPUs only, but with tasks related to mGPU dispatching handled by the engine, not the driver.
> 
> Problem with MDA is that if you do it like Ashes of Singularity, you get at best 2x the slowest GPU and everyone would be complaining why 3DMark isn't using all the resources of both GPUs. And to do it so that it would actually use all resources of two wildly different GPUs would require hideously complex code that would somehow "feel out" what the GPUs can do and then somehow load balance them, while responding to variances in performance and... uuuh, it would be very very complex piece of code and potentially a very fragile piece of code.
> 
> dGPU + iGPU the gains would be marginal and again the complexity would go through the roof.
> 
> I'm not saying a future 3DMark test can't ever support MDA, but Time Spy was designed for Linked-node. Partially because we wanted to ship it in 2016 and not 2017
> 
> 
> 
> 
> 
> 
> 
> and partially because mixed GPU (NV+AMD) setups basically do not exist outside press test labs.
> 
> Now if game developers suddenly decide that MDA is the best way to go and multiple games support NV+AMD mixed cards and everyone starts buying them so NV+AMD mixed setups are relevant for actual gaming, we'll definitely take notice.


Quote:


> Originally Posted by *FMJarnis*
> 
> Yes.
> 
> Also note that DX12 has some kinks in this regard - more than 2 cards performance will differ with full screen vs. borderless window.


Quote:


> Originally Posted by *FMJarnis*
> 
> Yes it does.
> 
> Here is an example of a custom run from 3x R9 290X setup using Borderless Window and 4K resolution


I plan to test this, however if there is gains from using border less window this is a custom run test. Its not applicable to HOF/valid/top30, which is quite interesting. I want to see what the differences are first.



So i followed FMJarnis's suggestion and ran quafire furyx stock with only setting borderless window in custom settings leaving rest default and it's saddening. I gained over 1k on gpu score alone with STOCK gpu against my max overclock run ....

test run borderless window stock gpus



max overclock official run stats to compare


That would be huge if it was a valid score where i am at ...









Edit added settings shot
Edit: post added from FMJarnis
Quote:


> Originally Posted by *FMJarnis*
> 
> I meant that when using more than 2 GPUs, DX12 full screen path has some... quirks. Borderless window does not have these, so it is definitely possible to get a higher graphics score with 3 and 4 GPUs using Borderless Window.
> 
> It is possible Microsoft will improve DX12 exclusive full screen at some point, or vendors do something in drivers, but that's how it works now.


this may apply to Nvidia as well?


----------



## xkm1948

Lots of discussion happening right now regarding nvidiamark time spy cheating. And the timing of time spy release makes it even more fishy--->right when AMD cards are gaining significant performance from DOOM Vuklan patch.


----------



## dagget3450

Quote:


> Originally Posted by *xkm1948*
> 
> Lots of discussion happening right now regarding nvidiamark time spy cheating. And the timing of time spy release makes it even more fishy--->right when AMD cards are gaining significant performance from DOOM Vuklan patch.


While it may or may not be true, when has there not been bias against AMD. Many of the benchmarks use insane amounts of tessellation or what have you. I guess if its some sort of legit scandal then great but these things usually get swept under the rug. I will treat it like all the other benches so far and just bench anyways cause i like doing it. If anything the questions that come to my mind are simple ones like, where is AA options, Where is the fire strike equivalents of official scoring runs i.e. 1080p/1440/4k like FS/FSE/FSU.

I am not going to get caught up in the drama that may be anywhere from tinfoil hat to brand loyalties. I am going to just try to have some fun.


----------



## gupsterg

Quote:


> Originally Posted by *shadowxaero*
> 
> Edit: I do get negative scaling in Ashes once I go past 1.256v


+rep for this info share







.


----------



## AndreDVJ

Quote:


> Originally Posted by *gupsterg*
> 
> I had been using your bench for compares with others and then realised you have 4096SP unlocked, shadowxaero is 3840SP IIRC.
> 
> So in this compare is CaNnoN Fury (Fury 3584SP) vs Shadowxaero (Fury 3840SP) vs ozyo (genuine Fury X) vs AndreDVJ (Fury 4096SP)


Still, ran at 1200/550Mhz, absolutely no increase in scores with my hardware.
http://www.3dmark.com/compare/spy/90607/spy/13079

I am probably running at the power limit. 1150/550Mhz is stable, and no interest in going further. I'd rather need a new graphics card or add another card.


----------



## gupsterg

Cheers for share







.

Perhaps your hitting the negative scaling effect







, going from 1150 to 1200 is a 4% GPU clock increase I'd expect to see some improvement







.

For PL you could check if card is dropping clocks, etc.

On another note the crazy thing I've noted is when I do get negative scaling it's not like GPU clock is being dropped, etc like when we run into PL limit














.

I've also been chatting to The Stilt and Mumak concerning A/W readings, hope to soon post about it once more stuff has been discussed / explained to me, doubtful at the moment we'll solve the negative scaling issue







.

Like you I'm happy with 1145/545 for daily use, but when you do have a card which is showing potential for higher clocks but the performance is lost with voltage increase it sorta sucks







.


----------



## LionS7

Hello everyone. Is it normal that my gpu want +36mv for 1100Mhz stable, witch is around 1.23V, and the memory can't go past 520 ? I thing that after Crimson generations drivers, these cards want more voltage, even in some cases with on stock cards ? Like it was with Hawaii/Grenada.


----------



## Performer81

+36 is not that unnormal if your stock voltage is maybe a little low or your GPU runs hot. For hbm i think the most reach 545 MHZ (only 500, 545, 600, 666 are possible, it clock in steps).


----------



## Elmy

Here is a video I just took at PDXLAN 28 today of my new build with the Primochill Wetbench. It has the Pro Duo with EK Waterblock and Backplate.

Stock setting temps are at 34/38c on the Pro Duo when gaming.


----------



## Semel

Quote:


> Originally Posted by *LionS7*
> 
> Hello everyone. Is it normal that my gpu want +36mv for 1100Mhz stable.


I see you got fury x..

For a fury (non-x) I'd say it's pretty normal. My fury requires almost 1.3 to be stable at 1100-1120(it depends on a game).Temps never exceed 60C. Most of these cards are just bad OCing wise.

As for fury x.. Generally you won't get past 1150.


----------



## Greenland

Quote:


> Originally Posted by *LionS7*
> 
> Hello everyone. Is it normal that my gpu want +36mv for 1100Mhz stable, witch is around 1.23V, and the memory can't go past 520 ? I thing that after Crimson generations drivers, these cards want more voltage, even in some cases with on stock cards ? Like it was with Hawaii/Grenada.


Strange, I tried 4 Furies already ( ASUS Strix, Sapphire Tri-x and 2 Nitros), all cannot clock past 1.1Ghz and all require +100mV to be stable. HELP?!?!


----------



## LionS7

As I said so, I thing that Crimson generation is heavy on these gpu, but I see that there's people with even worst gpu than mine. Im ok with 1.23V for 1100 core. My VID is around 1.20V, and on +96mV witch is max the voltage is ~ 1.30V. On 1.16V, the core crashes on stock.

@Semel, Greenland, Performer81 - thanks for the info.


----------



## dagget3450

Quote:


> Originally Posted by *Elmy*
> 
> Here is a video I just took at PDXLAN 28 today of my new build with the Primochill Wetbench. It has the Pro Duo with EK Waterblock and Backplate.
> 
> Stock setting temps are at 34/38c on the Pro Duo when gaming.


Cool man, you going to stick with a duo pro or add another sometime?


----------



## Krzych04650

I have a question regarding dual BIOS on Sapphire Fury Nitro. Reviews say that first BIOS has 260W limit and 75C temp target, while second BIOS has 300W and 80C temp target. Does it mean that the card will start to downclock and throttle when it reaches 80C and I cannot set different temp target?


----------



## LionS7

No, the fans will pump up. This is bios for extreme use.


----------



## Shau76434

If I can buy a Sapphire Radeon R9 Fury Nitro for 370 euros (including transport) or an AIB RX480 which I assume will be priced in europe for ~ 300-320 Euro (including transport aswell). Which card would you buy or recommend ?


----------



## xTesla1856

Quote:


> Originally Posted by *Barca130*
> 
> If I can buy a Sapphire Radeon R9 Fury Nitro for 370 euros (including transport) or an AIB RX480 which I assume will be priced in europe for ~ 300-320 Euro (including transport aswell). Which card would you buy or recommend ?


Nitro Fury hands down.


----------



## Krzych04650

Quote:


> Originally Posted by *Barca130*
> 
> If I can buy a Sapphire Radeon R9 Fury Nitro for 370 euros (including transport) or an AIB RX480 which I assume will be priced in europe for ~ 300-320 Euro (including transport aswell). Which card would you buy or recommend ?


Personally I ordered Sapphire Fury Nitro from caseking.de for €360 including shipping to Poland, standard price is €349. In Polish currency I paid 1580 ZL. No 480 or 1060 can touch this price/performance ratio, especially if we consider getting ones with similar quality cooler and build qualtiy level to Sapphire Fury Nitro, so the top ones only.

Accoring to TechPowerUp, Fury is 23% more powerful than RX 480 (in 1440p)

One downside is that they are going to ship this card to me on 29 July, so quite a bit of waiting (but still less than waiting for 480 custom cards reviews and reasonable availability), and it has 4 GB memory, but realistically it is enough, I can barely mention few games that ate more than 4 gigs of VRAM. And those which had was either poorly optimized so still no playable anyway or had some Hyper textures than consumed 2 times more VRAM than lower tier setting and gave no real picture quality improvement.

And meanwhile Polish retailer claims that Fury is on sale for 2600 ZL... While you can get one for 1550 from Germany or UK


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> No, the fans will pump up. This is bios for extreme use.


Not really if the Fury Nitro ROMs have been done the same way as Fury Tri-X STD/OC







.
Quote:


> Originally Posted by *Krzych04650*
> 
> I have a question regarding dual BIOS on Sapphire Fury Nitro. Reviews say that first BIOS has 260W limit and 75C temp target, while second BIOS has 300W and 80C temp target. Does it mean that the card will start to downclock and throttle when it reaches 80C and I cannot set different temp target?


Attach dumps of your ROMs.

Now you see in the Fury Tri-X STD/OC increased PL ROMs this is what they did:-

i) PowerLimit, IMO TDP / MPDL should be equal when raised, this is what we saw in Hawaii OC edition factory ROMs plus you can think if MPDL is raised TDP should be equal.



ii) The GPU throttle temp remains the same between ROMs but cooling profile has been modified. So the "increased PL" ROM actually aims to keep GPU at 80C with "fuzzly logic" fan control (there are min/max PWM values as well) where as stock PL ROM 75C, so technically the stock PL ROM is better for OC'ing on a temperature aspect.



More information in Fiji bios mod OP/thread







.


----------



## Shau76434

Quote:


> Originally Posted by *xTesla1856*
> 
> Nitro Fury hands down.


Quote:


> Originally Posted by *Krzych04650*
> 
> Personally I ordered Sapphire Fury Nitro from caseking.de for €360 including shipping to Poland, standard price is €349. In Polish currency I paid 1580 ZL. No 480 or 1060 can touch this price/performance ratio, especially if we consider getting ones with similar quality cooler and build qualtiy level to Sapphire Fury Nitro, so the top ones only.
> 
> Accoring to TechPowerUp, Fury is 23% more powerful than RX 480 (in 1440p)
> 
> One downside is that they are going to ship this card to me on 29 July, so quite a bit of waiting (but still less than waiting for 480 custom cards reviews and reasonable availability), and it has 4 GB memory, but realistically it is enough, I can barely mention few games that ate more than 4 gigs of VRAM. And those which had was either poorly optimized so still no playable anyway or had some Hyper textures than consumed 2 times more VRAM than lower tier setting and gave no real picture quality improvement.
> 
> And meanwhile Polish retailer claims that Fury is on sale for 2600 ZL... While you can get one for 1550 from Germany or UK


Thx for the replies, will order the fury one of these days.


----------



## Agent Smith1984

Damn. Fury is only $299.99 now!

Nevermind, sold out in like 5 minutes, hahaha


----------



## Maximization

tried the new time spy bench..

if i overclock my cpu maybe i could pass 10,000, i dont know, my system looking kinda old

http://www.3dmark.com/spy/105264


----------



## dagget3450

27734 47833
Quote:


> Originally Posted by *Maximization*
> 
> tried the new time spy bench..
> 
> if i overclock my cpu maybe i could pass 10,000, i dont know, my system looking kinda old
> 
> http://www.3dmark.com/spy/105264


Hey if you don't mind i was hoping i could ask you to run a test in timespy. If your up to it, can you run timespy custom and only change 2 settings? Run your fury CF @ stock in custom run with "borderless window" enabled and turn off CPU test. Then compare your graphics scores with your run you just posted. I am curious what your difference will be in graphics tests if any at all.


----------



## Maximization

Quote:


> Originally Posted by *dagget3450*
> 
> 27734 47833
> Hey if you don't mind i was hoping i could ask you to run a test in timespy. If your up to it, can you run timespy custom and only change 2 settings? Run your fury CF @ stock in custom run with "borderless window" enabled and turn off CPU test. Then compare your graphics scores with your run you just posted. I am curious what your difference will be in graphics tests if any at all.


i dont see those options.

Capture.PNG 154k .PNG file


----------



## dagget3450

Quote:


> Originally Posted by *Maximization*
> 
> i dont see those options.
> 
> Capture.PNG 154k .PNG file


I think this is only available for purchased version:

when you open 3dmark select "more tests" then up top in timespy box click on the box of timespy. Then you should see details and custom run tab. Click custom tab there should be the options.


----------



## Elmy

Here is a video I just took at PDXLAN 28 today of my new build with the Primochill Wetbench. It has the Pro Duo with EK Waterblock and Backplate.

Stock setting temps are at 34/38c on the Pro Duo when gaming.





Quote:


> Originally Posted by *dagget3450*
> 
> Cool man, you going to stick with a duo pro or add another sometime?


Just going to stick to 1 for now for this machine. I built this for VR.

Going Quad Vega as soon as they drop though....


----------



## Maximization

found it



graphics was pretty much same
http://www.3dmark.com/spy/107672

10046 too 10059


----------



## New green

Just something I've been noticing but why are so many amd cards being held hostage from 480 to fury x? Here in the states a fury can be marked as high as $800. The fury x is still floating around $600 with only one seller posting as low as $460. I understand that the 480 is going to be restocked this week to reduce the absurd $330-$500 price back down towards $270 but is this from people actually thinking they can sell these cards for those prices or is it some tinfoil hat conspiricy surronding nvidia trying to get people to buy their 10 series cards by buying up all the amd cards and trying to resell them overpriced?


----------



## Krzych04650

Quote:


> Originally Posted by *New green*
> 
> Just something I've been noticing but why are so many amd cards being held hostage from 480 to fury x? Here in the states a fury can be marked as high as $800. The fury x is still floating around $600 with only one seller posting as low as $460. I understand that the 480 is going to be restocked this week to reduce the absurd $330-$500 price back down towards $270 but is this from people actually thinking they can sell these cards for those prices or is it some tinfoil hat conspiricy surronding nvidia trying to get people to buy their 10 series cards by buying up all the amd cards and trying to resell them overpriced?


Pricing of last gen AMD cards is indeed interesting. There are literally two or three retailers in entire Europe that are selling 390/390x/Fury/FuryX for proper price that takes current market state into account and that they cannot be sold for release price anymore since there are newer and more powerful cards at lower price with Polaris and Pascal. There are very few offers like this and only for selected models. But forvery most of retailers AMD cards price is still like nothing happened and they were released week ago. And at the same, entire GeForce 900 series got serious price cut...

Also, I don't know how it is like in other countries, but in Poland GTX 1060 came in proper price and you can get the cheapest blower cooler AIB card for like 1249 which is actually a bit below bottom MSRP, while all RX 480 references came at 1349 price, which is way over MSRP that was set to 1170. I am not a fanboy of any side and I had both green and red cards, but what is happening now in Poland is clearly price fixing to break AMD sales.

So this either poor price politics and poor influence on prices by AMD, or there is some serious price fixing going on.


----------



## nyk20z3

How much would you guys value a few month old Asus Nano with an installed EK waterblock ? I am looking to move on to something else since i moved back to an atx form factor.


----------



## mypickaxe

Quote:


> Originally Posted by *nyk20z3*
> 
> How much would you guys value a few month old Asus Nano with an installed EK waterblock ? I am looking to move on to something else since i moved back to an atx form factor.


If I were in the market for a used Nano with a pre-installed waterblock, I'd be looking to spend no more than $375 to $400 at the highest end.
Quote:


> Originally Posted by *New green*
> 
> Just something I've been noticing but why are so many amd cards being held hostage from 480 to fury x? Here in the states a fury can be marked as high as $800. The fury x is still floating around $600 with only one seller posting as low as $460. I understand that the 480 is going to be restocked this week to reduce the absurd $330-$500 price back down towards $270 but is this from people actually thinking they can sell these cards for those prices or is it some tinfoil hat conspiricy surronding nvidia trying to get people to buy their 10 series cards by buying up all the amd cards and trying to resell them overpriced?


That would be cornering, illegal price manipulation in certain countries and, let's be honest, there would be no way for that to *not* show up somewhere in their books. It's not happening.


----------



## snurds

I think a modest premium over performance for the Fury X makes sense because it can't compete on specs or performance with the 980ti or 1070 among brand-neutral buyers, so the sellers might as well extract more money out of buyers that are locked into AMD for one reason or another.

Still that doesn't sound like what you are talking about.


----------



## Sonikku13

AMD cards are generally more expensive due to ETH mining, but ETH mining isn't that profitable atm... Must be lack of supply.


----------



## Flamingo

Latest drivers (16.7.2) got WHQL version release 2 days ago, so no more "driver not verified by WHQL" message on 3DMark.


----------



## dagget3450

thank you for posting that, i plan to test these in Timespy now and get valid results


----------



## Kana-Maru

Quote:


> Originally Posted by *dagget3450*
> 
> thank you for posting that, i plan to test these in Timespy now and get valid results


It's the same driver, just WHQL.

-*These drivers are now Microsoft WHQL certified

It doesn't look like it'll change anything except validation.


----------



## dagget3450

Quote:


> Originally Posted by *Kana-Maru*
> 
> It's the same driver, just WHQL.
> 
> -*These drivers are now Microsoft WHQL certified
> 
> It doesn't look like it'll change anything except validation.


Exactly. Something to give a whirl for that reason.
Quote:


> dagget3450 -- [email protected] -- 4x furyx - 1175/560- 1175/560 - 1105/560- 1105/560 - score 16710
> http://www.3dmark.com/3dm/13468567?
> 
> 
> 
> Newest drivers are now WHQL thank you AMD. On the other hand they have altered my gpu Oc's in a possibly negative way. Top 5 keeps running away


I may go back to 16.6.2 since these seemed to have nerfed my gpu oc.


----------



## Krzych04650

All right, my Sapphire Fury Nitro is already on the way. Ordered it from caseking.de for 360 EUR including shipping to Poland. Amazing deal. They said that estimated delivery time and stock availability is 29th July, but it is already on the way. So far I am very satisfied with this shop, great support, cheap shipping and they even said that they will pay for shipping back if I don't like the product. Prices far better than in Poland for many products, especially GPUs, CPUs are actually more expensive there, but GPUs are way better priced. At least AMD GPUs are much better priced at around 20% less, actually this Fury is 77% more expensive in Poland, but there is some serious price fixing going on in Poland in favor of Nvidia so this is to be expected.

Too bad that UPS is too slow to deliver tomorrow and delivery is scheduled for Monday







But Standard delivery was 11 EUR and Express one was 77 EUR so...


----------



## comagnum

I'm getting a nano next week. And advice for getting the max performance out of the card? Ie. Throttle prevention, max oc, etc


----------



## dagget3450

Quote:


> Originally Posted by *comagnum*
> 
> I'm getting a nano next week. And advice for getting the max performance out of the card? Ie. Throttle prevention, max oc, etc


Finished with your rx 480 already?


----------



## comagnum

Quote:


> Originally Posted by *dagget3450*
> 
> Finished with your rx 480 already?


Of course not! I got it in a trade along with some upgrades to my current set up.


----------



## dagget3450

Quote:


> Originally Posted by *comagnum*
> 
> Of course not! I got it in a trade along with some upgrades to my current set up.


hehe nothing wrong with more than 1 gpu









So negative voltage scaling in timespy confirmed:
http://www.3dmark.com/compare/spy/120380/spy/120753#

both runs done but 1200 clocks with voltage and negative points. This bench emulates DX11 performance to a T for me... in multigpu testing... i wonder....


----------



## mechwarrior

i just bought a gigabyte nano. having flickering issues with crimson software installed catalyst and the flickering has gone???
any ideas on what it could be? my monitor is LG mu67 4k.
thanks for the help?


----------



## flopper

Quote:


> Originally Posted by *mechwarrior*
> 
> i just bought a gigabyte nano. having flickering issues with crimson software installed catalyst and the flickering has gone???
> any ideas on what it could be? my monitor is LG mu67 4k.
> thanks for the help?


2d clocks rate changes for the card has been one such issue.
one way to solve it is to set clocks for 2d a bit higher.
mem clocks changes due to power saving features can cause such flickering.


----------



## gupsterg

@dagget3450

+rep for info that you get negative performance scaling in timespy.

@mechwarrior

There is a thread on AMD community forum, where numerous Fiji owners have posted similar issue, I would post there so perhaps AMD will solve issue.

@flopper

HBM clock is always 500MHz on Fiji stock ROMs, I have made 2 HBM clock state ROMs







. GPU clock changes maybe culprit. I was testing older driver Catalyst v15.7.1 on a Fury X and got display corruption, but never on Crimson v16.3.2 on the Fury X I've had the longest. When another Fury X I owned was in another rig using v16.3.2 I did get display corruption once only, card disposed of afterwards. On Fury / X later drivers like Crimson v16.3.2 have "Power Efficiency" feature which makes GPU clock stick to 300MHz at desktop/low loads, Catalyst v15.7.1 does not and I saw GPU clock yoyo about wildly







.

AFAIK Nano does not have "Power Efficiency" feature even in later drivers.


----------



## rv8000

Just a heads up to anyone looking to buy a Fury or grab a 2nd for CFX, Newegg has been offering some serious discounts on the Sapphire R9 Fury Nitro when in stock. Brand new the last price alert I got was $299 (one before that was $349)







, currently OOS though.


----------



## broadbandaddict

XFX Fury X is $400 again on Newegg. They were running a deal where if you paid with Paypal you got $25 more off, not sure if they're still doing that but it would be $375 if they are.









http://www.newegg.com/Product/Product.aspx?Item=N82E16814150742


----------



## Sonikku13

Getting 26 MH/sec with a stock Nano in ETH mining... Hmm, should I do it?


----------



## costilletas

Quote:


> Originally Posted by *Sonikku13*
> 
> Getting 26 MH/sec with a stock Nano in ETH mining... Hmm, should I do it?


https://badmofo.github.io/ethereum-mining-calculator/


----------



## Flamingo

Quote:


> Originally Posted by *Sonikku13*
> 
> Getting 26 MH/sec with a stock Nano in ETH mining... Hmm, should I do it?


Im getting 19MH/s only









What drivers are you using? OS? OpenCL version? BIOS? have you done anything special?

Edit: Got 25.83MH/s with genoil and 24 with claymore. latest drivers, win 10


----------



## Sonikku13

Quote:


> Originally Posted by *Flamingo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> Getting 26 MH/sec with a stock Nano in ETH mining... Hmm, should I do it?
> 
> 
> 
> Im getting 19MH/s only
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What drivers are you using? OS? OpenCL version? BIOS? have you done anything special?
> 
> Edit: Got 25.83MH/s with genoil and 24 with claymore. latest drivers, win 10
Click to expand...

Claymore miner, latest driver, Windows 10, latest OpenCL version, idk what BIOS, no overclocking, just cranked the fan to 100%. It sits at 61C.


----------



## AliNT77

definitely try undervolting

Lower power consumption - lower temps - higher clocks


----------



## Flamingo

Geth also has sustained 30mb/s disk usage when mining. So dont know how much of an impact that has on an SSD's life, might want to consider that too unless there is a way to reduce it to memory only?

nevermind, thats just the initial blocks being downloaded from mist


----------



## Kossou

Hi all!

Just want your opinion, should i get asus r9 nano for 349€ or wait for sapphire nitro rx 480 4gb? Rx 480 is something around 270€. I've heard that most nanos have coil whine issue and they are little bit noisy but for that price, i'm quite tempted to buy one...


----------



## bluezone

Quote:


> Originally Posted by *Kossou*
> 
> Hi all!
> 
> Just want your opinion, should i get asus r9 nano for 349€ or wait for sapphire nitro rx 480 4gb? Rx 480 is something around 270€. I've heard that most nanos have coil whine issue and they are little bit noisy but for that price, i'm quite tempted to buy one...


I've never had the issue of coil whine from my Nano, power supply yes. I have been very happy with the Nano and the Nano @ stock clocks has better performance than the Rx480 If cash isn't an issue I would say go for the Nano. Especially if your using 1440P.


----------



## Flamingo

If your going to game at 1080p go with the RX480, Nano for 1440p or above.

The coil whine i noticed were only at loading screens (where you get 5k fps).


----------



## snurds

Quote:


> Originally Posted by *AliNT77*
> 
> definitely try undervolting
> 
> Lower power consumption - lower temps - higher clocks


Is the relation you're referring to here lower voltage -> lower temps -> more thermal headroom -> higher clock rates?

Just curious about how to get higher clocks by undervolting.


----------



## mrcrusty

Quote:


> Originally Posted by *snurds*
> 
> Is the relation you're referring to here lower voltage -> lower temps -> more thermal headroom -> higher clock rates?
> 
> Just curious about how to get higher clocks by undervolting.


It's to do with power throttling moreso than thermals. Increasing the power limit or undervolting it prevents any downclocking, allowing it to run at advertised speed clock speeds 100% of the time.

Recently got a Nano for myself in preparation for an M-ITX build. Testing it with my main rig has been great, might get another one for the main rig permanently. Performance went up by about 8% in Valley after giving a 5% OC to core and memory, undervolting it and setting the power limit to max. Pretty chuffed.


----------



## lestatdk

Back on team Red after about a year in the green camp









Bought a Sapphire Fury Nitro from overclockers UK. Got the card the next day. Total was around 390 EUR with DHL shipping









Got 1150 on stock voltage so far looking good. Have no idea what frequencies I should be aiming for here,so going ahead slowly


----------



## Sonikku13

Decided to sell my Nano, hoping to get at least $350 so I can justify spending an extra $200 on 2x Nitro+ 480s.


----------



## bluezone

For anyone else who has lost the ability to use VSR since Crimson 16.3.1 while using HDMI to older TV's. asder00 over on guru3D has made-up a driver package with a different driver branch. I have tried it and it fixes what AMD has not fixed so far.

Read about it here:

http://forums.guru3d.com/showthread.php?t=408903

Find the driver in this thread:

http://forums.guru3d.com/showthread.php?t=408911

Hopefully this is a sign of things to come or at worse a happy accident.

Enjoy. Cheers.


----------



## comagnum

Quote:


> Originally Posted by *Sonikku13*
> 
> Decided to sell my Nano, hoping to get at least $350 so I can justify spending an extra $200 on 2x Nitro+ 480s.


I have my customized 480 up for trade. I'll add cash to compensate. It runs better than these aibs


----------



## Krzych04650

Just got Sapphire Fury Nitro. Overall the card is good.

One thing I am amazed with is a cooler. Such huge improvement over 390 Nitro. Fans are starting from ~700 RPM, not 1300 RPM jak on 390. With similar noise levels, I got 85C on 390 and 94C after OC on 390 Nitro while Fury Nitro keeps 74C on stock and 81C with OC, and this with ambient temperature for 390 being 23,7C and for Fury it was 27,7C. So the delta temps are even better for Fury, its 13C cooler on stock and 17C after OC.

I was able to get 1120 MHz on Fury and only 1090 MHz on 390. Also to stabilize the clock Fury needs only +18mV (any higher overclock ends in crash even with +100mV) while needed full +100mV to be stable at 1090 MHz and was throwing a lot of artifacts at 1100 MHz. In terms of memory overclocking, I was able to get 390 to 1650MHz and HBM on Fury didn't want to go any higher than 500 MHz, but doubt it would give any results if even 390 wasn't giving any results from memory overclock, I guess with this kind of huge bandwidth and bus there is nothing to improve.

Unfortunately coil whine is still there. Smaller than on 390, but still there. No so much in less demanding games, but under full load in games like Witcher 3 it is really hard to play without headphones. But out of 6 GPUs I had in my short adventure with PC hardware only one wasn't whining like crazy, and it started to do so after few moths anyway, and I had also 3 or even 4 power supplies in last two years, so... yea. Headphones are inevitable and silent gaming doesn't exist for someone that is not half deaf, this is what I can say from experience, I already tried enough cards from Nvidia and AMD and different AIBs to say that coil whine in just inevitable.

I will post some more measurements later, maybe even some small 390 Nitro vs Fury Nitro amateur review.


----------



## lestatdk

added 66 mV and got 1200 MHz core and 540 MHz memory .

This is benchmark stable ,but not quite game stable . I can run Firestrike and Time Spy without problems, but when testing with Doom it crashes sometimes









1160-1170 ish seems to be the max stable for me when gaming. Might have to try and clock down the memory and keep the core high just to see if it helps


----------



## Shau76434

Quote:


> Originally Posted by *lestatdk*
> 
> added 66 mV and got 1200 MHz core and 540 MHz memory .
> 
> This is benchmark stable ,but not quite game stable . I can run Firestrike and Time Spy without problems, but when testing with Doom it crashes sometimes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1160-1170 ish seems to be the max stable for me when gaming. Might have to try and clock down the memory and keep the core high just to see if it helps


What's the average fps increase you get in games with the 1170 overclock ?


----------



## ColdDeckEd

Just picked up a nano pretty happy with it. Running 1040 on stock volts. Seemed stable when I upped the power tune (+50) and undervolted (-6mv) but threw up an small artifact when running heaven so just set it to stock voltage.

Highest overclock I've gotten is 1060 on +18mV. Fan at 100 temps never went above 74 and had little to no throttling. Is it ok to go over +18mV on a nano?


----------



## Krzych04650

Just tried a bit of downvolting on Fury Nitro. I never did anything like that before, so I don't have any reference, but results seem to be amazing. I was able to set voltage all the way down to -96mV in Afterburner and saw only 0-3% performance decrease, so basically withing margin of air. I was able to get through all benchmarks few times (Valley, Fire Strike, Time Spy) and some of my own testing (for example Witcher 3 Swamp, Novigrad and etc) and there is just nothing more than margin of air in those results compared to stock voltage. Clock rate registered in Afterburner is super flat without a single drop. Temperatures are down to 62C compared to 71C on stock and 77C after OC (1120 MHz +12mV +50% power target). Power consumption is down to 308W compared to 360W on stock and 395W after OC. With this kind of voltage this card is way more efficient than RX 480







Coil whine is also reduced quite a bit, but sill annoying. VRM temps are also reduced by good 10C. I need to try overclocking with lower voltage now.


----------



## Gdourado

I am looking for help from fury owners.
At 1080p, is the 4gb of gram limiting in any way in current AAA games?
Is there any game that uses more than 4gb at 1080p or that stutters with a fury due to vram?
I ask because with current rebates, a fury is 350. Just 30 more than a RX480 nitro and I believe the fury is a stronger card and as such a better buy.

Cheers


----------



## dagget3450

Quote:


> Originally Posted by *Gdourado*
> 
> I am looking for help from fury owners.
> At 1080p, is the 4gb of gram limiting in any way in current AAA games?
> Is there any game that uses more than 4gb at 1080p or that stutters with a fury due to vram?
> I ask because with current rebates, a fury is 350. Just 30 more than a RX480 nitro and I believe the fury is a stronger card and as such a better buy.
> 
> Cheers


Not for me. Now in 4k yes ive seen a few games vram limited.


----------



## Krzych04650

Quote:


> Originally Posted by *Gdourado*
> 
> I am looking for help from fury owners.
> At 1080p, is the 4gb of gram limiting in any way in current AAA games?
> Is there any game that uses more than 4gb at 1080p or that stutters with a fury due to vram?
> I ask because with current rebates, a fury is 350. Just 30 more than a RX480 nitro and I believe the fury is a stronger card and as such a better buy.
> 
> Cheers


Fury is like 20% more powerful than 480 and Fury custom versions are actually available, so I'd say go for Fury for 350, I did the same . I don't think that 4GB is limiting factor. Even if there are games that can eat more than 4GB, this is because they have some Hyper textures, 4K textures or whatever, that take 3 times more VRam and gives no improvement. I remember only few games that ate more than 4 GB, this was Shadow of Mordor (from my experience won't stutter because of VRAM, it just always takes all you have), latest Mirrors Edge (Hyper textures, no gain is visual quality - useless), Rise of the Gimp Rider (same, Very High 4K textures - no gain in visuals), Assassins Creed Syndicate (VRAM usage concludes how bad this game is). Most of games are not using that much, for example for Witcher 3 tops at 3011 MB in 3440x1440 res, normally staying around 2 GB. There is no need for 4GB+ VRAM consumption, well optimized game won't use that much. Look at Witcher 3 and its 1-2 GB VRAM usage at 1080p and tell what is the justification for 3 times higher VRAM usage at the same res?


----------



## Performer81

Quote:


> Originally Posted by *Krzych04650*
> 
> Just tried a bit of downvolting on Fury Nitro. I never did anything like that before, so I don't have any reference, but results seem to be amazing. I was able to set voltage all the way down to -96mV in Afterburner and saw only 0-3% performance decrease, so basically withing margin of air. I was able to get through all benchmarks few times (Valley, Fire Strike, Time Spy) and some of my own testing (for example Witcher 3 Swamp, Novigrad and etc) and there is just nothing more than margin of air in those results compared to stock voltage. Clock rate registered in Afterburner is super flat without a single drop. Temperatures are down to 62C compared to 71C on stock and 77C after OC (1120 MHz +12mV +50% power target). Power consumption is down to 308W compared to 360W on stock and 395W after OC. With this kind of voltage this card is way more efficient than RX 480
> 
> 
> 
> 
> 
> 
> 
> Coil whine is also reduced quite a bit, but sill annoying. VRM temps are also reduced by good 10C. I need to try overclocking with lower voltage now.


MY Tri-X OC aka XFX Fury TD also is at 1060MHZ with -48mv voltage. 1090 and -18 are also an option. WIth my custom fan curve temps never go much over 60 with about 40% which is very quiet. Very nice card. Amd just puts way too much voltage in these cards at stock.


----------



## lestatdk

Quote:


> Originally Posted by *Barca130*
> 
> What's the average fps increase you get in games with the 1170 overclock ?


around 10-12 % or so. have done most testing in 3dmark with Time Spy and Firestrike. And then Doom to test for stability


----------



## lestatdk

Quote:


> Originally Posted by *Krzych04650*
> 
> Just tried a bit of downvolting on Fury Nitro. I never did anything like that before, so I don't have any reference, but results seem to be amazing. I was able to set voltage all the way down to -96mV in Afterburner and saw only 0-3% performance decrease, so basically withing margin of air. I was able to get through all benchmarks few times (Valley, Fire Strike, Time Spy) and some of my own testing (for example Witcher 3 Swamp, Novigrad and etc) and there is just nothing more than margin of air in those results compared to stock voltage. Clock rate registered in Afterburner is super flat without a single drop. Temperatures are down to 62C compared to 71C on stock and 77C after OC (1120 MHz +12mV +50% power target). Power consumption is down to 308W compared to 360W on stock and 395W after OC. With this kind of voltage this card is way more efficient than RX 480
> 
> 
> 
> 
> 
> 
> 
> Coil whine is also reduced quite a bit, but sill annoying. VRM temps are also reduced by good 10C. I need to try overclocking with lower voltage now.


OK, I need to try this out.


----------



## Krzych04650

Anyone knows any free endless load bechmark like Valley? Valley works okay for me if benchmarking, but if I want to leave it for hour or so and check stability it stops working after 1x minutes even on stock or downclock, so this is not GPU issue. I can run 10 benchmarks in a row and it won't stop working, but if leave it on standard endless mode then it stops to work.


----------



## xTesla1856

My Nitro Furies do 1070mhz at -72Volts. Power draw dropped from about 700 watts to about 480.


----------



## xTesla1856

Has anyone tried repasting these cards? Is it as straight forward as the Titans I had before? Or should I pay attention to the HBM?


----------



## bluedevil

Hey guys...any way to get the 75c thermal throttling unlocked?


----------



## Krzych04650

Ok I found other benchmark to replace Valley, Furmark. This is one crazy stress test, it makes GPU draw 150W more and make it over 10C hotter than Valley or demanding games.


----------



## xTesla1856

Quote:


> Originally Posted by *Krzych04650*
> 
> Ok I found other benchmark to replace Valley, Furmark. This is one crazy stress test, it makes GPU draw 150W more and make it over 10C hotter than Valley or demanding games.


Please don't use Furmark, it is very outdated and can damage cards from pulling too much power.


----------



## xTesla1856

Quote:


> Originally Posted by *bluedevil*
> 
> Hey guys...any way to get the 75c thermal throttling unlocked?


You could flash a different BIOS on your card. Sapphire's OC Bios allows for a 80 degree temp target.


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> My Nitro Furies do 1070mhz at -72Volts. Power draw dropped from about 700 watts to about 480.


WOW!!








Quote:


> Originally Posted by *xTesla1856*
> 
> Has anyone tried repasting these cards? Is it as straight forward as the Titans I had before? Or should I pay attention to the HBM?


The interposer is what you have to be careful about. The traces are exposed and don't use sharp or hard objects. I didn't even clean the interposer all that well myself.
Also be carful tightening the tension plate, on the back, that holds the cooler on. Tighten it evenly or there is a possibility of cracking the interposer. It has happened.


----------



## Shau76434

Quote:


> Originally Posted by *lestatdk*
> 
> around 10-12 % or so. have done most testing in 3dmark with Time Spy and Firestrike. And then Doom to test for stability


----------



## Krzych04650

Quote:


> Originally Posted by *xTesla1856*
> 
> Please don't use Furmark, it is very outdated and can damage cards from pulling too much power.


Hm, I didn't know about that. But it indeed wasn't updated since 5 years ago. It reads my card as Fury X instead of just Fury. What other benchmark would you recommend for 1 hour stress test except Valley and Heaven? OCCT?

Anyways, my Fury passed 1 hour Furmark GPU stress test at -90 mV voltage on stock 1050/500 clocks.
*(full size screenshot here)*



Crazy bechmark, I had to set fan speed to 50% to keep 71C while in Valley I needed 35% for 62 C. Power draw 300 on Valley vs 400 on Furmark.


----------



## Performer81

Furmark is useless. Its a max. heat test but no good stability test.


----------



## costilletas

I've found overwatch to be a very good stability test, my fury stays at 100% and ~~3k memory while in game. So far if it doesn't crash while playing OW, it won't crash anywhere else.


----------



## Thoth420

Quote:


> Originally Posted by *costilletas*
> 
> I've found overwatch to be a very good stability test, my fury stays at 100% and ~~3k memory while in game. So far if it doesn't crash while playing OW, it won't crash anywhere else.


It's also great to compare color quality for monitor and to ghetto calibrate if you dont have a Spyder etc. and it is a damn fun game too.


----------



## Krzych04650

I was basically playing with voltage half day today and run through multiple benchmarks and manual testing in games so I think I can post final results.

Power target is +50% for everything just in case
Fan speed is fixed to 35% (1280 RPM) for all temperature tests

Maximum OC I was able to get is 1120/500 with +18 mV. It won't go any higher even if much more voltage added.
However this OC doesn't make much sense because I was able to get 1100/500 stable at -48mV. There was some quite rare random crashing at -60, but works well at -48mV.
Power draw for entire setup dropped from 385W to 331W, core temperature from 77C to 66C and VRM temperature from 85C to 69C.

At stock 1050/500 I was able to decrease the voltage by -90mV. Got through many games and benchmarks and never crashed.
Power draw decreased from 360W to 309W, core temperature from 71C to 61C and VRM temperature from 79C to 64C.

As for performance, differences in stock results for standard vs overvolted are withing margin of air, and overclocked results shows some decrease because of 1120 vs 1100 core clock, so around 1-2% decrease in performance. Nowhere near to be worth additional 55W power draw and 11C temperature increasement, obviously.

So compared to 390 Nitro I had and returned because of crazy coil whine, if comparing OC vs OC, Fury is 24% more powerful while consuming 32% less power. But this will vary from unit to unit, my 390 needed +90mV overvolt to be able to overclock at all by 50 Mhz, and at the same time Fury is stable at -48mV on the same +50 MHz overclock, so this may be completely random.

Overall I am satisfied with the purchase, if only not this stupid coil whine this card will be almost perfect, almost because it still could overclock better after all. But the cooler is amazing, card is taking very significant downvolting very well, so except for this freaking whining that always breaks my silent PC attempts, I am pleased. Now I only need to get refund for my monitor and get a bit more reasonable one and I will be ready to play


----------



## lestatdk

I only have coil whine with crazy high frame rates. I also had this on my previous card (gtx 970 ).

I normally game with vsync og and never have any coil whine even at 6400x1080


----------



## mustrum

Quote:


> Originally Posted by *xTesla1856*
> 
> Has anyone tried repasting these cards? Is it as straight forward as the Titans I had before? Or should I pay attention to the HBM?


Can only speak for the fury X but the HBM is basically like the die. I applied new Paste when i switched to an EK fullcover cooler. Was not a big deal.


----------



## Gdourado

How does the Sapphire Fury Nitro go against the Fury Nano?
Are the 4096 stream processors worth it?
Will they give a bigger advantage in the future with vulkan or DX12?
From what I see, the nitro can go to 1100-1150 core.
The Nano usually goes to 1050 with increased power limit and doesn't throttle.
What gives better game performance?
4096 cores at 1050mhz or 3584 cores at 1150 mhz?

Cheers!


----------



## Krzych04650

Quote:


> Originally Posted by *Gdourado*
> 
> How does the Sapphire Fury Nitro go against the Fury Nano?
> Are the 4096 stream processors worth it?
> Will they give a bigger advantage in the future with vulkan or DX12?
> From what I see, the nitro can go to 1100-1150 core.
> The Nano usually goes to 1050 with increased power limit and doesn't throttle.
> What gives better game performance?
> 4096 cores at 1050mhz or 3584 cores at 1150 mhz?
> 
> Cheers!


After unlocking power target and etc Nano will be a bit faster than Fury, but the difference is not worth it in my opinion if you consider much higher temps and horrible noise levels.

Here are some measurements for Nano with +50% power limit:

Noise: http://www.purepc.pl/karty_graficzne/amd_radeon_r9_nano_nizsza_cena_za_wydajnosc_radeon_r9_fury_x?page=0,17
Temps: http://www.purepc.pl/karty_graficzne/amd_radeon_r9_nano_nizsza_cena_za_wydajnosc_radeon_r9_fury_x?page=0,16
Performance: http://www.purepc.pl/karty_graficzne/amd_radeon_r9_nano_nizsza_cena_za_wydajnosc_radeon_r9_fury_x?page=0,12


----------



## Krzych04650

I made a little review for Sapphire Fury Nitro, maybe it will be useful for someone: https://linustechtips.com/main/topic/633658-sapphire-fury-nitro-mini-review-vs-390-nitro/

At 50% fan speed the card tops at 52C. This is far to loud for playing on speakers, but if you play on headphones then it looks like you can pair two of them and never reach 70C.


----------



## LionS7

@Krzych04650

My R9 Fury X is very sad. I was testing around 2 weeks, cos it crashes not too often. After crash on 1.24V, witch is +42mV, I put 1.26V... and, I dont know anymore. This is for 1100Mhz core. You need to test these cards in games like Star Wars Battlefront, FrostBite 3 i killing them. Try it. Cos I don't think, that I need 1.26V, 1100Mhz for other games...


----------



## Spartoi

Quote:


> Originally Posted by *Krzych04650*
> 
> I made a little review for Sapphire Fury Nitro, maybe it will be useful for someone: https://linustechtips.com/main/topic/633658-sapphire-fury-nitro-mini-review-vs-390-nitro/
> 
> At 50% fan speed the card tops at 52C. This is far to loud for playing on speakers, but if you play on headphones then it looks like you can pair two of them and never reach 70C.


What did you use to measure GPU wattage/voltage? I use HWInfo and my Fury Tri-X is runs at 1080mhz/545mhz with a -12mV undervolt. HWInfo reports that after 10 minutes of Valley Extreme HD, my GPU uses about 271W. Considering you're stock undervolt reading, I would think you'd use less wattage than my card. Does the Nitro use more power than the Tri-X?


----------



## AngryLobster

Quote:


> Originally Posted by *Krzych04650*
> 
> I made a little review for Sapphire Fury Nitro, maybe it will be useful for someone: https://linustechtips.com/main/topic/633658-sapphire-fury-nitro-mini-review-vs-390-nitro/
> 
> At 50% fan speed the card tops at 52C. This is far to loud for playing on speakers, but if you play on headphones then it looks like you can pair two of them and never reach 70C.


Man I have no idea how you manage 61-62C.

Ambient here is 22C and copying your settings (1440P, same undervolt, 35% fixed fan speed, etc.), mine hits 75C and sits there or bouncing up to 76. Same game too (Witcher 3). This is with a open side panel too.

I don't know what is causing a almost 15C different in temps we are seeing but either something is wrong with my card or you live at the North Pole.


----------



## costilletas

Air flow in your case?


----------



## AngryLobster

Quote:


> Originally Posted by *costilletas*
> 
> Air flow in your case?


As I said above, side panel is open. I'm going to try applying new thermal paste right now in case the factory job is really bad.

EDIT: Yeah just changed paste, made zero different in temperatures.

Toms required around 1450RPM to maintain 75C in their review. My experience matches theirs when I close my side panel. I gotta stop believing everything on the internet.


----------



## costilletas

Oops, sorry







I didn't read that.


----------



## Krzych04650

Quote:


> Originally Posted by *LionS7*
> 
> @Krzych04650
> 
> My R9 Fury X is very sad. I was testing around 2 weeks, cos it crashes not too often. After crash on 1.24V, witch is +42mV, I put 1.26V... and, I dont know anymore. This is for 1100Mhz core. You need to test these cards in games like Star Wars Battlefront, FrostBite 3 i killing them. Try it. Cos I don't think, that I need 1.26V, 1100Mhz for other games...


Overclocking is not really a thing with those cards, at least from what I see on my Fury. I tested this 1100 MHz -48mV more extensively and it crashes and needs stock voltage, and any overclock above that will crash even with full overvoltage. And for only 50MHz less at 1050 MHz I can set voltage to -90 mV and be stable. Gains from overclocking compared to power draw and temperature increase are just...pathetic.

As for FrostBite3, I think that Witcher 3 is also very merciless in terms of overclocking and stability, I always had to set my cards lower than for other games to be stable. I was testing FreeSync for like an hour running around Novigrad and some woods and swamps and it never crashed at -90mV stock or 1100 MHz stock voltage.
Quote:


> Originally Posted by *Spartoi*
> 
> What did you use to measure GPU wattage/voltage? I use HWInfo and my Fury Tri-X is runs at 1080mhz/545mhz with a -12mV undervolt. HWInfo reports that after 10 minutes of Valley Extreme HD, my GPU uses about 271W and 1.181V. Considering you're stock undervolt reading, I would think you'd use less wattage and voltage than my card. Does the Nitro use more power than the Tri-X?


I don't know about Tri-X, but Tri-X is reference PCB and Nitro is custom, so there may be differences. This would explain temperature and noise difference between the two in tests.

I am using HWInfo and MSI Afterburner for voltage monitoring. As for wattage, I am just measuring total power draw from the wall with wall meter or however it is called in English. And this includes monitor and speakers because I was too lazy to reconnect it, so you can cut ~25W from those measurements.
Quote:


> Originally Posted by *AngryLobster*
> 
> Man I have no idea how you manage 61-62C.
> 
> Ambient here is 22C and copying your settings (1440P, same undervolt, 35% fixed fan speed, etc.), mine hits 75C and sits there or bouncing up to 76. Same game too (Witcher 3). This is with a open side panel too.
> 
> I don't know what is causing a almost 15C different in temps we are seeing but either something is wrong with my card or you live at the North Pole.


Quote:


> Originally Posted by *AngryLobster*
> 
> As I said above, side panel is open. I'm going to try applying new thermal paste right now in case the factory job is really bad.
> 
> EDIT: Yeah just changed paste, made zero different in temperatures.
> 
> Toms required around 1450RPM to maintain 75C in their review. My experience matches theirs when I close my side panel. I gotta stop believing everything on the internet.


Tests are made for ambient temperature of 23,7-24,1 degrees. I am measuring at night because during the day I have around 28C ambient temperature. Basically very hot room with 2 big windows and no ventilation, and only 2-3 degrees cooler than outside, so if there is like 36C outside than you can imagine that I am dying there









My case is Fractal Design Define R5. I am using slow RPM Silentium PC Sigma Pro 140mm fans set to 5V, which gives 400 RPM. I was afraid that they will be useless since you can barely feel any air pushed by them if you place your hand behind one, but actually they work well. There are two for front intake, one for bottom intake (1-2 degrees improvement for GPU), one for side intake (lowers temperature by another 4-5 degrees) and rear exhaust. Those measurements for fans, how much each helps, was made with R9 390, I didn't test them yet for Fury, but temp goes up quickly if I turn case fans off, so there is some airflow after all even with those not very well performing in theory fans.

Keep in mind that -90mV undervolt lowered temps from 71C to 61-62 C. On stock voltage I am getting 71.

This is how the temp scales with fan speed after undervolting (10 minutes Heaven Benchmark).:



I can make video of 10 minutes Valley run, what recording software doesn't take 500 GB for 10 minutes and won't take video on 50 parts like FRAPS?


----------



## Krzych04650

I don't know much sense does it make, probably not much, but I also downclocked Fury to 390 performance that I also tested recently and measured power draw and temperatures.

This probably is quite pointless, but it shows that you can scale Fury to whatever performance you want and power draw will scale very well with it. Downclocking to 390 performance (around 25% performance decrease) and setting -50% power target surely draws much less power than stock card at ~75% load.


----------



## AngryLobster

Quote:


> Originally Posted by *Krzych04650*
> 
> Overclocking is not really a thing with those cards, at least from what I see on my Fury. I tested this 1100 MHz -48mV more extensively and it crashes and needs stock voltage, and any overclock above that will crash even with full overvoltage. And for only 50MHz less at 1050 MHz I can set voltage to -90 mV and be stable. Gains from overclocking compared to power draw and temperature increase are just...pathetic.
> 
> As for FrostBite3, I think that Witcher 3 is also very merciless in terms of overclocking and stability, I always had to set my cards lower than for other games to be stable. I was testing FreeSync for like an hour running around Novigrad and some woods and swamps and it never crashed at -90mV stock or 1100 MHz stock voltage.
> I don't know about Tri-X, but Tri-X is reference PCB and Nitro is custom, so there may be differences. This would explain temperature and noise difference between the two in tests.
> 
> I am using HWInfo and MSI Afterburner for voltage monitoring. As for wattage, I am just measuring total power draw from the wall with wall meter or however it is called in English. And this includes monitor and speakers because I was too lazy to reconnect it, so you can cut ~25W from those measurements.
> 
> Tests are made for ambient temperature of 23,7-24,1 degrees. I am measuring at night because during the day I have around 28C ambient temperature. Basically very hot room with 2 big windows and no ventilation, and only 2-3 degrees cooler than outside, so if there is like 36C outside than you can imagine that I am dying there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My case is Fractal Design Define R5. I am using slow RPM Silentium PC Sigma Pro 140mm fans set to 5V, which gives 400 RPM. I was afraid that they will be useless since you can barely feel any air pushed by them if you place your hand behind one, but actually they work well. There are two for front intake, one for bottom intake (1-2 degrees improvement for GPU), one for side intake (lowers temperature by another 4-5 degrees) and rear exhaust. Those measurements for fans, how much each helps, was made with R9 390, I didn't test them yet for Fury, but temp goes up quickly if I turn case fans off, so there is some airflow after all even with those not very well performing in theory fans.
> 
> Keep in mind that -90mV undervolt lowered temps from 71C to 61-62 C. On stock voltage I am getting 71.
> 
> This is how the temp scales with fan speed after undervolting (10 minutes Heaven Benchmark).:
> 
> 
> 
> I can make video of 10 minutes Valley run, what recording software doesn't take 500 GB for 10 minutes and won't take video on 50 parts like FRAPS?


Thanks for the information. My friend and I both at -90mv are seeing results 10-12C higher. I don't really care for Valley or Heaven. Witcher 3 at 4K VSR brings the card close to 80C. At 1440P with your settings myself and my friend are both just under 75C.

I'm not sure what the discrepancy is given you are using fans at 400RPM which are essentially doing nothing in a case that large but 62C is literally impossible for me unless ambient was 7C.


----------



## Thoth420

Hey all so I finally got my hands on a working Z170 motherboard and decided that I would like to try 4K on my single Fury X. I have been tunnel vision focused on 2560 x 1440 144hz panels for the past few years so I am looking for a good 4K monitor to game on and if it has Freesync with a range that is manageable on a single Fury X that is all the better. Looking for suggestions from people with experience with 4K panels. I don't mind a TN if the color quality is decent. As far as panel size I am willing to consider anything from 28 or higher up to the max size as long as it is a monitor not a TV. Cheers!


----------



## Krzych04650

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all so I finally got my hands on a working Z170 motherboard and decided that I would like to try 4K on my single Fury X. I have been tunnel vision focused on 2560 x 1440 144hz panels for the past few years so I am looking for a good 4K monitor to game on and if it has Freesync with a range that is manageable on a single Fury X that is all the better. Looking for suggestions from people with experience with 4K panels. I don't mind a TN if the color quality is decent. As far as panel size I am willing to consider anything from 28 or higher up to the max size as long as it is a monitor not a TV. Cheers!


From what I searched, all 4K FreeSync panels has 40-60Hz FreeSync range, except for one, that is iiyama G-Master GB2888UHSU-B1 Gold Phoenix, it has 35-60. It is a TN.
Quote:


> Originally Posted by *AngryLobster*
> 
> Thanks for the information. My friend and I both at -90mv are seeing results 10-12C higher. I don't really care for Valley or Heaven. Witcher 3 at 4K VSR brings the card close to 80C. At 1440P with your settings myself and my friend are both just under 75C.
> 
> I'm not sure what the discrepancy is given you are using fans at 400RPM which are essentially doing nothing in a case that large but 62C is literally impossible for me unless ambient was 7C.


I don't know why there are such huge temperature differences, especially if like you said my setup is definitely not set for amazing cooling performance, it is set for maximum silence. I never got 80C on this card, even on crazy test like Furmark which makes GPU draw so much more than games I was getting 77C with 35% fan speed and 71C with 50% fan speed. But in games there is no way for such high temps with undervoltage. After overvoltage and 1125 OC I was indeed getting 78C, but this was 120mV above my undervolted setting.


----------



## dagget3450

If no one mentioned it yet, there is a way to hack the monitor down to 32 refresh freesynch.


----------



## Krzych04650

Quote:


> Originally Posted by *dagget3450*
> 
> If no one mentioned it yet, there is a way to hack the monitor down to 32 refresh freesynch.


Yea there is but no always works. I wasn't able to make it work properly on my 34UC98, but FreeSync on my unit is not working properly even at stock 55-75 and creates some stutters, flickering and some kind of "micro breaks" in GPU operation, so I am getting refund. But anyways, there are a lot of people who didn't manage to hack FreeSync range.


----------



## bluezone

New official driver kids. 16.7.3

release notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.7.2-Release-Notes.aspx

Cheers

Edited.


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> New official driver kids. 16.7.3
> 
> release notes.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.7.2-Release-Notes.aspx
> 
> Cheers
> 
> Edited.


You posted the notes for 16.7.2







Here's the actual notes: http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-7-3-Release-Notes.aspx


----------



## Orthello

Very nice gains if this 10% pans out .. can anyone do some ROTR 480 benches to see .. the is about the only dx12 title than the rx480 lagged the 1060 in .. maybe not much longer now.

Just saw the notes :

"Testing conducted by AMD Performance Labs as of July 22, 2016 on the Radeon™ RX 480 graphics card, on a test system comprising Intel i7 5960X CPU, 16GB DDR4-2666 Mhz system memory, Radeon Software Crimson Edition 16.7.2 and Radeon Software Crimson Edition 16.7.3 and Windows 10 x64 using the game Rise of the Tomb Raider™. PC manufacturers may vary configurations, yielding different results. *At 1920x1080, Radeon Software Crimson Edition 16.7.2 scored 78.73 and Radeon Software Crimson Edition 16.7.3 scored 86.53 using the Radeon RX 480 graphics card, which is 10% faster performance*. Tests are not average and may vary."

Seems its 1080p gains , they don't mention settings.

Interesting also that is specific increase related to the RX480 .. possibly getting the best out of its improved tessellation hardware vs prior gcn cards.


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> You posted the notes for 16.7.2
> 
> 
> 
> 
> 
> 
> 
> Here's the actual notes: http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-7-3-Release-Notes.aspx


Thanks. Copy and paste mistake.


----------



## bluezone

Quote:


> Originally Posted by *Orthello*
> 
> Very nice gains if this 10% pans out .. can anyone do some ROTR 480 benches to see .. the is about the only dx12 title than the rx480 lagged the 1060 in .. maybe not much longer now.
> 
> Just saw the notes :
> 
> "Testing conducted by AMD Performance Labs as of July 22, 2016 on the Radeon™ RX 480 graphics card, on a test system comprising Intel i7 5960X CPU, 16GB DDR4-2666 Mhz system memory, Radeon Software Crimson Edition 16.7.2 and Radeon Software Crimson Edition 16.7.3 and Windows 10 x64 using the game Rise of the Tomb Raider™. PC manufacturers may vary configurations, yielding different results. *At 1920x1080, Radeon Software Crimson Edition 16.7.2 scored 78.73 and Radeon Software Crimson Edition 16.7.3 scored 86.53 using the Radeon RX 480 graphics card, which is 10% faster performance*. Tests are not average and may vary."
> 
> Seems its 1080p gains , they don't mention settings.
> 
> Interesting also that is specific increase related to the RX480 .. possibly getting the best out of its improved tessellation hardware vs prior gcn cards.


10% would be nice to see. But I'm more interested on what they are going to do with this company now that they bought it. Using dynamically variable resolution to stabilize frame rate. If I correctly understand what this tech does.

http://www.hialgo.com/


----------



## Orthello

Quote:


> Originally Posted by *bluezone*
> 
> 10% would be nice to see. But I'm more interested on what they are going to do with this company now that they bought it. Using dynamically variable resolution to stabilize frame rate. If I correctly understand what this tech does.
> 
> http://www.hialgo.com/


Yeah , that's an interesting acquisition hialgo that, i think it would be a great way if they can implement this at a driver level to push for that next res up and only have lower fidelity at the stress points.

I'm not sure they can get something going like this at a driver level however , particularly in dx12 as its much more low level than dx11. I think more likely we see it in Gaming evolved titles etc.

Another problem is at present if i read it right it only supports dx9 .. and a lot of those titles would run quite well these days on relatively cheap hardware.

AMDs software direction of late has been quite encouraging, i would like to see a real focus on multi-gpu from them too.


----------



## ManofGod1000

Well, I have a Sapphire Nitro Fury OC on it's way. ($349 / Will receive it on August 2nd.) I decided to buy that and sell off my EVGA 980 Ti since I have always been happier with how AMD graphics look and perform. Also, I am using the XFX R9 380 DD 4GB card from my work computer at home until I receive the Fury. I personally think the desktop looks sharper, brighter and more colorful then when I was using the 980 Ti, as good a card is that is.

I noticed the difference the other way when I went from a 290x to the 980 Ti as well. Others may or may not notice the difference but I most certainly did.







I did some case swapping today so I could fit the new card in my home computer. (From a Define r3 to a Corsair r300.) It is was a much better idea than buying a new case and spending money I did not need to spend.


----------



## Orthello

Quote:


> Originally Posted by *ManofGod1000*
> 
> Well, I have a Sapphire Nitro Fury OC on it's way. ($349 / Will receive it on August 2nd.) I decided to buy that and sell off my EVGA 980 Ti since I have always been happier with how AMD graphics look and perform. Also, I am using the XFX R9 380 DD 4GB card from my work computer at home until I receive the Fury. I personally think the desktop looks sharper, brighter and more colorful then when I was using the 980 Ti, as good a card is that is.
> 
> I noticed the difference the other way when I went from a 290x to the 980 Ti as well. Others may or may not notice the difference but I most certainly did.
> 
> 
> 
> 
> 
> 
> 
> I did some case swapping today so I could fit the new card in my home computer. (From a Define r3 to a Corsair r300.) It is was a much better idea than buying a new case and spending money I did not need to spend.


I've noticed a distinct quality advantage in visuals in the witcher 3 on AMD vs NV ... it might just be my eyes but it looks quite a bit nicer IMHO than NVs rendering.

I was rather jealous after seeing how the Fury X looked playing the game and how it looked vs my TX .. I had way more FPS in sli .. still i like my eye candy.

Might have been monitor settings , who knows . I need to test it on same monitor and see both to be sure, it was the same model of monitor. Its probably a subjective thing.


----------



## mypickaxe

So, this came in today. Installed it with a dedicated 240 rad for now. Wondering if I should bother with custom BIOS or just push clocks up and leave voltage alone?


----------



## Orthello

Quote:


> Originally Posted by *mypickaxe*
> 
> So, this came in today. Installed it with a dedicated 240 rad for now. Wondering if I should bother with custom BIOS or just push clocks up and leave voltage alone?


I think its worth pushing without custom bios to see how far stock bios goes, definately push voltage though - the cooling is there.

If you don't hit 1400 + then i'd go custom bios to push further .. if you got to 1450 that would be quite something.


----------



## mypickaxe

Quote:


> Originally Posted by *Orthello*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mypickaxe*
> 
> So, this came in today. Installed it with a dedicated 240 rad for now. Wondering if I should bother with custom BIOS or just push clocks up and leave voltage alone?
> 
> 
> 
> 
> 
> I think its worth pushing without custom bios to see how far stock bios goes, definately push voltage though - the cooling is there.
> 
> If you don't hit 1400 + then i'd go custom bios to push further .. if you got to 1450 that would be quite something.
Click to expand...

1400 is a thing with the Nano?


----------



## Orthello

Quote:


> Originally Posted by *mypickaxe*
> 
> 1400 is a thing with the Nano?


oops my bad , didnt look hard enough .. thought it was RX480. Power connector should have given it away ..


----------



## Spartoi

EDIT:

Nvm.


----------



## LionS7

8% increase with 16.7.3 WHQL driver on R9 Fury X in Rise of the Tomb Raider, max, smaa. From 71 average to 77 fps. This is on 1100/1000Mhz.

Any ideas how to reduce the coil whine on my Fury X, but not with software, cos Im playing with vSync OFF ?


----------



## looncraz

Quote:


> Originally Posted by *LionS7*
> 
> Any ideas how to reduce the coil whine on my Fury X, but not with software, cos Im playing with vSync OFF ?


Find a way to make it as bad as you can and run it like that overnight, the coils will wear at the peaks of the frequency and the noise should diminish or go away completely (though it may come back at other frequencies, so you'll need to play with it a great deal).

Works most of the time, but some coils are tighter than others.


----------



## LionS7

So, something like +96mV, Valley ?


----------



## looncraz

If that makes the coil whine loud and unbearable, then yes, so long as the temps are under control


----------



## Krzych04650

Quote:


> Originally Posted by *looncraz*
> 
> Find a way to make it as bad as you can and run it like that overnight, the coils will wear at the peaks of the frequency and the noise should diminish or go away completely (though it may come back at other frequencies, so you'll need to play with it a great deal).
> 
> Works most of the time, but some coils are tighter than others.


Interesting. My Fury whines a lot, a bit more than other card I've had, and this creates some serious problem, because I am playing on speakers and coil whine basically ruins the experience, but I am also very uncomfortable with headsets so I most likely won't be able to find comfortable ones. I got Audio-Technica M30X yesterday and they are horribly uncomfortable. But I am digressing.... So I just need to make the coil whine as bad as possible with some overvoltage and very high FPS and run it for multiple hours without break? Or should I just run it on my daily clock/voltage/whine?


----------



## looncraz

Quote:


> Originally Posted by *Krzych04650*
> 
> Interesting. My Fury whines a lot, a bit more than other card I've had, and this creates some serious problem, because I am playing on speakers and coil whine basically ruins the experience, but I am also very uncomfortable with headsets so I most likely won't be able to find comfortable ones. Got Audio-Technica M30X and they are horribly uncomfortable. But I am digressing.... So I just need to make the coil whine as bad as possible with some overvoltage and very high FPS and run it for multiple hours without break? Or should I just run it on my daily clock/voltage/whine?


The over-voltage isn't required, but keeping it where it is whining constantly as much as possible will wear down the coil windings against each other which usually reduces the noise. My 7870XT was a horribly whiny card until I left it running overnight in some game menu pushing out like 500fps. I maxed the fans to 100% as well (you should probably do the same).

I still have that card and use it all the time - it's in my silent HTPC machine









EDIT:
Quick explanation:

What you are hearing is the coils actually moving. High current through them in a PWM setup causes the coils to react deferentially to induced magnetic fields, which causes movement (like an electric motor). This sound is from the windings rubbing against each other and the housing for the coil - so if you do it enough, the peaks where these components touch each other will wear into each other and will either move more freely (quietly) or actually fuse the insulating material on the windings, ceasing the movement - and the noise.


----------



## Krzych04650

Quote:


> Originally Posted by *looncraz*
> 
> The over-voltage isn't required, but keeping it where it is whining constantly as much as possible will wear down the coil windings against each other which usually reduces the noise. My 7870XT was a horribly whiny card until I left it running overnight in some game menu pushing out like 500fps. I maxed the fans to 100% as well (you should probably do the same).
> 
> I still have that card and use it all the time - it's in my silent HTPC machine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> Quick explanation:
> 
> What you are hearing is the coils actually moving. High current through them in a PWM setup causes the coils to react deferentially to induced magnetic fields, which causes movement (like an electric motor). This sound is from the windings rubbing against each other and the housing for the coil - so if you do it enough, the peaks where these components touch each other will wear into each other and will either move more freely (quietly) or actually fuse the insulating material on the windings, ceasing the movement - and the noise.


Thank you for detailed explanation. I will certainly try this is out because coil whine is only thing that bothers me right now, the rest in my setup, including my Fury cooling, is very quiet and cool, barely audible in silence at night, let alone gaming. It would be amazing if I manage to reduce whine significantly, because my head and ears just won't accept headset, especially for long hours. And I like speakers sound much more than headset. The works beautifully, especially for this price.

I will try it on Sunday during the day rather than at night, l will be able to monitor the situation and also I won't be able to sleep next to whining PC anyway









Thanks again


----------



## LionS7

Ok, thank you. The cooling is very effective, so maybe I will try 99 runs of Metro: Last Light.


----------



## Thoth420

Quote:


> Originally Posted by *looncraz*
> 
> Find a way to make it as bad as you can and run it like that overnight, the coils will wear at the peaks of the frequency and the noise should diminish or go away completely (though it may come back at other frequencies, so you'll need to play with it a great deal).
> 
> Works most of the time, but some coils are tighter than others.


Hitman Absolution Menu overnight with v sync off and it should do between 300 and 1000fps pegged in that menu. The coil whine should be at it's loudest under these conditions. If that doesnt kill it off after a few tries max it probably wont go away. In that case: capping your FPS using the FRTC may help.


----------



## Krzych04650

Witcher 3 has 3000-4000 FPS in menu. Watch your CPU temp though.


----------



## LionS7

Quote:


> Originally Posted by *Thoth420*
> 
> If that doesnt kill it off after a few tries max


What do you mean by that ? Few tries ? In hours i think maybe 12, or is it too much ?


----------



## ManofGod1000

*Sigh* I spend 3 weeks on and off looking at the Newegg site to see if the Sapphire R9 Fury will be $299.99 again. Then, 2 days ago, I see the +SR version for $349 on the Newegg ebay store so I purchase it. Of course, today, I receive a Newegg email showing the regular SR at $299.99 so I figured I would purchase it and have them take back the other one before I receive it.

Now, well I am on the phone with Newegg, the card sells out.







Sheez, I cannot seem to win.







Oh well, I am still looking forward to the card I bought but, I cannot believe my bad timing.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> What do you mean by that ? Few tries ? In hours i think maybe 12, or is it too much ?


A few 8 to 12 hour periods (bedtime is good unless you sleep in the same room as the system). Watch temps but whatever you are at after about an hour is pretty much the thermal plateau. After that you can leave it for days if need be. Coil whine(actually resonance) is not damaging just annoying.


----------



## Krzych04650

You know what? Screw coil whine, I will move my PC to the attic and pass all needed cables through ceiling to my room.. I will just need to watch out for dust and monitor ambient temperature, if it doesn't go below 0C, because going back to + can cause some dampness, right? My cooling is good enough to sustain 35C+ ambient temps that can be sometimes apparent there after few very hot days. Anything else to watch for except getting good quality DP cable (I need 3m)?


----------



## Thoth420

Quote:


> Originally Posted by *Krzych04650*
> 
> You know what? Screw coil whine, I will move my PC to the attic and pass all needed cables through ceiling to my room.. I will just need to watch out for dust and monitor ambient temperature, if it doesn't go below 0C, because going back to + can cause some dampness, right? My cooling is good enough to sustain 35C+ ambient temps that can be sometimes apparent there after few very hot days. Anything else to watch for except getting good quality DP cable (I need 3m)?


Is it nonstop or something? I tend to only get it in a game with v and free sync off and only in super high fps low load scenarios(menus and splash screens mostly). I use a 3m cable for my setup so I don't know how that would reach through to another room above yours. If it is nonstop however I would RMA the card as that isn't normal for a Fury X or at least not most. (or any GPU for that matter)


----------



## Krzych04650

Quote:


> Originally Posted by *Thoth420*
> 
> Is it nonstop or something? I tend to only get it in a game with v and free sync off and only in super high fps low load scenarios(menus and splash screens mostly). I use a 3m cable for my setup so I don't know how that would reach through to another room above yours. If it is nonstop however I would RMA the card as that isn't normal for a Fury X or at least not most. (or any GPU for that matter)


Not nonstop, but under any sensible gaming load no matter how much FPS. Intensity of course vary for different FPS, voltages and loads (for example no coil whine in Furmark but quite big one in games like Witcher 3 that can stress the card). It is not intensive for average human, I tried that with my brother, sister and father, all of them hear it only if they look for it and they have to be very close to PC (that is under my desk), but I am not average and all of my entire body was specifically created to annoy me all the time, by I suppose mother nature depending on what you believe in,, and I hear this whine much more and it drives me crazy and breaks gaming experience, especially if I play on speakers because I cannot wear any headphones because I feel very uncomfortable with them (tried quite a few considered as comfortable by many).

There is nothing unusual with the card itself, out of 6 GPUs I tried only one had acceptable amount of coil whine for me and it started to whine like crazy after few months anyway. It will just never end, I have built the quietest air cooled PC possible within very sensible budget and it cools itself very well also, but there will always be something to break whole concept, now its coil whine, and if not coil whine then some other unmanageable and not fixable issue will appear.

I will just move my PC away from me to other room and have eternal peace







It will cost a bit for all of those extensions for USB and fan controller, plus good certified 3m DP cable cost quite much, but still less than heahphones I would have to buy now, and they will only create another problems with comfort and sound quality, while I love my current setup with speakers and comfort doesn't play a role here since nothing is pressing on my head









Also what is the problem with 3m DP? There is like 1,75 meters from monitor connection for the ceiling, then the worse case scenario is 50cm through the ceiling (most likely much less, its from first floor to attic, so there is wooden ceiling, not concrete like from ground to 1st floor) and then connection to GPU is like 10cm from the attic floor. I will have 50cm+ cable left, maybe to put case on something so it does't stay on the fooler


----------



## costilletas

Why don't you rma your vga? It sucks when you get a crybaby pedal as a gpu lol


----------



## Thoth420

That is why it is called resonance technically and coil whine is a buzzword(pun intended). Some people are sensitive to it and others may not hear a thing. I am probably somewhere between you and the average person.


----------



## Krzych04650

Quote:


> Originally Posted by *costilletas*
> 
> Why don't you rma your vga? It sucks when you get a crybaby pedal as a gpu lol


I am not going to RMA because of the issue that is present on every card I saw in my life and get rid of my GPU for 30+ days for RMA process just to get another one for exchange with exactly the same issue and probably few others as I would most likely get some refurbished faulty crap returned by someone before. Card is working properly and coil whine will always be there if your hearing is sensitive to such sounds.
Quote:


> Originally Posted by *Thoth420*
> 
> That is why it is called resonance technically and coil whine is a buzzword(pun intended). Some people are sensitive to it and others may not hear a thing. I am probably somewhere between you and the average person.


This is what I see from experience also, and it turns out that most of people doesn't hear it or only when really trying. Another thing that makes reviews less useful. But I can understand differences in hearing, this is normal thing.

Anyways, coil whine and noise levels in general are the biggest pain in the ass I ever experienced with PC gaming and I am going to solve this problem once and for all. Question is why now? Why not few years ago? My attic is there waiting since forever.


----------



## LionS7

Well, Im pushing for now about 8 hours in the menu of The Witcher 3. I will do 12 hours and tell if there is a difference after that. The fps is just above 3500.


----------



## mypickaxe

OK, so I've been water cooling my Nano for a couple of days now. I'm setting the power limit to +50%, voltage to +12%, core at +11% and memory at +50 MHz. So this is resulting in a sustained overclock of:

1111 MHz GPU Clock
550 MHz HBM

Seems stable thus far. Question though, is this the best I should expect out of the card? Are there any modded BIOS with TDP increases forced in for water cooling?


----------



## Sonikku13

Still a solid 26 MH/sec on my Nano, wondering if I should undervolt a bit or overclock tomorrow.


----------



## mypickaxe

Quote:


> Originally Posted by *mypickaxe*
> 
> OK, so I've been water cooling my Nano for a couple of days now. I'm setting the power limit to +50%, voltage to +12%, core at +11% and memory at +50 Mhz. So this is resulting in a sustained overclock of:
> 
> 1111 Mhz GPU Clock
> 550 MHz HBM
> 
> Seems stable thus far. Question though, is this the best I should expect out of the card? Are there any modded BIOS with TDP increases forced in for water cooling?


So, never mind. I found a link to a modded BIOS (for either air or water) on reddit.


----------



## gupsterg

Quote:


> Originally Posted by *bluedevil*
> 
> Hey guys...any way to get the 75c thermal throttling unlocked?


Personally I wouldn't edit the throttling temp of GPU, but it can be done via bios mod. I would edit the cooling profile behavior via SW or FW so your not reaching 75C. With room temps of ~22C-24C the Fury X AIO can maintain 50C pretty quietly IMO. I modified cooling target GPU temp from 65C to 50C and upped the sensitivity by +100%.



Quote:


> Originally Posted by *xTesla1856*
> 
> You could flash a different BIOS on your card. Sapphire's OC Bios allows for a 80 degree temp target.


Disregarding that bluedevil has Fury X and flashing it with Fury Tri-X / Nitro is not advisable, the 80C temp they set in Tri-X / Nitro ROM is pants implementation IMO. They haven't changed GPU throttling temp, they've changed the cooling target GPU temp (ie how fan will behave), which actually would mean you hit higher GPU temps on the increased PL ROM. Which isn't really handy when OC'ing / having increased PL.



When I saw the slide about how the Nitro had "improved" mosfets I was like "nice". Then I noted 2 things:-

i) I preferred the IR DirectFets with metal casing / separate low & high side fets than the "all in one" PowerStage ones on Nitro.
ii) even though in Nitro slide they present we're using 60A fets vs 50A on ref PCB they gimped the PL compared with Tri-X.
iii) I don't think the specs on slide for mosfets for either card is correct.

*Nitro PL (left stock PL / right increased PL)*



*Fury Tri-X PL (left stock PL / right increased PL)*



If you do repaste the GPU any chance of markings from mosfets on Nitro or hi-res images of VRM? cheers







.


----------



## Ne01 OnnA

*Ne01* Presents:

We can see here the Scaling of Fiji in Shadow Of Mordor (in-Game Benchie)

Crimson 16.7.3 WHQL + CCC from Catalyst-15.11.1Beta-Nov14
Tests on: 1856:1392 77Hz Digital Panel 16/48Bit HDMI

SoM Test All | Ultra V.High noBlur FXAA | (no FPS CAP in Crimson)

850/500 1.218v -36mV -18%POW 45%Fan -> 68.27 130.05 33.02 (144tW) <- Old Games/RPG etc.
935/550 1.218v -30mV -12%POW 55%Fan -> 74.84 171.24 31.06 (186tW) <- Sweet Spot for Long sessions
1050/550 1.218v -24mV -8%POW 60%Fan -> 82.40 197.32 50.55 (224tW) <- Thats GOOD
1095/550 1.218v +26mV +8%POW 65%Fan -> 85.30 229.24 54.73 (301tW) <- Insane :bang:

Not Tested for more Yet







but i will....

850/500 1.218v -36mV -18%POW 45%Fan = In Game Seraph Beta (arcade Shooter/Platformer) 60FPS CAP in Crimson i have 84tW !

My Opinion on KRAKEN? KRAKEN :banana: also Cool&Quiet + Very efficient & You can manage how much Powah do you actually need :nerd:

Scaling of these GPU are Fenomenal Great Job ATI









Here ->


Spoiler: Warning: Spoiler!


----------



## LionS7

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> *Ne01* Presents:
> 
> We can see here the Scaling of Fiji in Shadow Of Mordor (in-Game Benchie)
> 
> Crimson 16.7.3 WHQL + CCC from Catalyst-15.11.1Beta-Nov14
> Tests on: 1856:1392 77Hz Digital Panel 16/48Bit HDMI
> 
> SoM Test All | Ultra V.High noBlur FXAA | (no FPS CAP in Crimson)
> 
> 850/500 1.218v -36mV -18%POW 45%Fan -> 68.27 130.05 33.02 (144tW) <- Old Games/RPG etc.
> 935/550 1.218v -30mV -12%POW 55%Fan -> 74.84 171.24 31.06 (186tW) <- Sweet Spot for Long sessions
> 1050/550 1.218v -24mV -8%POW 60%Fan -> 82.40 197.32 50.55 (224tW) <- Thats GOOD
> 1095/550 1.218v +26mV +8%POW 65%Fan -> 85.30 229.24 54.73 (301tW) <- Insane :bang:
> 
> Not Tested for more Yet
> 
> 
> 
> 
> 
> 
> 
> but i will....
> 
> 850/500 1.218v -36mV -18%POW 45%Fan = In Game Seraph Beta (arcade Shooter/Platformer) 60FPS CAP in Crimson i have 84tW !
> 
> My Opinion on KRAKEN? KRAKEN :banana: also Cool&Quiet + Very efficient & You can manage how much Powah do you actually need :nerd:
> 
> Scaling of these GPU are Fenomenal Great Job ATI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here ->
> 
> 
> Spoiler: Warning: Spoiler!


Yes, the performance scaling of these gpu is insane, but the oc potential is so poor. I think, after 1-2 hours in The Witcher 3, mine will go no futher then 1100/1040 on 1.27V, witch is +72mV. Well, maybe 1110, but I want to have room for failures.


----------



## Krzych04650

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> *Ne01* Presents:
> 
> We can see here the Scaling of Fiji in Shadow Of Mordor (in-Game Benchie)
> 
> Crimson 16.7.3 WHQL + CCC from Catalyst-15.11.1Beta-Nov14
> Tests on: 1856:1392 77Hz Digital Panel 16/48Bit HDMI
> 
> SoM Test All | Ultra V.High noBlur FXAA | (no FPS CAP in Crimson)
> 
> 850/500 1.218v -36mV -18%POW 45%Fan -> 68.27 130.05 33.02 (144tW) <- Old Games/RPG etc.
> 935/550 1.218v -30mV -12%POW 55%Fan -> 74.84 171.24 31.06 (186tW) <- Sweet Spot for Long sessions
> 1050/550 1.218v -24mV -8%POW 60%Fan -> 82.40 197.32 50.55 (224tW) <- Thats GOOD
> 1095/550 1.218v +26mV +8%POW 65%Fan -> 85.30 229.24 54.73 (301tW) <- Insane :bang:
> 
> Not Tested for more Yet
> 
> 
> 
> 
> 
> 
> 
> but i will....
> 
> 850/500 1.218v -36mV -18%POW 45%Fan = In Game Seraph Beta (arcade Shooter/Platformer) 60FPS CAP in Crimson i have 84tW !
> 
> My Opinion on KRAKEN? KRAKEN :banana: also Cool&Quiet + Very efficient & You can manage how much Powah do you actually need :nerd:
> 
> Scaling of these GPU are Fenomenal Great Job ATI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here ->
> 
> 
> Spoiler: Warning: Spoiler!


Also tested it a bit before with my Fury Nitro. 850 clock is easily stable at -96mV and -50% power target (I am using second, 300W bios though, was too lazy to switch back after overclocking attempts) and compared to stock settings power draw from the wall for entire system dropped from 360W to 195W under heavy load like Valley or Witcher 3. On this settings Fury was like 1-2% more powerful than R9 [email protected], got 1592 in Valley vs 1562 of 390. Temps were ridiculously low around 47C at 35% fan speed (1280 RPM)
Quote:


> Originally Posted by *LionS7*
> 
> Well, Im pushing for now about 8 hours in the menu of The Witcher 3. I will do 12 hours and tell if there is a difference after that. The fps is just above 3500.


Any news?


----------



## LionS7

Quote:


> Originally Posted by *Krzych04650*
> 
> Any news?


No change... I have done 10 hours. Maybe I will do 6 hours Valley, dont know. I dont think that it will help anyway. How long I need to put the card under that kind of load ? 24 hours ?


----------



## snurds

Quote:


> Originally Posted by *LionS7*
> 
> No change... I have done 10 hours. Maybe I will do 6 hours Valley, dont know. I dont think that it will help anyway. How long I need to put the card under that kind of load ? 24 hours ?


If you're not using headphones already would they be sufficient to make the noise tolerable?

There are probably headphones specifically designed to block out ambient noise.


----------



## Thoth420

Quote:


> Originally Posted by *Krzych04650*
> 
> I am not going to RMA because of the issue that is present on every card I saw in my life and get rid of my GPU for 30+ days for RMA process just to get another one for exchange with exactly the same issue and probably few others as I would most likely get some refurbished faulty crap returned by someone before. Card is working properly and coil whine will always be there if your hearing is sensitive to such sounds.
> This is what I see from experience also, and it turns out that most of people doesn't hear it or only when really trying. Another thing that makes reviews less useful. But I can understand differences in hearing, this is normal thing.
> 
> Anyways, coil whine and noise levels in general are the biggest pain in the ass I ever experienced with PC gaming and I am going to solve this problem once and for all. Question is why now? Why not few years ago? My attic is there waiting since forever.


Same here man as my system is absolutely inaudible aside from the resonance. I got lucky and my Fury X only manages to lightly whine at high fps low load scenarios so far. I have had cards that were far worse and it tends to be something with newer gens of GPUs. My 6970 reference never had it. My 480 never had it but that card was a piece of junk in other regards. My 8800 GTX never had it(still my favorite GPU maybe because it was my first build). I have however had a 7970 Lightning, 780, 780Ti that all exhibited coil whine and I tried multiple PSUs, UPSs, outlets and even different physical locations to weed out dirty power. I am guessing something about the architecture or common design method used on newer GPUs (and this is mere conjecture) are causing this as there have always been complaints of coil whine however it seems to me more a problem these days than in the past. My p67 was whisper quiet which was running the 6970 (sometimes north of 90C) with the voltage maxed to its limit and an aggressive OC and it never once whined....thing never even crashed amazingly enough and that card def required some power too. Maybe all this low power draw crap is leading to easier to hear resonance than previous cards. Resonance is always there it is just a matter of if your ears pick it up or not. I wonder what this sound does to pets since it is abrasive to our weak human ears.


----------



## CMac019

For the XFX Fury X on Jet. I posted this somewhere else and just pasted it here "I got really lucky with this... With a new Jet account make your order with shop10, cancel it and it won't let you use it anymore, call Jet and they will credit you the $39.xx that it takes off. Then you can use ELECTRONICSBASH and add a $7 USB cable and get an extra 20% off. Total price is $319.59!"


----------



## Thoth420

Quote:


> Originally Posted by *CMac019*
> 
> For the XFX Fury X on Jet. I posted this somewhere else and just pasted it here "I got really lucky with this... With a new Jet account make your order with shop10, cancel it and it won't let you use it anymore, call Jet and they will credit you the $39.xx that it takes off. Then you can use ELECTRONICSBASH and add a $7 USB cable and get an extra 20% off. Total price is $319.59!"


That is a steal especially since XFX offers the best warranty and allow modifications.


----------



## Ne01 OnnA

Yeah, but its not stable with so little mV in Games (im playing Games 4-10h+ Daily)
So -96mV is no go im sure (Test this in some UbiSoft Game eg. AC:U or AC:S ;-)

Most stable is -48mV and -17%POW (on my from 1.218v is Max V set in my Custom BIOS)

Im also using 300W BIOS (but is edited to 260tW)
For normal gaming 1090/550 1.218v -5%POW is possible and around ~250-270tW

Now for Heavy games im using my 1050/550 Profile and its OK ~230-2460tW also Cool&Quiet


----------



## michael82

Hi. does the custom bios fix the negative voltage scaling on the Fury X when raising core voltage in msi afterburner to causing lost in performance ?.


----------



## Xgatt

Woohoo, thanks guys. The posts in this thread helped me achieve a solid clockspeed, finally. I had posted this earlier: http://www.overclock.net/t/1607504/sapphire-fury-nitro-downclocks-after-crossing-60-degrees-c, but couldn't find a solution anywhere. Finally tried UNDERvolting the card to -24mV and -8% power, as recommended in an earlier post. Now the clock stays rock solid at 1050 / 550 throughout gameplay on my Fury Nitro. I did try bumping the clock to 1075 with -30mV earlier, but after about 15 mins the card crashed and forced a reboot.

Is it recommended to just leave the power undervolted and use the card as a good 1050/550 card?


----------



## Krzych04650

Quote:


> Originally Posted by *Xgatt*
> 
> Woohoo, thanks guys. The posts in this thread helped me achieve a solid clockspeed, finally. I had posted this earlier: http://www.overclock.net/t/1607504/sapphire-fury-nitro-downclocks-after-crossing-60-degrees-c, but couldn't find a solution anywhere. Finally tried UNDERvolting the card to -24mV and -8% power, as recommended in an earlier post. Now the clock stays rock solid at 1050 / 550 throughout gameplay on my Fury Nitro. I did try bumping the clock to 1075 with -30mV earlier, but after about 15 mins the card crashed and forced a reboot.
> 
> Is it recommended to just leave the power undervolted and use the card as a good 1050/550 card?


Interesting. Downclocking is the last thing I saw on Fury, actually I never saw it. There were a lot of frequency drops on 390 Nitro, but for Fury its perfectly constant, at least after disabling Power Efficiency option is Radeon Settings.

I don't quite understand your question, but just keep the card where it is stable and working well for you.


----------



## Xgatt

Quote:


> Originally Posted by *Krzych04650*
> 
> Interesting. Downclocking is the last thing I saw on Fury, actually I never saw it. There were a lot of frequency drops on 390 Nitro, but for Fury its perfectly constant, at least after disabling Power Efficiency option is Radeon Settings.
> 
> I don't quite understand your question, but just keep the card where it is stable and working well for you.


Yeah, that's the part I didn't quite understand as well, as I never saw others complain about it that much. Basically, what happens is this: at stock voltages and above, as soon as the card crosses about 62 degrees C, it starts to downclock itself (sometimes by up to 300MHz). I definitely have power efficiency and ULPS turned off, so no idea what was causing it. There's a chance that this card's VRMs somehow can't handle the higher voltages and cause the downclock on their own.


----------



## Krzych04650

MSI RX 480 Gaming X review is out on TechPowerUp.

There are really almost no power efficiency improvements over Fury lineup. Hopefully they improve for Vega, just imagine AMD cards competing with 1080 and 1080Ti with current perf/watt of Polaris, while RX 480 consumes the same amount of power as 1080 and is almost 2 times slower. I am not really concerned about power draw in general until it is reasonable, but about the heat that comes with it.


----------



## xTesla1856

Quote:


> Originally Posted by *Krzych04650*
> 
> MSI RX 480 Gaming X review is out on TechPowerUp.
> 
> There are really almost no power efficiency improvements over Fury lineup. Hopefully they improve for Vega, just imagine AMD cards competing with 1080 and 1080Ti with current perf/watt of Polaris, while RX 480 consumes the same amount of power as 1080 and is almost 2 times slower. I am not really concerned about power draw in general until it is reasonable, but about the heat that comes with it.


Yeah, as an owner of 2 Fiji cards, I am more than underwhelmed with Polaris. Sure it's a great price point and many people upgrading from ancient hardware should be very happy, but look at the vacuum it created in the market. Intel and Nvidia are on a rampage right now with price hikes (Founder's edition, Titan X, HB Bridge, 6950X). I really, really hope AMD can make a triumphant return with Vega and Zen. I would be down for an all-AMD rig in the future


----------



## LionS7

Can somebody help me with a stable bios with 1.31V on the HBM ? My VID is 1.30V. I need to go on 525, now I can 520.







I have ref. R9 Fury X.


----------



## elmonen

Hi! Anyone else here have asus fury strix? Just would like to hear some oc reports.. I seem to only get 1040mhz on default voltage.. Even at 1050 I need like +25mv to get it stable and after that I need to add voltage like crazy to keep it stable :\ and i feel like its not worth it for the extra heat/noise..


----------



## Krzych04650

Quote:


> Originally Posted by *elmonen*
> 
> Hi! Anyone else here have asus fury strix? Just would like to hear some oc reports.. I seem to only get 1040mhz on default voltage.. Even at 1050 I need like +25mv to get it stable and after that I need to add voltage like crazy to keep it stable :\ and i feel like its not worth it for the extra heat/noise..


Well, you bought Asus card, so overclocking will be the least of your problems. Better return it if you still can.


----------



## elmonen

Quote:


> Originally Posted by *Krzych04650*
> 
> Well, you bought Asus card, so overclocking will be the least of your problems. Better return it if you still can.


Hmm why is that? Ive had the card for 5 months working fine.. Except the bad oc..


----------



## Shatun-Bear

As I'm thinking of snagging one of these for about £280 via eBay, can the memory on the Fury (non-X, not that it makes a difference) be overclocked now? If so, does it make a difference? Sorry I haven't been keeping up with this card.

Also, out of interest I was reading a review of Sapphire Fury from last year in TPU and looking at the Performance Summary figures. At the time of the Strix and Sapphire Fury's launch reviews, the Fury was only *6% faster* in 1440p than a GTX 980:

https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/30.html

Fast forward to the present and I was reading TPU's review on the MSI 480 yesterday and looked at the Perf Summary again and noticed that the Fury is now placed as a whopping *11% faster* at the same resolution:

https://www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/23.html

Goes to show that the fabled AMD driver improvements post launch and the relative consistency of Nvidia's performance is real and not just fanboy talk! This makes me quite eager to snap up a Fury to play with (or certainly a 480 over a 1060).


----------



## Krzych04650

Quote:


> Originally Posted by *Shatun-Bear*
> 
> As I'm thinking of snagging one of these for about £280 via eBay, can the memory on the Fury (non-X, not that it makes a difference) be overclocked now? If so, does it make a difference? Sorry I haven't been keeping up with this card.
> 
> Also, out of interest I was reading a review of Sapphire Fury from last year in TPU and looking at the Performance Summary figures. At the time of the Strix and Sapphire Fury's launch reviews, the Fury was only *6% faster* in 1440p than a GTX 980:
> 
> https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/30.html
> 
> Fast forward to the present and I was reading TPU's review on the MSI 480 yesterday and looked at the Perf Summary again and noticed that the Fury is now placed as a whopping *11% faster* at the same resolution:
> 
> https://www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/23.html
> 
> Goes to show that the fabled AMD driver improvements post launch and the relative consistency of Nvidia's performance is real and not just fanboy talk! This makes me quite eager to snap up a Fury to play with (or certainly a 480 over a 1060).


With current situation on the market, new releases and discounts for last generation Fury seems to be the best deal up there while maintaining high performance. With current prices for example in Germany Fury matches price/perf ratio of much cheaper cards and is way more powerful. Normally you need to pay premium and price/pefr doesn't scale well. You may be able to find similar deal with 980 somewhere, but it is less powerful and you have no future with the displays as G-sync monitors are overly expensive, univariate and coming from joke manufacturers, while there are really a lot of options for FreeSync monitors and you can pair your Fury with one for little cost, much less than with G-sync, and this will always give much better result than 980 + standard monitor. Plus like I said, variety of types and price points of FreeSync displays is very big, offer is very wide.


----------



## xkm1948

I may grab a second FuryX for crossfire down the road.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Krzych04650*
> 
> *With current situation on the market, new releases and discounts for last generation Fury seems to be the best deal up there* while maintaining high performance. With current prices for example in Germany Fury matches price/perf ratio of much cheaper cards and is way more powerful. Normally you need to pay premium and price/pefr doesn't scale well. You may be able to find similar deal with 980 somewhere, but it is less powerful and you have no future with the displays as G-sync monitors are overly expensive, univariate and coming from joke manufacturers, while there are really a lot of options for FreeSync monitors and you can pair your Fury with one for little cost, much less than with G-sync, and this will always give much better result than 980 + standard monitor. Plus like I said, variety of types and price points of FreeSync displays is very big, offer is very wide.


Yep. I've owned a lot of GPUs and sold my GTX 980 for quite a bit of money before any of the new gen of cards was released. And I have come to the same conclusion as you - A Fury, at the current prices you can get them for, make a lot of sense right now. Performance at 1440p, which is the res of my monitor, leaves even a good custom 980 in the dust. That AMD cards mature better with age and driver updates is a huge bonus.

What about the memory overclocking though on a Fury? Is it possible?


----------



## Spartoi

Quote:


> Originally Posted by *Shatun-Bear*
> 
> What about the memory overclocking though on a Fury? Is it possible?


Yes, use MSI Afterburner and tick "Expand Official Overclocking limits" in the settings.


----------



## comagnum

Can someone with a nano do me a huge favor? Can you measure the distance of outermost screw holes? Like the ek backplate seen here? - 

I'm at work and I can't run my machines until 9pm est (peak hour restrictions) and found the perfect piece of material to create my own backplate. Just need the screw hole dimensions.

Edit: nvm. Blew up a pdf to accurate dimensions and got the measurements from it.


----------



## Shatun-Bear

Quote:


> Originally Posted by *Spartoi*
> 
> Yes, use MSI Afterburner and tick "Expand Official Overclocking limits" in the settings.


Cool thanks.


----------



## lestatdk

Quote:


> Originally Posted by *lestatdk*
> 
> added 66 mV and got 1200 MHz core and 540 MHz memory .
> 
> This is benchmark stable ,but not quite game stable . I can run Firestrike and Time Spy without problems, but when testing with Doom it crashes sometimes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1160-1170 ish seems to be the max stable for me when gaming. Might have to try and clock down the memory and keep the core high just to see if it helps


Decided to stick with 1155 and 550. It can do this with +40 mV and even in Doom @ 2560x1080 on Ultra settings without vsync it can keep the temp at 64C and it's stable









Also it seems my card can't go much above the stock 1050 if I under volt it


----------



## Krzych04650

Why Sapphire is selling Fury Nitro in two versions - 1020 and 1050 MHz? Looking at codes, 1020 is newer. Samples that failed 1050 or what?


----------



## ManofGod1000

I am sure this has been mentioned before but, I was actually quite surprised that the graphics score of my R9 Fury Nitro was so low in 3D Mark 11. It was about 17100 or so. Does anyone else here have the same results? Gaming is faster as well as 3D Mark 13 though.








Quote:


> Originally Posted by *Krzych04650*
> 
> Why Sapphire is selling Fury Nitro in two versions - 1020 and 1050 MHz? Looking at codes, 1020 is newer. Samples that failed 1050 or what?


I do not know but, I managed to miss the $299 sale on the 1020 one by ordering the other one just 2 days earlier at $349.







Good card either way.


----------



## comagnum

Quote:


> Originally Posted by *ManofGod1000*
> 
> I am sure this has been mentioned before but, I was actually quite surprised that the graphics score of my R9 Fury Nitro was so low in 3D Mark 11. It was about 17100 or so. Does anyone else here have the same results? Gaming is faster as well as 3D Mark 13 though.
> 
> 
> 
> 
> 
> 
> 
> 
> I do not know but, I managed to miss the $299 sale on the 1020 one by ordering the other one just 2 days earlier at $349.
> 
> 
> 
> 
> 
> 
> 
> Good card either way.


My nano scores lower than my 480 by a good margin.


----------



## ManofGod1000

Quote:


> Originally Posted by *comagnum*
> 
> My nano scores lower than my 480 by a good margin.


Yeah, it must just be one of those things, thanks. After all, the Nano is definitely faster than the RX 480.


----------



## iRUSH

Is the Nano at $299 new a good price? I picked one up on clearance a few hours ago.

I wanted to get an AIB 480 lol..This was in the back at MC. My friend brought it out and said $299 and it's mine.


----------



## bluezone

Quote:


> Originally Posted by *ManofGod1000*
> 
> Yeah, it must just be one of those things, thanks. After all, the Nano is definitely faster than the RX 480.


Yes something going on with your score there.

1100/500 "0" PL:

http://www.3dmark.com/3dm11/11466325

Benching settings 1150/500 50% PL:

http://www.3dmark.com/3dm11/11466305


----------



## bluezone

Quote:


> Originally Posted by *iRUSH*
> 
> Is the Nano at $299 new a good price? I picked one up on clearance a few hours ago.
> 
> I wanted to get an AIB 480 lol..This was in the back at MC. My friend brought it out and said $299 and it's mine.


Yes, I'd say that's pretty good for a Nano.


----------



## xTesla1856

Quote:


> Originally Posted by *Krzych04650*
> 
> Why Sapphire is selling Fury Nitro in two versions - 1020 and 1050 MHz? Looking at codes, 1020 is newer. Samples that failed 1050 or what?


They could have run out of samples that do 1050 reliably lol


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> They could have run out of samples that do 1050 reliably lol


Actually 1000 and 1050 MHZ from what I see??

http://www.sapphiretech.com/catapage_pd.asp?cataid=284&lang=eng



Where was the listing for 1020 and 1050?

Edit:

Never mind found it.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202187


----------



## JDags

I recently bought a "1020" MHz Fury that was on sale for $300 on Newegg. I will be receiving the card today and am interested to see if the card is indeed clocked at 1020 MHz. I also plan on undervolting as well as checking for unlocking the additional CUs on the card so I'll provide some results when I get the chance (even though I understand that the newer Furys are more than likely hardware locked but it doesn't hurt to try







)

Edit: On Sapphire's website, the specs tab shows two different versions of the Fury. One clocked at 1050 and the other clocked at 1020:

Sapphire Fury Website


----------



## ManofGod1000

Quote:


> Originally Posted by *JDags*
> 
> I recently bought a "1020" MHz Fury that was on sale for $300 on Newegg. I will be receiving the card today and am interested to see if the card is indeed clocked at 1020 MHz. I also plan on undervolting as well as checking for unlocking the additional CUs on the card so I'll provide some results when I get the chance (even though I understand that the newer Furys are more than likely hardware locked but it doesn't hurt to try
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Edit: On Sapphire's website, the specs tab shows two different versions of the Fury. One clocked at 1050 and the other clocked at 1020:
> 
> Sapphire Fury Website


You will definitely be happy with the card, I know I am with my 1050 Mhz version.







Lucky you though, you timed it just right well I bought mine for $349 just too days before.







I know that my card will not unlock, I already checked it out. I have done no overclocking or undervolting yet since I have not had any time to play around with it since I installed it on Tuesday.

The card does not look heavy in the picture but, it definitely feels heavy once you get it out of the box. Excellent card though and looks great on my Samsung 4k 28 inch monitor.


----------



## JDags

Quote:


> Originally Posted by *ManofGod1000*
> 
> You will definitely be happy with the card, I know I am with my 1050 Mhz version.
> 
> 
> 
> 
> 
> 
> 
> Lucky you though, you timed it just right well I bought mine for $349 just too days before.
> 
> 
> 
> 
> 
> 
> 
> I know that my card will not unlock, I already checked it out. I have done no overclocking or undervolting yet since I have not had any time to play around with it since I installed it on Tuesday.
> 
> The card does not look heavy in the picture but, it definitely feels heavy once you get it out of the box. Excellent card though and looks great on my Samsung 4k 28 inch monitor.


Awesome! Ya, I was running two monitors (one at 1080 and the other at 1050) but I know the Fury is more meant for 1440p and 4k so I also pulled the trigger on the Acer 27 in. 1440p 144hz Freesync monitor. Huge difference already as I used the monitor with my old 660 (it's bigger than the TV I have in my bedroom!). Can't wait to start tinkering with it.


----------



## LionS7

Quote:


> Originally Posted by *xTesla1856*
> 
> They could have run out of samples that do 1050 reliably lol


Im not so sure.


----------



## JDags

When I get some time, I'll try to OC my Fury 1020 to 1050 without a power increase and see if its stable. My assumption would be that it won't be stable


----------



## costilletas

Jdags what's the ASIC of your nitro? I guess all decent Fiji chips are used for the firepros models now


----------



## JDags

Quote:


> Originally Posted by *costilletas*
> 
> Jdags what's the ASIC of your nitro? I guess all decent Fiji chips are used for the firepros models now


I am unfamiliar with the term ASIC. Could you explain to me what it is and how I can "figure out" what ASIC my Nitro is? Thanks!


----------



## costilletas

You can find this option in the top right corner of gpu-z. Basically it's the quality of the chip.


----------



## bluezone

ASIC or Application-specific integrated circuit.

You want slight leakage for a better overclock. A higher ASIC in fact could be worse for overclocking on air.


----------



## lestatdk

My Fury is like 64.6% ASIC and it can do 1200 .So low ASIC doesn't necessarily mean it's bad at overclocking


----------



## Thoth420

Quote:


> Originally Posted by *JDags*
> 
> Awesome! Ya, I was running two monitors (one at 1080 and the other at 1050) but I know the Fury is more meant for 1440p and 4k so I also pulled the trigger on the Acer 27 in. 1440p 144hz Freesync monitor. Huge difference already as I used the monitor with my old 660 (it's bigger than the TV I have in my bedroom!). Can't wait to start tinkering with it.


The XG with the orangish base? Any good? I skipped it because that bezel doesn't match my theme but the ACER G Sync IPS was the best 144hz 1440 G Sync panel I tried of the whole lot.


----------



## bluezone

Quote:


> Originally Posted by *lestatdk*
> 
> My Fury is like 64.6% ASIC and it can do 1200 .So low ASIC doesn't necessarily mean it's bad at overclocking


Very good.









My ASIC is 62.4. It will clock to 1165, but runs too hot and is best a 1100.


Spoiler: Warning: Spoiler!


----------



## JDags

Quote:


> Originally Posted by *Thoth420*
> 
> The XG with the orangish base? Any good? I skipped it because that bezel doesn't match my theme but the ACER G Sync IPS was the best 144hz 1440 G Sync panel I tried of the whole lot.


Yes. The Acer XG270HU 27" 1ms 144HZ Freesync monitor. Watched some Dota on it with the Fury and Freesync and it was quite the pleasant viewing experience.


----------



## JDags

Quote:


> Originally Posted by *bluezone*
> 
> Very good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My ASIC is 62.4. It will clock to 1165, but runs too hot and is best a 1100.
> 
> 
> Spoiler: Warning: Spoiler!


I checked my ASIC and it was only 55.7%







It says my card's ASIC quality is higher than 1.7% of similar GPUs in their validation database. I am also a little disappointed with the coil whine when my card is operating at 100% but my case has dampening material as well as the fact that I use headphones so it won't be much of an issue for me.

I will try to undervolt the card tomorrow when I get some time. I had do a lot to get this card going with Freesync so I am going to enjoy it the rest of the night and will tinker with it tomorrow.


----------



## bluedevil

That Acer looks like a nice panel, but I really like Acer's XR Predator panels....just wish that they had a variant that was a 3440x1440 144hz IPS Freesync version... instead the best they have is the 2560x1080 144hz VA Freesync XR...


----------



## pdasterly

finally got pc up and running again, gonna use primochill lrt from here out, so much easier than bending rigid but doesn't look as good but who cares


----------



## costilletas

Quote:


> Originally Posted by *JDags*
> 
> I checked my ASIC and it was only 55.7%
> 
> 
> 
> 
> 
> 
> 
> It says my card's ASIC quality is higher than 1.7% of similar GPUs in their validation database. I am also a little disappointed with the coil whine when my card is operating at 100% but my case has dampening material as well as the fact that I use headphones so it won't be much of an issue for me.
> 
> I will try to undervolt the card tomorrow when I get some time. I had do a lot to get this card going with Freesync so I am going to enjoy it the rest of the night and will tinker with it tomorrow.


Why don't you rma it? I sent mine back even though the coil whine was almost nonexistent cause what i really wanted is another gpu to try luck and see if i could unlock it







but no luck







. The point is you should be able to rma it









BTW has anyone tested their furies in heroes of the storm? My gpu load goes crazy and i can't get 100+ fps. In cs go it does the same thing, but i doesn't drop below 295 fps so it's fine, I've disabled the power saving mode so IDK what else to do


----------



## looncraz

Quote:


> Originally Posted by *JDags*
> 
> I checked my ASIC and it was only 55.7%
> 
> 
> 
> 
> 
> 
> 
> It says my card's ASIC quality is higher than 1.7% of similar GPUs in their validation database. I am also a little disappointed with the coil whine when my card is operating at 100% but my case has dampening material as well as the fact that I use headphones so it won't be much of an issue for me.
> 
> I will try to undervolt the card tomorrow when I get some time. I had do a lot to get this card going with Freesync so I am going to enjoy it the rest of the night and will tinker with it tomorrow.


Low ASIC card can sometimes be extremely good overclockers, but they will use more power and generate more heat, so you will need more voltage and more cooling ( as in water cooling ).

If that doesn't appeal to you, then I would RMA it for the coil whine issue, since you're nearly guaranteed to get a better sample next time around.


----------



## MissHaswellE

Hey guys I picked up an XFX Fury X from the XFX sale for 400$ total.

Currently running it at 1100mhz GPU 510mhz mem.
Overclocked with TRIXX, +36mv, 50% Power limit
ASIC Quality 61.6%
58~60% fan speed to keep it at 52c under 100% load.
What kind of overclocking can I expect from this card?


----------



## Bryst

Bit the bullet on the Sapphire Fury on amazon for $349. Was hoping to see it on sale again on newegg for $299 but that's probably too soon. Really wanted to get my hands on a RX480, but the reference didn't preform temp wise as I'd hoped. I pretty much just got tired of waiting for a decent AIB board, was waiting and waiting for almost over 2 months.

This 7950 is just getting too old. And with the Fury beating the 480 in 480 reviews by 10-16fps in games I figured it was still worth the $50 more. Though getting it for the price of a AIB 480 would have been awesome.

On the plus side, living 10 miles for a amazon warehouse is nice, free same day delivery!


----------



## bluezone

Quote:


> Originally Posted by *Bryst*
> 
> Bit the bullet on the Sapphire Fury on amazon for $349. Was hoping to see it on sale again on newegg for $299 but that's probably too soon. Really wanted to get my hands on a RX480, but the reference didn't preform temp wise as I'd hoped. I pretty much just got tired of waiting for a decent AIB board, was waiting and waiting for almost over 2 months.
> 
> This 7950 is just getting too old. And with the Fury beating the 480 in 480 reviews by 10-16fps in games I figured it was still worth the $50 more. Though getting it for the price of a AIB 480 would have been awesome.
> 
> On the plus side, living 10 miles for a amazon warehouse is nice, free same day delivery!


Delivery Drones?


----------



## Bryst

Quote:


> Originally Posted by *bluezone*
> 
> Delivery Drones?


I dont think thats going on in my area yet. But it would be cool.

OT: Gonna run some benches now to compared my 7950 to the fury. Should be comical.


----------



## Bryst

Anyone know if the $349 sapphire fury nitro on amazon comes with a DP cable? Some unboxing videos have a HDMI and some seem to have DP.


----------



## Krzych04650

Quote:


> Originally Posted by *Bryst*
> 
> Anyone know if the $349 sapphire fury nitro on amazon comes with a DP cable? Some unboxing videos have a HDMI and some seem to have DP.


There are bo DP bundles with Fury Nitro, only HDMI.


----------



## Bryst

Quote:


> Originally Posted by *Krzych04650*
> 
> There are bo DP bundles with Fury Nitro, only HDMI.


Sigh, for an originally 500+ dollar card with 3 DP I feel like they should have included one. I only have a miniDP to DP cable.


----------



## Ceadderman

Quote:


> Originally Posted by *Bryst*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluezone*
> 
> Delivery Drones?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont think thats going on in my area yet. But it would be cool.
Click to expand...

Til the thieves get smart and shoot down the drone.









~Ceadder


----------



## MissHaswellE

Quote:


> Originally Posted by *Bryst*
> 
> Sigh, for an originally 500+ dollar card with 3 DP I feel like they should have included one. I only have a miniDP to DP cable.


Well most monitors that US DisplayPort, come with DisplayPort cables.

Every monitor that I've purchased that has DisplayPort, has come with the cable.

I've got 2 DP cables floating around here being unused because 4 of the monitors I've purchased came with them. The only thing I had to get was an HDMI to DVI Cable for a 5th.


----------



## Bryst

Quote:


> Originally Posted by *MissHaswellE*
> 
> Well most monitors that US DisplayPort, come with DisplayPort cables.
> 
> Every monitor that I've purchased that has DisplayPort, has come with the cable.
> 
> I've got 2 DP cables floating around here being unused because 4 of the monitors I've purchased came with them. The only thing I had to get was an HDMI to DVI Cable for a 5th.


My last 2 monitors, PS4, Receiver, and TV all came with HDMI cables.

I guess my thoughts are manufacturers should include the newest/ least likely for a consumer to already own cable with a graphics card. I have about 4 HDMI lying around, A box of DVI and VGA cables and a MiniDP to HDMI and MiniDP to DVI. Those I had to buy because my 7950 only has mini display ports.

No matter really, cable should be here tomorrow. But now I have to go A FULL DAY without freesync. OUTRAGEOUS.


----------



## costilletas

They come with hmdi cause you can use it on your pc and ps. As simple as that.


----------



## Krzych04650

Quote:


> Originally Posted by *Bryst*
> 
> Sigh, for an originally 500+ dollar card with 3 DP I feel like they should have included one. I only have a miniDP to DP cable.


DP would be more sensible, but Sapphire is still looking very good compared to some manufacturers, you know, there are FreeSync monitors existing that come without DP cable included, even though they require one to utilize FreeSync, so...


----------



## Bryst

Quote:


> Originally Posted by *Krzych04650*
> 
> DP would be more sensible, but Sapphire is still looking very good compared to some manufacturers, you know, there are FreeSync monitors existing that come without DP cable included, so


I know, I bought one...


----------



## Thoth420

BenQ XL2730Z did not contain any DP cable.
I have been using Accell brand without problems so far though. Avoid cablematters ones imo.


----------



## Bryst

Well got my Sapphire Fury last night, man is this thing a brick. First this I thought when I picked it up was "I could drop this and it would probably be okay." I didn't don't worry.. GPU manufacturing quality has definitely improved since the 7000 era.

Some quick benches before I swapped, didn't really have any games install with benchmarks and didn't want to do the fraps method so I just ran Heaven and Firestrike




Like a GLOVE:


----------



## Rossky

Excuse me if someone has asked this before but I'm running into a problem for several days now and I can't seem to find anything on it.

So the FuryX I've owned since last year is the first AMD card I actually got to take a closer look at as the card I had before that was a GTX 680 from a prebuild PC that got me introduced to PC gaming.
The FuryX is the first card I ever attempted overclocking and by using a custom BIOS for a couple months now, I got it up to some very nice clocks with considerable fps gains across all of my games.
The entire system runs on a 580W be quiet! Straight Power E9 and there are two screens hooked up to the card, both of which are 1080p.

When reaching clocks above 1170mhz the card sometimes seems to completely shut off, turning off both of my monitors which then only show "no signal" and the fan spins down as well. I've only ever seen the card crash to the desktop once due to it being overclocked too high a couple times. Is the "shutdown" of the card normal or is my PSU just too small for the overclocked card?


----------



## iRUSH

Quote:


> Originally Posted by *Rossky*
> 
> Excuse me if someone has asked this before but I'm running into a problem for several days now and I can't seem to find anything on it.
> 
> So the FuryX I've owned since last year is the first AMD card I actually got to take a closer look at as the card I had before that was a GTX 680 from a prebuild PC that got me introduced to PC gaming.
> The FuryX is the first card I ever attempted overclocking and by using a custom BIOS for a couple months now, I got it up to some very nice clocks with considerable fps gains across all of my games.
> The entire system runs on a 580W be quiet! Straight Power E9 and there are two screens hooked up to the card, both of which are 1080p.
> 
> When reaching clocks above 1170mhz the card sometimes seems to completely shut off, turning off both of my monitors which then only show "no signal" and the fan spins down as well. I've only ever seen the card crash to the desktop once due to it being overclocked too high a couple times. Is the "shutdown" of the card normal or is my PSU just too small for the overclocked card?


I could be at the end of good power delivery, but 1170 is pretty strong for a Fury X too. I'd settle for 1100-50 and if it doesn't error then call it good. But that's just me.


----------



## MissHaswellE

Quote:


> Originally Posted by *Rossky*
> 
> Excuse me if someone has asked this before but I'm running into a problem for several days now and I can't seem to find anything on it.
> 
> So the FuryX I've owned since last year is the first AMD card I actually got to take a closer look at as the card I had before that was a GTX 680 from a prebuild PC that got me introduced to PC gaming.
> The FuryX is the first card I ever attempted overclocking and by using a custom BIOS for a couple months now, I got it up to some very nice clocks with considerable fps gains across all of my games.
> The entire system runs on a 580W be quiet! Straight Power E9 and there are two screens hooked up to the card, both of which are 1080p.
> 
> When reaching clocks above 1170mhz the card sometimes seems to completely shut off, turning off both of my monitors which then only show "no signal" and the fan spins down as well. I've only ever seen the card crash to the desktop once due to it being overclocked too high a couple times. Is the "shutdown" of the card normal or is my PSU just too small for the overclocked card?


Fury X wasn't really designed to be overclocked much, if you have the standard Vbios it came with. I'd suggest reverting it.
HBM Memory isn't very overclockable as it is right now, one of the drawbacks of new technology.

Also what Utility are you using to OC your Fury X? I use Sapphire TRIXX.


----------



## Rossky

Quote:


> Originally Posted by *MissHaswellE*
> 
> Fury X wasn't really designed to be overclocked much, if you have the standard Vbios it came with. I'd suggest reverting it.
> HBM Memory isn't very overclockable as it is right now, one of the drawbacks of new technology.
> 
> Also what Utility are you using to OC your Fury X? I use Sapphire TRIXX.


I'm using Afterburner.

The problem has occured without the custom BIOS, too. The new BIOS just allowed me to push it a bit further before it happened.


----------



## iRUSH

Quote:


> Originally Posted by *Rossky*
> 
> I'm using Afterburner.
> 
> The problem has occured without the custom BIOS, too. The new BIOS just allowed me to push it a bit further before it happened.


What if set to stock with the power slider maxed? Still does it?


----------



## Rossky

Quote:


> Originally Posted by *iRUSH*
> 
> What if set to stock with the power slider maxed? Still does it?


No. Only at rather high clocks. It has crashed to the desktop a handful of times but that's happened extremely rarely.


----------



## LionS7

Quote:


> Originally Posted by *Rossky*
> 
> When reaching clocks above 1170mhz the card sometimes seems to completely shut off, turning off both of my monitors which then only show "no signal" and the fan spins down as well. I've only ever seen the card crash to the desktop once due to it being overclocked too high a couple times. Is the "shutdown" of the card normal or is my PSU just too small for the overclocked card?


The card is just not stable. This is happening when the card is too unstable. When the card need a little more voltage, the drivers will crash into the desktop. So, shut down the monitor and no video input is a worst case of crash on Fiji. I was at the same point like you, thinking that my PSU is the weaker link, or my cables like hdmi is no good, but when I start to pump up the voltage, the card become stable, like 1.268V at 1100Mhz. The problem is that after the Crimson driver generation, the Fiji chips want more voltage for the same clocks, and I saw that with the Hawaii too, like R9 290. With Crimson for 1100Mhz core, the needed voltage was 106mv, it was 75mv before Crimson.


----------



## bluezone

Quote:


> Originally Posted by *Rossky*
> 
> Excuse me if someone has asked this before but I'm running into a problem for several days now and I can't seem to find anything on it.
> 
> So the FuryX I've owned since last year is the first AMD card I actually got to take a closer look at as the card I had before that was a GTX 680 from a prebuild PC that got me introduced to PC gaming.
> The FuryX is the first card I ever attempted overclocking and by using a custom BIOS for a couple months now, I got it up to some very nice clocks with considerable fps gains across all of my games.
> The entire system runs on a 580W be quiet! Straight Power E9 and there are two screens hooked up to the card, both of which are 1080p.
> 
> When reaching clocks above 1170mhz the card sometimes seems to completely shut off, turning off both of my monitors which then only show "no signal" and the fan spins down as well. I've only ever seen the card crash to the desktop once due to it being overclocked too high a couple times. Is the "shutdown" of the card normal or is my PSU just too small for the overclocked card?


Keep an eye on your VRM temps. I get that kind of shut either due to unstable clocks (not enough voltage) or high VRM temps (too much continuous current draw).

Cheers


----------



## MissHaswellE

Quote:


> Originally Posted by *Rossky*
> 
> I'm using Afterburner.
> 
> The problem has occured without the custom BIOS, too. The new BIOS just allowed me to push it a bit further before it happened.


I'd swap to TRIXX, because its a programe dedicated to Overclocking AMD GPUs. It has a nicer set of controls and profile saving, plus fan controls.
It probably wont make any difference as to your cards performance.

But 1150 core is a really nice overclock anyways.


----------



## gupsterg

Quote:


> Originally Posted by *looncraz*
> 
> Low ASIC card can sometimes be extremely good overclockers, but they will use more power and generate more heat, so you will need more voltage and more cooling ( as in water cooling ).


From what The Stilt has stated about AMD GPUs ASIC quality = LeakageID.
Quote:


> High ASIC "Quality" (Leakage) = Lower operating voltage, larger current draw, hotter, less energy efficient (due higher losses)
> 
> Low ASIC "Quality" = Higher operating voltage, lower current draw, cooler, more energy efficient


Quote link.

I know this goes against what is shown in GPU-Z info on ASIC quality tab but The Stilt has good information and experience of AMD GPUs. What he has stated held true for me on Hawaii and Fiji. He recently also posted similar info in the Polaris 10 discussion thread.


----------



## Ne01 OnnA

-> gupsterg









Please gimme 333/500 HBM with 1.3133 or 1.32133 v
I will post my new BIOS here.
Big Thanks and only 1 i need without PL (PL i've my own set in BIOS Editor)

NewED_all.rom
https://mega.nz/#!MFER1bjZ!OvumrtFowopaC0BMWtt6T82VB-M6eZm4S8h53Bj1i2g


----------



## gupsterg

I have replied in bios mod thread







.

i) there is no 333MHz timings which I can place in ROM, do you have a 333MHz timings strap in ROM?

ii) the information on how to set HBM voltage has been in OP for several months.

Why I don't do custom ROMs for members is:-

i) I would get more requests than I already do. Which means I wouldn't have no spare time for my own purposes.

ii) What also happens is when I do 1 ROM for a member it "snow balls" it an avalanche of requests for more custom ROMs from same member. Then it also gets difficult for me to refuse to do ROMs for other members as well.

I prefer to give share information/experience I may have to help a member







, then they are in a position to make modifications as they require







, as many times as they want







, whenever they want







. My other reason for sharing information on what I have picked up or know is so the member learns bios mod and may see something I or another modder misses or does not know







, so technically we have then more modders viewing ROM = more minds working on ROM = more or better mods for all of us







.

If you are stuck with modifying your ROM post question specifics in bios mod and I will answer as best as I can. Those answers will then be there for another to use as well







.


----------



## Ne01 OnnA

OK Fair enough.
Can you point me in right direction?
I can HEX Edit everything, but i dont have experience with BIOS Moding -> for Fiji


----------



## gupsterg

Of course I am willing to point you in the right direction







, post in Fiji bios mod the ROM you wish to modify and questions and we go from there







.


----------



## Bryst

Anyone who recently bought the Sapphire R9 Fury Nitro on Newegg or Amazon care to comment on their coil whine?


----------



## JDags

Quote:


> Originally Posted by *Bryst*
> 
> Anyone who recently bought the Sapphire R9 Fury Nitro on Newegg or Amazon care to comment on their coil whine?


Bought the Sapphire Fury on sale for $300 through Newegg. My coil whine is pretty bad. Others in this thread have suggested that I RMA my card since my ASIC is piss poor as well. Card doesn't take to undervolting well and I dare not OC the card on air because if it could go any higher, it would have been sold as the 1050 mhz variant instead of the 1020 mhz variant I purchased.


----------



## Krzych04650

Quote:


> Originally Posted by *Bryst*
> 
> Anyone who recently bought the Sapphire R9 Fury Nitro on Newegg or Amazon care to comment on their coil whine?


I got my from caseking.de, but it doesn't matter where you buy it, as long as it is new and factory sealed. Coil whine is about as bad as on other GPUs in the world. This one is maybe a bit worse, my previous card, MSI 980 Ti Gaming 6G, was slightly better under max load, but it was also whining as soon as it got anything to render, even in menus with V-sync on, so basically no load at all. Coil whine is quite subjective thing, some people are not sensitive to it and some are, for example I hear whine that nobody else in my house can hear and even my camera is also unable to record this sound.

If you are sensitive to this sound then 3m cables and a hole in the wall or ceiling to the other room and placing PC there is the only way to go.


----------



## looncraz

Quote:


> Originally Posted by *gupsterg*
> 
> From what The Stilt has stated about AMD GPUs ASIC quality = LeakageID.
> Quote link.
> 
> I know this goes against what is shown in GPU-Z info on ASIC quality tab but The Stilt has good information and experience of AMD GPUs. What he has stated held true for me on Hawaii and Fiji. He recently also posted similar info in the Polaris 10 discussion thread.


The truth isn't so simple. Higher ASIC quality means lower operating voltage and less heat output at a certain frequency target. Behavior outside of that target can seem almost completely independent of ASIC quality (though it's not).

Technically, higher ASIC quality means lower leakage, therefore lower power consumption and more overclocking headroom at moderate temperatures (this is in keeping with GPU-z, IIRC). However, temperature and voltage can change leakage dramatically (to the fourth and second powers, respectively), so these properties change as soon as you start tweaking the GPU configuration.

I have a 78.1% ASIC quality R9 290 - a rather good sample. It runs at lower voltage, generates less heat, and can under-volt very well. Give it the slightest extra juice, though, and it all that goes away completely. And, when it's hot, it pulls more power than the average R9 290 (I can easily get my card to pull 420W, at which point I run out of the sensible power limits).

A lower ASIC quality means higher leakage, which requires more voltage to run at its default clocks, which means more baseline consumption, but usually means less extra power is consumed when you overclock. The extra leakage means the transistors, on average, are more willing to switch states - provided you can prevent overheating... which can be a challenge.

So the observation can be that, at normal temperatures, overclocking on a lower ASIC quality GPU can be easier and draw less extra power, but that's because the voltage and power usage floor are already elevated.


----------



## gupsterg

I agree it is not as simple a subject







and by no means am I "qualified" or have "full experience" on the matter but what The Stilt has stated held true for me







.

I have seen on higher ASIC quality Fiji GPU I had, they drew more current than lower ASIC quality GPU, even if the higher quality ASIC had lower voltage. Temps were approximately similar, stock cooling with no mods but only fan profile. As I OC'd the higher ASIC quality GPUs they drew more power than lower ASIC quality. One such comparison was where clocks and VID were very close between the 2 GPUs.

I have also had 2 GPUs where they were higher ASIC quality which would mean lower voltage *but* as they were being deemed "bad ASIC" they were higher voltage than what they should have been IMO (link to post). These cards which were seeming like "bad ASIC" to me, one would not even run [email protected] at stock stable, where as for some reason 3D loads it had no issue even with moderate OC. I made no changes to rig except used another card and the same work unit that was = BSOD, on the other card worked fine.

So far played with 7 differing Fiji cards.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *gupsterg*
> 
> Of course I am willing to point you in the right direction
> 
> 
> 
> 
> 
> 
> 
> , post in Fiji bios mod the ROM you wish to modify and questions and we go from there
> 
> 
> 
> 
> 
> 
> 
> .


Great THX Bratan'
















Maby we can add HBM V into Fiji BIOS?
And Maby i will consider with Adding BIOS Editing tools into my RadeonMOD ? (of course if Makers of it agree to that) so we can Have All in One


----------



## Aretak

Quote:


> Originally Posted by *Bryst*
> 
> Anyone who recently bought the Sapphire R9 Fury Nitro on Newegg or Amazon care to comment on their coil whine?


Not from one of those two retailers, but I bought one in the past couple of weeks and have zero coil whine. I do have a good quality PSU in an EVGA Supernova G2 750W, although that didn't stop me returning five different 970s that all had obnoxious coil whine in the same system (I eventually bought a 980 instead that didn't have any).


----------



## Bryst

Quote:


> Originally Posted by *JDags*
> 
> Bought the Sapphire Fury on sale for $300 through Newegg. My coil whine is pretty bad. Others in this thread have suggested that I RMA my card since my ASIC is piss poor as well. Card doesn't take to undervolting well and I dare not OC the card on air because if it could go any higher, it would have been sold as the 1050 mhz variant instead of the 1020 mhz variant I purchased.


Quote:


> Originally Posted by *Krzych04650*
> 
> I got my from caseking.de, but it doesn't matter where you buy it, as long as it is new and factory sealed. Coil whine is about as bad as on other GPUs in the world. This one is maybe a bit worse, my previous card, MSI 980 Ti Gaming 6G, was slightly better under max load, but it was also whining as soon as it got anything to render, even in menus with V-sync on, so basically no load at all. Coil whine is quite subjective thing, some people are not sensitive to it and some are, for example I hear whine that nobody else in my house can hear and even my camera is also unable to record this sound.
> 
> If you are sensitive to this sound then 3m cables and a hole in the wall or ceiling to the other room and placing PC there is the only way to go.


Thank for the reply. I bought one off amazon and I have some minor coil whine. Its wierd, its not high pitched at 60-75fps like it is at 3000fps. It almost sounds like a bad fan at 60-75 fps. Like a low whirl. Its not the fans though because I ran the card full load without the fans for a bit and it was still there. Going to try running something at high fps overnight to see if it can reduce it. Ore I'll just have to get used to it I guess. I went though the whole return thing with amazon with my LF 29UM67 and the replacement I received was worse then the one I was exchanging. I ended up returning the exchange instead lol.


----------



## Kana-Maru

I finally got around to undervolting my Fury X. I've had it for more than a year now and I guess I should attempt to undervolt it.

Stock voltage for my Fury X would peak at 1.225mV and the average would hover around 1.19mV - 1.21mV.

I undervolting the card using these settings:
*Core Voltage:* -36
*Power Limit %:* -25
*Core Clock* 1050Mhz

The voltage now peaks at 1.16mV and the average is roughly 1.14mV - 1.15mV. I saw it dip as low as 1.10mV during less stressful parts in the benchmark test.

I ran Heaven Benchmark 4 [Tessellation = Extreme 1440p + 4K], Valley Benchmark 1 [Ultra: 1440p + 4K] and Fire Strike Extreme & Ultra. No artifacts and no crashing.

The temps were around 35c-40c! Wow that's around 5c - 8c lower than what I normally see!
I might try to go lower, but at some point I'm sure I'll need to lower my core clock a little bit.


----------



## iRUSH

^^ Looks like the AIO on the Fury X is more than enough to keep it cool ?


----------



## Bryst

Quote:


> Originally Posted by *Kana-Maru*
> 
> I finally got around to undervolting my Fury X. I've had it for more than a year now and I guess I should attempt to undervolt it.
> 
> Stock voltage for my Fury X would peak at 1.225mV and the average would hover around 1.19mV - 1.21mV.
> 
> I undervolting the card using these settings:
> *Core Voltage:* -36
> *Power Limit %:* -25
> *Core Clock* 1050Mhz
> 
> The voltage now peaks at 1.16mV and the average is roughly 1.14mV - 1.15mV. I saw it dip as low as 1.10mV during less stressful parts in the benchmark test.
> 
> I ran Heaven Benchmark 4 [Tessellation = Extreme 1440p + 4K], Valley Benchmark 1 [Ultra: 1440p + 4K] and Fire Strike Extreme & Ultra. No artifacts and no crashing.
> 
> The temps were around 35c-40c! Wow that's around 5c - 8c lower than what I normally see!
> I might try to go lower, but at some point I'm sure I'll need to lower my core clock a little bit.


Nice! I have my Fury nitro at -60mv which puts it about 50-55c in games. I left the power limit because I saw a reduction of FPS in games. Not to much only like 5-6 but still I didnt like that. Those temps are awesome, makes me wish id spent the extra 50 on the Fury X but I didnt think my PSU could handle it, its only a 550watt.


----------



## MissHaswellE

Quote:


> Originally Posted by *Kana-Maru*
> 
> I finally got around to undervolting my Fury X. I've had it for more than a year now and I guess I should attempt to undervolt it.
> 
> Stock voltage for my Fury X would peak at 1.225mV and the average would hover around 1.19mV - 1.21mV.
> 
> I undervolting the card using these settings:
> *Core Voltage:* -36
> *Power Limit %:* -25
> *Core Clock* 1050Mhz
> 
> The voltage now peaks at 1.16mV and the average is roughly 1.14mV - 1.15mV. I saw it dip as low as 1.10mV during less stressful parts in the benchmark test.
> 
> I ran Heaven Benchmark 4 [Tessellation = Extreme 1440p + 4K], Valley Benchmark 1 [Ultra: 1440p + 4K] and Fire Strike Extreme & Ultra. No artifacts and no crashing.
> 
> The temps were around 35c-40c! Wow that's around 5c - 8c lower than what I normally see!
> I might try to go lower, but at some point I'm sure I'll need to lower my core clock a little bit.


These cards are designed to run at 80C~95C without any real worry, why are you so focused on undervolting when you can overclock and still keep it bellow 60C under 100% load?


----------



## Bryst

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *MissHaswellE*
> 
> These cards are designed to run at 80C~95C without any real worry, why are you so focused on undervolting when you can overclock and still keep it bellow 60C under 100% load?






To each their own. I've been more focused on reducing temps as much as possible while keeping framerates high and noise low. I prefer to keep my GPU around the temps of my CPU. Plus keeps the temp of my SSD in the m.2 slot cooler.


----------



## MissHaswellE

Quote:


> Originally Posted by *Bryst*
> 
> 
> To each their own. I've been more focused on reducing temps as much as possible while keeping framerates high and noise low. I prefer to keep my GPU around the temps of my CPU. Plus keeps the temp of my SSD in the m.2 slot cooler.


Mine's running right now at 99% at 50C in game at 1440p, overclocked 1100mhz, 510memory.


----------



## Krzych04650

Quote:


> Originally Posted by *MissHaswellE*
> 
> These cards are designed to run at 80C~95C without any real worry, why are you so focused on undervolting when you can overclock and still keep it bellow 60C under 100% load?


If you cannot overclock or overclocking is poor and for example requires big overvoltage for small performance increase and is just not worth it for the temp/noise/power draw increasement then it is nice to check if the GPU at least can take some undervolting. You can significantly decrease temperatures and power draw this way, and therefore noise levels, especially on air.

I know that there are paranoiac people around that will say they don't like things running "as hot as 70C" and that components "cannot keep up" at this temp, while in fact they were designed to run at higher temps, but if you can reduce noise levels and power consumption without any performance cost then why not, its free improvement.


----------



## Kana-Maru

Quote:


> Originally Posted by *iRUSH*
> 
> ^^ Looks like the AIO on the Fury X is more than enough to keep it cool ?


Yes it is. The card is silent since the fans run at low RPMs during benchmarking and gaming sessions.

Quote:


> Originally Posted by *Bryst*
> 
> Nice! I have my Fury nitro at -60mv which puts it about 50-55c in games. I left the power limit because I saw a reduction of FPS in games. Not to much only like 5-6 but still I didnt like that. Those temps are awesome, makes me wish id spent the extra 50 on the Fury X but I didnt think my PSU could handle it, its only a 550watt.


Thanks. I had the power limit and the core voltage lower, but then I noticed my core clock would fall. Remember these are the warmest months of the year, I can only imagine how cool the Fury X runs during the Fall and Winter months. The temps were already good, but undervolting makes it even better.

Quote:


> Originally Posted by *MissHaswellE*
> 
> These cards are designed to run at 80C~95C without any real worry, why are you so focused on undervolting when you can overclock and still keep it bellow 60C under 100% load?


I'm not "so focused" on undervolting. I had the GPU for a year and just decided to undervolt for the heck of it. The results were nice. I run my Fury X at stock settings and I rarely overclock it since my performance is normally great at stock. The highest I've pushed it was 1175Mhz on the Core. I was hitting 1125Mhz-1150Mhz with only core clock changes. All of my benchmarks are using stock settings anyways except some synthetic benchmarks from time to time.

Undervolting does has it's purposes. Less voltage being used, less wattage and less heat output, fans running at lower RPMs while maintaining the same core clock. I guess a better question would be "why not" undervolt it if you can achieve the same performance with no fps loss.

Quote:


> Originally Posted by *Bryst*
> 
> 
> To each their own. I've been more focused on reducing temps as much as possible while keeping framerates high and noise low. I prefer to keep my GPU around the temps of my CPU. Plus keeps the temp of my SSD in the m.2 slot cooler.


I have a SSD using the m.2 slot and those can get pretty warm. My GPU is normally on par with my CPU and now my GPU actually runs cooler than the GPU depending on the game and API being used.

I'm considering picking up another Fury X for CFX since the prices have dropped down to around $400.00. The scaling looks nice, but who knows what might be released later this year or early next year. Plus the single Fury X is giving me more than I need and aging fairly well 1 year later.


----------



## Ne01 OnnA

I have now for my NITRO OC+ this:

Under-Volting scaling:

Crimson + CCC 16.7.3 WHQL
Tests on: 1856:1392 80Hz Digital Panel 16Bit HDMI

SoM Test All Ultra/V.High/noBlur/FXAA:

850/500 1.206v -42mV -18%POW 45%Fan -> 68 85 52 (145tW / HBM 8tW)
935/550 1.206v -48mV -14%POW 55%Fan -> 73 88 57 (204tW / HBM 10tW)
1020/570 1.206v -48mV -14%POW 65%Fan -> 76 90 54 (238tW / HBM 10tW)
1050/570 1.206v -36mV -8%POW 65%Fan -> 76 91 58 (249tW / HBM 10tW)

Here my VDD


Next will be gupsterg ROM he gave me with small V bump for HBM..


----------



## LionS7

No ! Green bar on the window..., with R9 Fury...no ! Im leaving.


----------



## gupsterg

Quote:


> Originally Posted by *iRUSH*
> 
> ^^ Looks like the AIO on the Fury X is more than enough to keep it cool ?


Totally







, I have slight cooling profile mod via ROM and 50°C GPU temp is maintained when gaming/folding, etc.


----------



## bluezone

Crimson 16.8.1 release notes

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-8-1-Release-Notes.aspx

Windows 10 64 download.

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

windows 7 64 download.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

cheers

EDITED: Win 7 added. and corrected Win 10


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Crimson 16.8.1 release notes
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-8-1-Release-Notes.aspx
> 
> Windows 10 64 download.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-8-1-Release-Notes.aspx
> 
> cheers


Some nice fixes in there. I may for the first time consider multi GPU. I will however wait for HBM2 AMD Flaghships as my loop is already done. AMD is doing it big with each driver release!


----------



## Krzych04650

Did the finally fix not supported VSR bug in drivers or not?


----------



## MissHaswellE

Quote:


> Originally Posted by *Thoth420*
> 
> Some nice fixes in there. I may for the first time consider multi GPU. I will however wait for HBM2 AMD Flaghships as my loop is already done. AMD is doing it big with each driver release!


Dual Graphics support relies vastly more on the Developer of the games rather than the Drivers.
Drivers can't do much if a game doesn't support and optimize their multi GPU support.

Most developers just don't care because the grand majority of gamers are running single GPUs, and the vast majority are running cards at 250$ or less.


----------



## bluezone

Quote:


> Originally Posted by *Krzych04650*
> 
> Did the finally fix not supported VSR bug in drivers or not?


I'm not sure. Meaning I have finally have VSR back. But I have not used my normal Nuke and Pave with DDU. I used a custom driver I found on Guru3D that allowed VSR to operate.(same driver kernel as 16.7.3).
I've used the upgrade driver path with the past 2 drivers in fear of losing the ability to use VSR.









Please try 16.8.1 and let me know if VSR is enabled. If it doesn't work I will point you to the custom driver. I posted about this drover 2-3 weeks ago.


----------



## Krzych04650

Quote:


> Originally Posted by *bluezone*
> 
> I'm not sure. Meaning I have finally have VSR back. But I have not used my normal Nuke and Pave with DDU. I used a custom driver I found on Guru3D that allowed VSR to operate.(same driver kernel as 16.7.3).
> I've used the upgrade driver path with the past 2 drivers in fear of losing the ability to use VSR.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please try 16.8.1 and let me know if VSR is enabled. If it doesn't work I will point you to the custom driver. I posted about this drover 2-3 weeks ago.


I heard something about FreeSync issues with new driver and I am about to get FreeSync monitor for testing tomorrow so I will pass on this driver for now.


----------



## Thoth420

Quote:


> Originally Posted by *MissHaswellE*
> 
> Dual Graphics support relies vastly more on the Developer of the games rather than the Drivers.
> Drivers can't do much if a game doesn't support and optimize their multi GPU support.
> 
> Most developers just don't care because the grand majority of gamers are running single GPUs, and the vast majority are running cards at 250$ or less.


True but it seems some devs are getting better at xfire and sli support and worst case two HBM2 flagships one gets disabled for a game and it will still run just fine on those monsters to be.


----------



## iRUSH

Quote:


> Originally Posted by *Krzych04650*
> 
> I heard something about FreeSync issues with new driver and I am about to get FreeSync monitor for testing tomorrow so I will pass on this driver for now.


Free-sync user here reporting in that the 16.8 drivers are bogus. 16.7.3 is a winner for now.


----------



## Krzych04650

Quote:


> Originally Posted by *iRUSH*
> 
> Free-sync user here reporting in that the 16.8 drivers are bogus. 16.7.3 is a winner for now.


Thanks for info


----------



## Thoth420

Quote:


> Originally Posted by *iRUSH*
> 
> Free-sync user here reporting in that the 16.8 drivers are bogus. 16.7.3 is a winner for now.


What is the issue with Freesync on the latest driver?


----------



## iRUSH

Quote:


> Originally Posted by *Thoth420*
> 
> What is the issue with Freesync on the latest driver?


You'll have random moments where your frame-rate will go from it's smooth 144, to 60 and get stuck there for 5-10 seconds before going back to 144 again. This happens constantly just on the desktop alone. In the course of 5 minutes it'll happen 5-10 times.


----------



## Thoth420

Quote:


> Originally Posted by *iRUSH*
> 
> You'll have random moments where your frame-rate will go from it's smooth 144, to 60 and get stuck there for 5-10 seconds before going back to 144 again. This happens constantly just on the desktop alone. In the course of 5 minutes it'll happen 5-10 times.


Damn...hope they suss that out quick! :0


----------



## bluezone

I don't use Freesync so I cannot check this out. But over on Guru3D:
Quote:


> As an update the Redstone issue is being fixed at the moment. If anyone has any issues with Freesync the workaround would be to enable GPU scaling that is also being looked at too along with some other issues.


http://forums.guru3d.com/showthread.php?t=409217


----------



## Bryst

Anyone have issues with their Fury and glitches in World of Warcraft? I would have like horizontal white lines of a small group of the screen glitch white for a millisecond or so.

Enabling Triple buffering seemed to remove this but then I got terrible horiszontal tearing, like the whole bottom have of my screen would be misaligned by like a few millimeters. So I enabled V-sync but that would tank my FPS to 37fps in alot of areas with high volume of people.

Limiting FPS to under 60 fps seems to reduce the frequency.

I have done a driver wipe and fresh install of 16.7.3. It seems to only happen in wow, I didnt notice this in Doom, or Fallout4, though I do get buggy clouds in War thunder, but that seems to be an issue that started when they updated the cloud engine.


----------



## Thoth420

Quote:


> Originally Posted by *Bryst*
> 
> Anyone have issues with their Fury and glitches in World of Warcraft? I would have like horizontal white lines of a small group of the screen glitch white for a millisecond or so.
> 
> Enabling Triple buffering seemed to remove this but then I got terrible horiszontal tearing, like the whole bottom have of my screen would be misaligned by like a few millimeters. So I enabled V-sync but that would tank my FPS to 37fps in alot of areas with high volume of people.
> 
> Limiting FPS to under 60 fps seems to reduce the frequency.
> 
> I have done a driver wipe and fresh install of 16.7.3. It seems to only happen in wow, I didnt notice this in Doom, or Fallout4, though I do get buggy clouds in War thunder, but that seems to be an issue that started when they updated the cloud engine.


I am coming back for Legion but my account is still either inactive or frozen(perhaps deleted if that happens over a long enough inactive time) but I suspect a driver release will be out prior to the 16th for the xpac release. I am sure the game pre-patched already and I know they add some of the framework for the xpac early as certain changes affect non xpac buyers etc. Essentially the game is in flux for a bit so I wouldn't worry about it too much at the moment.
Do you play in Fullscreen, Windowed(borderless) or plain old Windowed?


----------



## Bryst

Quote:


> Originally Posted by *Thoth420*
> 
> I am coming back for Legion but my account is still either inactive or frozen(perhaps deleted if that happens over a long enough inactive time) but I suspect a driver release will be out prior to the 16th for the xpac release. I am sure the game pre-patched already and I know they add some of the framework for the xpac early as certain changes affect non xpac buyers etc. Essentially the game is in flux for a bit so I wouldn't worry about it too much at the moment.
> Do you play in Fullscreen, Windowed(borderless) or plain old Windowed?


I am playing in Fullscreen. maybe it is just a driver thing, but it does it in draenor zones as well. But I havent owned the card long and I think the legion prepatch was installed already.


----------



## Thoth420

Quote:


> Originally Posted by *Bryst*
> 
> I am playing in Fullscreen. maybe it is just a driver thing, but it does it in draenor zones as well. But I havent owned the card long and I think the legion prepatch was installed already.


Try playing in borderless window and clamp fps accordingly to not get tearing and see if that solves it for the median. Another thing to try is turn off all forms of AA. I find most glitches in WoW are AA related. I always preferred to play in borderless window for WoW and I pulled Gladiator a couple seasons in a row before my hiatus as well as clearing everything on Heroic from Vanilla(less Vanilla Naxx and C'Thun obviously..., we got to 4 horsemen before they dropped the first xpac) through Lich King. I even got lucky and won the Invincible mount on our first kill.
I always found the game to be more responsive in borderless windowed mode be on Nvidia or AMD but it has been a while since I played and I am sure much has changed on the back end as well as the front. Just my experience being a hardcore WoW addict for years.


----------



## Bryst

Quote:


> Originally Posted by *Thoth420*
> 
> Try playing in borderless window and clamp fps accordingly to not get tearing and see if that solves it for the median. Another thing to try is turn off all forms of AA. I find most glitches in WoW are AA related. I always preferred to play in borderless window for WoW and I pulled Gladiator a couple seasons in a row before my hiatus as well as clearing everything on Heroic from Vanilla(less Vanilla Naxx and C'Thun obviously..., we got to 4 horsemen before they dropped the first xpac) through Lich King. I even got lucky and won the Invincible mount on our first kill.
> I always found the game to be more responsive in borderless windowed mode be on Nvidia or AMD but it has been a while since I played and I am sure much has changed on the back end as well as the front. Just my experience being a hardcore WoW addict for years.


I tried borderless and it did the same thing, it seems like if the FPS never goes over 60(limiting it in game to 55) it doesnt happen at all. So im beginning to think its a Freesync issue. Going to try playing for a bit with the game set to 60hz and see if it resolves it.


----------



## Thoth420

Quote:


> Originally Posted by *Bryst*
> 
> I tried borderless and it did the same thing, it seems like if the FPS never goes over 60(limiting it in game to 55) it doesnt happen at all. So im beginning to think its a Freesync issue. Going to try playing for a bit with the game set to 60hz and see if it resolves it.


Quite possible I have yet to try FS on WoW yet. Also the latest driver really broke FS so if you are on that you may want to roll back to the last one you were using.


----------



## Krzych04650

Moving my PC to the attic completed. Quite easy to do, requires only 3m DP, two 3m USB cables (one is enough), and some USB HUBs. And of course you need quite long power and reset cables, I got LianLi 90cm, they barely reached my ceiling and I am turning computer on/off with 1m stick because I cannot reach them







To much effort to extend them by my own, and prices for extensions in shops are ridiculous, stupid 1pin 30 cm extension costs 18 ZL, so to extend power and reset by 1,5m I would need 20 of them, this is in total much more than quality 256GB SSD... Just idiots.

No issues with 3m DP, I got StarTech (aka Amphenol) one, certified by VESA, no issues running 3440x1440 75Hz.
No issues with 3m USB 3.0 (Gembrid) or USB 3.0 HUB also (Unitek)
Some serious issues with 3m USB 2.0 (Nantec), it will disconnect things over and over with a HUB (Hama). HUB by itself works when connected to USB 3.0 HUB (but its quality is quite poor anyway, I will probably return it), so I have total of 7 ports for now, should be enough.
No issues with external USB sound card either.

As for noise... Well, AMAZING. Coil whine is not audible anymore, and 100% fan speed on Fury Nitro, which sounds like jet engine, I quieter than my breath. Normal usage fan speeds like 50% are completely not audible. NO MORE PC SOUNDS! I can have eternal peace now









I only need to paint this 'however this is called in English' brown to match the walls and everything will be complete.


----------



## iRUSH

Quote:


> Originally Posted by *Krzych04650*
> 
> 
> 
> Moving my PC to the attic completed. Quite easy to do, requires only 3m DP, two 3m USB cables (one is enough), and some USB HUBs. And of course you need quite long power and reset cables, I got LianLi 90cm, they barely reached my ceiling and I am turning computer on/off with 1m stick because I cannot reach them
> 
> 
> 
> 
> 
> 
> 
> To much effort to extend them by my own, and prices for extensions in shops are ridiculous, stupid 1pin 30 cm extension costs 18 ZL, so to extend power and reset by 1,5m I would need 20 of them, this is in total much more than quality 256GB SSD... Just idiots.
> 
> No issues with 3m DP, I got StarTech (aka Amphenol) one, certified by VESA, no issues running 3440x1440 75Hz.
> No issues with 3m USB 3.0 (Gembrid) or USB 3.0 HUB also (Unitek)
> Some serious issues with 3m USB 2.0 (Nantec), it will disconnect things over and over with a HUB (Hama). HUB by itself works when connected to USB 3.0 HUB (but its quality is quite poor anyway, I will probably return it), so I have total of 7 ports for now, should be enough.
> No issues with external USB sound card either.
> 
> As for noise... Well, AMAZING. Coil whine is not audible anymore, and 100% fan speed on Fury Nitro, which sounds like jet engine, I quieter than my breath. Normal usage fan speeds like 50% are completely not audible. NO MORE PC SOUNDS! I can have eternal peace now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I only need to paint this 'however this is called in English' brown to match the walls and everything will be complete.


How hot is your attic?


----------



## MrKoala

And how hot is the attic after running the PC for a while?


----------



## Krzych04650

Quote:


> Originally Posted by *iRUSH*
> 
> How hot is your attic?


Temperature depends on outside temperature because the attic is not thermally isolated from outside, there is no glass wool there. And at the same time it is thermally isolated from two floors beneath it with crazy amount of wool between my ceiling on 1st floor and attic floor, so ambient temperatures cannot affect hotness or coldness on the attic by too much, so with a lot of hot days in a row in the summer temps surely can get over 35C. And on the other hand when there is very cold winter and outside temps are kept around -20C for few days, temp on the attic approaches 0C and relatively warm attic floor starts to produce steam and this can be very dangerous for PC. I will have to monitor temperatures during winter very carefully. I have some wool to isolate at least half of attic but I hate this crap, because it irritates skin and airways so much. I'd rather build some 1m x 1m box with small intakes and put PC there, maybe with some ~50W laps inside to ensure that heat is produced while PC is not used and temperature is kept over 0C, but first I need to see if this attic really gets close to 0C. My father said that he saw below 0C temp there once so I need to be careful.

I am waiting for winter to see the temps with like 5-8C ambient







With undervolting, 70% fan speed and 12V case fans I can get the card to 46C already with ~20C ambient, so I guess below 40 is possible, on air








Quote:


> Originally Posted by *MrKoala*
> 
> And how hot is the attic after running the PC for a while?


This is very big room. You can hit your head there over and over because there are some roof reinforcements hanging at around average human height, so no way to live there really unless you are a hobbit, but generally affecting ambient temperature in such big room in any meaningful way with 300-350W heat source is not possible. I will post few pictures of this attic tomorrow, because now there is a black hole there









Temps are not of a concern really, or only if they drop below 0C. I have very good cooling in my PC and now I can crank fans up 3 times higher without hearing anything, so... yea, I won't even see 70C on this PC now even with 35C+ ambient temp.


----------



## LionS7

Well, after too much testing in The Witcher 3, cos I think it is the best game for this purpose, the final score of my Fury X is 1100Mhz core on 1.28V. Im not happy with it, but with 45% fan on the water cooler, the temps are really good. Someone else with that kind of core, or close to it ?


----------



## Kana-Maru

Quote:


> Originally Posted by *LionS7*
> 
> Well, after too much testing in The Witcher 3, cos I think it is the best game for this purpose, the final score of my Fury X is 1100Mhz core on 1.28V. Im not happy with it, but with 45% fan on the water cooler, the temps are really good. Someone else with that kind of core, or close to it ?


I have been undervolting my Fury X lately. I'll probably get around to testing some overclock settings soon and let you know what I can get stable.


----------



## Krzych04650

I think I am having some issues with this 3m DP cable. It is working, but some strange things happens occasionally, like slight flickers and etc. DP is loosing performance if it is long right? And HDMI is not? I could drive 3440x1440 60 Hz through HDMI 2.0.


----------



## Thoth420

Quote:


> Originally Posted by *Krzych04650*
> 
> I think I am having some issues with this 3m DP cable. It is working, but some strange things happens occasionally, like slight flickers and etc. DP is loosing performance if it is long right? And HDMI is not? I could drive 3440x1440 60 Hz through HDMI 2.0.


3m is not too long and afaik it is what AMD recommends to use for freesync panels and if it works find on those it should work find on that 60hz widescreen as well. Which brand cable are you using because some are not VESA certified?...a few are marketed as such but they are not.


----------



## Krzych04650

Quote:


> Originally Posted by *Thoth420*
> 
> 3m is not too long and afaik it is what AMD recommends to use for freesync panels and if it works find on those it should work find on that 60hz widescreen as well. Which brand cable are you using because some are not VESA certified?...a few are marketed as such but they are not.


This one: https://www.amazon.com/StarTech-com-Certified-DisplayPort-Cable-Latches/dp/B0011ZQLYM

Only one certified I was able to find in Poland.

I am not going to use FreeSync because I don't find it useful, it is performing far worse then you can read in web, especially on 55-75 range of my LG (not expandable, setting for example 45-75 causes flickering), I also tried it on 40-75 range in other monitor and I am not impressed at all. And also I don't want to get used to 75 Hz because I won't be able to get back to 60 FPS after some time of 75 FPS usage, I already felt 60 FPS to be a bit clunky after one day of 75 FPS usage, so I reverted back to 60 Hz, because 60 FPS is already hard enough to drive on 3440x1440, not to mention many games being CPU limited and barely running at 40-50 range.

So I can use HDMI instead and return DP cable. HDMI also has some certifications or not? It is also over 2 times cheaper.


----------



## costilletas

Quote:


> Originally Posted by *Krzych04650*
> 
> I think I am having some issues with this 3m DP cable. It is working, but some strange things happens occasionally, like slight flickers and etc. DP is loosing performance if it is long right? And HDMI is not? I could drive 3440x1440 60 Hz through HDMI 2.0.


By weird flickering do you mean something like this? https://drive.google.com/open?id=0B7sVtMlN65eMeWY3N2lqb3NPcmM


----------



## iRUSH

Ah! I have been experiencing flickering too. Maybe I'll roll back a few drivers and see if it disappears.


----------



## Krzych04650

Quote:


> Originally Posted by *costilletas*
> 
> By weird flickering do you mean something like this? https://drive.google.com/open?id=0B7sVtMlN65eMeWY3N2lqb3NPcmM


I think I saw something like this few times after switching to 3m cable. This is quite rare and very fast so I am usually not sure if I really saw this or not, but something is not right here, and only cable was changed recently, everything else is not changed,.

Plus I get some bugged flickering shadows, I didn't saw them on r9 270x back when I played this game (Titan Quest).

I will just order HDMI and see if it helps./


----------



## Ne01 OnnA

Ne01 Presents: *Serious BIOS Moding* is started.

My New BIOS with Timing HBM MOD from 300/500 to 400/500 Strap + Little V adjust for Stability !
V goes to 1.319v for HBM, and that HBM Strap Timing is 400MHz from Stock 300MHz !
THX to Gupsterg for Baking the BIOS for Me (then i need to HEX Edit my self)
CU's are unlocked.

Test Run was conducted in SoMordor with Ultra Textures + reshade/Sweet/HighPass All Maxed + AA & Ryse Son of ROME Maxed with AA (that one is really sensitive so good for testing for errors)
No artifacts or glitches !
So for my HBM 1.319 was enough for 400MHz/570MHz (that Timings are really fast, and i mean REALLY, no GDDR5 or X can match those HBMs)





Here my BIOS (Fury NITRO OC+ 1050/500)

https://mega.nz/#F!QAkghRJT!W4n7fr-1CPaw7-QzTi04Xg

*Fury_Tmod_all* -> Unlocked All CU's + Timing MOD HBM to 400MHz + V Mod HBM 1.319v + Edited for low tW (good for long run gaming 6-8h) [non-UEFI]

*OC_all.HBM* -> Unlocked All CU's + V Mod HBM 1.319v [non-UEFI]

*NewED_all_New.V* -> Normal Nitro OC BIOS with PL MOD and Unlocked CU's [UEFI]

Please just Edit this in Fiji BIOS Editor if you need more PL (All of these are Edited for long Gaming sessions)
Im Using TRIXX for settings (its strongly recomended by me to use TRIXX)


----------



## costilletas

Ne01 OnnA Yours is a nitro oc too right? Can you upload your bios?









@Krzych04650@iRUSH

Can you tell me if your flickering is like that of the video i uploaded? Cause it's the second fury i own and both came with this problem and it's making me want to rma it again, I can't get the fps i want in most games i play cause the gpu is too lazy to go full load for some reason no matter what i do and then this awful flickering which seems to happen only when i hit 144+ fps.


----------



## iRUSH

Quote:


> Originally Posted by *costilletas*
> 
> Ne01 OnnA Yours is a nitro oc too right? Can you upload your bios?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Krzych04650@iRUSH
> 
> Can you tell me if your flickering is like that of the video i uploaded? Cause it's the second fury i own and both came with this problem and it's making me want to rma it again, I can't get the fps i want in most games i play cause the gpu is too lazy to go full load for some reason no matter what i do and then this awful flickering which seems to happen only when i hit 144+ fps.


So far today's driver update seemed to have eliminated my flickering issue. It looked like the same thing you have going on. I even went as far as exchanging my monitor earlier this morning too.


----------



## costilletas

@Ne01 OnnA

570 with that timing is pretty impressive, at least compared to mine which starts artifacting at 520 lol.


----------



## Ne01 OnnA

Thank You









My advise?
Go for Custom BIOS + Give some v bump on HBM and then Test and tell me if it work for you








Max IMO good for HBM is 1.319 / 1.325 / 1.331
First try 1.319v (for me is enough, i don't need the 600MHz with 400 Strap







-> of course i can do it, but the Gain is small when one compare 570 to 600)

Don't give up -> Tweak MOAR


----------



## bluezone

Quote:


> Originally Posted by *costilletas*
> 
> Ne01 OnnA Yours is a nitro oc too right? Can you upload your bios?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Krzych04650@iRUSH
> 
> Can you tell me if your flickering is like that of the video i uploaded? Cause it's the second fury i own and both came with this problem and it's making me want to rma it again, I can't get the fps i want in most games i play cause the gpu is too lazy to go full load for some reason no matter what i do and then this awful flickering which seems to happen only when i hit 144+ fps.


It's a driver issues apparently.
Quote:


> ◾A small number of 144hz non-Freesync enabled displays may exhibit flickering during gaming or on desktop.


In Crimson 16.8.2 notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-8-2-Release-Notes.aspx


----------



## Bryst

Quote:


> Originally Posted by *costilletas*
> 
> By weird flickering do you mean something like this? https://drive.google.com/open?id=0B7sVtMlN65eMeWY3N2lqb3NPcmM


Thats what I've been experiencing with my Fury, mainly in WoW. doesnt seem top happen when I play doom or I just cant notice is.


----------



## costilletas

@bluezone I've been experiencing this with previous drivers too, so I guess at least now they are aware of this issue


----------



## bluezone

The new Crimson 16.8.2 has really cleaned up the output from Metro 2033 benchmark. The output report used to contain all sorts of micro stutters, their gone now. Same Bios

Before.


Spoiler: Warning: Spoiler!







Now.


Spoiler: Warning: Spoiler!







I've made a new optimized Bios using the GPU throttle temperature feature with good results. Note the temperature and fan speeds. Much quitter than before.
Quote:


>


FireStrike score- Graphics Score17 357.

http://www.3dmark.com/3dm/14085620


----------



## xTesla1856

Well guys, the time has come: The Furys are sold and packed up and awaiting a new home. As for the future, I am moving on to bigger and better things









Thanks again for the support and info I got in this thread !


----------



## bluezone

Quote:


> Originally Posted by *xTesla1856*
> 
> Well guys, the time has come: The Furys are sold and packed up and awaiting a new home. As for the future, I am moving on to bigger and better things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks again for the support and info I got in this thread !


Sorry to see you go. I always enjoyed your posts. Green or Red? I'd be interested to hear how what ever cards you use perform.


----------



## Krzych04650

Does Sapphire Fury Nitro have HDMI 2.0 or 1.4a like FuryX?


----------



## Alastair

Quote:


> Originally Posted by *Krzych04650*
> 
> Does Sapphire Fury Nitro have HDMI 2.0 or 1.4a like FuryX?


1.4a


----------



## bluedevil

Anyone know of a good dual link DVI to DP converter? I wanna hook up my Qnix 1440P PLS @ 96hz or more...the Fury Xs no DVI..


----------



## JackCY

Quote:


> Originally Posted by *bluedevil*
> 
> Anyone know of a good dual link DVI to DP converter? I wanna hook up my Qnix 1440P PLS @ 96hz or more...the Fury Xs no DVI..


Isn't it DP to dual link DVI rather?


----------



## bluedevil

Quote:


> Originally Posted by *JackCY*
> 
> Isn't it DP to dual link DVI rather?


Lol yeah that would be it


----------



## Alastair

People be selling Fury's and I be buying more!


----------



## Thoth420

I forgot those QNIX panels don't come with a Display Port port (lol)...I kinda wish they put a dvi d on the back of the fury and fury x as well. Pain in the arse for users of older panels...


----------



## xTesla1856

Quote:


> Originally Posted by *bluezone*
> 
> Sorry to see you go. I always enjoyed your posts. Green or Red? I'd be interested to hear how what ever cards you use perform.


I'm getting a stopgap 1080 until Vega and Volta launch. Hope to join team red at that time again


----------



## bluezone

Anyone try No Man Sky yet or as I call it Minecraft with planets?


----------



## Krzych04650

Quote:


> Originally Posted by *bluezone*
> 
> Anyone try No Man Sky yet or as I call it Minecraft with planets?


I tried. Runs like crap on my Fury and i5 5765C 4.2 GHz. I get 40-55 FPS no matter settings or resolution. There are heavy freezes from time to time. Stuttering basically all the time. Tearing is crazy even with V-sync. This is easily the worst running game I have seen since playing some single core limited MMOs like Lord of the Rings Online, where you can get drops to 35 FPS on high end system overclocked to the limits.

New driver helped a bit but performance is a complete mess, as well as the game itself. So much hype around this game and it failed so bad, I hope nobody actually paid for this game and people got their refunds immediately.


----------



## bluezone

Quote:


> Originally Posted by *Krzych04650*
> 
> I tried. Runs like crap on my Fury and i5 5765C 4.2 GHz. I get 40-55 FPS no matter settings or resolution. There are heavy freezes from time to time. Stuttering basically all the time. Tearing is crazy even with V-sync. This is easy the worst running game I have seen since playing some single core limited MMOs like Lord of the Rings Online, where you can get drops to 35 FPS on high end system overclocked to the limits.


That's not confidence inspiring. Obviously they need to patch it. But it sounds like its so irritating that you didn't even mention game play. I guess I'll will be skipping it.

Thanks.

REP+1


----------



## Krzych04650

Quote:


> Originally Posted by *bluezone*
> 
> That's not confidence inspiring. Obviously they need to patch it. But it sounds like its so irritating that you didn't even mention game play. I guess I'll will be skipping it.
> 
> Thanks.
> 
> REP+1


Gameplay may not be the worst but is very repeatable, planets are very similar, I don't see anyone playing this game on longer period of time. Interesting game but basically only because the vision of exploring space is amazingly interesting, but this game is not doing it too well.

If you want to explore space then buy Elite Dangerous, it is far better looking and runs buttery smooth at 3440x1440 maxed out.


----------



## bluezone

Quote:


> Originally Posted by *Krzych04650*
> 
> Gameplay may not be the worst but is very repeatable, planets are very similar, I don't see anyone playing this game on longer period of time. Interesting game but basically only because the vision of exploring space is amazingly interesting, but this game is not doing it too well.
> 
> If you want to explore space then buy Elite Dangerous, it is far better looking and runs buttery smooth at 3440x1440 maxed out.


Cool I'll check that out.


----------



## bluezone

I've never looked at Elite Dangerous before but I instantly recognized video of it. A couple years back I was on a gaming discussion board. One of the members was posting up video and pictures of assets he was working on. For a game he could not name.


----------



## Sonikku13

Gonna sell my Nano for an AIB 480. Partly to get a profit while sacrificing little performance.


----------



## Krzych04650

Quote:


> Originally Posted by *Sonikku13*
> 
> Gonna sell my Nano for an AIB 480. Partly to get a profit while sacrificing little performance.


AIB 480 overclocked out of the box by ~50 MHz is already almost 20% slower than Nano or Fury on reference clocks, and if Fiji also gets good clocks then difference exceeds 20%. This far from little sacrifice.

I did similar thing and sold 980 Ti for Fury, but I did it to finally be able to utilize FreeSync (unfortunately turned out to be more of a gimmick then game changing feature) and to avoid price decreasement after Pascal release (I sold my MSI 980 Ti for 2050 ZL when new one was for 2900, and 10 days later price cut came and new one was available for 2200, so I avoided huge value loss here), and then I bought Fury for 1500, so I lost 25% performance but paid 25% less, so perf/price remained the same and I got 500 ZL in my pocket, and also power draw was proportionally decreased since my Fury undervolts like crazy.

But Nano is already after different price cuts/rebates and etc, just like Fury and FuryX, and RX 480 is new, overpriced and hardly available card, you may have a lot of trouble with getting the money proportional to performance loss.


----------



## Sonikku13

Quote:


> Originally Posted by *Krzych04650*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> Gonna sell my Nano for an AIB 480. Partly to get a profit while sacrificing little performance.
> 
> 
> 
> AIB 480 overclocked out of the box by ~50 MHz is already almost 20% slower than Nano or Fury on reference clocks, and if Fiji also gets good clocks then difference exceeds 20%. This far from little sacrifice.
> 
> I did similar thing and sold 980 Ti for Fury, but I did it to finally be able to utilize FreeSync (unfortunately turned out to be more of a gimmick then game changing feature) and to avoid price decreasement after Pascal release (I sold my MSI 980 Ti for 2050 ZL when new one was for 2900, and 10 days later price cut came and new one was available for 2200, so I avoided huge value loss here), and then I bought Fury for 1500, so I lost 25% performance but paid 25% less, so perf/price remained the same and I got 500 ZL in my pocket, and also power draw was proportionally decreased since my Fury undervolts like crazy.
> 
> But Nano is already after different price cuts/rebates and etc, just like Fury and FuryX, and RX 480 is new, overpriced and hardly available card, you may have a lot of trouble with getting the money proportional to performance loss.
Click to expand...

I paid $370 for my Nano.

I'm downgrading for four reasons. Debt, A10-7850K, VRAM and HDMI 2.0.


----------



## comagnum

Quote:


> Originally Posted by *Sonikku13*
> 
> I paid $370 for my Nano.
> 
> I'm downgrading for four reasons. Debt, A10-7850K, VRAM and HDMI 2.0.


I'll trade you my 480 for your nano







I've been contemplating trading/selling my nano and crossfiring my 480.. But I'm not sure it's worth it.


----------



## Sonikku13

Quote:


> Originally Posted by *comagnum*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I paid $370 for my Nano.
> 
> I'm downgrading for four reasons. Debt, A10-7850K, VRAM and HDMI 2.0.
> 
> 
> 
> I'll trade you my 480 for your nano
> 
> 
> 
> 
> 
> 
> 
> I've been contemplating trading/selling my nano and crossfiring my 480.. But I'm not sure it's worth it.
Click to expand...

I am doing it to reduce my debt load... It's already on eBay at a starting bid of $299.99 and I pounced on the MSI 480 Gaming for $289.99. I'll accept a lil markup... but not over $30 for every $250 MSRP....


----------



## looncraz

Quote:


> Originally Posted by *Sonikku13*
> 
> I am doing it to reduce my debt load... It's already on eBay at a starting bid of $299.99 and I pounced on the MSI 480 Gaming for $289.99. I'll accept a lil markup... but not over $30 for every $250 MSRP....


You do realize you probably just lost money buying a slower card, right?

eBay fees will be about 10~15% of the final sale price, plus shipping costs (if applicable).

You will need to get >$350 for the Nano just to break even, assuming you paid no shipping or taxes for the RX 480.

While that's not too far fetched, I've been watching several of them listed as low as $400 simply not sale at all. The Nano isn't a hotly sought-after GPU.

Found your listing:

http://www.ebay.com/itm/ASUS-Radeon-R9-Nano-White-Edition-/222223899713


----------



## Sonikku13

Quote:


> Originally Posted by *looncraz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I am doing it to reduce my debt load... It's already on eBay at a starting bid of $299.99 and I pounced on the MSI 480 Gaming for $289.99. I'll accept a lil markup... but not over $30 for every $250 MSRP....
> 
> 
> 
> You do realize you probably just lost money buying a slower card, right?
> 
> eBay fees will be about 10~15% of the final sale price, plus shipping costs (if applicable).
> 
> You will need to get >$350 for the Nano just to break even, assuming you paid no shipping or taxes for the RX 480.
> 
> While that's not too far fetched, I've been watching several of them listed as low as $400 simply not sale at all. The Nano isn't a hotly sought-after GPU.
> 
> Found your listing:
> 
> http://www.ebay.com/itm/ASUS-Radeon-R9-Nano-White-Edition-/222223899713
Click to expand...

Eh... 480 hashes better anyway in ETH. I get 26 MH/sec with a Nano, the OCed 480 should pass it.


----------



## looncraz

Okay, so I've now done some memory scaling testing.

In order to do this, I had to use driver 16.7.2, as 16.8.2 does not seem to actually set the memory frequency - meaning my RX 470 bug was, indeed, a driver bug persistent in 16.8.1 and 16.8.2 (though that card, sadly, had other problems).

The result are actually quite surprising - and are VERY telling. Someone told me last week that they got 10% more performance from a 10% memory overclock - I didn't believe them. I have to apologize, because that's exactly what I found:

Framerates:


Relative Score:


Relative Score with frequency charted to make it easier to see just how interesting of an observation this really is:


Yeah, that's right, between 2100Mhz and 2225Mhz, we are gaining more performance relative to the frequency increase for the memory. That's a dead giveaway of memory bottle-necking.

This is all the more important when it is realized that we are only seeing 2.5 to 5% of the 15% IPC increase Polaris is supposed to deliver. All of these benchmarks were done with a core clock of 1250Mhz, with +50% power limit (no throttling).

Memory errors began at 2150Mhz at a rate of 0.1 errors/second. 2175 tripled that rate to 0.3 errors/second, but more performance was still seen - so I continued.

2200Mhz had an error rate of 1/second, and 2225Mhz shot right up to 72/second - but performance still increased.

2250Mhz, the max overclock allowed, hit 300 errors per second, and performance gains were almost nullified as a result.

Using automatic memory voltage was crazy dumb - Wattman simply maxes the voltage at any non-default setting. All of the memory speeds, except 2225 and 2250 were the same regardless of this voltage setting. Maxing the setting (1175 on my card) gave me the above error rates.

---

*Takeaway:*

Polaris is being terribly bottlenecked due to memory bandwidth, it's at least 15% slower than it would be with 15% more memory bandwidth. This bodes extremely well for Vega, which should not be starving for bandwidth.


----------



## Spartoi

Does a R9 Nano clock for clock match a Fury X?


----------



## looncraz

Quote:


> Originally Posted by *Spartoi*
> 
> Does a R9 Nano clock for clock match a Fury X?


It absolutely should. Same, exact, GPU and memory bandwidth. Frequency is really the only differentiator - other than the cooler and VRM.


----------



## MrKoala

Quote:


> Originally Posted by *looncraz*
> 
> Polaris is being terribly bottlenecked due to memory bandwidth, it's at least 15% slower than it would be with 15% more memory bandwidth. This bodes extremely well for Vega, which should not be starving for bandwidth.


Bandwidth or latency?

Fury (X) gets a big boost from VRAM OC as well. Bandwidth is probably not a problem there. AMD GPUs tend to have significantly worse caching compared to NV, and bad caching paired with high latency often lead to latency bottlenecks regardless of throughput.

If it's bandwidth Vega will be in the clear. If it's latency HBM with its longer clock cycles is even worse.


----------



## looncraz

Quote:


> Originally Posted by *MrKoala*
> 
> Bandwidth or latency?
> 
> Fury (X) gets a big boost from VRAM OC as well. Bandwidth is probably not a problem there. AMD GPUs tend to have significantly worse caching compared to NV, and bad caching paired with high latency often lead to latency bottlenecks regardless of throughput.
> 
> If it's bandwidth Vega will be in the clear. If it's latency HBM with its longer clock cycles is even worse.


Considering the similarity with Hawaii and testing done with that, I suspect bandwidth is the greater factor. Latency will have its impact, of course, but GPUs are generally not sensitive to memory latency - Polaris's larger caches should make it even less sensitive to latency.


----------



## bluezone

Quote:


> Originally Posted by *looncraz*
> 
> Considering the similarity with Hawaii and testing done with that, I suspect bandwidth is the greater factor. Latency will have its impact, of course, but GPUs are generally not sensitive to memory latency - Polaris's larger caches should make it even less sensitive to latency.


This talk about bandwidth on GPU's made me decide post this.
For some time now I've been setting my Bios voltage in each DMP step to gain performance (through put) on my Nano. I just recently changed how I was determining what these voltage values were. Now I'm setting my VIDs based on VRM (memory) current A/W usage during benchmarking. I ended up gaining a little performance this way.

Does anyone else set their voltages this way?

EDIT: MY reason for looking for looking at HBM Amp/Watt Peak is that I believe this is where the highest throughput from the GPU is happening.


----------



## Ne01 OnnA

Ne01 OnnA Presents:

*Best Driver Compilation MOD !*

Please Install 16.7.3 WHQL Driver and before you Restart,
copy and Overwrite This set of Driver DLL's from 16.300.2201.0 July 7 WHQL !

Here download the Best to date DX9/10/12 & Mantle Driver

Copy here -> C:\Windows\System32 & C:\Windows\SysWOW64
+ You can Download and Copy Vulcan Driver from 16.8.2 ! for Best Performance








Here 16.8.2 OGL/Vulcan dlls ->

*All Games working Great !*
*Mantle* Work for every Title ! BF4/H, DA:I etc.

THX goes to -> Guru3D member PrMinisterGR for AMD/ATI DLL driver repository









This set is not only for Fiji ! but working also with other GCN GPUs









UPD. Sometimes one need to run Windows in Safe Mode to do the ovewrite.


----------



## BIGTom

Quote:


> Originally Posted by *bluezone*
> 
> This talk about bandwidth on GPU's made me decide post this.
> For some time now I've been setting my Bios voltage in each DMP step to gain performance (through put) on my Nano. I just recently changed how I was determining what these voltage values were. Now I'm setting my VIDs based on VRM (memory) current A/W usage during benchmarking. I ended up gaining a little performance this way.
> 
> Does anyone else set their voltages this way?
> 
> EDIT: MY reason for looking for looking at HBM Amp/Watt Peak is that I believe this is where the highest throughput from the GPU is happening.


BZ, can you explain further? Are you working around a target for A/W at each DPM or? What kind of results are you seeing?

Thanks


----------



## StenioMoreira

Quote:


> Originally Posted by *bluedevil*
> 
> Anyone know of a good dual link DVI to DP converter? I wanna hook up my Qnix 1440P PLS @ 96hz or more...the Fury Xs no DVI..


I bought an 8 dollar dvi to dp and it didn't work, and that's because it's not a powered one, look one one that has a usb cable on it, to power it or else it probably won't work. So when you search for one, go along the lines of " dvi to dp/ usb powered"


----------



## sergiodb

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> Ne01 OnnA Presents:
> 
> *Best Driver Compilation MOD !*
> 
> Please Install 16.7.3 WHQL Driver and before you Restart,
> copy and Overwrite This set of Driver DLL's from 16.300.2201.0 July 7 WHQL !
> 
> Here download the Best to date DX9/10/12 & Mantle Driver
> 
> Copy here -> C:\Windows\System32 & C:\Windows\SysWOW64
> + You can Download and Copy Vulcan Driver from 16.8.2 ! for Best Performance
> 
> 
> 
> 
> 
> 
> 
> 
> Here 16.8.2 OGL/Vulcan dlls ->
> 
> *All Games working Great !*
> *Mantle* Work for every Title ! BF4/H, DA:I etc.
> 
> THX goes to -> Guru3D member PrMinisterGR for AMD/ATI DLL driver repository
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This set is not only for Fiji ! but working also with other GCN GPUs


i instal thx


----------



## sydefekt

Hi all, is it worth selling an unlocked 3840 Fury to buy a Fury X? I can probably sell the 3840 Fury for same price as X, but will lose some money on eBay fees. I'm also worried that initial reviews of the X show poor overclocks, whereas the Fury with less shaders have more reviews with better overclocks. So far my 3840 Fury can reach 1110 mhz and 570 memory on custom bios. Timespy graphics score is 5110ish.


----------



## looncraz

Quote:


> Originally Posted by *sydefekt*
> 
> Hi all, is it worth selling an unlocked 3840 Fury to buy a Fury X? I can probably sell the 3840 Fury for same price as X, but will lose some money on eBay fees. I'm also worried that initial reviews of the X show poor overclocks, whereas the Fury with less shaders have more reviews with better overclocks. So far my 3840 Fury can reach 1110 mhz and 570 memory on custom bios. Timespy graphics score is 5110ish.


I would say no.

http://www.anandtech.com/bench/product/1522?vs=1513

They are extremely close in performance - you would never notice a difference except with the actual scores you write into a spreadsheet.

The RX 480 versus my R9 290, however, is another story. My framerate is so high in BF4 that I am overshooting my 144Hz monitor at 1080p Ultra, so I need to underclock my card... so my system is using like 340W while running 100FPS+ AVG with dips into the 80s in BF4, LOL! Compared to upwards of 420W before, running ~90FPS, with dips to the 60s. I plan on doing some VSR testing later, BF4 looks amazing with some good scaling. Maybe 4K VSR with no AA will be the way.


----------



## gupsterg

Quote:


> Originally Posted by *sydefekt*
> 
> Hi all, is it worth selling an unlocked 3840 Fury to buy a Fury X?


I had a Fury Tri-X unlocked to 3840 SP and versus a genuine Fury X with 4096SP they benched pretty much equal other than run to run variance IMO. So not worth the swap IMO, I only swapped cards as was losing no money doing so plus the Fury X was not costing anymore. Only bonus the AIO scores over air cooler was all heat from GPU was being exhausted out of case in my setup.
Quote:


> Originally Posted by *MrKoala*
> 
> Fury (X) gets a big boost from VRAM OC as well.


Not noticed this in 3DM 11 P or X, 3DM FS or FS E, gaming benches again very low gains for HBM OC, similar to gains show in this article. Only thing I see a largish rise is AIDA GPGPU memory copy results, read / write is pretty much the same between 500MHz vs 545MHz.

May I ask what you are seeing big boost in from HBM OC?


----------



## bluedevil

Quote:


> Originally Posted by *StenioMoreira*
> 
> I bought an 8 dollar dvi to dp and it didn't work, and that's because it's not a powered one, look one one that has a usb cable on it, to power it or else it probably won't work. So when you search for one, go along the lines of " dvi to dp/ usb powered"


Yeah I found this one and I only think I can get it up to 82hz by my calculations.

https://www.amazon.com/gp/aw/d/B00856WJH8/ref=pd_aw_sim_sbs_23_1?ie=UTF8&psc=1&refRID=RD124W7W19DA80X1WBHT


----------



## RatPatrol01

Quote:


> Originally Posted by *sydefekt*
> 
> Hi all, is it worth selling an unlocked 3840 Fury to buy a Fury X? I can probably sell the 3840 Fury for same price as X, but will lose some money on eBay fees. I'm also worried that initial reviews of the X show poor overclocks, whereas the Fury with less shaders have more reviews with better overclocks. So far my 3840 Fury can reach 1110 mhz and 570 memory on custom bios. Timespy graphics score is 5110ish.


Got my Nano at 1100Mhz under water doing ~5200 in Time Spy, considering such a small difference I doubt you'd see much gain moving to a Fury X


----------



## sydefekt

I guess I'll be keeping my 3840 Fury. Thanks for all the replies


----------



## RatPatrol01

Depending on your mobo, you could always snag a Fury X to crossfire with that Fury


----------



## bluezone

Quote:


> Originally Posted by *BIGTom*
> 
> BZ, can you explain further? Are you working around a target for A/W at each DPM or? What kind of results are you seeing?
> 
> Thanks


Sorry BT I have to retract what I said earlier. When I first received My Nano there was about a 350-400 point advantage in FS scores with tuning Per DPM. That was with old drivers. With current drivers there is not a significant difference in scores. Temps are 1-2C lower though than using offset voltage.

I was running comparison bench marks with current drivers and saw very little difference. I could not figure out why until I reinstalled older drivers. This also what took me so long to reply. This is a dead end.

It does work with determining correct VID voltage for overclock frequencies. But you could accomplish the same by simple benchmarking.
Out of experience, finding correct VID per DPM at lower frequencies is something that is not so easily done.

My mistake.

EDIT: If you want to set an undervolt value into Bios so that a voltage offset was not necessary in overclocking software this works ok.


----------



## Ne01 OnnA

OK today i manage to do some OC Tests for my Nitro-X








Using my Customised Driver Set 16.7.3 WHQL + 16.300.2201.0 July 7 WHQL DX9/10/11/12/Mantle + Vulcan from 16.8.2

Here the results:
PhenomII is no Go for it, but 3D score is somewhat OK.

My BIOS is set to 1.212v and have Unlocked CU's + Timing HBM MOD with HBM v MOD to 1.319v
No Over PL is made ->



3Dmark13:
FireStrike P 3.91GHz
10.700 - GPU: 15.399 PhY: 8145 \ 1050/570 -42mV 0% POW 247tW

FireStrike P 4.00GHz
11.117 - GPU: 15.544 PhY: 8476 \ 1050/570 0mV +8% POW 258tW
11.478 - GPU: 16.735 PhY: 8481 \ 1120/570 0mV +10% POW 268tW

FireStrike X 4.00GHz
7048 - GPU: 8077 PhY: 8491 \ 1125/570 0mV +12% POW 280tW
6943 - GPU: 8035 PhY: 8479 \ 1120/570 0mV +10% POW 275tW

Best FireStrike P 4.00GHz GPU score 1122MHz = 16.816

S.o.Mordor:
935/550 1.212v -48mV -14%POW 55%Fan -> 61 154 45 (170tW / HBM 10tW) - FPS CAP 61



Spoiler: Warning: Spoiler!



FS FHD P score:


FS Extreme Score:


Overhead API test:

OC 1120/570 0mV GPU on 1.212v +10% POW 270tW:



My Default 1050/570 -42mV 0% POW 245tW:





Yeah that's How we ROLL on Radeon Fiji


----------



## looncraz

Quote:


> Originally Posted by *bluezone*
> 
> Sorry BT I have to retract what I said earlier. When I first received My Nano there was about a 350-400 point advantage in FS scores with tuning Per DPM. That was with old drivers. With current drivers there is not a significant difference in scores. Temps are 1-2C lower though than using offset voltage.


I believe current drivers are actually completely broken in regards to memory overclocks. I tried my R9 290 with 16.8.2 and the memory overclocks don't work there, either.

I seem to be the only person that has actually noticed this, so I am going to file a report with AMD.

EDIT: In case you aren't up to date with my saga: memory clocks don't work with the RX 470 or RX 480 with 16.8.1 or 16.8.2, from my experience.

With the RX 470 I was unable to use a driver older than 16.8.1, so I came to suspect a faulty card, but then the card started messing up pretty bad anyway, so I RMA'd it and bought an RX 480 - with 16.7.2 I can overclock the memory fine, but there are other bugs. 16.8.2 has other problems (cursor corruption, for starters), but doesn't crash my games.

It looks that whoever fixed the memory overclocking range in the driver actually broke it.

To check, all you have to do is see if the screen flickers when you set the memory clocks - they should. If not, then the memory clock is not actually being engaged, despite all utilities saying otherwise. You should be able to set any memory clock you want and GPU-z will say you did it, but you will get no memory errors nor any improvement in speed.

UPDATE:

Created a thread: http://www.overclock.net/t/1609366/amd-16-8-1-16-8-2-memory-clocks-not-working


----------



## rdr09

Quote:


> Originally Posted by *looncraz*
> 
> I believe current drivers are actually completely broken in regards to memory overclocks. I tried my R9 290 with 16.8.2 and the memory overclocks don't work there, either.
> Quote:
> 
> 
> 
> My 290 oc's just fine and i'm using the latest driver. I did not accept Overdrive, though.
Click to expand...


----------



## looncraz

Quote:


> Originally Posted by *rdr09*
> 
> My 290 oc's just fine and i'm using the latest driver. I did not accept Overdrive, though.


Are you sure you are actually seeing gains?

Set your memory clocks and see if the screen flickers, that seems to be the dead giveaway for me.

Also, I started a thread just for this problem:

http://www.overclock.net/t/1609366/amd-16-8-1-16-8-2-memory-clocks-not-working


----------



## bluedevil

Anyone got any vmod'd BIOSs for the Fury X yet? Just posting here while I look. Tks.


----------



## gupsterg

Fuji bios mod OP has info on how to modify GPU voltage via ROM. The OP also has a section with ROMs which have editable HBM voltage register in it







.


----------



## jaggafeen

this is what i get with my fury nitro

Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E331
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 20010001 / 00000000 [..x............x]
SE2 hw/sw: 00030001 / 00000000 [..............xx]
SE3 hw/sw: 00030001 / 00000000 [..............xx]
SE4 hw/sw: 00030001 / 00000000 [..............xx]
56 of 64 CUs are active. HW locks: 8 (R/O) / SW locks: 0 (R/W).
Sorry, all 8 disabled CUs can't be unlocked by BIOS replacement.


----------



## bluezone

HBM news.

http://arstechnica.com/gadgets/2016/08/hbm3-details-price-bandwidth/


----------



## Flamingo

new Trixx release with HBM OC support

http://www.sapphiretech.com/catapage_tech.asp?cataid=291&lang=eng


----------



## pozzallo

would like to join club running three XFX fury x's


----------



## bluedevil

Just wanted to let you guys know that, Project: CLASSIFIED DEMON featuring 2 Fury Xs in Crossfire is now complete, link in sig.


----------



## Krzych04650

Anyone tried to play Obduction? Are your core clock also fluctuating so much? It is all around from 750 to 1050 MHz, usually around 900-1000 with some peaks to proper frequency.


----------



## pozzallo

I am running my system on a Evga X99 Classified great board and great support from Evga. Made some changes to my system put top gpu radiator on rear case fan spot and replaced H80i GT cpu cooler for H100i cooler cpu temps are cooler and mounted on the top of the case. 3 fury x's are a good upgrade from 3 R9 290x's and the case is a alot cooler now.3 Fury x's run my games at 4k. Have my Fury X's overclocked 1100 MHz clock and 525 MHz for memory no change in voltage. Trixx 6.0.0 beta is out but no control for voltage on Fury X for me. Also XFX support is good asked rep. if they have a UEFI bios for Fury x and said yes and emailed to me. My Fury x's had old legacy bios now all my fury x's have UEFI bios on all 3 cards now in 2 bios . first bios is still legacy.


----------



## LionS7

Quote:


> Originally Posted by *Krzych04650*
> 
> Anyone tried to play Obduction? Are your core clock also fluctuating so much? It is all around from 750 to 1050 MHz, usually around 900-1000 with some peaks to proper frequency.


Just turn off the power efficiency from Crimson control panel. The core will stay at its max. It is the same in Dark Souls III, and this "fix it", but I think that if you don't need the extra power, leave it on.


----------



## Krzych04650

Quote:


> Originally Posted by *LionS7*
> 
> Just turn off the power efficiency from Crimson control panel. The core will stay at its max. It is the same in Dark Souls III, and this "fix it", but I think that if you don't need the extra power, leave it on.


Ehh, I thought that it is turned off already because I turned it off immediately after I bought the card, but it looks like it comes back on sometimes. I need to check that frequently it seems. 10 FPS gain, which is huge gain if you are up from 45 to 55 FPS. Whoever created this option -


----------



## LionS7

You need to turn it off every time you install new drivers.

So, people - I think that negative scaling with the high voltage is only in FireStrike. I think that the benchmark having a problem of some sort. Did someone watch negative scaling somewhere else ?


----------



## IsaacM

Hello gents. I recently purchased a Nano and was wondering what's the best method to OC it. I currently underclock it -24 mv in Afterburner, it stays pegged at 1000/500 with temp ~75 C. Do you think it would be able to OC? Do you think it worth repasting? I have some Kryonaut Grizzly thermal paste I could use.


----------



## LionS7

Well, I think you could find yourself. Start increasing the core to a point where it will start dropping. Put the Power limit to +50%. The memory will start to show red/blue dots if it unstable and need more voltage. If you can use steady 1000/500 its a good start, but test in games like The Witcher 3 if you want good stable test.


----------



## jaggafeen

Hi guys i need a bios for a fury nitro which will have a more aggressive fan curve and not throttle the card clocks. Please.


----------



## Krzych04650

Quote:


> Originally Posted by *jaggafeen*
> 
> Hi guys i need a bios for a fury nitro which will have a more aggressive fan curve and not throttle the card clocks. Please.


You can set fan curve by yourself in MSI Afterburner of Sapphire TriXXX. Turn off Power Efficiency in GPU settings panel (Games>>Global Settings)

Fury Nitro shouldn't throttle with clocks, it has certain temperature limits set in BIOSes (75 C on default one and 80C on second one). The fans will spin at minimal speed until the card reaches temp limit and then they are starting to speed up to keep the temp. It shouldn't throttle clocks unless you set your custom fan curve that is too slow and can keep the card below temp target.

Just turn off your PC and switch BIOS to second one with a button on a card that is located on the left, next to ports. It has 80C temp limit, with fairly good case and reasonable ambient temps the card should never reach this temp even at minimal fan speed. Fan are spinning at 30% until they reach temp limit if I remember well, with 35% speed I got 71C on stock, so 30% may bump it up to 75-77 C and it should stay there.


----------



## Alastair

so I sold my Fury that had two faulty CUs which prevented it from going to 3840. and I purchased one that was guaranteed to unlock. For no loss to me. (just shipping)


Spoiler: Pretty Pictures













Heaven at 1440P ultimate 4x AA crossfire goes from 90 to 93FPS. Since I didn't loose anything I am happy.


----------



## Minotaurtoo

finally, I've ordered my fury x... price has dropped a bunch since I tried to get one last year : ) anyway, I haven't the time to read through everything.... any good guidelines to overclocking these things yet... I'm not looking for monster overclocks, just a "what to expect" idea... last I read up on it people barely got to 1150 on core if they could even get that far...

...Also, I have a 4 k monitor (49"







) but I understand that the hdmi output is only capable of 30hz... is there any dependable DP to HDMI converters that can do the full HDMI 2.0?


----------



## josephimports

Quote:


> Originally Posted by *pozzallo*
> 
> would like to join club running three XFX fury x's


Looks very familiar.










Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *Alastair*
> 
> so I sold my Fury that had two faulty CUs which prevented it from going to 3840. and I purchased one that was guaranteed to unlock. For no loss to me. (just shipping)
> 
> Heaven at 1440P ultimate 4x AA crossfire goes from 90 to 93FPS. Since I didn't loose anything I am happy.


Very nice. Here are my results for the first of two used Tri-X that I recently acquired on CL.

Adapters detected: 1
Card #1 PCI ID: 1002:7300 - 174B:E329
DevID [7300] Rev [CB] (0), memory config: 0x00000000 (unused)
Fiji-class chip with 16 compute units per Shader Engine
SE1 hw/sw: 00030000 / 00000000 [..............xx]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 00030000 / 00000000 [..............xx]
SE4 hw/sw: 01010000 / 00000000 [.......x.......x]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.

stock rom = Firestrike G score 15236
unlocked stock rom= Firestrike G score 15610
unlocked Tri-X OC mod rom= Firestrike G score 16151


----------



## pozzallo

josephimports Great system are those Fury's or Fury X's.


----------



## josephimports

Quote:


> Originally Posted by *pozzallo*
> 
> josephimports Great system are those Fury's or Fury X's.


Thanks, they're X's.


----------



## Tgrove

Quote:


> Originally Posted by *Minotaurtoo*
> 
> finally, I've ordered my fury x... price has dropped a bunch since I tried to get one last year : ) anyway, I haven't the time to read through everything.... any good guidelines to overclocking these things yet... I'm not looking for monster overclocks, just a "what to expect" idea... last I read up on it people barely got to 1150 on core if they could even get that far...
> 
> ...Also, I have a 4 k monitor (49"
> 
> 
> 
> 
> 
> 
> 
> ) but I understand that the hdmi output is only capable of 30hz... is there any dependable DP to HDMI converters that can do the full HDMI 2.0?


https://www.amazon.com/Club3D-Displayport-1-2-HDMI-CAC-1070/dp/B017BQ8I54


----------



## Minotaurtoo

Quote:


> Originally Posted by *Tgrove*
> 
> https://www.amazon.com/Club3D-Displayport-1-2-HDMI-CAC-1070/dp/B017BQ8I54


Thanks, I just ordered it...maybe it will get here same time as my card... trouble with this monitor is the only UHD 60hz ports on it are 3 HDMI 2.0 ports all else is dvi and vga... no display port... it was a cheap monitor after all.


----------



## madmanmarz

Anyone know if the bolt pattern spacing is the same on these cards as previous AMD cards? Been using my swiftech mcw80 for many generations and curious if it would work on fury.


----------



## Alastair

Quote:


> Originally Posted by *madmanmarz*
> 
> Anyone know if the bolt pattern spacing is the same on these cards as previous AMD cards? Been using my swiftech mcw80 for many generations and curious if it would work on fury.


I dont think so. These chips are massive so I imagine it to be a bit wider.


----------



## madmanmarz

I see that the alphacool nexxxos GPX universal is compatible, but I noticed those blocks have slots that go in and out for different bolt patterns.
Quote:


> Originally Posted by *Alastair*
> 
> I dont think so. These chips are massive so I imagine it to be a bit wider.


After a lot of digging around, the spacing is wider than the usual 53.2mm. I read a post on reddit by EK saying it was 54mm, which is the same as the 7900 series, and swiftech does sell a 54mm adapter plate for that. The base of the mcw80 is also flat, so it would cover all the parts assuming they're all at the same height, assuming it's wide enough.

The alphacool nexxos gpx base is also flat, and they show compatibility with the fury (although their block uses slots that allows it to work with different bolt spacing). This makes me think that the mcw80 will work with the 54mm adapter.

Oh well I'm gonna dive in then! Anyone have a pic of the PCB showing where the areas of concern are (vrm/mosfets etc)?


----------



## Ne01 OnnA

In Fiji BIOS edit thread i've posted Moded with tMOD etc. F-X BIOS + HBM to 1.325v
Check and see


----------



## pozzallo

josephimports. Thanks, the fury X's are great cards. I went with the X's instead of air cooled fury because had non reference 290X's before and the heat they produced was great in winter but to hot in the summer months.


----------



## madmanmarz

Quote:


> Originally Posted by *pozzallo*
> 
> josephimports. Thanks, the fury X's are great cards. I went with the X's instead of air cooled fury because had non reference 290X's before and the heat they produced was great in winter but to hot in the summer months.


Air cooled or water cooled doesn't make a difference in heat in the room, it's how much power the card is using. You're still dissipating the same wattage and unless your radiator etc is in another room it's not gonna make a difference. In fact you're probably outputting more heat with water cooling due to the extra headroom which usually results in higher clocks,voltage power usage etc.


----------



## Kana-Maru

Quote:


> Originally Posted by *madmanmarz*
> 
> Air cooled or water cooled doesn't make a difference in heat in the room, it's how much power the card is using. You're still dissipating the same wattage and unless your radiator etc is in another room it's not gonna make a difference. In fact you're probably outputting more heat with water cooling due to the extra headroom which usually results in higher clocks,voltage power usage etc.


There is a difference. The air cooler fan reaches higher RPMs and simply dumps a ton of heat into the room while the Fury X doesn't have to push the heat out at high RPMs to keep the GPU at a decent temperature. Yes the radiator will get hot, no one is questioning that, but you can run your Fury X fan at a low RPM which doesn't push as much heat out as the air cooler high RPM.

I'm sure people are smart enough to turn their AC on and keep their house\rooms ambient at a decent temp as well. Compared to my previous air blowers my room feel much cooler with my Fury X than it used to. Then I went SLI and you could really feel the heat blowing heavily from the GPUs.

I recently undervolted my Fury X sometimes as well. In the winter months the Fury X runs so cool that heat output is almost non-existing due to the constant cool ambient temps.


----------



## Minotaurtoo

There is that little principle that people forget too... the hotter a chip is the more resistance there is in it... thus more heat... the cooler you can keep the chip the lower the resistance in it... I"m not really qualified to explain it properly, but in the push to get 5ghz I learned that keeping the chip cooler actually reduced the power/voltage needed and thus reduces the heat... blah blah, maybe someone else can explain this better, but yes it's possible that liquid cooling could result in overall lower heat output.


----------



## Ne01 OnnA

System that uses WC has more stable OC, and Yes chip will produce same amout of heat but The Heat dissipation from WC is greater than regular fans can achieve.


----------



## Drag0g0

I need help here, i got nano with -30v and +50powerlimit and fan 100% and still temps go 77c and its start trothle. Anything what i can do? I did install silent wings2 fan for nano.


----------



## Drag0g0

The odd thing is that it start trothlle at 77c not 85c, with powerlimit +50 thatis odd?


----------



## jaggafeen

couldnt unlock any cu so i had a play at overclocking the fury nitro and managed 1161 core 570 memory +75mv.

scored 7598 on 3Dmark Fire Strike Extreme







with max temp 63c using 6700k cpu and z170x gaming 7 board.

overclocked fury X 1134/500mhz scores 7591 here http://www.legitreviews.com/sapphire-radeon-r9-fury-tri-x-oc-video-card-review_169018/8

i just whacked the voltage up 75mv and had a go, tomorrow i will try and keep current clocks and lower the voltage

asic quality is 66.4% on gpuz.


----------



## bluezone

Quote:


> Originally Posted by *Minotaurtoo*
> 
> There is that little principle that people forget too... the hotter a chip is the more resistance there is in it... thus more heat... the cooler you can keep the chip the lower the resistance in it... I"m not really qualified to explain it properly, but in the push to get 5ghz I learned that keeping the chip cooler actually reduced the power/voltage needed and thus reduces the heat... blah blah, maybe someone else can explain this better, but yes it's possible that liquid cooling could result in overall lower heat output.


There are 2 different things in effect.
For the copper trace circuitry on the board:

http://resources.schoolscience.co.uk/CDA/16plus/copelech2pg4.html

Copper circuit traces resist current flow as they heat, creating more heat (BTU"s). This is a smaller problem on properly designed, implemented and used circuit (current flow capacity). I am intimately experienced (the black out of 2003) with this this effect as it causes power line sag from heating.
The induced internet withdrawal that event would of caused today would be horrifying.









For silicon Die(s) [GPU, VRM, control silicon, etc...]:

http://electronics.stackexchange.com/questions/13873/why-exactly-do-chips-start-malfunctioning-once-they-overheat

The simple version is Silicon component leakage increases as temperatures rise. Now the voltage control circuits purpose is to supply the GPU with consistent operating voltage @ Frequency. Now as a result of maintaining circuit forward Voltage, circuit forward current (Amperage) increases as a direct result of reduced circuit Resistance. The important part is W = V x A. To put this another way. At a constant voltage, current flow will increase as resistance decreases. The math is A = V/R or properly expressed I = E/R.
Now CPUs and GPUs are excellent space heaters, as energy in (W) maps very well to BTU's produced.
One other thing. Maintaining Voltage levels @ higher Current flows greatly heats the VRM as well.

The short version is "Keep it cool to reduce temperature induced current flow".


----------



## pozzallo

I would like to thank everyone for there answers


----------



## Ne01 OnnA

The short version is "Keep it cool to reduce temperature induced current flow".







-> YES !!! YES !!!!


----------



## Minotaurtoo

Quote:


> Originally Posted by *pozzallo*
> 
> I would like to thank everyone for there answers


well put... I couldn't remember for my life why it was that keeping it cool actually helped keep it cool lol.. but that explains it quite well...









edit: I just realized I quoted the wrong post lol.... ooops. meant to quote the one above the one I quoted.


----------



## LionS7

You should consider to turn the fan of the Fury X to suck air from intake case fan. The turn somewhere maybe will be not too easy, but the card stays very cool. Im playing Rise of the Tomb Raider in 25C temp room. My card is using 30mv (1.231V typical), HBM at 550Mhz, 1.36V. The core can't go above 46-47C at 45% witch i still quiet for me. In The Witcher 3 maybe the core will turn around 49-50C.

I will post images soon to show what I mean.


----------



## Gamedaz

* Just updated to the Latest Crimson Drivers for my Fury R9 GPU.

* I've been waiting all year for AMD to allow the GPU to overclock without having to use third Party O.C software.

With previous features I was unable to O.C the GPU per GAME as when I did the System would crash.

* Now with power efficiency feature, I've overclocked Ryse Son of Rome to 5% and the screen stuttering / tearing is all gone, plays smoothly without any issues.

* I can see why AMD does not want us to O.C their GPU's because of inconsistent PSU tolerances (such as my SFX-600) which I understand can deviate from its correct voltages up to 5% (even though its gold Rated), 5% voltage spikes = more heat and stress on the GPU itself. The Fury R9 card has a Huge heat spreader and x3 Large fans to cool it, I overclocked it @ 3.5% Core clock and 6.5% Power clock. This has resulted in better image stability in Ryse Game * As well temps never exceed 69c in my Silverstone Mini itx case.

* Its a really useful feature if you want to squeeze 4-5 frames to hit 60 FPS on 1080p Displays.


----------



## Drag0g0

Hi! Im going change my thermal paste in my R9 Nano and i just wanted to ask if anyone got some tips for that? Is there anything special what i would wanna know?

And if someone got pictures this operation or videos feel free link here!

Im having high temps in my nano and this is last thing what i didint yet do, i did undervolt, custom bios, silent wing2, everything no luck.


----------



## sydefekt

Quote:


> Originally Posted by *Drag0g0*
> 
> Hi! Im going change my thermal paste in my R9 Nano and i just wanted to ask if anyone got some tips for that? Is there anything special what i would wanna know?
> 
> And if someone got pictures this operation or videos feel free link here!
> 
> Im having high temps in my nano and this is last thing what i didint yet do, i did undervolt, custom bios, silent wing2, everything no luck.


When I changed my Sapphire Fury TriX thermal paste there was a plastic film around the gpu and hbm. I removed the film and then cleaned the old thermal paste off. I'm not sure if Nano will have this same film or not.

I then used the paste that came with my EK waterblock (Gelid). I put 1 small dot on each HBM and then a thin X-shaped line on the GPU itself.


----------



## Jflisk

Quote:


> Originally Posted by *sydefekt*
> 
> When I changed my Sapphire Fury TriX thermal paste there was a plastic film around the gpu and hbm. I removed the film and then cleaned the old thermal paste off. I'm not sure if Nano will have this same film or not.
> 
> I then used the paste that came with my EK waterblock (Gelid). I put 1 small dot on each HBM and then a thin X-shaped line on the GPU itself.


Quote:


> Originally Posted by *Drag0g0*
> 
> Hi! Im going change my thermal paste in my R9 Nano and i just wanted to ask if anyone got some tips for that? Is there anything special what i would wanna know?
> 
> And if someone got pictures this operation or videos feel free link here!
> 
> Im having high temps in my nano and this is last thing what i didint yet do, i did undervolt, custom bios, silent wing2, everything no luck.


In Sydefekt picture above there are gold membranes running between the HBM chips . These are the interposters do not touch or break them. Just a heads up , Good luck


----------



## Drag0g0

So if there are old thermal paste interposter do i just leave it there?


----------



## Jflisk

Quote:


> Originally Posted by *Drag0g0*
> 
> So if there are old thermal paste interposter do i just leave it there?


I would suggest leaving it alone if you damage the interposter you will have a brick.

Have a look at this youtube video the guy explains the removal of the cooler and installation of a water block. Explains all the dos and don'ts and he seems pretty knowledgeable.


----------



## Alastair

I must say. in working with three Fiji cards now, I've removed the plastic film covering the interposer, and I've looked carefully at it I don't think it's as easy to break as everyone seems to think.

To me after having a look at it for a while, it looks like the interposer is covered by a glue or something, which reduces the risk of damage.

I'm not saying go ahead and run a screw driver across it. But yeah I don't think it's as fragile as people make it to be. I cleaned all the old TiM off my interposers with a toothpick and Q-tio. And all my cards are in ship shape.


----------



## bluezone

Well I do not think that the interposer traces can be harmed just by breathing on them. There are at least two thread members whom have had interposer, TIM replacement related damage. One member damaged his interposer whilst cleaning TIM after removing the protective film that Sapphire applies to Fury series cards. IIRC they were not gentle with it. The second Fury card had interposer damage due to cracked interposer. This happened while tightening the cooler back in place. This happened to a very experienced enthusiast exercising proper care. Likely due to the very high tension of the stock retaining spring plate mounted on the back of the card.
I have also read elsewhere of al least 2 other cards being damaged due to carless old TIM removal methods.
The cracked interposer is the one I find most worrying.
I did not have any problems myself. But I do not think it is a cake walk either.


----------



## Alastair

Well I do think that caution should be applied. I think that if proper care is taken the risk is no greater as it would be with any other big die card. I mean you can crack a big Hawaii die or Maxwell die with too high pressure. So I think that if you are just careful, take your time and just add tension to the corner screws evenly, you will be alright.


----------



## bluezone

Yes I definitely agree with your advice Alastair.

@Jflisk the video you linked is good. The same YouTuber did a slightly (IMO) better video on water blocking a pair a of Nano's as well.






PLUS 1 REP for both of you for the shares.


----------



## bluezone

End of the month Driver.









http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.8.3-Release-Notes.aspx

Win 10 64

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Minotaurtoo

yay got my fury x in ... actually surprised me at 4k... didn't think it would be that much better if any than my old tahiti cross fire setup... but it did quite well... mostly was wanting to get away from crossfire again since some of the games I play never got crossfire support or it just completely dorked up the game...

odd bit... I lost points on fire strike (regular) gained on the 4k fire strike though... wierd.

overclocking = not happening ... bad clocker I guess... can't even get an extra 50 mhz out of it and pass stability test... so far ... still tweaking, but from what I've read all fury's are pretty rotten overclockers compared to tahiti... I had my 7950's clocked to 1200mhz at 1.25v so far I haven't tried volt increases or mem clocking...

I've read and read and read... so far I haven't read any decent guide on these... ie what to do, what not to do... but from what I've read... it looks like most people are stuck at stock or very near.

links / recommendations welcomed


----------



## Flamingo

Anyone tried Battlefied1 ? 16.8.3 vs older ones?


----------



## Kana-Maru

Quote:


> Originally Posted by *Flamingo*
> 
> Anyone tried Battlefied1 ? 16.8.3 vs older ones?


I'm downloading the BF1 Beta now. I'm still running Crimson 16.7.2.

I can try to benchmark both 16.7.2 and 16.8.3 and tell you the difference. I'll be using the DX12 API for sure.









Oh and I'm running a Fury X.


----------



## Alastair

Quote:


> Originally Posted by *Minotaurtoo*
> 
> yay got my fury x in ... actually surprised me at 4k... didn't think it would be that much better if any than my old tahiti cross fire setup... but it did quite well... mostly was wanting to get away from crossfire again since some of the games I play never got crossfire support or it just completely dorked up the game...
> 
> odd bit... I lost points on fire strike (regular) gained on the 4k fire strike though... wierd.
> 
> overclocking = not happening ... bad clocker I guess... can't even get an extra 50 mhz out of it and pass stability test... so far ... still tweaking, but from what I've read all fury's are pretty rotten overclockers compared to tahiti... I had my 7950's clocked to 1200mhz at 1.25v so far I haven't tried volt increases or mem clocking...
> 
> I've read and read and read... so far I haven't read any decent guide on these... ie what to do, what not to do... but from what I've read... it looks like most people are stuck at stock or very near.
> 
> links / recommendations welcomed


Not even 50MHz?you sure you aren't doing something wrong. I agree these chips aren't exactly the easiest to OC but three of the cards I have owned could all manage well North of 1100 without any additional voltage. And I currently run 1100 at stock in crossfire.

To be honest I don't think there is a guide out there yet simply because I think a lot of us are trying to figure out what makes these cards tick!


----------



## Minotaurtoo

Quote:


> Originally Posted by *Alastair*
> 
> Not even 50MHz?you sure you aren't doing something wrong. I agree these chips aren't exactly the easiest to OC but three of the cards I have owned could all manage well North of 1100 without any additional voltage. And I currently run 1100 at stock in crossfire.
> 
> To be honest I don't think there is a guide out there yet simply because I think a lot of us are trying to figure out what makes these cards tick!


I'm using msi afterburner to attempt OC... I was able to finally get 1100 at stock voltage to "work" but it wasn't fully stable... only scored 92% on firestrike stress test... I think 97% is considered passing stability... either way it got a "didn't pass" lol I'm ok with it if I got a terrible clocker, I've had more than my fair share of good clocking gpu's over the past years.

here's what I have tried so far:

^increasing fan speed's to keep it cooler

^increasing voltage attempting to get 1150 stable (+96mv) I didn't notice the "negative" scaling with voltage increase some people complained about

^increasing clock to 1100 with no voltage increase but with fan speeds greatly increased (this actually has been the best result)

none of these resulted in abject failure, but none passed the stability test... stock passes with flying colors with a score of 99.7%


----------



## Alastair

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Not even 50MHz?you sure you aren't doing something wrong. I agree these chips aren't exactly the easiest to OC but three of the cards I have owned could all manage well North of 1100 without any additional voltage. And I currently run 1100 at stock in crossfire.
> 
> To be honest I don't think there is a guide out there yet simply because I think a lot of us are trying to figure out what makes these cards tick!
> 
> 
> 
> I'm using msi afterburner to attempt OC... I was able to finally get 1100 at stock voltage to "work" but it wasn't fully stable... only scored 92% on firestrike stress test... I think 97% is considered passing stability... either way it got a "didn't pass" lol I'm ok with it if I got a terrible clocker, I've had more than my fair share of good clocking gpu's over the past years.
> 
> here's what I have tried so far:
> 
> ^increasing fan speed's to keep it cooler
> 
> ^increasing voltage attempting to get 1150 stable (+96mv) I didn't notice the "negative" scaling with voltage increase some people complained about
> 
> ^increasing clock to 1100 with no voltage increase but with fan speeds greatly increased (this actually has been the best result)
> 
> none of these resulted in abject failure, but none passed the stability test... stock passes with flying colors with a score of 99.7%
Click to expand...

Maybe try Sapphire Tri-x instead? Tri-x works well for me. Or you could even try just using over drive in the driver?


----------



## JunkaDK

Help guys,

My PC just went NUTS! Im getting this kinda flicking pixel print on the screen.. Its there all the time.. also in the BIOS. Is it my monitor / GPU? something else? I tried switching cables, PCIE slot ect ect.. removing all oc.. also..

Anyone seen this before?


----------



## JunkaDK

When i try to install AMD drivers it instantly crashes.. and i have to reboot. Also i can see that the screen is set at 64hz and i can't change it!


----------



## JunkaDK

Quote:


> Originally Posted by *JunkaDK*
> 
> Help guys,
> 
> My PC just went NUTS! Im getting this kinda flicking pixel print on the screen.. Its there all the time.. also in the BIOS. Is it my monitor / GPU? something else? I tried switching cables, PCIE slot ect ect.. removing all oc.. also..
> 
> Anyone seen this before?


When i try to install AMD drivers it instantly crashes.. and i have to reboot. Also i can see that the screen is set at 64hz and i can't change it!


----------



## Jflisk

Either the monitor or GPU by the looks of things. Do you have a lcd tv you can connect your computer to and see if you get the same result using the hdmi output or another monitor see what you get. Good luck


----------



## JunkaDK

Quote:


> Originally Posted by *Jflisk*
> 
> Either the monitor or GPU by the looks of things. Do you have a lcd tv you can connect your computer to and see if you get the same result using the hdmi output or another monitor see what you get. Good luck


Hooked it up to my TV with the same result.. guess its the GPU .. craaaap







But at least i know its the GPU then i guess.


----------



## Jflisk

Quote:


> Originally Posted by *JunkaDK*
> 
> Hooked it up to my TV with the same result.. guess its the GPU .. craaaap
> 
> 
> 
> 
> 
> 
> 
> But at least i know its the GPU then i guess.


The only thing that is weird is the bottom of the screen is still there and installing the driver totally fails is the dead give away. I mean process of elimination is pointing towards the GPU . You can try DDU to uninstall the drivers and reinstall them . Is the whole upper screen like that and the bottom is normal ? Do you see Icons at all in the upper half of the screen or just the orange .


----------



## Thoth420

JunkaDK I def advise either trying DDU and/or using the guide on here by BradleyW in regard to manually doing essentially the same as DDU does before giving up on the GPU. Also how long have you had the card out of curiosity?


----------



## bluezone

This may sound weird, but try totally powering down your PC and disconnect and remove the card and let it sit for 15 minutes to discharge the capacitors on the GPU. Also with the card out start the PC with alternate graphics output use DDU to uninstall drivers. Then shut it down and reinstall the card and drivers.

I had problems with drivers before and this was the only thing that would fix it. For some reason the card (Nano) was stuck bad settings and would not reset without doing this.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Alastair*
> 
> Maybe try Sapphire Tri-x instead? Tri-x works well for me. Or you could even try just using over drive in the driver?


ok so I tried that and same results... I'm posting a couple picks of the info in the bios... wondering if there is something there that might help.


what's up with those voltage entries.... I know it isn't running 65 volts lol


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> This may sound weird, but try totally powering down your PC and disconnect and remove the card and let it sit for 15 minutes to discharge the capacitors on the GPU. Also with the card out start the PC with alternate graphics output use DDU to uninstall drivers. Then shut it down and reinstall the card and drivers.
> 
> I had problems with drivers before and this was the only thing that would fix it. For some reason the card (Nano) was stuck bad settings and would not reset without doing this.


Thanks for info about experience and solution to it. I bookmarked this page for future reference I hope I never need.


----------



## bluezone

Quote:


> Originally Posted by *Thoth420*
> 
> Thanks for info about experience and solution to it. I bookmarked this page for future reference I hope I never need.


Your welcome.

I've have had to use this procedure a few times, Such as when overclock or voltage settings would become frozen or a test Bios would set HBM voltage and would not reset with a new Bios.

I don't know if it made any difference but I also removed and reinstalled all monitoring and overclocking software as well.

Cheers


----------



## Jflisk

OPSS thought you got it fixed .


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Your welcome.
> 
> I've have had to use this procedure a few times, Such as when overclock or voltage settings would become frozen or a test Bios would set HBM voltage and would not reset with a new Bios.
> 
> I don't know if it made any difference but I also removed and reinstalled all monitoring and overclocking software as well.
> 
> Cheers


I always remove all OC software completely including profiles etc. before any driver swap for GPU...always a smart thing to do especially if it is set to boot on startup.


----------



## JunkaDK

Quote:


> Originally Posted by *Thoth420*
> 
> JunkaDK I def advise either trying DDU and/or using the guide on here by BradleyW in regard to manually doing essentially the same as DDU does before giving up on the GPU. Also how long have you had the card out of curiosity?


i always remove drivers with DDU ? card is 3 days old.. Since the problem also occurs in the bios i cant see it beeing a driver issue. Gonna borrow another card today just to rule out other hardware


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Minotaurtoo*
> 
> ok so I tried that and same results... I'm posting a couple picks of the info in the bios... wondering if there is something there that might help.
> 
> 
> what's up with those voltage entries.... I know it isn't running 65 volts lol


nothing to worry about, this is default v
like here in this table ->


----------



## Drag0g0

I wanted to try custom bios for my Nano to lower gpu temps, what and where i can download bios for that, or do i have to make it my self?


----------



## gupsterg

Quote:


> Originally Posted by *pozzallo*
> 
> Also XFX support is good asked rep. if they have a UEFI bios for Fury x and said yes and emailed to me.


Any chance of attaching the ROMs as a zip to post? just curious to view them







.
Quote:


> Originally Posted by *jaggafeen*
> 
> Hi guys i need a bios for a fury nitro which will have a more aggressive fan curve and not throttle the card clocks. Please.


In Fiji bios mod thread OP is a tool which can mod your factory ROM. The Nitro comes with 2 differing ROMs on each switch position and the one with 80°C Iw ouldn't like to use as it's not the throttling temp that has increased but the temperature the cooling solution will try to maintain.



So you either use OS SW like MSI AB to set a custom curve or mod that value in VBios.
Quote:


> Originally Posted by *Drag0g0*
> 
> I wanted to try custom bios for my Nano to lower gpu temps, what and where i can download bios for that, or do i have to make it my self?


In in Fiji bios mod may help you, linked above







.


----------



## octiny

Here's my newest ITX build.

NCASE M1
Maximus Impact VIII
6700K 4.75ghz/4.65ghz @ 1.376 w/H75 AIO
32GB DDR4 G. Skill Ripjaws 3780mhz 16-17-17-36
Radeon Duo Pro 1060/520 @ -60mv
SX700-LPT SFX-L








At 4K resolution. Highest PEAK wattage "from the wall" is 650-720W during Crysis 3 which leaves more than enough room, and 575-650 in online 64P BF4.

Max GPU temps 61c/58c in Crysis 3, max stress testing temps on CPU 71C.

Happy with out everything turned out!









http://www.3dmark.com/fs/9978632

Radeon Duo Pro is a beast @ 4K.


----------



## Thoth420

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *octiny*
> 
> Here's my newest ITX build.
> 
> NCASE M1
> Maximus Impact VIII
> 6700K 4.75ghz/4.65ghz @ 1.376 w/H75 AIO
> 32GB DDR4 G. Skill Ripjaws 3780mhz 16-17-17-36
> Radeon Duo Pro 1060/520 @ -60mv
> SX700-LPT SFX-L
> 
> 
> 
> 
> 
> 
> 
> 
> At 4K resolution. Highest PEAK wattage "from the wall" is 650-720W during Crysis 3 which leaves more than enough room, and 575-650 in online 64P BF4.
> 
> Max GPU temps 61c/58c in Crysis 3, max stress testing temps on CPU 71C.
> 
> Happy with out everything turned out!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/9978632
> 
> Radeon Duo Pro is a beast @ 4K.






That is a lot of horsepower crammed in a very small package with very impressive temps!


----------



## Drag0g0

What is default position bios switch in Nano? Do it matter if you use default or other?


----------



## chris89

Do you guys know how to disable the whole PowerPlay feature all together? I'm looking through AtomBiosReader in the Master Data Table and see PowerPlayInfo ... If setting to 0, would it bypass all the limits so I can just focus on stabilizing clocks and voltages?

In AtomBiosReader DPM 7 High is 1306, Low is 1068 ... and VDD base is 65288...

Thanks


----------



## Jflisk

Usually bios position 1 is factory default and position 2 is where you place it to try a bios. So 2 is for a bios change and 1 is default bios. Always save the old bios before a update. If your going to play with the bios read here and this is the latest AMD factory bios for the Nano and Fury X.

https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware

This should explain everything

Look up ATIflash directions you can use it off an administrative command line. There are two in the one package atiwinflash and atiflash use the atiflash command should look like

instructions
https://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/

downloadhttps://www.techpowerup.com/downloads/2531/atiflash-2-71/

From elevated command prompt atiflash -p 0 ( 0 First card 1 second card and so on) biosname.bin
atiflash -p 0 biosname.bin

-s save bios
-f force - You should not need this

Any questions feel free.


----------



## Alastair

Quote:


> Originally Posted by *Drag0g0*
> 
> What is default position bios switch in Nano? Do it matter if you use default or other?


One is normally a higher powerlimit/temp limit BIOS. With the switch set to the position closest to the display outputs is the default BIOS. The bios switch set away from the IO panel is usually the higher limit BIOS.


----------



## Kuivamaa

Assuming similar price , would you go with Fury non-X (nitro or tri-x) or a nano? I would expect a 1000-1020MHz non-throttling nano to match or slightly surpass a 1100 fury. Is that the case from your experience ? Are there nuances that prevent nano from keeping a steady 1000+ strictly on air?


----------



## NightAntilli

If you manage to avoid a Nano from throttling, it should outplay a Fury. But to avoid it from throttling, you have to put a huge cooler on it, or go to water cooling.
Some Fury cards have CUs that can be unlocked, making them perform closer to a Fury X. Aside from that, they come with good cooling already...


----------



## Alastair

Quote:


> Originally Posted by *Kuivamaa*
> 
> Assuming similar price , would you go with Fury non-X (nitro or tri-x) or a nano? I would expect a 1000-1020MHz non-throttling nano to match or slightly surpass a 1100 fury. Is that the case from your experience ? Are there nuances that prevent nano from keeping a steady 1000+ strictly on air?


It would be more like a 1050 nano vs. a 1100 fury. The gap is that small. Go Tri-X and you might even be lucky to get one that might unlock


----------



## Krzych04650

Quote:


> Originally Posted by *Kuivamaa*
> 
> Assuming similar price , would you go with Fury non-X (nitro or tri-x) or a nano? I would expect a 1000-1020MHz non-throttling nano to match or slightly surpass a 1100 fury. Is that the case from your experience ? Are there nuances that prevent nano from keeping a steady 1000+ strictly on air?


For me, if you can keep Nano from power or thermal throttling and you are cooling it with water, go for Nano over Fury non-X, if prices are similar, because it is a bit faster. For air cooling only Fury, because Nano simply lacks proper cooling and noise/temperature ratio is absolutely horrible, it is hard to imagine worse if we are talking about modern GPUs, especially if you want to take everything out of your GPU and remove power target limitations. Fury on the other hand has some amazing coolers available, like Fury Nitro, amazing temps with barely any noise, assuming that you don't hear/you are not bothered with coil whine. The difference is so huge that Sapphire Fury Nitro vs Nano is like the best air cooler on the market, or one of the best, vs the worst, while heat output is the same.


----------



## NightAntilli

Remember that the Fury Nitro can be found for $310 right now. Haven't seen any Nano matching that price.


----------



## bluezone

Quote:


> Originally Posted by *Krzych04650*
> 
> For me, if you can keep Nano from power or thermal throttling and you are cooling it with water, go for Nano over Fury non-X, if prices are similar, because it is a bit faster. For air cooling only Fury, because Nano simply lacks proper cooling and noise/temperature ratio is absolutely horrible, it is hard to imagine worse if we are talking about modern GPUs, especially if you want to take everything out of your GPU and remove power target limitations. Fury on the other hand has some amazing coolers available, like Fury Nitro, amazing temps with barely any noise, assuming that you don't hear/you are not bothered with coil whine. The difference is so huge that Sapphire Fury Nitro vs Nano is like the best air cooler on the market, or one of the best, vs the worst, while heat output is the same.


I think I would chose the NON X Fury over the Nano if you have the room. That said I am running a Nano. In between good airflow and Bios settings I maintain pretty good temperatures and noise levels on air and more than decent performance.


----------



## Kuivamaa

Quote:


> Originally Posted by *NightAntilli*
> 
> Remember that the Fury Nitro can be found for $310 right now. Haven't seen any Nano matching that price.


In Finland they can be found within 10-20 euros of difference ( nano being the cheaper one , as a matter of fact). Not putting a nano under water , I'd get an X with the AIO in that case. Acoustics are a non issue because I use headphones and my Nepton 280L with two jetflos is so loud that covers everything else. Throttling however, is an issue. I use an Asus custom 290X (thermals are perfect) that clocks to 1100 with minimal additional voltage and a throttling nano might not be more than say, 15% faster.


----------



## NightAntilli

In that case I'd definitely pick the Nano.


----------



## bluezone

I tripped across this video on YouTube. Casually Explained: Computers






LOL


----------



## mypickaxe

Quote:


> Originally Posted by *NightAntilli*
> 
> In that case I'd definitely pick the Nano.


Agreed.


----------



## utking

Hi guys!

New Fury owner here, got a nice price, and was sick of the temperatures and noise of my cf 7950









But now to the point, did i get a golden sample or something? I'm unable to unlock the extra cores, but it overclocks really good!

It goes up to 1110/516 without touching voltage.

Currently i'm running it at 1198/516 at +68mv

Should i try to take it higher? Haven't tested long term stability though, just ran it through 3Dmark


----------



## lestatdk

Quote:


> Originally Posted by *utking*
> 
> Hi guys!
> 
> New Fury owner here, got a nice price, and was sick of the temperatures and noise of my cf 7950
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But now to the point, did i get a golden sample or something? I'm unable to unlock the extra cores, but it overclocks really good!
> 
> It goes up to 1110/516 without touching voltage.
> 
> Currently i'm running it at 1198/516 at +68mv
> 
> Should i try to take it higher? Haven't tested long term stability though, just ran it through 3Dmark


Those are good numbers. Mine can do 1170/560 or so stable. But for daily use I run at 1155/530 just to be on the safe side


----------



## LionS7

Quote:


> Originally Posted by *utking*
> 
> Hi guys!
> 
> New Fury owner here, got a nice price, and was sick of the temperatures and noise of my cf 7950
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But now to the point, did i get a golden sample or something? I'm unable to unlock the extra cores, but it overclocks really good!
> 
> It goes up to 1110/516 without touching voltage.
> 
> Currently i'm running it at 1198/516 at +68mv
> 
> Should i try to take it higher? Haven't tested long term stability though, just ran it through 3Dmark


Test in The Witcher 3 for 2-3 hours. 3Dmark is no good.


----------



## NightAntilli

Unigine Heaven/Valley also works well for testing stability.


----------



## Minotaurtoo

well... I finally got around to overclocking the memory... ended up quickly at 550 and passed the stability tests I threw at it... and I did manage a very small overclock on the core.... a meager 25 mhz.... with no voltage added... but for some reason I can't get to 1100 without raising volts and even with + 96mv I can't get 1150 to stabilize... but the memory seems to be a clocking beast... haven't tried past 560 yet... little scared what it will do when it finally does go unstable.


----------



## utking

Witcher 3 ran almost flawless for a couple of hours, had to disable HBAO+ though, crashed to desktop without any errors With it on.


----------



## costilletas

Can someone here with a r9 fury play overwatch and tell me what their gpu load and fps average is? Also, do you experience any brutal frame dips? >(


----------



## NightAntilli

Does this video help? Not mine, simply found it on YouTube...


----------



## costilletas

I don't get why my gpu load can't stay at 100% like the video's. IT's awful, i set everything to low and when i'm alone in a custom match it stays at 100% load and 300 fps but when i play with other people the gpu load goes crazy and so do the fps. it goes from 2xx to stuttering from a sec then back to normal 2xx...


----------



## Thoth420

Quote:


> Originally Posted by *NightAntilli*
> 
> Does this video help? Not mine, simply found it on YouTube...


Playing online FPS on PC with controller....Anathema(and I own the 150 xbox elite for my PC for 3rd person stealth and other things)...


----------



## NightAntilli

Quote:


> Originally Posted by *costilletas*
> 
> I don't get why my gpu load can't stay at 100% like the video's. IT's awful, i set everything to low and when i'm alone in a custom match it stays at 100% load and 300 fps but when i play with other people the gpu load goes crazy and so do the fps. it goes from 2xx to stuttering from a sec then back to normal 2xx...


What CPU do you have?


----------



## costilletas

2500k @ 4.6 so nope, it's not the cpu. If i set render scale to 150% it kinda fixes it, but hovers arounr 180-220 fps with some dips too, but less often than when setting it to 100%, so weird.


----------



## Thoth420

Quote:


> Originally Posted by *costilletas*
> 
> 2500k @ 4.6 so nope, it's not the cpu. If i set render scale to 150% it kinda fixes it, but hovers arounr 180-220 fps with some dips too, but less often than when setting it to 100%, so weird.


Have you tried changing the power setting for the game profile in Crimson? I don't have Overwatch(planning on it eventually looks fun) so shooting in the dark.


----------



## costilletas

It's not just ow, it happens with heroes of the storm too, it used to happen in cs go to, now it fixed. It's very weird >(. Yes, high performance mode and power saving and fps limit off in Crimson. I've had these kind of problems since i bought the gpu. No problems at all with games like the witcher, but low spec games struggle to maintain the frame rate.


----------



## Alastair

Quote:


> Originally Posted by *costilletas*
> 
> It's not just ow, it happens with heroes of the storm too, it used to happen in cs go to, now it fixed. It's very weird >(. Yes, high performance mode and power saving and fps limit off in Crimson. I've had these kind of problems since i bought the gpu. No problems at all with games like the witcher, but low spec games struggle to maintain the frame rate.


have you tried using clock blocker to see if maybe that fixes your problem? is just an idea.


----------



## NightAntilli

Did you notice if the GPU is downclocking?


----------



## sergiodb

hi,, my fury 1100, 570 get 80fps in valley bench,,, post your results please


----------



## Alastair

Quote:


> Originally Posted by *sergiodb*
> 
> hi,, my fury 1100, 570 get 80fps in valley bench,,, post your results please


What settings?


----------



## costilletas

Quote:


> Originally Posted by *Alastair*
> 
> have you tried using clock blocker to see if maybe that fixes your problem? is just an idea.


It's a gpu usage issue, It looks lik there are a lot of people with the same problem in overwatch, so it might be blizzard's fault.


----------



## prom

I recently put together a new ITX rig (6600k, Nano, 16gb ddr4, win10, SF600), and for the most part it's been pretty smooth.
The card is a Sapphire Nano running the latest 16.9.1 driver, with AMDs latest bios.
I WAS running it using MSI AB with -24mv, +50% power, custom fan curve, & 1000mhz.
I have since reverted to stock voltage until I've got a good solution/answer.

However, I've noticed that in some games I've been experiencing some very specific model corruption.

Of all the games I have on this machine, it only seems to happen (as far as I can tell) in CSGO (Source), and Mechwarrior Online (CryEngine).
Both get the same style of bug, where I will get ONE 3d 'spike' artifact out of a model. It can be a static or moving model, or part of the landscape.

*EXAMPLE:* http://i.imgur.com/qospxIS.jpg

These 'spikes' stay with the model until the map/match is over, or I reload the map.

I have no voltage spikes, no power throttling, and temperatures are generally a cool 70 degrees.
It doesn't happen all the time, and it doesn't happen in any other games as far as I can tell.
I can play something GPU intensive like Hitman or GTA5 for hours with no issues.

I've run 3d Mark benches AND stress tests with no trouble.
I don't get any issues until I go well over 4gb VRAM with FireStrike Ultra, with a custom max tessellation/AA profile.
Power throttling also kicks in during the stress tests, but still with no corruption.

I've run both Heaven & Valley benches extensively with no issues whatsoever @ both 1080 and 1440p maxed.
I've also tried older drivers with no noticeable changes.

Here is my Aida64 info:

Code:



Code:


------[ PowerPlay7 BIOS Info ]------

Max GPU Clock      = 2000 MHz
Max Memory Clock   = 500 MHz
PowerControl Limit = 50%
SCLK DPM0 =  300 MHz
SCLK DPM1 =  508 MHz
SCLK DPM2 =  717 MHz
SCLK DPM3 =  874 MHz
SCLK DPM4 =  911 MHz
SCLK DPM5 =  944 MHz
SCLK DPM6 =  974 MHz
SCLK DPM7 = 1000 MHz
MCLK DPM0 =  500 MHz  (VDDCI: 1.00000 V)

------[ ADL GPU Info ]------

Part Number  = 113-C8820200-107
BIOS Version = 015.049.000.012
BIOS Date    = 2016/02/12 09:17
Memory Type  = HBM
GPU Clock    = 974 MHz
Memory Clock = 500 MHz
VDDC         = 0 mV
DPM State    = 0
GPU Usage    = 6 %

------[ ADL PStates List ]------

State #0: GPUClock =  300 MHz, MemClock =  500 MHz, VID = 0.000 V
State #1: GPUClock = 1000 MHz, MemClock =  500 MHz, VID = 0.000 V

------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  508 MHz, VID = 0.95000 V
DPM2: GPUClock =  717 MHz, VID = 0.95600 V
DPM3: GPUClock =  874 MHz, VID = 1.08700 V
DPM4: GPUClock =  911 MHz, VID = 1.12500 V
DPM5: GPUClock =  944 MHz, VID = 1.16800 V
DPM6: GPUClock =  974 MHz, VID = 1.20600 V
DPM7: GPUClock = 1000 MHz, VID = 1.24300 V

------[ ATIDriver Calls ]------

ATIDriver Performance Switching: Not Supported

ATIDriver MultiVPU: Not Supported

I've lurked this community for a long while, and that you guys also appear to be the most active Fiji (non-mining) community is just an added bonus









I know this is a pretty hefty first post, but any thoughts/input would be greatly appreciated. The card IS still under RMA (only got it a couple months ago) so I _could_ do that if it's really necessary.
I got the card new for a really great deal, so it'd be a shame if it ends up being bunk









*EDIT*:
As of posting this, I've started getting artifacts in Heaven at stock clocks








I've since reduced the max power to 30% instead of 50. Heaven is not bugging out anymore.
This machine only has a 600w PSU, so I'm wondering if power had been spiking or something...

Signs are starting to point to RMA


----------



## pozzallo

gupsterg here is the Zip file for UEFI Bios for my XFX Fury X . File is attached

UEFIBIOSFORFURYX.zip 104k .zip file
 sorry it took so long have been busy


----------



## mypickaxe

Quote:


> Originally Posted by *prom*
> 
> I recently put together a new ITX rig (6600k, Nano, 16gb ddr4, win10, SF600), and for the most part it's been pretty smooth.
> The card is a Sapphire Nano running the latest 16.9.1 driver, with AMDs latest bios.
> I WAS running it using MSI AB with -24mv, +50% power, custom fan curve, & 1000mhz.
> I have since reverted to stock voltage until I've got a good solution/answer.
> 
> However, I've noticed that in some games I've been experiencing some very specific model corruption.
> 
> Of all the games I have on this machine, it only seems to happen (as far as I can tell) in CSGO (Source), and Mechwarrior Online (CryEngine).
> Both get the same style of bug, where I will get ONE 3d 'spike' artifact out of a model. It can be a static or moving model, or part of the landscape.
> 
> *EXAMPLE:* http://i.imgur.com/qospxIS.jpg
> 
> These 'spikes' stay with the model until the map/match is over, or I reload the map.
> 
> I have no voltage spikes, no power throttling, and temperatures are generally a cool 70 degrees.
> It doesn't happen all the time, and it doesn't happen in any other games as far as I can tell.
> I can play something GPU intensive like Hitman or GTA5 for hours with no issues.
> 
> I've run 3d Mark benches AND stress tests with no trouble.
> I don't get any issues until I go well over 4gb VRAM with FireStrike Ultra, with a custom max tessellation/AA profile.
> Power throttling also kicks in during the stress tests, but still with no corruption.
> 
> I've run both Heaven & Valley benches extensively with no issues whatsoever @ both 1080 and 1440p maxed.
> I've also tried older drivers with no noticeable changes.
> 
> Here is my Aida64 info:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ PowerPlay7 BIOS Info ]------
> 
> Max GPU Clock      = 2000 MHz
> Max Memory Clock   = 500 MHz
> PowerControl Limit = 50%
> SCLK DPM0 =  300 MHz
> SCLK DPM1 =  508 MHz
> SCLK DPM2 =  717 MHz
> SCLK DPM3 =  874 MHz
> SCLK DPM4 =  911 MHz
> SCLK DPM5 =  944 MHz
> SCLK DPM6 =  974 MHz
> SCLK DPM7 = 1000 MHz
> MCLK DPM0 =  500 MHz  (VDDCI: 1.00000 V)
> 
> ------[ ADL GPU Info ]------
> 
> Part Number  = 113-C8820200-107
> BIOS Version = 015.049.000.012
> BIOS Date    = 2016/02/12 09:17
> Memory Type  = HBM
> GPU Clock    = 974 MHz
> Memory Clock = 500 MHz
> VDDC         = 0 mV
> DPM State    = 0
> GPU Usage    = 6 %
> 
> ------[ ADL PStates List ]------
> 
> State #0: GPUClock =  300 MHz, MemClock =  500 MHz, VID = 0.000 V
> State #1: GPUClock = 1000 MHz, MemClock =  500 MHz, VID = 0.000 V
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  508 MHz, VID = 0.95000 V
> DPM2: GPUClock =  717 MHz, VID = 0.95600 V
> DPM3: GPUClock =  874 MHz, VID = 1.08700 V
> DPM4: GPUClock =  911 MHz, VID = 1.12500 V
> DPM5: GPUClock =  944 MHz, VID = 1.16800 V
> DPM6: GPUClock =  974 MHz, VID = 1.20600 V
> DPM7: GPUClock = 1000 MHz, VID = 1.24300 V
> 
> ------[ ATIDriver Calls ]------
> 
> ATIDriver Performance Switching: Not Supported
> 
> ATIDriver MultiVPU: Not Supported
> 
> I've lurked this community for a long while, and that you guys also appear to be the most active Fiji (non-mining) community is just an added bonus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know this is a pretty hefty first post, but any thoughts/input would be greatly appreciated. The card IS still under RMA (only got it a couple months ago) so I _could_ do that if it's really necessary.
> I got the card new for a really great deal, so it'd be a shame if it ends up being bunk
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT*:
> As of posting this, I've started getting artifacts in Heaven at stock clocks
> 
> 
> 
> 
> 
> 
> 
> 
> I've since reduced the max power to 30% instead of 50. Heaven is not bugging out anymore.
> This machine only has a 600w PSU, so I'm wondering if power had been spiking or something...
> 
> Signs are starting to point to RMA


What is the make/model of your PSU? They do recommend at 750W PSU for the Nano, but if it's a high quality model with the required amperage on the 12V rail, and there's not a lot of ripple, it should be fine at 600W, but pushing to 50% is, well, pushing it, especially if you're overclocking both the CPU and GPU.


----------



## prom

Quote:


> Originally Posted by *mypickaxe*
> 
> What is the make/model of your PSU? They do recommend at 750W PSU for the Nano, but if it's a high quality model with the required amperage on the 12V rail, and there's not a lot of ripple, it should be fine at 600W, but pushing to 50% is, well, pushing it, especially if you're overclocking both the CPU and GPU.


It's a Corsair SF600. My CPU hasn't been overclocked as I'm clearly working out all the kinks









I'm going to uninstall AfterBurner, run DDU again and reinstall the latest drivers.
I want to check for any stability changes using Trixx instead, as some folks have reported instability with MSI.


----------



## StenioMoreira

Can I apply Coollaboratory Liquid Ultra or something similar on Fiji GPUs? I wan't to replace my thermal on the fury x and was wondering it Coollaboratory Liquid Ultra would cause issues over the HBM


----------



## prom

Removed MSI Afterburner / Rivatuner, and did a DDU driver cycle.
Fresh driver install paired with Sapphires Trixx.

I've already noticed a difference, though more testing will determine if it's permanent.
With MSI, benches would crash out if I tried to undervolt anything more than 30mv (+50% power).
With Trixx I've been alternating loops of Valley & Heaven for the last hour with -42mv (+30% power).

I am cautiously optimistic


----------



## octiny

Quote:


> Originally Posted by *prom*
> 
> Removed MSI Afterburner / Rivatuner, and did a DDU driver cycle.
> Fresh driver install paired with Sapphires Trixx.
> 
> I've already noticed a difference, though more testing will determine if it's permanent.
> With MSI, benches would crash out if I tried to undervolt anything more than 30mv (+50% power).
> With Trixx I've been alternating loops of Valley & Heaven for the last hour with -42mv (+30% power).
> 
> I am cautiously optimistic


You are completely fine with that PSU.

Without any overclocking I pull 450-540 from the WALL with my Radeon Duo Pro (specs in sig). With Undervolt at -60, +50 PL, and 1050mhz clocks I pull 575-720w @ 4K from the wall in the most demanding games at max settings which is more than fine (when you take efficiency into account).

Good to hear you got things fixed


----------



## prom

Nope nope nope, nothings fixed








Shortly after posting that, Trixx started to BSOD with a "thread stuck" error, and I still got an artifact ingame.

I've resorted to nixing any power limit increase in hopes of seeing SOME stability


----------



## bluezone

Quote:


> Originally Posted by *prom*
> 
> Nope nope nope, nothings fixed
> 
> 
> 
> 
> 
> 
> 
> 
> Shortly after posting that, Trixx started to BSOD with a "thread stuck" error, and I still got an artifact ingame.
> 
> I've resorted to nixing any power limit increase in hopes of seeing SOME stability


What sort of clocks are running during "CSGO (Source), and Mechwarrior Online (CryEngine)".Meaning are your clocks rock solid. Monitor them with HWiNFO64 with the settings for "Polling Frequency" set at "100". This is the highest possible Polling rate.
Secondly. Do the artifacts appear with less of an undervolt?


----------



## prom

Quote:


> Originally Posted by *bluezone*
> 
> What sort of clocks are running during "CSGO (Source), and Mechwarrior Online (CryEngine)".Meaning are your clocks rock solid. Monitor them with HWiNFO64 with the settings for "Polling Frequency" set at "100". This is the highest possible Polling rate.
> Secondly. Do the artifacts appear with less of an undervolt?


Clocks are always solid @ 1000mhz.
Artifacts appear at any given voltage (stock & below). I'm going to over volt a little bit and see if anything changes.

The odd thing is that whenever I feel like I've got something stable (say a 30min loop of whatever bench), I go try CSGO or MWO and get an artifact within an hour








Conversely, I could be playing something like GTA5 or Hitman, where GPU _and_ memory usage is pushed really hard with no ill effects.


----------



## bluezone

Quote:


> Originally Posted by *prom*
> 
> Clocks are always solid @ 1000mhz.
> Artifacts appear at any given voltage (stock & below). I'm going to over volt a little bit and see if anything changes.
> 
> The odd thing is that whenever I feel like I've got something stable (say a 30min loop of whatever bench), I go try CSGO or MWO and get an artifact within an hour
> 
> 
> 
> 
> 
> 
> 
> 
> Conversely, I could be playing something like GTA5 or Hitman, where GPU _and_ memory usage is pushed really hard with no ill effects.


I've run into artifacting like yours. If one of the voltages per DPM is lower than needed problems will arise. Since your solid @ 1000 MHz. the likely culprit is DPM7 VID being a little low. The only problem is you are already pushing 1.243V stock voltage. Heat might become a problem. You have very nice temperatures so far. It doesn't take mush voltage to start ramping up those temperatures.

How difficult is an RMA?


----------



## prom

Unless something has changed in the last few years, sending Sapphire RMAs out of Canada has always been a pain.
I do have a question though...

How come my Aida64 dumps aren't reporting the full GPU clock like every other one I see?
Is that a normal thing? I have no idea.


----------



## Alastair

Quote:


> Originally Posted by *StenioMoreira*
> 
> Can I apply Coollaboratory Liquid Ultra or something similar on Fiji GPUs? I wan't to replace my thermal on the fury x and was wondering it Coollaboratory Liquid Ultra would cause issues over the HBM


I would not recommend it. Maybe if the whole die was uniform. But due to the HBM stacks and gaps in the interposer I would personally feel uncomfortable doing it. Especially when trying to take the heatsink off come cleaning time.


----------



## waltercaorle

hi, there's a waterblock for the fury nitro??


----------



## Thoth420

Kung Fury X with it's new mobo (also swapped RAM for white low profile and coolant dye color). Better pics incoming once she is set up.


----------



## LionS7

Quote:


> Originally Posted by *prom*
> 
> Unless something has changed in the last few years, sending Sapphire RMAs out of Canada has always been a pain.
> I do have a question though...
> 
> How come my Aida64 dumps aren't reporting the full GPU clock like every other one I see?
> Is that a normal thing? I have no idea.


Did you try the last official firmware from AMD for R9 Nano ?
https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware

I was having black screens with "no video input" crash all the time with my old bios, even the number of the bios was the same, but the last one is very stable. ...and lol, Im seeing for the first time that high VID - 1.24V.


----------



## prom

Quote:


> Originally Posted by *LionS7*
> 
> Did you try the last official firmware from AMD for R9 Nano ?
> https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware
> 
> I was having black screens with "no video input" crash all the time with my old bios, even the number of the bios was the same, but the last one is very stable. ...and lol, Im seeing for the first time that high VID - 1.24V.


Yep, I did the bios change before I did anything else.

Currently running +50% too lock it to 1000mhz, but with +30(!)mv.
So far so good, but my current track record for posting something positive usually means it'll fail once I post


----------



## Minotaurtoo

well. I found an updated bios for my card... modded it slightly... of coarse... and now I can get 1100mhz at stock volts on my fury x... I kinda had an idea it might be bios or driver related because I wasn't getting artifacts but just plain out crashes to desktop from stress tests... with driver failure error messages.. so I clean installed the driver and it did no good... then I read somewhere that the bios had caused trouble sometimes... and now it seems to been my problem... I'm still working to get higher clocks, but at least this is a positive move... oh and with this bios, I actually got a better score even at stock on my graphics scores in 3dmark... very small increase but nevertheless consistently there.... now to see if there is negative scaling still present with voltage increase like before... soon I'll try using afterburner to increase voltage and try for higher clocks.


----------



## LionS7

Quote:


> Originally Posted by *Minotaurtoo*
> 
> well. I found an updated bios for my card... modded it slightly... of coarse... and now I can get 1100mhz at stock volts on my fury x... I kinda had an idea it might be bios or driver related because I wasn't getting artifacts but just plain out crashes to desktop from stress tests... with driver failure error messages.. so I clean installed the driver and it did no good... then I read somewhere that the bios had caused trouble sometimes... and now it seems to been my problem... I'm still working to get higher clocks, but at least this is a positive move... oh and with this bios, I actually got a better score even at stock on my graphics scores in 3dmark... very small increase but nevertheless consistently there.... now to see if there is negative scaling still present with voltage increase like before... soon I'll try using afterburner to increase voltage and try for higher clocks.


Can you put the bios somewhere, so we can try it ?







Im using the final UEFI official from AMD and it it ok for now, even that my card is not a good clocker.


----------



## prom

I'm starting to wonder if my dual monitors are causing nonsense.
One is a 144hz displayport panel, the other is a 60hz hdmi panel.

MWO & CSGO are the only games I have issues with, and they are both played in windowed mode so I can go between my screens whilst waiting.

I only noticed this because I had been playing CSGO for a couple hours without going to my second screen, and I didn't notice a 3d artifact until I tabbed back...









Maybe a hardware acceleration conflict?
It just doesn't make sense to me, it really doesn't seem like a HARDWARE fault.
Hours at 1000mhz, no problems. Suddenly a _single_ 3d artifact that stays with whatever the model is.

FWIW, I flashed my bios back to the original one, rather than AMDs updated UEFI bios.
I've also reverted settings back to stock + 50% power, and gone back to AfterBurner since Trixx was BSODing on startup (thread_stuck).


----------



## Minotaurtoo

Quote:


> Originally Posted by *LionS7*
> 
> Can you put the bios somewhere, so we can try it ?
> 
> 
> 
> 
> 
> 
> 
> Im using the final UEFI official from AMD and it it ok for now, even that my card is not a good clocker.


I actually got it from AMD it was the april 2016 release... the only mods I made was power limit increases.... I did have one game crash since the last post... so I have some more testing to do... but it is doing better than the original... I couldn't even bench at stock volts at 100 before without driver crashes... it may not have been related to the clocks though... that game is a little crash happy anyway.


----------



## Alastair

Quote:


> Originally Posted by *waltercaorle*
> 
> hi, there's a waterblock for the fury nitro??


Nope


----------



## LionS7

Quote:


> Originally Posted by *LionS7*
> 
> Can you put the bios somewhere, so we can try it ?


Quote:


> Originally Posted by *Minotaurtoo*
> 
> I actually got it from AMD it was the april 2016 release... the only mods I made was power limit increases.... I did have one game crash since the last post... so I have some more testing to do... but it is doing better than the original... I couldn't even bench at stock volts at 100 before without driver crashes... it may not have been related to the clocks though... that game is a little crash happy anyway.


Ok, thank you for the info. I'm with the same bios, and yes - it is very stable.


----------



## NightAntilli

I just got my R9 Fury Nitro. Too bad the CUs can't be unlocked...

Btw, is it normal that HBM doesn't downclock from its 500 MHz frequency?


----------



## LionS7

Quote:


> Originally Posted by *NightAntilli*
> 
> I just got my R9 Fury Nitro. Too bad the CUs can't be unlocked...
> 
> Btw, is it normal that HBM doesn't downclock from its 500 MHz frequency?


Yes, there is no 2D profiles for HBM. It will stay on 500Mhz all the time. If you clock it, it will stay on oc speed.


----------



## NightAntilli

Ok. Its power consumption is very low anyway. For now, since I have an FX 8320, it will kind of bottleneck it anyway, so I won't be overclocking the card for now. I'm surprised how quiet it is...

I'll try posting some benchmark results later to see how its performance stacks with the rest of you, just to confirm it's working correctly, and determine how much of a bottleneck my FX-8320 really is.


----------



## prom

Still haven't figured this out








I was playing MWO for a few hours before I finally got an artifact



Note the shoulder, with the grey vertex flayed out there.

It doesn't distort, or gyrate like an 'undervolt' artifact.
It moves smoothly as thought attached to the model.

In this particular case, loading a map (thus reloading this mech model) will fix it.
Conversely, if the artifact appears while in a match, it will be gone when the map ends or if I reconnect.

Here are other examples from the same program, but from previous dates. This time on static models:



They don't move around, and are solely fixed in that location. I could run around the map, and come back, and they'd be exactly the same.
This also happens the same way in CSGO if it appears on a static model.

It just doesn't seem very much like a hardware fault.
I'll try to record something next time one appears









No voltage spikes.
No out of the ordinary GPU spikes.
Clock speed is locked @ 1000mhz.
In these two games, fan speed doesn't really go above 60% and GPU temps are 70 or below.


----------



## NightAntilli

Just a pic of my old & new card xD


----------



## Minotaurtoo

driver crashes when I OC my fury x to far... no artifacts just a driver crash still... the new bios got me farther at least now I can get 1150 stable with +.72mv but if I go to 1160 I get driver crashes during stressing.... benches run fine no artifacting.

ideas?


----------



## bluezone

FreeSync TVs
VERY. VERY. COOOL









http://www.tomshardware.com/news/freesync-tv-amd-radeon-rtg,32685.html:

http://www.144hzmonitors.com/technology/amd-freesync-coming-tvs-soon/

In other news
Quote:


> Radeon Crimson Edition may be incorporating features from recently acquired startup HiAlgo who developed software to dynamically monitor gameplay and adjust the resolution to maintain maximum frame rates and prevent overheating during long game sessions. One of their techniques called HiAlgo Switch would allow gamers to switch from full to half resolution (and back again) at the press of a hot-key button so as to keep FPS high if a gamer anticipates they are about to enter a demanding area that would normally result in low frame rates. While these techniques are not very important for desktop gaming (especially the CPU/GPU limiter to prevent overheating), all three would come in handy for mobile gamers using laptops with discrete cards or especially APUs.


----------



## dagget3450

New Driver differences for Quadfire FuryX (all i quickly tested for now)
Quote:


> Mmmm, looks like after revisiting timespy on newest drivers Furyx Quad made some nice gains.... i am matching my max overclocked scores with my stock gpu now. When i get some time this week i will load up my old oc profiles and see what happens. Curious if overclocking is nerfed on these drivers.
> 
> Quick test on stock gpu/ game oc on cpu
> 
> http://www.3dmark.com/spy/445563
> 
> Technically on GPU score i passed my max OC score by a small margin. This should yield even better results once i setup again for bench runs.
> 
> GPu score stock 18637 vs 18412 @1150/560
> http://www.3dmark.com/compare/spy/445563/spy/120380
> 
> Apples to apples gpu score stock vs stock 17266 vs 18637
> http://www.3dmark.com/compare/spy/87593/spy/445563
> 
> 6% and 9.4% gains on gpu tests.


----------



## prom

Safe to say this card is boned and will need to be warrantied








Underclocked I'm still getting artifacts


----------



## NightAntilli

So... Is a 7.5 SteamVR score normal for a stock R9 Fury Nitro...?


----------



## Alastair

Quote:


> Originally Posted by *NightAntilli*
> 
> So... Is a 7.5 SteamVR score normal for a stock R9 Fury Nitro...?


I dont *think* so. It sounds low?


----------



## Thoth420

Finally got my system back, setup and running with the new MSI mobo, some white low profile RAM and decided to change the color of the coolant to represent AMD a bit. I really love the Fury X and Crimson suite (very easy to use)








Been gaming on her all day on the 19.1 hotfix driver with Freesync and not a single issue. Yes my cable combs look like crap at the moment... I will never use another ASUS motherboard ever again.


----------



## bluezone

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Thoth420*
> 
> Finally got my system back, setup and running with the new MSI mobo, some white low profile RAM and decided to change the color of the coolant to represent AMD a bit. I really love the Fury X and Crimson suite (very easy to use)
> 
> 
> 
> 
> 
> 
> 
> 
> Been gaming on her all day on the 19.1 hotfix driver with Freesync and not a single issue. Yes my cable combs look like crap at the moment... I will never use another ASUS motherboard ever again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [/quote





]

Looks sweet.


----------



## lestatdk

Quote:


> Originally Posted by *Alastair*
> 
> I dont *think* so. It sounds low?


Sounds very low.I get a score of 10 with an overclocked Nitro


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Looks sweet.


Thanks it took almost a year to stabilize this system because of the ASUS Deluxe which was far from Deluxe...










The MSI board looks better with my theme and has all the features I need and none of the extra junk I don't. I also really like that the LED debug shows the CPU temp instead of A0(fast boot disabled) or 40(fast boot enabled) when there are no issues. I haven't gotten to messing with the OC on the CPU yet because I am new to the MSI terminology and everything is running fine as is anyway at the moment. I have plenty of cooling headroom with a 240 and 360 rad running just the CPU and a single Fury X.

I am finally able to enjoy this legendary monitor and can stop playing stuff on the xbox one in 900p!


----------



## eperelez

Quote:


> Originally Posted by *NightAntilli*
> 
> So... Is a 7.5 SteamVR score normal for a stock R9 Fury Nitro...?


Here's mine. What are your specs?


----------



## bluezone

Quote:


> Originally Posted by *NightAntilli*
> 
> So... Is a 7.5 SteamVR score normal for a stock R9 Fury Nitro...?


Seems a little low. My Nano with the stock bios and untouched settings hits a 8.4 score.


----------



## NightAntilli

I don't know what happened, but, the score jumped up to 9 now. I didn't change anything. I already had the latest drivers. Only thing that happened in between is that I installed a few windows updates and rebooted...

Before;


After;


Score seems good now...


----------



## dagget3450

Well looks like ill have to wait to run time spy on max overclocks, i may have a dead cpu or something. Luckily i had a backup xeon cpu. Im ordering another mainboard to try the 5960x on. The gigabyte board i got isnt the best on overclocking.

I did manage a 19k+ gpu score before crashing hard. I have a feeling i will be in 17k+ total score when or if my 5960x is good.


----------



## Orgios

I feel like I am not getting the best possible gpu usage out of my gpu...

I have a XFX R9 FURY aircooled and unlocked to 3840 cores, 500/1050

I almost never see gpu usage reach 85-90% most commonly I see 56-60% which is kind of frustrating , I don't think its the cpu as it doesnt reach over 70% per core almost never. (fx8350 overclocked to 4,7Ghz

this video is mine for reference
witcher 3





and also thief





Though msi readings in 4k are almost unreadable they are quite clear in 1440p


----------



## LeadbyFaith21

I've got an i5-6600k the I have set at 4.5 GHz at the moment (can safely push it to 4.8, but don't have the need right now) as well as an unlocked Fury. I'm planning on getting a Fury X to put into crossfire with my current Fury and was curious if my CPU would be a limiting factor, or if it should be good? I've never messed with crossfire to know how CPU load changes from one GPU to two, so am curious if you guys think I'll need a CPU upgrade as well to take full advantage of a dual Fury X setup. Thanks!


----------



## diggiddi

Quote:


> Originally Posted by *Orgios*
> 
> I feel like I am not getting the best possible gpu usage out of my gpu...
> 
> I have a XFX R9 FURY aircooled and unlocked to 3840 cores, 500/1050
> 
> I almost never see gpu usage reach 85-90% most commonly I see 56-60% which is kind of frustrating , I don't think its the cpu as it doesnt reach over 70% per core almost never. (fx8350 overclocked to 4,7Ghz
> 
> this video is mine for reference
> witcher 3
> 
> 
> 
> 
> 
> and also thief
> 
> 
> 
> 
> 
> Though msi readings in 4k are almost unreadable they are quite clear in 1440p


Increase your resolution and or try Crysis 3


----------



## Orgios

resolution is already at max (2160p) , unfortunately I do not own crysis 3.

Tried BF4 which I cant get msi afterburner to work with and the gpu load there is 99% so at least one game is working on max!


----------



## huzzug

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> I've got an i5-6600k the I have set at 4.5 GHz at the moment (can safely push it to 4.8, but don't have the need right now) as well as an unlocked Fury. I'm planning on getting a Fury X to put into crossfire with my current Fury and was curious if my CPU would be a limiting factor, or if it should be good? I've never messed with crossfire to know how CPU load changes from one GPU to two, so am curious if you guys think I'll need a CPU upgrade as well to take full advantage of a dual Fury X setup. Thanks!


Your processor should be good in games that are graphic intensive. The 6600k is a great processor even at stock and should be better when overclocked.


----------



## bluedevil

Lol I ran the Steam VR test for the heck of it on CLASSIFIED DEMON.







lol


----------



## LeadbyFaith21

Quote:


> Originally Posted by *huzzug*
> 
> Your processor should be good in games that are graphic intensive. The 6600k is a great processor even at stock and should be better when overclocked.


Awesome, thanks! Didn't want to get too excited only to realize my CPU couldn't hold up.


----------



## Thoth420

I am curious as to how my system will score on the VR test...will run it tonight with everything stock to get a baseline so you guys can compare your OCs to a system running fully at stock clocks. The loop keeps everything ice cold however so it might help performance a tad even with no OCs. I am still learning MSI BIOS so I have left everything at stock for now to test overall stability of my mobo and RAM(set manually to 2133 speed and timing but the sticks are rated for 2666) as both are brand new.

Will post this evening after class.


----------



## Performer81

Does anybody know if the XFX Fury TD is 100% Reference PCB like the Sapphire Tri-X? Cause there is XFX instead of AMD printed over the PCIe Pins.


----------



## gupsterg

Seems that way view EKWB configurator (AMD ref PCB - checked by EK







).


----------



## bluezone

Quote:


> Originally Posted by *bluedevil*
> 
> Lol I ran the Steam VR test for the heck of it on CLASSIFIED DEMON.
> 
> 
> 
> 
> 
> 
> 
> lol


WOW, it goes up to "11". Very Spinal Tap.









10.1 is the highest I've hit with MOD Bios.


Spoiler: Warning: Spoiler!


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> WOW, it goes up to "11". Very Spinal Tap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 10.1 is the highest I've hit with MOD Bios.
> 
> 
> Spoiler: Warning: Spoiler!


My mobo also has an auto OC knob that goes to 11! Others only go to 10....mine goes to 11!


----------



## bluezone

Crimson 16.9.2

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.9.2-Release-Notes.aspx

Win 10 64 bit.

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64 bit.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Crimson 16.9.2
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.9.2-Release-Notes.aspx
> 
> Win 10 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64
> 
> Win 7 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


Cheers! I only just got anniversary update last night but after reinstalling my 16.9.1 hotfix drivers everything has been fine so far. The only fix that affects me is related to that so if I run into trouble will def give these a go.


----------



## dagget3450

Good god more drivers Jim!


----------



## Thoth420

Ever since the anniversary update to w10 (1607) and a reinstall of the 16.9.1 hotfix driver my Fury X actually goes into Zero Core(single green light) on monitor sleep. I thought the GPU didn't do that when a DP to DP cable was attached. Is this a bug? It has always stayed single red light in the past.
Power Effeciency is still off globablly, FS enabled and FRTC set to 144.

Mouse trail setting also disabled after a reboot but that could be the LCore.exe causing that.

Nothing is problematic just a curiousity. I would prefer it be able to go in and out of zero core on monitor sleep without a hitch anyway.

UPDATE: After over 12 hours system uptime it instead now goes into zero core for a minute or two then the tach spools up 3/4 full a sec and back to a solid red light for about 3 minutes then back to green (cycle repeats as long as the monitor is asleep).


----------



## dagget3450

Anyone here with crossfire fiji able to run Ashes of the Singularity in dx12 with MGPU enabled? I have tried across multiple platforms with different drivers over the last year or so. All jt does is crash on startup. I really regret buying the game if i cannkt benchmark multiple gpu. I have tried disabling CF and all that jazz.


----------



## lanofsong

Hey Fury/Fury X/Nano/Pro Duo Fiji owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## bluezone

Quote:


> Originally Posted by *Thoth420*
> 
> Ever since the anniversary update to w10 (1607) and a reinstall of the 16.9.1 hotfix driver my Fury X actually goes into Zero Core(single green light) on monitor sleep. I thought the GPU didn't do that when a DP to DP cable was attached. Is this a bug? It has always stayed single red light in the past.
> Power Effeciency is still off globablly, FS enabled and FRTC set to 144.
> 
> Mouse trail setting also disabled after a reboot but that could be the LCore.exe causing that.
> 
> Nothing is problematic just a curiousity. I would prefer it be able to go in and out of zero core on monitor sleep without a hitch anyway.
> 
> UPDATE: After over 12 hours system uptime it instead now goes into zero core for a minute or two then the tach spools up 3/4 full a sec and back to a solid red light for about 3 minutes then back to green (cycle repeats as long as the monitor is asleep).


GURU3D 16.9.2 thread is reposting a idle bug.

http://forums.guru3d.com/showthread.php?t=410006


----------



## chris89

Quote:


> Originally Posted by *mypickaxe*
> 
> [ ADL PStates List ]
> 
> *State #0: GPUClock = 300 MHz, MemClock = 500 MHz, VID = 0.000 V*
> State #1: GPUClock = 1000 MHz, MemClock = 500 MHz, VID = 0.000 V
> 
> [ GPU PStates List ]
> 
> *
> DPM0: GPUClock = 300 MHz, VID = 0.90000 V*
> DPM1: GPUClock = 508 MHz, VID = 0.95000 V
> DPM2: GPUClock = 717 MHz, VID = 0.95600 V
> DPM3: GPUClock = 874 MHz, VID = 1.08700 V
> DPM4: GPUClock = 911 MHz, VID = 1.12500 V
> DPM5: GPUClock = 944 MHz, VID = 1.16800 V
> DPM6: GPUClock = 974 MHz, VID = 1.20600 V
> DPM7: GPUClock = 1000 MHz, VID = 1.24300 V


POWERPLAY_STATE_OBJECT_2
xxxx - EngineClockIndexHigh - 00
xxxx - EngineClockIndexLow - 00
xxxx - MemoryClockIndexHigh - 00
xxxx - MemoryClockIndexLow -00

POWERPLAY_STATE_OBJECT_1
xxxx - EngineClockIndexHigh - 07
xxxx - EngineClockIndexLow - 00
xxxx - MemoryClockIndexHigh - 03
xxxx - MemoryClockIndexLow - 00

Is there any way to force the card to use DPM 0, State #0 while on the desktop? It jumps all over with the clocks and power consumption is high at idle for no reason.

I disabled all the other states and forced only DPM 0, State #0 and my 4k decode went from over 25 watts to just 3 watts... 99% utilization 300Mhz is more than enough.

I'd like to force it to DPM 0 at idle and video decode etc and go to DPM 7 like normal under Direct3D gaming load...

Any Ideas?

Thanks


----------



## JunkaDK

Just a quick heads up!

Recently Sapphire has had special offer on their Nitro models.. If you have an Asus MG279Q freesync monitor you might have an issue with a wierd glitch down the middle of the screen at high refresh rates.

ASUS ROG are looking into it: https://rog.asus.com/forum/showthread.php?87817-Strange-line-down-the-middle-of-my-MG279Q-alot-of-R9-Fury-users-have-this-problem&p=609355#post609355


----------



## Thoth420

Anyone getting slight coil whine on mouse cursor movement in desktop @ 2d clocks? Fury X(Overclocked or not) I have to have my head pretty close to the GPU I/O to hear it but is certainly coming from there. Doesn't rule out the PSU, my house power etc. just saying that is where the sounds origin point is. I can't hear it from where I sit but just curious as to the root cause.

I found a bandaid in another thread here that removes the sound when turning mouse cursor trails on in the OS mouse settings. Win10 or LCore.exe disables this however on reboot. I don't mind if it is ever fixed as I said it is inaudible from where I sit but curiousity...killed the cat.


----------



## gupsterg

So originally had an Eizo FG2421, enjoyed the 120Hz. Wanted 1440P screen, went with a Dell U2515H. Super screen for PPI/colors, etc, motion clarity wasn't great in fast paced FPS games but not to bad, at times 60Hz seemed ok and others not. Saw a promo on an Asus MG279Q and OMG! FREESYNC is so fricky cool!







.

Recently picked up Dead Space 1 on promo at Steam. Little did I know enabling V-Sync = 30FPS lock in DS1







, so to get decent FPS you gotta go without V-Sync. On the Dell U2515H 60Hz with 55 FRTC = tearing, Eizo FG2421 120Hz with 110 FRTC distinct line appearing every so often, but on the Asus MG279Q setting FRTC to 85 with FreeSync on I had perfectly rendered image!

I really should have gone FreeSync sooner







.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> 
> 
> So originally had an Eizo FG2421, enjoyed the 120Hz. Wanted 1440P screen, went with a Dell U2515H. Super screen for PPI/colors, etc, motion clarity wasn't great in fast paced FPS games but not to bad, at times 60Hz seemed ok and others not. Saw a promo on an Asus MG279Q and OMG! FREESYNC is so fricky cool!
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Recently picked up Dead Space 1 on promo at Steam. Little did I know enabling V-Sync = 30FPS lock in DS1
> 
> 
> 
> 
> 
> 
> 
> , so to get decent FPS you gotta go without V-Sync. On the Dell U2515H 60Hz with 55 FRTC = tearing, Eizo FG2421 120Hz with 110 FRTC distinct line appearing every so often, but on the Asus MG279Q setting FRTC to 85 with FreeSync on I had perfectly rendered image!
> 
> I really should have gone FreeSync sooner
> 
> 
> 
> 
> 
> 
> 
> .


What I wonder is why 85 is the magic number in FRTC...odd. You gave me a few ideas though to fix my FS issue in BF4(doesn't work in FS mode and I suspect a line in my custom config is causing it).


----------



## bluezone

Quote:


> Originally Posted by *Thoth420*
> 
> What I wonder is why 85 is the magic number in FRTC...odd. You gave me a few ideas though to fix my FS issue in BF4(doesn't work in FS mode and I suspect a line in my custom config is causing it).


This might be part of it.
Quote:


> FreeSync in this implementation is also limited to a maximum refresh rate of 90Hz.


http://www.pcgamer.com/asus-mg279q-review/


----------



## gupsterg

Quote:


> Originally Posted by *Thoth420*
> 
> What I wonder is why 85 is the magic number in FRTC...odd. You gave me a few ideas though to fix my FS issue in BF4(doesn't work in FS mode and I suspect a line in my custom config is causing it).


Only a few months back started meddling with FRTC. When googling for info, many stated when we set FRTC FPS may rise/dip occasionally from set value (+/-). So to stay within monitor Hz/Freesync range you wanna go slightly under from max Hz/FS range.

The MG279Q has FS range of 35-90Hz, so I cap FPS @ 85, run games at 90Hz. You can mod the range on MG279Q,

__
https://www.reddit.com/r/3vo4zs/psa_mg279q_users_can_expand_their_freesync_range/
. When googling for mod info IIRC others have modded other monitors this same way.

I may be able to go a little higher on FRTC but not tested yet. Was planning on doing FS range mod but finding FRTC 85 + 90Hz with FS On is sweet enough, not to miss [email protected]

The other aspect of Freesync is AMD introduced LFC in driver. So if a monitor has FS range where min Hz is equal to or greater than max Hz by 2.5, then when FPS drop below min Hz LFC sorts it out to an limited extent.

Some IIRC are modding their FS range to comply with LFC, where their monitor FS range is not compliant. No monitor firmware upgrade/extra hardware is needed for LFC, basically just the FS range from factory/DIY mod and recent AMD drivers.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Only a few months back started meddling with FRTC. When googling for info, many stated when we set FRTC FPS may rise/dip occasionally from set value (+/-). So to stay within monitor Hz/Freesync range you wanna go slightly under from max Hz/FS range.
> 
> The MG279Q has FS range of 35-90Hz, so I cap FPS @ 85, run games at 90Hz. You can mod the range on MG279Q,
> 
> __
> https://www.reddit.com/r/3vo4zs/psa_mg279q_users_can_expand_their_freesync_range/
> . When googling for mod info IIRC others have modded other monitors this same way.
> 
> I may be able to go a little higher on FRTC but not tested yet. Was planning on doing FS range mod but finding FRTC 85 + 90Hz with FS On is sweet enough, not to miss [email protected]
> 
> The other aspect of Freesync is AMD introduced LFC in driver. So if a monitor has FS range where min Hz is equal to or greater than max Hz by 2.5, then when FPS drop below min Hz LFC sorts it out to an limited extent.
> 
> Some IIRC are modding their FS range to comply with LFC, where their monitor FS range is not compliant. No monitor firmware upgrade/extra hardware is needed for LFC, basically just the FS range from factory/DIY mod and recent AMD drivers.


Thanks for the input. My upper FS limit is 144 so I guess I will try 139ish. In regard to dropping out of range on the lower limit...I don't ever really see sub 40 fps.


----------



## Alastair

I'm still sitting with my 1080P monitor patiently waiting for HDR to drop. Or should I just say "shove it" and upgrade my screen, I am so torn.


----------



## Krzych04650

Quote:


> Originally Posted by *Alastair*
> 
> I'm still sitting with my 1080P monitor patiently waiting for HDR to drop. Or should I just say "shove it" and upgrade my screen, I am so torn.


No point on waiting for HDR and it is far too early to be concerned about it. You will wait very long for those monitors and even longer for any meaningful amount of HDR content. Its the same like with DX12/Vulkan, until there is any meaningful amount of properly developed games designed from the beginning for new APIs and benefiting from it, there are years and years of waiting.

Just get yourself good monitor and don't waste your life for waiting.


----------



## gupsterg

Quote:


> Originally Posted by *Thoth420*
> 
> Thanks for the input. My upper FS limit is 144 so I guess I will try 139ish. In regard to dropping out of range on the lower limit...I don't ever really see sub 40 fps.


No worries







. Yeah, I don't reach lower limit in games which I have either







. IIRC from "stuff" I read originally FreeSync did not have LFC, but as nVidia had lower down the range syncing AMD then had to come up with something.

That BenQ XL2730Z is quite feature packed








Quote:


> Originally Posted by *Alastair*
> 
> I'm still sitting with my 1080P monitor patiently waiting for HDR to drop. Or should I just say "shove it" and upgrade my screen, I am so torn.


One of the reasons why I bought a Fiji card was to go 1440P, you have 2 of them/reasons







. Even in games where CF support maybe lacking a single Fiji is pretty ample. For Crysis 3, on the Dell U2515H [email protected] was jaw dropping, but then with the Asus MG279Q [email protected] with FreeSync I had to pick myself up off the floor







. On the Dell I used FRTC, close to the Hz, can't recall if it was with or without V-Sync, so the upper limit was pretty much 60 FPS. On the Asus MG279Q due to FreeSync I'm seeing upto 90 FPS at time (Gods & Monsters level) and when it does dip FreeSync makes it unnoticeable.

After having experienced variable refresh rate tech I'd say if you're not using it, your missing out on a great feature. To me I'd recommend it as a "must have", just like say a SSD.


----------



## Krzych04650

Quote:


> Originally Posted by *gupsterg*
> 
> After having experienced variable refresh rate tech I'd say if you're not using it, your missing out on a great feature. To me I'd recommend it as a "must have", just like say a SSD.


I have exact opposite experiences with FreeSync. It does almost nothing with v-sync enabled, and with v-sync disabled you get tearing anyway even if you are within your refresh rate. It seemed to do something at 60FPS+ range, and I didn't saw drops from 75 to 65 for example, but it was because I am just used to 60 Hz and I won't see drops on higher FPS that precisely like on lower ones. If you use 60 Hz all the time, and then you are to tell the difference between 65 and 75 FPS, it is not that easy, you need to get used to 75 FPS and then you will start to see FPS drops. But in terms of FPS I am playing at, locked to 60, I didn't saw FreeSync helping at all. Tested on two monitors, one with 55-75 (obviously nothing to see here) and 40-75 (theoretically perfect range) and 2 GPUs, first 390 and then Fury, on both Win7 and Win10. Drops below 60 are as visible and annoying as they were without FreeSync. You can see FreeSync working and animation looks a bit more "in sync", fluctuating around 45-50 FPS with FreeSync surely looked smoother than without it, but this is still terrible and loss in smoothness is still very obvious. This is far from any major difference or "must have" feature.

Not to mention many issues like FreeSync not working at all in some games and flickering in DX9 games.

What is must have for me is PC properly paired with monitor so you are able to run games at stable FPS on your desired refresh rate (like I don't have right now, Fury is far to slow for 3440x1440). Everything else are just secondary workarounds and compromising.


----------



## flopper

Quote:


> Originally Posted by *Alastair*
> 
> I'm still sitting with my 1080P monitor patiently waiting for HDR to drop. Or should I just say "shove it" and upgrade my screen, I am so torn.


guess we know more about HDR in feb with vega.
Not much can be said otherwise in terms of when we will have HDR 4k monitors.
One would think that the manufacturers would be really interesting to hype them a bit by now.
Potentially millions of customers going HDR


----------



## bluezone

crimson 16.10.1 is now here.

release notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-10-1-Release-Notes.aspx

Win 10 64

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

That was fast.


----------



## Tgrove

Quote:


> Originally Posted by *Krzych04650*
> 
> I have exact opposite experiences with FreeSync. It does almost nothing with v-sync enabled, and with v-sync disabled you get tearing anyway even if you are within your refresh rate. It seemed to do something at 60FPS+ range, and I didn't saw drops from 75 to 65 for example, but it was because I am just used to 60 Hz and I won't see drops on higher FPS that precisely like on lower ones. If you use 60 Hz all the time, and then you are to tell the difference between 65 and 75 FPS, it is not that easy, you need to get used to 75 FPS and then you will start to see FPS drops. But in terms of FPS I am playing at, locked to 60, I didn't saw FreeSync helping at all. Tested on two monitors, one with 55-75 (obviously nothing to see here) and 40-75 (theoretically perfect range) and 2 GPUs, first 390 and then Fury, on both Win7 and Win10. Drops below 60 are as visible and annoying as they were without FreeSync. You can see FreeSync working and animation looks a bit more "in sync", fluctuating around 45-50 FPS with FreeSync surely looked smoother than without it, but this is still terrible and loss in smoothness is still very obvious. This is far from any major difference or "must have" feature.
> 
> Not to mention many issues like FreeSync not working at all in some games and flickering in DX9 games.
> 
> What is must have for me is PC properly paired with monitor so you are able to run games at stable FPS on your desired refresh rate (like I don't have right now, Fury is far to slow for 3440x1440). Everything else are just secondary workarounds and compromising.


My freesync works flawlessly and has no sort of issues like that, sounds like something might be wrong with your pc or monitor.

I agree it is a must have feature for me from now on


----------



## Thoth420

Quote:


> Originally Posted by *Tgrove*
> 
> My freesync works flawlessly and has no sort of issues like that, sounds like something might be wrong with your pc or monitor.
> 
> I agree it is a must have feature for me from now on


Mine seems to work fine in everything other than Battlefield 4 (don't have any other BF games installed). I am about to install the driver for Mafia released today...hopefully there is a ninja fix or it was a profile bug that might get overwritten and fixed.


----------



## jearly410

Quote:


> Originally Posted by *Krzych04650*
> 
> I have exact opposite experiences with FreeSync. It does almost nothing with v-sync enabled, and with v-sync disabled you get tearing anyway even if you are within your refresh rate. It seemed to do something at 60FPS+ range, and I didn't saw drops from 75 to 65 for example, but it was because I am just used to 60 Hz and I won't see drops on higher FPS that precisely like on lower ones. If you use 60 Hz all the time, and then you are to tell the difference between 65 and 75 FPS, it is not that easy, you need to get used to 75 FPS and then you will start to see FPS drops. But in terms of FPS I am playing at, locked to 60, I didn't saw FreeSync helping at all. Tested on two monitors, one with 55-75 (obviously nothing to see here) and 40-75 (theoretically perfect range) and 2 GPUs, first 390 and then Fury, on both Win7 and Win10. Drops below 60 are as visible and annoying as they were without FreeSync. You can see FreeSync working and animation looks a bit more "in sync", fluctuating around 45-50 FPS with FreeSync surely looked smoother than without it, but this is still terrible and loss in smoothness is still very obvious. This is far from any major difference or "must have" feature.
> 
> Not to mention many issues like FreeSync not working at all in some games and flickering in DX9 games.
> 
> What is must have for me is PC properly paired with monitor so you are able to run games at stable FPS on your desired refresh rate (like I don't have right now, Fury is far to slow for 3440x1440). Everything else are just secondary workarounds and compromising.


How is the fury "far too slow" for 3440x1440? I'm using a fury x and it is amazing. Get around 75 fps in the games I play and with freesync and vsync, no tears or stuttering. I came from 144hz playing shooters and while it's not the same, 75hz is good enough and with widescreen it is a compromise I'm happy to make.

Freesync and vsync are supposed to operate together as stated by amd. Whatever input lag there is from vsync I don't notice. There must be something wrong somewhere in the setup if freesync + vsync has tearing etc. I know some people are more sensitive to frame rate changes than to tearing so perhaps you fall in that category.


----------



## bluezone

There might be something new/fixed in the latest Crimson drivers. Frame pacing Multi GPU in DX12.






Anyone running X-Fire Furys with Rise of the TombRadier?


----------



## Tgrove

I literally just uninstalled that game a few days ago


----------



## gupsterg

@Krzych04650

So far my experience has been "flawless" with FreeSync. It has been a "buttery" smooth experience like I'd read from others posts in the past.

I don't usually buy the latest games, just because I like to get them when on promo, so will try some older titles. I do have Assassin's Creed 1, tend to use the DX10 exe but will try the DX9. I've yet to play it on the MG279Q with FreeS ync in either mode. I know on the Eizo [email protected] there was this odd hitching , this occured with my Hawaii & Fiji card, but I can't recall it on my HD5850 when used on 1080P plasma and an older Samsung 1680x1050 monitor. AC1 though at [email protected] on the Dell U2515H with Fiji was sweet, no hitching, hoping it is the same experience on MG279Q but be able to use FreeSync or 120Hz+.

I also have Lost Planet 1, I know with what cards/screens I tried it on in the past I had no issues, regardless of DX9 or 10 exe used. Yet to play it on the MG279Q, if an issue with FreeSync will post.

I know some monitors have had firmware issues, dunno if yours are one of them. I know early MG279Q did, mine has a April 16 manufactured date, working great, even the OverDrive option in OSD (called Trace Free).

Mine is an open box purchase from Amazon Warehouse. It was £355, cosmetically like new, still had spec stickers on base/screen like no one used it. Whilst testing it I found minimal back light bleed. Less than the Dell U2515H, it is so much less that I can use a background with all black screen (apart from graphic in center) on the MG279Q. Screen uniformity again seems great to eye. I found this odd bright patch on screen whilst testing with single color full screen fills. The pixels change with color. So not stuck or dead, the patch is 1mm sq, more apparent on bright deep single colour fill screens and virtually non existent on light color or black fill screen, located about 2cm above the S on Asus logo bottom bezel. It does not a) come in the FOV at all in normal use b) as it changes in color you don't note in games/os. I gained a further 20% discount. Paying ~£280, the patch in brightness seems to be diminishing as I use the screen more, so it may disappear but if it doesn't truly no biggie, considering screens normal price is £460-£500. This 1mm sq bright patch is my only issue with the screen so far.


----------



## Krzych04650

Quote:


> Originally Posted by *jearly410*
> 
> How is the fury "far too slow" for 3440x1440? I'm using a fury x and it is amazing. Get around 75 fps in the games I play and with freesync and vsync, no tears or stuttering. I came from 144hz playing shooters and while it's not the same, 75hz is good enough and with widescreen it is a compromise I'm happy to make.
> 
> Freesync and vsync are supposed to operate together as stated by amd. Whatever input lag there is from vsync I don't notice. There must be something wrong somewhere in the setup if freesync + vsync has tearing etc. I know some people are more sensitive to frame rate changes than to tearing so perhaps you fall in that category.


Maybe. As I said, I see the difference and FreeSync makes things smoother but still not impressive, doesn't change a fact that I have to keep stable 60 FPS all the time. Its true that I have overly sensitive eyes and especially ears, so what I say about smoothness or especially noise levels won't apply to the most of users.

Fury is even far far far too slow. Try playing Witcher 3, Assassin's Creed from Black Flag and up, Rise of the Tomb Rider and other games like that. You won't reach 40 FPS on average. Fury is too slow for 1440p, let alone 3440x1440, for this resolution GTX 1080 is bare minimum for comfortable gameplay without heavy compromises in settings. Fury is okay for mid-demanding games, like Mad Max, averaging at ~65 FPS on 3440x1440, but it cannot touch games like Witcher 3 in this resolution. It barely does at 2560x1080, dropping to ~45 FPS in dense forests, at 3440x1440 to low 30s.


----------



## Thoth420

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *gupsterg*
> 
> @Krzych04650
> 
> So far my experience has been "flawless" with FreeSync. It has been a "buttery" smooth experience like I'd read from others posts in the past.
> 
> I don't usually buy the latest games, just because I like to get them when on promo, so will try some older titles. I do have Assassin's Creed 1, tend to use the DX10 exe but will try the DX9. I've yet to play it on the MG279Q with FreeS ync in either mode. I know on the Eizo [email protected] there was this odd hitching , this occured with my Hawaii & Fiji card, but I can't recall it on my HD5850 when used on 1080P plasma and an older Samsung 1680x1050 monitor. AC1 though at [email protected] on the Dell U2515H with Fiji was sweet, no hitching, hoping it is the same experience on MG279Q but be able to use FreeSync or 120Hz+.
> 
> I also have Lost Planet 1, I know with what cards/screens I tried it on in the past I had no issues, regardless of DX9 or 10 exe used. Yet to play it on the MG279Q, if an issue with FreeSync will post.
> 
> I know some monitors have had firmware issues, dunno if yours are one of them. I know early MG279Q did, mine has a April 16 manufactured date, working great, even the OverDrive option in OSD (called Trace Free).
> 
> Mine is an open box purchase from Amazon Warehouse. It was £355, cosmetically like new, still had spec stickers on base/screen like no one used it. Whilst testing it I found minimal back light bleed. Less than the Dell U2515H, it is so much less that I can use a background with all black screen (apart from graphic in center) on the MG279Q. Screen uniformity again seems great to eye. I found this odd bright patch on screen whilst testing with single color full screen fills. The pixels change with color. So not stuck or dead, the patch is 1mm sq, more apparent on bright deep single colour fill screens and virtually non existent on light color or black fill screen, located about 2cm above the S on Asus logo bottom bezel. It does not a) come in the FOV at all in normal use b) as it changes in color you don't note in games/os. I gained a further 20% discount. Paying ~£280, the patch in brightness seems to be diminishing as I use the screen more, so it may disappear but if it doesn't truly no biggie, considering screens normal price is £460-£500. This 1mm sq bright patch is my only issue with the screen so far.






So to clarify: One must leave V Sync on in game for Freesync to work? I feel like such a derp if so...but in my defense coming from G-Sync it was the other way around. If you had issues with G Sync in a game then the advice was to disable the in game V Sync.

I enabled in game V Sync for BF4 and of course the tearing is now gone and I haven't seen any stutter. No perceivable input lag either.








Quote:


> Originally Posted by *Krzych04650*
> 
> Maybe. As I said, I see the difference and FreeSync makes things smoother but still not impressive, doesn't change a fact that I have to keep stable 60 FPS all the time. Its true that I have overly sensitive eyes and especially ears, so what I say about smoothness or especially noise levels won't apply to the most of users.
> 
> Fury is even far far far too slow. Try playing Witcher 3, Assassin's Creed from Black Flag and up, Rise of the Tomb Rider and other games like that. You won't reach 40 FPS on average. Fury is too slow for 1440p, let alone 3440x1440, for this resolution GTX 1080 is bare minimum for comfortable gameplay without heavy compromises in settings. Fury is okay for mid-demanding games, like Mad Max, averaging at ~65 FPS on 3440x1440, but it cannot touch games like Witcher 3 in this resolution. It barely does at 2560x1080, dropping to ~45 FPS in dense forests, at 3440x1440 to low 30s.


What settings are you running because my single Fury X has no problem running any of those games. I am assuming you have all the settings cranked seeing lows like that...gotta find a middle ground if you want higher resos and decent fps. I mean I can even get Hitman and DX MD to stay above 40 fps at all times with only dropping a few settings down one notch from max and only using FXAA or an injector. AA is the performance killer as always. For example I play TW3 everything maxed cept no AA, no Hairworks, and Foliage and Grass on High instead of Ultra. Motion Blur and Blur disabled for preference. The game never drops below 45 fps anywhere.


----------



## gupsterg

Quote:


> Originally Posted by *Thoth420*
> 
> I enabled in game V Sync for BF4 and of course the tearing is now gone and I haven't seen any stutter. No perceivable input lag either.


Depends on the game. FreeSync works with V-Sync On or Off (ie you will get variable refresh rate).

I confirmed refresh rate will variate even with V-Sync = On + FreeSync = On, by running FRAPS and keeping monitor OSD on-screen (MG279Q can be set to show OSD for 2 min), I see refresh rate in OSD change to match FPS.

Below are example of games in context of stock FreeSync range on MG279Q.

Example 1

Crysis 3 has no FPS cap built in to game, so if using V-Sync off I need to set a profile in Crimson driver with FRTC of say 85 to keep game always within FreeSync range. V-Sync On sets it self as max FPS 90 as 90Hz is res setting and then as FreeSync is on it will dive down and upto 90.

Example 2

DeadSpace 1 if V-Sync *ON* regardless of STD / FreeSync monitor and Hz of monitor, FPS will be limited to 30 by game engine. So if I wish to use FreeSync I set a profile in Crimson driver with FRTC of 85 to keep game always within FreeSync range and then use V-Sync *OFF* in game to game 85 FPS.

Example 3

The Crew, this game has in game engine FPS limit selection of 30 or 60, regardless of V-Sync setting. So as long as FreeSync is *on* monitor will match the FPS in Hz, no need for crimson profile with FRTC as game engine is already limiting FPS = game stay with FS range.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Depends on the game. FreeSync works with V-Sync On or Off (ie you will get variable refresh rate).
> 
> I confirmed refresh rate will variate even with V-Sync = On + FreeSync = On, by running FRAPS and keeping monitor OSD on-screen (MG279Q can be set to show OSD for 2 min), I see refresh rate in OSD change to match FPS.
> 
> Below are example of games in context of stock FreeSync range on MG279Q.
> 
> Example 1
> 
> Crysis 3 has no FPS cap built in to game, so if using V-Sync off I need to set a profile in Crimson driver with FRTC of say 85 to keep game always within FreeSync range. V-Sync On sets it self as max FPS 90 as 90Hz is res setting and then as FreeSync is on it will dive down and upto 90.
> 
> Example 2
> 
> DeadSpace 1 if V-Sync *ON* regardless of STD / FreeSync monitor and Hz of monitor, FPS will be limited to 30 by game engine. So if I wish to use FreeSync I set a profile in Crimson driver with FRTC of 85 to keep game always within FreeSync range and then use V-Sync *OFF* in game to game 85 FPS.
> 
> Example 3
> 
> The Crew, this game has in game engine FPS limit selection of 30 or 60, regardless of V-Sync setting. So as long as FreeSync is *on* monitor will match the FPS in Hz, no need for crimson profile with FRTC as game engine is already limiting FPS = game stay with FS range.


Thanks alot as all three of those games I do not play(well deadspace is on my xbone...great game). I think I get the gist of how this works now. I guess that ASUS having a max of 90 makes it easier to figure this out. Mine being matched with the panels max refresh rate had me kinda clueless since well there isn't much noticeable tearing with v sync off at high fps on high refresh panels anyway. I really had to hunt just to find them in BF4.


----------



## gupsterg

Quote:


> Originally Posted by *Thoth420*
> 
> Thanks alot as all three of those games I do not play(well deadspace is on my xbone...great game). I think I get the gist of how this works now.


Cool







. The games where just an example of how different engines/games are and how you need to assess what is best for the given situation







.
Quote:


> Originally Posted by *Thoth420*
> 
> I guess that ASUS having a max of 90 makes it easier to figure this out.


Sort of, but say Dead Space 1 with V-Sync Off, when @ 1080P it would give higher FPS than what the Hz of screen was (120Hz) so still an issue ( ie line in image). If I enabled FRTC it still exhibited a line in the rendered image on the Eizo, even with Turbo 240 mode, only way the game become what it should be was with a FreeSync panel







 .

Yep DS1 is great, if I hadn't got the FreeSync panel I would no way enjoy it, as the line in rendered image was really doing my nut in!


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> . The games where just an example of how different engines/games are and how you need to assess what is best for the given situation
> 
> 
> 
> 
> 
> 
> 
> .
> Sort of, but say Dead Space 1 with V-Sync Off, when @ 1080P it would give higher FPS than what the Hz of screen was (120Hz) so still an issue. If I enabled FRTC it still exhibited a line in the rendered image on the Eizo, even with Turbo 240 mode, only way the game become what it should be was with a FreeSync panel
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yep DS1 is great, if I hadn't got the FreeSync panel I would no way enjoy it, as the line in rendered image was really doing my nut in!


I am just glad you didn't pick the games I already figured out how to get working fine. It was good to get some examples from titles I haven't messed with FS on so far.

I am enjoying it on the console even in crap reso crap graphics because I never got around to it before. I can see why this game was so well received.


----------



## gupsterg

I never also got round to playing it when released, bought DS1 & 2 for £5, 3 weeks ago on steam







.


----------



## mynm

Quote:


> Originally Posted by *gupsterg*
> 
> DeadSpace 1 if V-Sync *ON* regardless of STD / FreeSync monitor and Hz of monitor, FPS will be limited to 30 by game engine. So if I wish to use FreeSync I set a profile in Crimson driver with FRTC of 85 to keep game always within FreeSync range and then use V-Sync *OFF* in game to game 85 FPS.


You can use D3Doverrider to unlock the 30 fps limit. Also you have to enable open gl triple buffering on radeon settings.


----------



## neurotix

Hi guys.

Just ordered two Sapphire R9 Fury Nitro from newegg for $309 each.

I had a lot of money saved up for Vega, but I'm not very optimistic about it, and might skip Vega entirely and wait for Navi (at this rate I'll be waiting till 2019). I wanted something more powerful in the meantime. Up till this year I had two R9 290 Tri-X and then two R9 290 Vapor-X (best cards I ever owned). The Fury's should be even better than my 290s. I don't plan on upgrading again for a long time. I'm pretty excited. The price on these things is amazing and they're only about a year old. Much better than the $600 each I paid for my 290s during the mining craze... Even so, my wife makes a lot of money so within 2-3 years I'll have plenty to get whatever is best at the time.









I'll join the club and post pics once I get my cards installed. Regards.

EDIT: Also Witcher 3.... I have no problems running this game at 5760x1080 on my current setup (dual 380X). Everything is on Ultra with post processing off and FXAA and I get between 55-60 FPS... I think two Fury's will probably crush it.


----------



## bluezone

Quote:


> Originally Posted by *neurotix*
> 
> Hi guys.
> 
> Just ordered two Sapphire R9 Fury Nitro from newegg for $309 each.
> 
> I had a lot of money saved up for Vega, but I'm not very optimistic about it, and might skip Vega entirely and wait for Navi (at this rate I'll be waiting till 2019). I wanted something more powerful in the meantime. Up till this year I had two R9 290 Tri-X and then two R9 290 Vapor-X (best cards I ever owned). The Fury's should be even better than my 290s. I don't plan on upgrading again for a long time. I'm pretty excited. The price on these things is amazing and they're only about a year old. Much better than the $600 each I paid for my 290s during the mining craze... Even so, my wife makes a lot of money so within 2-3 years I'll have plenty to get whatever is best at the time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll join the club and post pics once I get my cards installed. Regards.
> 
> EDIT: Also Witcher 3.... I have no problems running this game at 5760x1080 on my current setup (dual 380X). Everything is on Ultra with post processing off and FXAA and I get between 55-60 FPS... I think two Fury's will probably crush it.


Welcome neurotix.

So far I'm liking the rumors that I hear on Vega. The Fury series cards are roughly = 2 X 7950. So I think you will enjoy the Nitro's.


----------



## bluezone

OK so I think I've have my VRM temperatures on my Nano under control now.

Here's a HWiNFO screen cap after a run of FS.


Spoiler: Warning: Spoiler!







To get these temperatures I'm running 2 extra fans and I've mounted a small laptop cooler to the aluminum stiffening bracket/heat sink, with airflow from one fan on the back blowing through the small cooler.


Spoiler: Warning: Spoiler!


----------



## neurotix

Quote:


> Originally Posted by *bluezone*
> 
> Welcome neurotix.
> 
> So far I'm liking the rumors that I hear on Vega. The Fury series cards are roughly = 2 X 7950. So I think you will enjoy the Nitro's.


Yep, just like my old 290s. The little brother cards.

The 290 was within like 5% of a 290X so I'm guessing Fury is within 10% or so of a Fury X.

Ultimately, for the price, these twin cards should work for me for a long time. If I wanted I could easily get two 1080's but... why? Two of these should max everything I have out at my resolution for probably 1/3rd the price.


----------



## gupsterg

Fury at times is so close to Fury X. If you gain SP unlock a Fury with 3840SP benched pretty much the same as genuine Fury X with 4096SP for me.

SP unlock is rare on Nitro, but some have had unlockable card, but still a nice Fiji card.


----------



## gupsterg

Quote:


> Originally Posted by *mynm*
> 
> You can use D3Doverrider to unlock the 30 fps limit. Also you have to enable open gl triple buffering on radeon settings.


Yeah, did find info on D3Doverider when googling but just didn't wanna use it. Forcing OpenGL triple buffering won't help as it's a DirectX title from what I recall.

Anyway +rep for info, rolling with FreeSync in DS1 is simple and easy TBH







.


----------



## NightAntilli

SteamVR performance test is so unreliable. I decided to let Plays.TV record during the test to see how much my performance would drop on my Fury Nitro. To my surprise, my score went up from 9 to 9.6  Even though before my CPU had 0 frames CPU bound, and with Plays.TV recording 5 frames CPU bound...


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> Yep, just like my old 290s. The little brother cards.
> 
> The 290 was within like 5% of a 290X so I'm guessing Fury is within 10% or so of a Fury X.
> 
> Ultimately, for the price, these twin cards should work for me for a long time. If I wanted I could easily get two 1080's but... why? Two of these should max everything I have out at my resolution for probably 1/3rd the price.


The Fury is not that far off the 290 don't expect a large jump from your 290's


----------



## LionS7

Quote:


> Originally Posted by *diggiddi*
> 
> The Fury is not that far off the 290 don't expect a large jump from your 290's


Well, it's about 25 to 40% for R9 Fury X. (R9 Fury X @1100/1000 vs R9 290 @1100/6100) I was with R9 290 before my R9 Fury X.


----------



## gupsterg

@diggiddi

It isn't a large jump ....

I subscribe to a PC mag in the UK called Custom PC, some of the reviews from the mag are also on Bit Tech, this RX 480 review interested me as they used newer drivers than their Fury/X reviews. Except for Hitman these are the figures I come up with:-



Spoiler: Warning: Spoiler!



AOTS (DX12)

1080P Fury X vs 390X has 15% higher min. FPS, Fury X vs 390X has 25% higher aver. FPS.
1440P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS

Fallout 4 (DX11)

1080P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 17% higher aver. FPS.
1440P Fury X vs 390X has 17% higher min. FPS, Fury X vs 390X has 18% higher aver. FPS.

The Division (DX11)

1080P Fury X vs 390X has 38% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS.
1440P Fury X vs 390X has 33% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS.

Total War: Warhammer (DX12)

1080P Fury X vs 390X has 13% higher min. FPS, Fury X vs 390X has 12% higher aver. FPS.
1440P Fury X vs 390X has 4% higher min. FPS, Fury X vs 390X has 19% higher aver. FPS.

The Witcher 3 (DX11)

1080P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 19% higher aver. FPS.
1440P Fury X vs 390X has 26% higher min. FPS, Fury X vs 390X has 25% higher aver. FPS.



We also know from Hawaii bios that modding a 290/X ROM or flashing it to 390/X ROM, the true 390/X seems to have a distinct advantage. So in a way 390/X vs Fury/X is the best of Hawaii vs Fiji. I won't do the maths for Fury X vs 290X, here is the Fury X launch review with that.

This section of TPU RX 480 review also shows relative performance.

Not ground breaking gap but there none the less IMO.

Dunno if neurotix has custom WC loop for his Hawaii cards but one thing that I noted was none of the aftermarket air cooled Hawaii cards I owned (DCUII, Tri-X, Vapor-X) where as quiet as the Fury Tri-X, let alone Fury X. The Fury Nitro has pretty much the same cooler as Fury Tri-X, so I expect it to be quiet.

As a "out of the box" experience Fiji for me has been better than Hawaii and I'll be honest at first I wasn't impressed but the more I used them, the more I knew I'd made the right GPU swap. Currently luv'ing Fiji at 1440P, may or may not get Vega TBH.

The only thing Fiji does not have vs Hawaii/Grenada is OC headroom consistently, but it scales pretty much 1:1 for GPU clock increase.

Here is 3DM FS compare of MAX OC of my VX290X 4GB 1150/1575 with 390 MC + RAM Timings mod vs my daily Fury X OC.

Here is 3DM FS compare of VX290X 4GB daily OC vs Fury X daily OC.


----------



## neurotix

Quote:


> Originally Posted by *gupsterg*
> 
> @diggiddi
> 
> It isn't a large jump ....
> 
> I subscribe to a PC mag in the UK called Custom PC, some of the reviews from the mag are also on Bit Tech, this RX 480 review interested me as they used newer drivers than their Fury/X reviews. Except for Hitman these are the figures I come up with:-
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> AOTS (DX12)
> 
> 1080P Fury X vs 390X has 15% higher min. FPS, Fury X vs 390X has 25% higher aver. FPS.
> 1440P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS
> 
> Fallout 4 (DX11)
> 
> 1080P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 17% higher aver. FPS.
> 1440P Fury X vs 390X has 17% higher min. FPS, Fury X vs 390X has 18% higher aver. FPS.
> 
> The Division (DX11)
> 
> 1080P Fury X vs 390X has 38% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS.
> 1440P Fury X vs 390X has 33% higher min. FPS, Fury X vs 390X has 23% higher aver. FPS.
> 
> Total War: Warhammer (DX12)
> 
> 1080P Fury X vs 390X has 13% higher min. FPS, Fury X vs 390X has 12% higher aver. FPS.
> 1440P Fury X vs 390X has 4% higher min. FPS, Fury X vs 390X has 19% higher aver. FPS.
> 
> The Witcher 3 (DX11)
> 
> 1080P Fury X vs 390X has 19% higher min. FPS, Fury X vs 390X has 19% higher aver. FPS.
> 1440P Fury X vs 390X has 26% higher min. FPS, Fury X vs 390X has 25% higher aver. FPS.
> 
> 
> 
> We also know from Hawaii bios that modding a 290/X ROM or flashing it to 390/X ROM, the true 390/X seems to have a distinct advantage. So in a way 390/X vs Fury/X is the best of Hawaii vs Fiji. I won't do the maths for Fury X vs 290X, here is the Fury X launch review with that.
> 
> This section of TPU RX 480 review also shows relative performance.
> 
> Not ground breaking gap but there none the less IMO.
> 
> Dunno if neurotix has custom WC loop for his Hawaii cards but one thing that I noted was none of the aftermarket air cooled Hawaii cards I owned (DCUII, Tri-X, Vapor-X) where as quiet as the Fury Tri-X, let alone Fury X. The Fury Nitro has pretty much the same cooler as Fury Tri-X, so I expect it to be quiet.
> 
> As a "out of the box" experience Fiji for me has been better than Hawaii and I'll be honest at first I wasn't impressed but the more I used them, the more I knew I'd made the right GPU swap. Currently luv'ing Fiji at 1440P, may or may not get Vega TBH.
> 
> The only thing Fiji does not have vs Hawaii/Grenada is OC headroom consistently, but it scales pretty much 1:1 for GPU clock increase.
> 
> Here is 3DM FS compare of MAX OC of my VX290X 4GB 1150/1575 with 390 MC + RAM Timings mod vs my daily Fury X OC.
> 
> Here is 3DM FS compare of VX290X 4GB daily OC vs Fury X daily OC.


I used the Tri-X cards with the Tri-X cooler. I had to RMA them because the fans made a very loud, obnoxious grinding noise under load. They sent me back two matching Vapor-X cards (it probably helped that I sent them candy/Swedish fish theory). The Vapor-X cards were amazing, excellent build quality, no fan rattling, and ran pretty cool (usually under 70C). However, they *were* quite loud. The Fury Nitros will also be left alone, the reason I got cards with high end air coolers is to use those coolers. (I won't go into all my reasons about why I don't want a water loop.) But if they really are quieter I will be quite happy.


----------



## gupsterg

At idle the fans will switch off on Fury Tri-X/Nitro, can be made to come on a little earlier via bios mod







or custom fan curve via MSI AB,etc.

Stock fan profile/clocks I had to keep shining a torch through my mesh side panel to know if the Fury Tri-X fans were spinning under load







. IIRC room ambient of 23°C, my SilverStone TJ06 has been modded over the years







, so has improved airflow







, may have photo with the Fury Tri-X but here's one of the Fury X just after installing it







.


----------



## Drake87

Just ordered my Fury Nitro and it should be here next week! I noticed that the recommended PSU is 750. I have a Capstone 650. Am I going to be ok with that or should I look for a higher wattage power supply?


----------



## gupsterg

The recommended PSU on graphics card manufacturer pages are more like a guideline for poorer quality PSU that are not continuous power output rated.

If the capstone 650W is same platform as capstone 750W (reviewed on JonnyGuru) then it's pretty decent PSU and I'd see no issue. I've ran a OC'd Fury/X (not at same time) on a 650W PSU.


----------



## Drake87

Quote:


> Originally Posted by *gupsterg*
> 
> The recommended PSU on graphics card manufacturer pages are more like a guideline for poorer quality PSU that are not continuous power output rated.
> 
> If the capstone 650W is same platform as capstone 750W (reviewed on JonnyGuru) then it's pretty decent PSU and I'd see no issue. I've ran a OC'd Fury/X (not at same time) on a 650W PSU.


I'm pretty sure that it is. That's good news. I really didn't want to get my old 850 corsair out since the fan on it is freaking loud.


----------



## gupsterg

You'll be fine, you see the Q6600 rig in my sig I placed spare Fury/X cards (when I had them) in that. Running tests like [email protected] CPU/GPU loaded for over 24hrs+ continuous at a time. Then also Heaven/Valley/3DM FS looped whilst I testing OC ability of cards. I tested about 7 Fiji cards with probably 10+ OC profiles, so you could say 70 runs each of [email protected] / Heaven / Valley / 3DM FS, I'll let you do the maths on how many hours uptime that was on the 650W PSU







.


----------



## Thoth420

Quote:


> Originally Posted by *Drake87*
> 
> Just ordered my Fury Nitro and it should be here next week! I noticed that the recommended PSU is 750. I have a Capstone 650. Am I going to be ok with that or should I look for a higher wattage power supply?


Agreed with the others you are more than fine. My Fury X and 6700K are running fine with a modest CPU OC and the GPU running at stock with overhead room to push it further on a 750 Watt EVGA SuperNOVA P2.


----------



## miklkit

Hi all! I installed a Sapphire Fury Nitro a few days ago and have been playing The Witcher 3 since. It replaced a Sapphire 8gb 290X with a mild OC and has really bumped up frame rates. Put everything on ultra and the frame rate has never dropped below 60 yet but has hit the 90s a few times.

The first thing I noticed was that the temp went straight to 80C and stayed there. Opened up Afterburner and made a more aggressive fan profile and since then it stays in the mid-high 50s although it did hit 60C once.

I was surprised to see higher cpu loads with it. In TW3 the cpu loads went from 20-30% to 30-40% with average temps sitting at 52C.

So, is there anything else I should know about this beast?


----------



## neurotix

I got my two R9 Fury Nitros and they're in. I'll have to do more extensive testing and bench them for HWBOT. I'll take and post pics tomorrow.


----------



## pengs

Quote:


> Originally Posted by *miklkit*


This is exactly what I was looking at a few days ago, a 290X Tri-X sitting next to the Fury Nitro. Turned out that the Fury was a literally a two hair widths longer which made no difference.

The cooler on the Nitro is a tank. I trust it and the stock fan profile which is saying a lot. The time which it takes to rise in temperature is extreme and that signifies just how over engineered it is. If the 290X Tri-X had an excellent cooler on it this one is a hit that went out of the park.

My case has normal to good airflow and I haven't seen anything over 77* tops, 73-75* normally. At 75* the fan spins up to around 30% and kicks the temperature back to 70*. Easy peasy, as if it encountered a mosquito and shoo'ed it gently with a whisk breath. Best cooler I've seen on a GPU hands down







Extremely silent also.

Definitely recommend the Nitro.


----------



## neurotix

I love mine so far too. I have VSYNC on, power efficiency on, frame pacing on, and frame rate target control set to 60 fps. In Sleeping Dogs (5760x1080) my top card only goes to 50C and bottom to 45C. Even in Witcher 3 and ROTTR my top card only hits 57C and bottom 48C. This is while maintaining a constant 60fps in all games. Moreover I have a pretty aggressive fan profile (65% fan at 55C, 100% fan at 70C) but the cards are pretty quiet. The cooler on these things is phenomenal.

Going from dual 380X I got 52fps in Valley at 5760x1080, that's up to 97.9 fps now with the Fury's and I haven't even overclocked yet. I had dual 290s last year and those got right around 60 fps in Valley Extreme at 5760x1080. So the Fury's are actually like 37 fps more in Valley. Just amazing for the $600 I spent for the pair, and still probably much more powerful than dual RX 480s. Especially considering I spent over $600 alone for one 290 Tri-X during the mining craze when they were in short supply and had JUST come out.

I had both dual 290 Tri-X and 290 Vapor-X and both ran hotter. The Vapor-X looked and performed amazing though.


----------



## Drake87

I've been using afterburner for the last few years and have been pretty happy with it. Would I be ok using that for my nitro or should I use the sapphire software (forgot the name of it) instead? My card hasn't arrived yet so I haven't been able to play around with the oc software to see firsthand which is better.


----------



## gupsterg

No issue with using MSI AB vs Trixx on Nitro card, I used MSI AB with Tri-X. Each software does not have anything proprietary for Nitro, so you will be getting same "tweaking" experience. I'd say all down to what you prefer using







.


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> No issue with using MSI AB vs Trixx on Nitro card, I used MSI AB with Tri-X. Each software does not have anything proprietary for Nitro, so you will be getting same "tweaking" experience. I'd say all down to what you prefer using
> 
> 
> 
> 
> 
> 
> 
> .


@gupsterg, what voltage your card want for 1145Mhz core ?


----------



## gupsterg

Stock VID is 1.212V. I set DPM 7 as 1.268V for 1145. You could say +56mV if I was using OS OC SW.

Since setting this OC months back in ROM, it has been stable for 1080P 120Hz gaming on the Eizo FG2421, 1440P 60Hz gaming on Dell U2515H and now Asus MG279Q 1440P with FreeSync







.


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> Stock VID is 1.212V. I set DPM 7 as 1.268V for 1145. You could say +56mV if I was using OS OC SW.
> 
> Since setting this OC months back in ROM, it has been stable for 1080P 120Hz gaming on the Eizo FG2421, 1440P 60Hz gaming on Dell U2515H and now Asus MG279Q 1440P with FreeSync
> 
> 
> 
> 
> 
> 
> 
> .


Ok thank you. This sounds fine I think. Im on +60mV (1.258V) for now for 1125/520. VID 1.20V. The interesting thing is that even when I put 1.35V for the HBM voltage, the memory crashes on 525. So no voltage scaling for me I think for HBM. In games maybe it will stay even on 550Mhz, but I dont know if it is really stable. Im testing with Time Spy stress test, cos the load on the memory is very heavy, and there it crashes on 525.

Im just doing some statistics in my head for now, for the "silicon lottery".


----------



## neurotix

Quote:


> Originally Posted by *Drake87*
> 
> I've been using afterburner for the last few years and have been pretty happy with it. Would I be ok using that for my nitro or should I use the sapphire software (forgot the name of it) instead? My card hasn't arrived yet so I haven't been able to play around with the oc software to see firsthand which is better.


I prefer Trixx.

If you have more than one card, Trixx can control the voltage and clocks for each card individually. You can also set separate fan profiles for each card. I have never found a way to do this with Afterburner, it seems with Afterburner you can only control both cards at once (at least, the one I have works this way, I don't know if it's different with a newer version). However, Afterburner DOES let you adjust the aux voltage, which you can't do in Trixx, but I am not sure if this even does anything with AMD cards. It's more of an Nvidia thing.

Ultimately, I have both, there's no reason you can't use both, you can even have them open at the same time with no issues afaik. (Did this with my 380s and 290s no problem, haven't tried with Fury's).


----------



## prom

Got my Nano back from Sapphire RMA and while my original artifact problem (3d spikes in games) seems to be solved, another has appeared.
Now I get random 'snow' flickering in and out sometimes, regardless of what I'm doing. At stock settings.

I've already DDU'd a couple of times for a variety of Crimson drivers, and I'm using AMDs latest bios.

It's worth noting that they gave me back my _original card_, so I'm really curious to know what they did.
I'm not sure I want to do the process _again_, only to get the same card back


----------



## lanofsong

Hey R9 Radeon Fury/Nano/X/Pro DUO FIJI owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## xkm1948

Hopefully to grab a second FuryX over Blackfriday, assuming there will be more price drops. Will my 5820K and 850Watt PSU be enough to handle two FuryX?


----------



## diggiddi

I'd say yes


----------



## neurotix

Benches done.


----------



## Thoth420

Quote:


> Originally Posted by *lanofsong*
> 
> Hey R9 Radeon Fury/Nano/X/Pro DUO FIJI owners,
> 
> We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> October Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


Cheers! I was interested in this next time I got an AMD card but have never done folding before. I can just leave my system on doing this 24/7? It is in a loop and stays very cool even under heavy stress for long gaming periods. Would love to stress this system to the max as often as possible (don't have time to game every day) to see if any of the hardware has a weakpoint..rather it fail now and replace it than later. I can't stand letting my system sit in a bench all day because it causes too much coil whine and the system is in my living/work area. Does folding have the same effect in regard to coil whine?


----------



## NightAntilli

It's nice to see people put their old cards next to their Fury Nitro. I guess I experienced the biggest upgrade xD;


----------



## neurotix

Quote:


> Originally Posted by *NightAntilli*
> 
> It's nice to see people put their old cards next to their Fury Nitro. I guess I experienced the biggest upgrade xD;


I have pretty much the exact same card except it's a 6870. Same cooler though, and Sapphire.

Unfortunately, it's dead... been dead for years. It was in another computer and my brother played games on it without turning on the fan profile, it overheated and the card died (no worries, he replaced it with a 270X).





There's a before and after for me.


----------



## JunkoXan

My Sapphire 280x out and my Sapphire Fury Nano in and the RMA Nano has voltage control...
















update: Okay I know these cards are hot, but 1.7million Celsius isn't exactly "Hot" that's Scorched Earth Hot.... lol Unigine.... >_> really though it tapped at 65c.


----------



## mynm

Quote:


> Originally Posted by *gupsterg*
> 
> Yeah, did find info on D3Doverider when googling but just didn't wanna use it. Forcing OpenGL triple buffering won't help as it's a DirectX title from what I recall.
> 
> Anyway +rep for info, rolling with FreeSync in DS1 is simple and easy TBH
> 
> 
> 
> 
> 
> 
> 
> .


Thanks for the rep+







. But I don't know what is happening because D3Doverider seems to don't be working some times. I have tested it, but it worked one time and not more. But years ago with win7 it was working. It was the only way to see more than 30fps (or half fps than Hzs) and don't see tearing wit DS1. But I have searched about FreeSync and with vsync off it is not showing tearing, so it isn't a problem for you.


----------



## tabs

So I'm trying to decide between a Nano and Fury X and I would really appreciate some advice and opinions.

I will need to cool the processor and the graphics together on a single 120mm radiator (I know this isn't ideal but this is mostly for aesthetics). The processor is a Broadwell 5775C which is slightly undervolted with stock clocks. I'm pretty sure the Nano will work in this cooling setup. What I need your help with is the Fury X. Its listed TDP is much higher at 275W but reviews seem to show it drawing 225W to 250W. Is the Nano really much more power efficient due to chip binning, or is much of its lower TDP due to clocks and throttling? If I undervolt and underclock a Fury X, do I get something close to a Nano in terms of performance? Exactly how would I go undervolting and underclocking it to realize this? With the Fury X, I know I have the option to air cool the processor and get the full performance out of the graphics card. Could I get Fury X performance out of a Nano somehow? If so, how? The Nano is also known for coil whine. Is coil whine a problem on the Fury X?

Thanks


----------



## Tgrove

Quote:


> Originally Posted by *xkm1948*
> 
> Hopefully to grab a second FuryX over Blackfriday, assuming there will be more price drops. Will my 5820K and 850Watt PSU be enough to handle two FuryX?


Ended up having to upgrade my hx850 to ax1200i with system rig


----------



## Thoth420

Quote:


> Originally Posted by *Tgrove*
> 
> Ended up having to upgrade my hx850 to ax1200i with system rig


Edit:

How old was that hx?


----------



## prom

Looks like I'm going to have to RMA my card a second time.
Different artifacts this time around. Now they are in the form of "snow" randomly appearing/disappearing on either monitor, under load OR idle, and the occasional checker artifacts on parts of my screens


----------



## JunkoXan

Quote:


> Originally Posted by *prom*
> 
> Looks like I'm going to have to RMA my card a second time.
> Different artifacts this time around. Now they are in the form of "snow" randomly appearing/disappearing on either monitor, under load OR idle, and the occasional checker artifacts on parts of my screens


they just don't like you, do they?


----------



## neurotix

Quote:


> Originally Posted by *prom*
> 
> Looks like I'm going to have to RMA my card a second time.
> Different artifacts this time around. Now they are in the form of "snow" randomly appearing/disappearing on either monitor, under load OR idle, and the occasional checker artifacts on parts of my screens


Good luck. Hope you get it back fast.


----------



## bluezone

Update your HWiNFO64. There is a new version out. With corrected GPU Wattage for GPU Core Power and added GPU Chip Power.

Cool.

EDIT: Amperage seems corrected too.


----------



## Thoth420

Had a random Kernel 41 system reboot doing the "run all tests" in 3D mark(never done that before fyi). Is it possible this occured because one of the tests says for 4K and my panel is only 1440p? I have not had this issue occur any other time with this configuration and the PSU is only 6 months old. I also have my system plugged into a pure sine wave active PFC compliant battery backup.

I already cleaned my drivers...suspect either dirty power in my house or the PSU is crapping out...either one is pretty infuriating. I have confirmed it is not RAM or Mobo. Any other things I might be overthinking?

I haven't tried the run all tests since but did run a custom test with everything maxed at my 1440p native resolution with Firestrike in a loop overnight and it was totally fine.


----------



## Drake87

Quote:


> Originally Posted by *NightAntilli*
> 
> It's nice to see people put their old cards next to their Fury Nitro. I guess I experienced the biggest upgrade xD;


What clocks are you getting on your fury? I haven't played around with mine much just yet. Set it to 1100 core and ran the heaven bench and been playing Deus Ex since.


----------



## diggiddi

Anbody with an FX 83XX/9XXX cpu and a Fury care to run some benches with me?
I'd like to see how my 290x lightning stacks up, reps will be given for your time, thx


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> Anbody with an FX 83XX/9XXX cpu and a Fury care to run some benches with me?
> I'd like to see how my 290x lightning stacks up, reps will be given for your time, thx


what bench do you want?

Here is a few for you to compare to...

http://www.3dmark.com/fs/10392921 OC
http://www.3dmark.com/spy/552130 OC
http://www.3dmark.com/fs/9997725 full stock I ran just to see.
http://www.3dmark.com/spy/367989 card at stock, cpu at my daily clock of 5ghz

I'm not sure how much this will help as I do run a normal clock speed of 5ghz nearly permanently on my cpu so its a bit of a stretch from stock... and even my stock may be a bit off from other visheras because of the high stock clocks.

This is the highest score on FS I was able to get on fury x... http://www.3dmark.com/fs/10311866


----------



## Tgrove

Quote:


> Originally Posted by *Thoth420*
> 
> Edit:
> 
> How old was that hx?


It was a few years old


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> what bench do you want?
> 
> Here is a few for you to compare to...
> 
> http://www.3dmark.com/fs/10392921 OC
> http://www.3dmark.com/spy/552130 OC
> http://www.3dmark.com/fs/9997725 full stock I ran just to see.
> http://www.3dmark.com/spy/367989 card at stock, cpu at my daily clock of 5ghz
> 
> I'm not sure how much this will help as I do run a normal clock speed of 5ghz nearly permanently on my cpu so its a bit of a stretch from stock... and even my stock may be a bit off from other visheras because of the high stock clocks.
> 
> This is the highest score on FS I was able to get on fury x... http://www.3dmark.com/fs/10311866


Thx for Quick reply can you clock down to 4.6ghz no turbo, and 1600 mhz Ram so we can be on even plane?
Also do you have Pcars, Assetto Corsa,BF4/3, Cry3?


----------



## Minotaurtoo

sorry I have none of those... I'll see about the down clocking though... last time I tried to down clock the ram to that I couldn't get the timings to work out right... for some reason the only timings it wanted to work at was terrible.


----------



## Krzych04650

Like I said somewhere before I moved my PC to the attic to avoid coil whine noises, and this attic is not thermally isolated so ambient temp vary a lot. Today there is 11 C temp there, so I made some temperature tests on Fury Nitro to see how much it affects GPU temps. With 24 C ambient temp I got 71 C on fixed 35% fan speed with everything on stock. With stock clocks and -90mV undervolt I got 56 C. With 11 C ambient I got 58 and 43 C respectively, 13 C difference in both bases with 13 C difference in ambient temp, so temp change seem to be perfectly lineal.

I get 13 C after booting PC







CPU stays at 19C during web browsing on air cooler. But under lighter load, like Valley or games, CPU temps doesn't seem to be affected by ambient temp, they are around 40-45 C just like before. In OCCT test CPU temps are like 7-9C lower so ambient temp affects CPU temps less.


----------



## gupsterg

Quote:


> Originally Posted by *tabs*
> 
> So I'm trying to decide between a Nano and Fury X and I would really appreciate some advice and opinions.
> 
> I will need to cool the processor and the graphics together on a single 120mm radiator (I know this isn't ideal but this is mostly for aesthetics). The processor is a Broadwell 5775C which is slightly undervolted with stock clocks. I'm pretty sure the Nano will work in this cooling setup. What I need your help with is the Fury X. Its listed TDP is much higher at 275W but reviews seem to show it drawing 225W to 250W. Is the Nano really much more power efficient due to chip binning, or is much of its lower TDP due to clocks and throttling? If I undervolt and underclock a Fury X, do I get something close to a Nano in terms of performance? Exactly how would I go undervolting and underclocking it to realize this? With the Fury X, I know I have the option to air cool the processor and get the full performance out of the graphics card. Could I get Fury X performance out of a Nano somehow? If so, how? The Nano is also known for coil whine. Is coil whine a problem on the Fury X?
> 
> Thanks


Nano & Fury X are both ref PCB using same VRM components, so aspect of experiencing whine be the same IMO. I've had 1x Fury Tri-X and 6 Fury X and none had obtrusive coil whine IMO, but is a subjective thing. Nano has though 4 less GPU phases on VRM, but as it seems your not going to OC than not something to consider. Just bare in mind Fiji is not known for it's OC ability.

I would also think a undervolted Fury X would be the same power usage as Nano. Nano may have slight edge due to binning but not much IMO. You can under volt/clock cards through OS SW or ROM mod.

As it seems your needs are more towards low power usage (plus how your setting up cooling), I would think the Nano would be better suited. As long as it's cheaper to buy, if the same price I would opt for Fury X probably and use the stock AIO and get air cooler for CPU. This may save you hassle and be cost effective vs Nano then adding your custom small RAD cooling. Also your warranty would stay entact from not swapping cooling on GPU.


----------



## NightAntilli

Quote:


> Originally Posted by *Drake87*
> 
> What clocks are you getting on your fury? I haven't played around with mine much just yet. Set it to 1100 core and ran the heaven bench and been playing Deus Ex since.


I only have a measly 1080p monitor right now, so its capabilities are way more than what I need right now. Therefore I've decided not to OC it yet. Although OC-ing is fun, I only do it when it's necessary. Maybe when I get my ultrawide monitor I'll try OC-ing it.
Quote:


> Originally Posted by *diggiddi*
> 
> Thx for Quick reply can you clock down to 4.6ghz no turbo, and 1600 mhz Ram so we can be on even plane?
> Also do you have Pcars, Assetto Corsa,BF4/3, Cry3?


I would help you but I have none of those games, and my FX-8320 cannot surpass 4.5 GHz. Most likely because I live in the tropics and it's always hot here.


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> sorry I have none of those... I'll see about the down clocking though... last time I tried to down clock the ram to that I couldn't get the timings to work out right... for some reason the only timings it wanted to work at was terrible.


Try DOCP in bios and set it to 1600, I noticed you are at 4k any way to downscale to 1080 also?
Quote:


> Originally Posted by *NightAntilli*
> 
> I only have a measly 1080p monitor right now, so its capabilities are way more than what I need right now. Therefore I've decided not to OC it yet. Although OC-ing is fun, I only do it when it's necessary. Maybe when I get my ultrawide monitor I'll try OC-ing it.
> I would help you but I have none of those games, and my FX-8320 cannot surpass 4.5 GHz. Most likely because I live in the tropics and it's always hot here.


OK then lets settle for Fire strike, time spy, heaven and valley using driver Crimson 16.10 since you all don't have any of these games
I will clock down to 4.5 and run these with my 290x @ stock and 1200/1600 1080P
I wouldn't mind having 390/X join in too


----------



## ManofGod1000

Quote:


> Originally Posted by *Drake87*
> 
> What clocks are you getting on your fury? I haven't played around with mine much just yet. Set it to 1100 core and ran the heaven bench and been playing Deus Ex since.


I have a Sapphire R9 Fury Nitro+ and an FX 8300 running at 4.5 Ghz. (That is the best my processor can do stably and I have not overclocked my card.) Want me to do those benchmarks you mentioned and at what resolution? I have a 4k Samsung 28 inch monitor.


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> Try DOCP in bios and set it to 1600, I noticed you are at 4k any way to downscale to 1080 also?
> OK then lets settle for Fire strike, time spy, heaven and valley using driver Crimson 16.10 since you all don't have any of these games
> I will clock down to 4.5 and run these with my 290x @ stock and 1200/1600 1080P
> I wouldn't mind having 390/X join in too


ok I set ram to 1600, cpu to 4.6 and my card back to 1050 clocks.... here are the tests I did...

http://www.3dmark.com/spy/588190
http://www.3dmark.com/3dm/15444275?


hope this will suffice... now back to my normal clocks lol.


----------



## diggiddi

1200/1600

Stock


1200/1600


Stock


Thx Mino









Looks like you are ahead by 13% of my overclocked Lightning 1200/1600 (in both timespy and firestrike)


----------



## gupsterg

Quote:


> Originally Posted by *diggiddi*
> 
> Looks like you are ahead by 13% of my overclocked Lightning 1200/1600 (in both timespy and firestrike)


Quote:


> Originally Posted by *Minotaurtoo*
> 
> ok I set ram to 1600, cpu to 4.6 and my card back to 1050 clocks.... here are the tests I did...


13% with stock Fury X GPU clock, but HBM is shown as 550MHz in 3DM FS/ S, this will add approx 1-2%, which could be lost in run to run variance so not a biggie







.

I know I'm on i5 4690K, clocked pretty well on air, but the GS score is valid compare IMO, 17.6K with older v16.3.2 WHQL driver, any of the newer drivers will bench somewhere between 17.2/4K. The thing I'm finding with TS is newer drivers score better than older, here is not the latest driver but quite recent. I've not got any DX12 games yet, so can't bench those to see if newer drivers = better, but seems to me they're going for DX12 performance improvement more than DX11 on Fiji Win 10 drivers.

You may recall the 3D Fanboy competition several months back. I bought Fiji from etailer just a little time before that, from where I could return within 30 days no quibble. When I saw some of the Hawaii/Grenada benches and these were pretty much members going all out. Fiji was benching higher for me without "going all out" plus with just stock AIO cooler, so it seemed worthy of keeping. Especially as I could sell my Hawaii card at no loss and swap to Fiji for a pretty smal outlay (due to promo on it). You'll see I was at something like 1120 for those benches IIRC, now 1145/545 has been my 24/7 OC for months. Done a lot of hours of [email protected] several months back, Feb to June 16 was basically 1-2 Fiji cards running off and on.


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> 13% with stock Fury X GPU clock, but HBM is shown as 550MHz in 3DM FS/ S, this will add approx 1-2%, which could be lost in run to run variance so not a biggie
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I know I'm on i5 4690K, clocked pretty well on air, but the GS score is valid compare IMO, 17.6K with older v16.3.2 WHQL driver, any of the newer drivers will bench somewhere between 17.2/4K. The thing I'm finding with TS is newer drivers score better than older, here is not the latest driver but quite recent. I've not got any DX12 games yet, so can't bench those to see if newer drivers = better, but seems to me they're going for DX12 performance improvement more than DX11 on Fiji Win 10 drivers.
> 
> You may recall the 3D Fanboy competition several months back. I bought Fiji from etailer just a little time before that, from where I could return within 30 days no quibble. When I saw some of the Hawaii/Grenada benches and these were pretty much members going all out. Fiji was benching higher for me without "going all out" plus with just stock AIO cooler, so it seemed worthy of keeping. Especially as I could sell my Hawaii card at no loss and swap to Fiji for a pretty smal outlay (due to promo on it). You'll see I was at something like 1120 for those benches IIRC, now 1145/545 has been my 24/7 OC for months. Done a lot of hours of [email protected] several months back, Feb to June 16 was basically 1-2 Fiji cards running off and on.


glad you said that... I forgot when I switched bios for him that I even had that one at 550... that's one of my old ones lol...but yeah, didn't really make much of a difference... set to 545 now when I returned to the bios you made for me : )


----------



## Drake87

Quote:


> Originally Posted by *ManofGod1000*
> 
> I have a Sapphire R9 Fury Nitro+ and an FX 8300 running at 4.5 Ghz. (That is the best my processor can do stably and I have not overclocked my card.) Want me to do those benchmarks you mentioned and at what resolution? I have a 4k Samsung 28 inch monitor.


I was curious what type of overclock you were able to get on the card. I've been playing Deus Ex non stop and haven't spent any time playing around with it.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Drake87*
> 
> I was curious what type of overclock you were able to get on the card. I've been playing Deus Ex non stop and haven't spent any time playing around with it.


maybe his fury based card is better, but it was a hard fight to get even 1111 mhz stable on mine without a huge increase in voltage. 1075 was all I could get at stock volts with stock bios...


----------



## LionS7

Quote:


> Originally Posted by *Minotaurtoo*
> 
> maybe his fury based card is better, but it was a hard fight to get even 1111 mhz stable on mine without a huge increase in voltage. 1075 was all I could get at stock volts with stock bios...


It's totally normal. Mine want from 1050 to 1100Mhz, 1.20V to 1.23V.


----------



## LionS7

For deleting, sorry.


----------



## gupsterg

Quote:


> Originally Posted by *Minotaurtoo*
> 
> glad you said that...


No worries







.


----------



## NightAntilli

Quote:


> Originally Posted by *diggiddi*
> 
> 1200/1600
> 
> Stock
> 
> 
> 1200/1600
> 
> 
> Stock
> 
> 
> Thx Mino
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like you are ahead by 13% of my overclocked Lightning 1200/1600 (in both timespy and firestrike)


This is of my stock Fury Nitro, with 4.5 GHZ FX-8320.




You're quite close in TimeSpy I see.


----------



## jdorje

Fury X arrived and was installed yesterday.

First impressions are not super great. Excellent noise level under load; water temps can get up to 60C and core temps a bit higher than that but the rad (on rear exhaust) does its job and VRM temps basically are the same as core. But idle noise level is worse than former cards, and the noise profile is annoyingly whiny - and my system is pretty damn quiet otherwise at idle. Sounds like this is coming from the pump.

Did a quick OC up to 1090 core and 570 memory, which only gave about a 5% boost in benchmarks. Afterburner wouldn't allow adjustment of voltage (?). Downloaded trixx which does, but haven't pushed voltage yet. Pushing up past 1090 core results in continued performance gains without artifacting, but occasionally it just crashes. Going to leave it at stock for a few days to make sure those crashes aren't happening at stock at all.

Performance is good. I'm on [email protected] with an overclocked 4690k; freesync and the $325 price tag are obviously the main reasons for choosing this card. Now I need to buy a new game to play with it...too bad I didn't get Doom when it was 1/3 off a few weeks ago.

I have had, for some time now, no ability to have "validated" 3dmark scores. Why is this? "Graphics drive not approved" is always the error. I've updated now and double checked I have WHQL drivers, but they are from roughly 2 days ago so wouldn't be approved yet anyway. But I haven't had an approved run in...many months...even with my old card.

GPU-z: http://gpuz.techpowerup.com/16/10/15/hug.png
stock timespy 5200/3878: http://www.3dmark.com/3dm/15445952
stock firestrike extreme 7782/9027: http://www.3dmark.com/3dm/15457859
1090/570 timespy 5430/3782: http://www.3dmark.com/spy/588465
The card appears to throttle at exactly 300W even with +50% on power limit in afterburner. Is that normal?

Did some fun bios mods on my previous card. Is that a thing on Fiji also?


----------



## tabs

Quote:


> Originally Posted by *gupsterg*
> 
> Nano & Fury X are both ref PCB using same VRM components, so aspect of experiencing whine be the same IMO. I've had 1x Fury Tri-X and 6 Fury X and none had obtrusive coil whine IMO, but is a subjective thing. Nano has though 4 less GPU phases on VRM, but as it seems your not going to OC than not something to consider. Just bare in mind Fiji is not known for it's OC ability.
> 
> I would also think a undervolted Fury X would be the same power usage as Nano. Nano may have slight edge due to binning but not much IMO. You can under volt/clock cards through OS SW or ROM mod.
> 
> As it seems your needs are more towards low power usage (plus how your setting up cooling), I would think the Nano would be better suited. As long as it's cheaper to buy, if the same price I would opt for Fury X probably and use the stock AIO and get air cooler for CPU. This may save you hassle and be cost effective vs Nano then adding your custom small RAD cooling. Also your warranty would stay entact from not swapping cooling on GPU.


Thanks for the comprehensive reply. I went with the Nano because it was cheaper and I also like how it looks with the Kryographics block.


----------



## diggiddi

Quote:


> Originally Posted by *NightAntilli*
> 
> This is of my stock Fury Nitro, with 4.5 GHZ FX-8320.
> 
> 
> 
> 
> You're quite close in TimeSpy I see.


OK no need for me to run the 4.5ghz seeing you are ahead, thx all repped up


----------



## gupsterg

Quote:


> Originally Posted by *tabs*
> 
> Thanks for the comprehensive reply. I went with the Nano because it was cheaper and I also like how it looks with the Kryographics block.


No worries







, be cool to see photos of rig







.


----------



## NightAntilli

Quote:


> Originally Posted by *diggiddi*
> 
> OK no need for me to run the 4.5ghz seeing you are ahead, thx all repped up


You still want a Heaven and Valley benchmark?


----------



## diggiddi

Yeah if you want to


----------



## jdorje

After a few days of using I can run 1100 mhz instead of 1090 mhz on stock voltage. Bit strange.

I also have noticed that unlike my 390, raising the clock affects the VID. At stock DPM7 is reported by aida at 1.200V, but if I raise clock to 1100 then do the same registry dump it comes out as 1.250V. The VDDC (average under load per hwinfo) rises from 1.162 to 1.209. Pretty annoying actually.

Otherwise overclocking does seem a bit weird and inconsistent. First of all there's crap gains from actually raising clock; without changing voltage and running 1100/560 on core/memory I raise my valley score a rather middling 6.5%. With upping voltage...well I haven't gone too far there.

I'm tempted to go back and write a Hawaii oc guide before I give away my 390. Hard to abandon that card.


----------



## gupsterg

Yes using SW OC will raise VID, even without increasing it, some kind of "dynamic" state setting is going on (I have seen this on Hawaii as well). Also depending upon how far you raise GPU clock you will see lower DPM clocks and VID increase. This was why ROM OC was better IMO, you may find if you set VID as stock, then an OC you achieve at stock VID in ROM and then test a smaller increase in OC using SW you may gain more (card dependant all this). I went about fine tuning my OC this way.

If you conduct some GPUMEM benches like AIDA64 GPUGP Bench that the HBM clocks in steps. 500, 545, 600 & 666, I was a skeptic about this but seems that way, an AMD techie (AMD Matt) posted this info ages ago on several forums.

Then there is also negative scaling effect on Fiji, depending how much VDDC GPU is getting the scaling will be impacted. All depends on GPU sample, tried various drivers/roms/modding rom and can not counter this effect.


----------



## NightAntilli

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah if you want to


Here you go, both on the highest preset without touching any settings (not even resolution);


----------



## jdorje

Those scores seem low. 2888 on valley EHD is about what I got with my 390 on a stock-voltage overclock (2863). Is it because of the lower CPU speed? I have heard but never really seen evidence that unigine can be single-core bottlenecked.


----------



## NightAntilli

It could be... If I can trust this site, it doesn't do that much worse though, considering their Fury Tri-X does 2938, and the Fury Strix does 2940.


----------



## jdorje

I feel like the fiji scores on these 1600x900 and 1920x1080 benchmarks actually are terrible compared to that of weaker cards. I got 3327 at stock in valley EHD with my x but that's still only 16% improvement from the 390. But in games at 1440 the results are massively better.

Uh, so after a few days my x is a lot quieter at idle now. Might have moved some air out of the block?


----------



## dagget3450

Valley really shines with fast cpu/ram. given the cpu overhead on fiji vs others it doesn't help. Run the same systems at 4k and there should be a difference. Fiji doesn't exactly impress or shine at 1080 or lower.

I am on a xeon e5 2683 and the cpu overhead is killing me on Mgpu fiji :-( I am waiting on an RMA of my 5950x.


----------



## neurotix

Valley also runs tremendously better on anything Nvidia.

In fact, all the benchmarks are coded for Nvidia, this includes 3dmark.


----------



## jdorje

Quote:


> Originally Posted by *neurotix*
> 
> Valley also runs tremendously better on anything Nvidia.
> 
> In fact, all the benchmarks are coded for Nvidia, this includes 3dmark.


My 390 crushed 970s in Valley. You can see scores

__
https://www.reddit.com/r/2pjdoo/gtx_970_owners_how_does_this_unigen_valley/
; the highest is 2700 on extreme hd preset. I had 2900 on stock voltage. Doesn't seem to be the case with fiji, though amazingly, unigine scores seem to scale linearly with clock.

But most benchmarks simply run in too low resolution for either the hawaii or fiji cards to compete with nvidia. Passmark and userbenchmark (lol) run at like 640x480 resolution.

So back to my card. A few days of (I assume) bubble relocation has it quieter at idle now, though still a bit louder than an air-cooled card. Under load it's much happier. Been playing gta v while doom downloads ($30 sale!) and averaging 70 fps on basically ultra at 1440.

Overclocking is...basically nonexistent. From 500 mhz the hbm goes up to 570 with no trouble. At 575 it artifacts. That's a decent boost. Clock is stable at 1090 mhz at -24 mV (1226 VID), or 1100 at +18 mV (1268). That's 42 mV per 10 mhz, aka absolutely abysmal scaling even by amd standards. The flip side, as i said, is that performance seems to scale near linearly with clock.

On a probably-unstable max-voltage overclock (1325mV vid I think), I pushed a 5230 in timespy. Still 100 points behind the 4690k+furyx leader, @gupsterg. Might be within striking range but you're 100 mhz higher on CPU clock than I can bench at.

Next up: modding the bios.

One question: I have a spare phanteks MP and XP fans. To throw that on the back of the rad in push-pull what size screws do I need? Are rad screws standard? Strangely the radiator did not come with an extra pair.


----------



## gupsterg

Quote:


> Originally Posted by *jdorje*
> 
> Still 100 points behind the 4690k+furyx leader, @gupsterg. Might be within striking range but you're 100 mhz higher on CPU clock than I can bench at.


From viewing my result vs yours, your 200MHz behind on CPU as that section shows 4898MHz vs 4698MHz?

Surprised for ~4% higher CPU = 9.5% scaling difference in CPU test, so TS CPU test must be effected by GPU clock difference as well?

Your GPU clocks have been picked up "skewiff", what was you your GPU/HBM clocks? I'm also on Crimson v16.9.1 vs your v16.10.1, from some testing I did newer Crimson seem to bench better on TS (will test v16.10.1).
Quote:


> Originally Posted by *jdorje*
> 
> One question: I have a spare phanteks MP and XP fans. To throw that on the back of the rad in push-pull what size screws do I need? Are rad screws standard? Strangely the radiator did not come with an extra pair.


Was planing on doing push/pull fans, not done it yet. I got XSPC Radiator Screw Set 6-32UNC (fit perfectly) and Gelid 4-Pin PWM Adaptor Cable for Video/VGA cards (CA-PWM-02). Gonna use the PWM signal from card but power both fans on RAD via molex when get around to it







.


----------



## jdorje

Quote:


> Originally Posted by *gupsterg*
> 
> From viewing my result vs yours, your 200MHz behind on CPU as that section shows 4898MHz vs 4698MHz?
> 
> Surprised for ~4% higher CPU = 9.5% scaling difference in CPU test, so TS CPU test must be effected by GPU clock difference as well?
> 
> Your GPU clocks have been picked up "skewiff", what was you your GPU/HBM clocks? I'm also on Crimson v16.9.1 vs your v16.10.1, from some testing I did newer Crimson seem to bench better on TS (will test v16.10.1).
> Was planing on doing push/pull fans, not done it yet. I got XSPC Radiator Screw Set 6-32UNC (fit perfectly) and Gelid 4-Pin PWM Adaptor Cable for Video/VGA cards (CA-PWM-02). Gonna use the PWM signal from card but power both fans on RAD via molex when get around to it
> 
> 
> 
> 
> 
> 
> 
> .


That result is at 4.7 cpu with 4.2 uncore and 2133/11 ram. I can bench at 4.8/4.4 but the cpu score is further behind than that. It's 1130/570 on the gpu on the highest score. It's always misreported my gpu clocks for some reason.

I was just going to hook up my second fan to a mobo header. Undecided if i use speedfan to control speed or just leave it low. Seems like the gentle typhoon has a much worse noise profile than my phanteks fans.


----------



## gupsterg

Cheers for info on your run







.

I found one of the Fury X's fan whined more than others I had (they all seemed differing to me at the time IIRC). I slackened the screws off and the noise profile seemed better to me (dunno if placebo







). I then placed some AC pipe insulation tape (spongy) on the corners of fan and just nipped screws up and seemed better to me. The one I kept did not need this mod, seemed fine to me.


----------



## jearly410

Quote:


> Originally Posted by *jdorje*
> 
> My 390 crushed 970s in Valley. You can see scores
> 
> __
> https://www.reddit.com/r/2pjdoo/gtx_970_owners_how_does_this_unigen_valley/
> ; the highest is 2700 on extreme hd preset. I had 2900 on stock voltage. Doesn't seem to be the case with fiji, though amazingly, unigine scores seem to scale linearly with clock.
> 
> But most benchmarks simply run in too low resolution for either the hawaii or fiji cards to compete with nvidia. Passmark and userbenchmark (lol) run at like 640x480 resolution.
> 
> So back to my card. A few days of (I assume) bubble relocation has it quieter at idle now, though still a bit louder than an air-cooled card. Under load it's much happier. Been playing gta v while doom downloads ($30 sale!) and averaging 70 fps on basically ultra at 1440.
> 
> Overclocking is...basically nonexistent. From 500 mhz the hbm goes up to 570 with no trouble. At 575 it artifacts. That's a decent boost. Clock is stable at 1090 mhz at -24 mV (1226 VID), or 1100 at +18 mV (1268). That's 42 mV per 10 mhz, aka absolutely abysmal scaling even by amd standards. The flip side, as i said, is that performance seems to scale near linearly with clock.
> 
> On a probably-unstable max-voltage overclock (1325mV vid I think), I pushed a 5230 in timespy. Still 100 points behind the 4690k+furyx leader, @gupsterg. Might be within striking range but you're 100 mhz higher on CPU clock than I can bench at.
> 
> Next up: modding the bios.
> 
> One question: I have a spare phanteks MP and XP fans. To throw that on the back of the rad in push-pull what size screws do I need? Are rad screws standard? Strangely the radiator did not come with an extra pair.


I replaced the stock fan with dual cougar vortex for a while then with phanteks xp. If looking for nearly silent get the phanteks. Doesn't actually help with temps but it does for the noise. I have them and all fans hooked up to a grid2+ and the speed ramps (with gpu temp) pretty aggressively.

I'm on a quest to lower my vr vddc temps on my fury x, replaced thermal paste, took off plastic shroud, attached copper heatsinks to backside of vrm but no real improvements. I have noticed the card is held back in a major way by temps and I'm seeing if it's the vrm or something else.

Stock dpm7 is 1.250 (highest possible) and adding +48mv seems to be the sweet spot. Pushing over 1.3 heats up vrm too much. At about 54 Celsius vr vddc the stability starts going. I've managed up to 60 Celsius and 1148mhz 545mhz but unstable in games. With bf1 being temperamental with oc'ing I'm back to stock waiting for new drivers and patch.


----------



## jdorje

Gpu core temps are only 3-4c above the loop temp though. Assuming the loop temp is measured inside the metal of the water block that leaves very little room for improvement.

To get the temp down then you need to lower the water temp. A second fan may help here slightly but increasing the rad space would be the only real way.

Vrm temps are hotter and I'm guessing replacing the pads there with top end fujipoly ones would lower their temp delta. But that delta is still measured from the water temp which is very high.

I have an mp (on another cooler) and an xp. I guess I'll get some screws and see how the xp works. Might end up just putting on two mps though. I thought the typhoon was supposed to be the best rad cooler? How is it so much louder than my mps?


----------



## jearly410

Quote:


> Originally Posted by *jdorje*
> 
> Gpu core temps are only 3-4c above the loop temp though. Assuming the loop temp is measured inside the metal of the water block that leaves very little room for improvement.
> 
> To get the temp down then you need to lower the water temp. A second fan may help here slightly but increasing the rad space would be the only real way.
> 
> Vrm temps are hotter and I'm guessing replacing the pads there with top end fujipoly ones would lower their temp delta. But that delta is still measured from the water temp which is very high.
> 
> I have an mp (on another cooler) and an xp. I guess I'll get some screws and see how the xp works. Might end up just putting on two mps though. I thought the typhoon was supposed to be the best rad cooler? How is it so much louder than my mps?


I thought the typhoons were supposed to "be the best" however I could hear it clearly above the rest of my system which isn't even meant to be silent. At full tilt the typhoon was unbearable. Maybe it's just my bad luck? And how will you hook up the phanteks, with a controller or directly to gpu?

I have some fujipoly already, haven't applied it yet due to being lazy, however if I have time tonight or soon, I'll replace the stock pads and see what happens. My usual stress game is battlefront so that's what I'll test with.


----------



## jdorje

I was originally planning to just put the xp on and hook it up to mobo header.

However now I'm thinking maybe replace the typhoon completely. But that needs a splicing or special adapter as the gpu header isn't a 4 standard 4 pin?

Edit: to hijack this post, I uploaded my stock bios rom to gpu-z, and amazingly it was not already in the database. Has XFX just released a new bios on these? https://www.techpowerup.com/vgabios/186797/186797

Edit again: CA-PWM-02 (5-pin to 4-pin adapter) is something like $7 online. For $8 there's a moddiy one that's a 3-way splitter. Think that fan header can handle 3 fans?


----------



## gupsterg

Quote:


> Originally Posted by *jearly410*
> 
> I'm on a quest to lower my vr vddc temps on my fury x, replaced thermal paste, took off plastic shroud, attached copper heatsinks to backside of vrm but no real improvements. I have noticed the card is held back in a major way by temps and I'm seeing if it's the vrm or something else.


On Fury/X, PowerPlay both VRM Temp.Limit is 105°C , if reached results in GPU/HBM clock drop. IR3567B registers both VRM VR_Hot is 127°C, if reached results in GPU/HBM clock drop. OTP is 134°C if reached results in card shutdown.

I have not changed TIM/pads on my Fury X, I see on GPU VRM highest temp as ~80°C, HBM VRM highest temp ~50°C. I reduced PowerPlay VRM temp limits to 90°C and 60°C to protect card, below is recent 1hr+ of 3DM FSE Demo looped.



Spoiler: Warning: Spoiler!







I did see slightly better GPU VRM temps on Fury Tri-X, as those mosfets are cooled independent HS. Fury X I reckon the GPU VRM temps are slightly higher due to how coolant flows from GPU to VRM.

The GPU/HBM VRM mosfets are the same, IRF6811 & 6894. I don't believe we are pushing them to temp "limits" to effect OC'ing even if stock AIO/TIM/pads.
Quote:


> Originally Posted by *jearly410*
> 
> Stock dpm7 is 1.250 (highest possible) and adding +48mv seems to be the sweet spot. Pushing over 1.3 heats up vrm too much. At about 54 Celsius vr vddc the stability starts going. I've managed up to 60 Celsius and 1148mhz 545mhz but unstable in games. With bf1 being temperamental with oc'ing I'm back to stock waiting for new drivers and patch.


IMO 54/60°C VRM temps would not effect OC at all (page 5 fig 12 of datasheets has a graph). IMO I reckon the GPU is temperamental to the OC, I found some of my cards did the same, the one I kept has performed flawless for every purpose @ 1145 (1.268V) 545 (1.325V).


----------



## jearly410

@gupsterg I've never seen my vrm over 70 Celsius so I have never experienced throttle down. After replacing pads and tim again no noticeable difference. Will have numbers tomorrow hopefully. Weirdly I think my mobo died during all my testing (gigabyte gaming 3 z170, has been problematic before) so things are put on hold ? Trying to solve one problem creates two more ?

That's sad to hear about my OC dreams being crushed. At stock it still rocks so I'm not upset, just disappointed.

Back to the drawing board!


----------



## gupsterg

You have sweet VRM temps IMO







.

Sorry to read your mobo issues







, I hope RMA is swift/easy process







.

My "wild guess" is due to the transistor count/die size Fiji is poorer to OC than other gen AMD cards








.


----------



## gupsterg

Quote:


> Originally Posted by *jdorje*
> 
> I was originally planning to just put the xp on and hook it up to mobo header.
> 
> However now I'm thinking maybe replace the typhoon completely. But that needs a splicing or special adapter as the gpu header isn't a 4 standard 4 pin?
> 
> Edit: to hijack this post, I uploaded my stock bios rom to gpu-z, and amazingly it was not already in the database. Has XFX just released a new bios on these? https://www.techpowerup.com/vgabios/186797/186797
> 
> Edit again: CA-PWM-02 (5-pin to 4-pin adapter) is something like $7 online. For $8 there's a moddiy one that's a 3-way splitter. Think that fan header can handle 3 fans?


Post 9792 I've added manufacturer links. It's a standard mini 4 pin PWM plug/setup on PCB.



Spoiler: Warning: Spoiler!







Don't know how much power the GFX PCB can give to fan header, so my plan was to remove the +/- pins from mini 4 pin plug and connect to molex to power fans. On the standard 4 pin end of Gelid lead I was going to use a Y splitter I have, which would share the PWM signal from GFX PCB to each fan and has only 1 fan on RPM signal wire to get monitoring data in apps for it.

XFX ROM is old, this is latest still (ref ROM P/N, Ver., Compile date in hex editors).


----------



## jdorje

@gupsterg I did break your timespy 4690k+furyx record. You've got some work to do to reclaim it!

http://www.3dmark.com/compare/spy/612886/spy/467838

* CPU 48x, uncore 44x, ram 2400-11. Voltages on all of them were high.

* GPU was 1165 mhz and hbm 570. VID is 1250(auto) + 66 so theoretically about 1316 mV. The key here was setting the fan speed to maximum, which kept temps considerably down. I disabled hwinfo for the main run but probably somewhere in the 40s on core and 50s on vrms. The lower temps (probably mainly VRM) allowed considerably higher clock ended up being stable.

* I actually forgot to do this on every run, but disabling freesync and turning my monitor down to 60 hz helped me in the past.

* Looks like the 6600k, 4670k and probably all other i5 scores are lower. Yay!

* Next up: firestrike. Or maybe it's straight to bios modding.

Thanks for all your help so far.


----------



## bluezone

Ooooooo. new drivers and it's even my B Day.

Crimson 16.10.2

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16-10-2-Release-Notes.aspx

Win 10 64,

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Arizonian

So awesome having Battlefield 1 Games day ready driver! I download it at midnight. Perfect timing.


----------



## bluezone

This is an interesting read in the GPU section for possible VEGA features.

http://www.eurogamer.net/articles/digitalfoundry-2016-inside-playstation-4-pro-how-sony-made-a-4k-games-machine

roe example;
Quote:


> "One of the features appearing for the first time is the handling of 16-bit variables - it's possible to perform two 16-bit operations at a time instead of one 32-bit operation," he says, confirming what we learned during our visit to VooFoo Studios to check out Mantis Burn Racing. "In other words, at full floats, we have 4.2 teraflops. With half-floats, it's now double that, which is to say, 8.4 teraflops in 16-bit computation. This has the potential to radically increase performance."


So is why the VEGA leak indicate 24TFlops performance?


----------



## gupsterg

Quote:


> Originally Posted by *jdorje*
> 
> @gupsterg I did break your timespy 4690k+furyx record. You've got some work to do to reclaim it!
> 
> http://www.3dmark.com/compare/spy/612886/spy/467838


Nice score







.

My daily usage ROM 1145/545 with just bump to 1175 via MSI AB







, Crimson v16.10.2, result (driver not approved by 3DM yet). Run was artifact free (demo+tests) but isn't a clock I have stability tested like 1145/545 with a lot of hours of [email protected], 3DM FS/E, Heaven, Valley and gaming.
Quote:


> Originally Posted by *jdorje*
> 
> Thanks for all your help so far.


No worries your welcome







.


----------



## jearly410

@jdorje @gupsterg Mind if I join in? http://www.3dmark.com/compare/spy/616463/spy/617187/spy/612886#


----------



## jdorje

Nice.

How'd you get better cpu score while still at 4800?


----------



## jearly410

@jdorje
6600k vs 4690k?
Does ram speed matter?
I'm new to 3DMark so I don't know all of the variables yet.


----------



## jdorje

Oh, 6600k. Yeah you can't join in then; no way a 4690k can match the skylake. Go found your own club.

Though i think you can move up in clubs, just not down. If you have a fury you can beat fury x's and brag, but not vice versa.


----------



## gupsterg

Quote:


> Originally Posted by *jearly410*
> 
> @jdorje @gupsterg Mind if I join in? http://www.3dmark.com/compare/spy/616463/spy/617187/spy/612886#


Good score







, I like seeing DC vs SL benches







. Viewing the Graphics score/tests there is no extra scaling on SL vs DC at same clocks (ie you vs jdorje), so basically Fiji on DC is not being held back by it. Means I'll hang to mine still yet







(+rep







). Now on the CPU tests that some nice gains on SL vs DC clock for clock







.

I'll match my i5 4690K to your 6600K @ 4.8GHz and bench at same GPU clock as you, be interested to see Heaven/Valley bench results.
Quote:


> Originally Posted by *jearly410*
> 
> Does ram speed matter?


No idea, I did once try tweaking my RAM timings but as I had errors in memory test, I then just left them as XMP 2400MHz, only thing I tweak from profile is 2T to 1T.
Quote:


> Originally Posted by *jdorje*
> 
> Oh, 6600k. Yeah you can't join in then; no way a 4690k can match the skylake. *Go found your own club.*


It's just some friendly comparing between owners







. The bold text is uncalled for IMO, this is Fiji owners club so CPU does not come into







.


----------



## Willius

Quote:


> Originally Posted by *jdorje*
> 
> Oh, 6600k. Yeah you can't join in then; no way a 4690k can match the skylake. Go found your own club.


I hope that was a joke, in any case a little more nuance goes a long way, even an emoticon will make you sound less like an ...









Back on topic, I believe I've read somewhere that ram timings do matter in 3DM Time Spy.

Edit:

http://overclocking.guide/optimizing-ram-performance-time-spy/


----------



## jdorje

Yes a joke. I actually forgot the board supports emoticons. What i meant was with a 6600k you need to top out on this chart . Which you'll do. 6600k will score 10% higher on cpu than 4690k so we can only really compare graphics scores.

But the latest drivers are non whql, which means (?) they won't get approved. Need to downgrade to last week's whql drivers or wait another couple weeks.

Or...the top fs score is so far above what i can get. And it's from spring. I bet the driver from that day, if i could find it, would allow some good scoring.


----------



## Thoth420

Quote:


> Originally Posted by *jdorje*
> 
> Yes a joke. I actually forgot the board supports emoticons. What i meant was with a 6600k you need to top out on this chart . Which you'll do. 6600k will score 10% higher on cpu than 4690k so we can only really compare graphics scores.
> 
> But the latest drivers are non whql, which means (?) they won't get approved. Need to downgrade to last week's whql drivers or wait another couple weeks.
> 
> Or...the top fs score is so far above what i can get. And it's from spring. I bet the driver from that day, if i could find it, would allow some good scoring.


Here you go sir (bookmark and enjoy):








http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064


----------



## jearly410

Anyone have experience crossfiring a Fury X and Fury (non X)? I have a 750W PSU and with my Fury X oc'd, I see ~450 from the wall. Add in another Fury and I'm guessing it would reach up to 650W. Can anyone verify?

I can always undervolt both to bring the power draw way down if needed, will it affect performance too much?

Thanks!

Edit: -66mv undervolt stock clocks dropped ~100w from wall.


----------



## gupsterg

WOW nice reduction in power for undervolt







, what clocks you use with that undervolt?


----------



## jearly410

Stock 1050/500. I won't budge on the clocks :








The Fury Tri-X oc+ arrives this week and has stock 1050/500 as well which makes my life easier. I'll check to see if cores can unlock (doubt it) then undervolt and crossfire. If all goes well I'll be hitting 75fps steady on ultra 3440X1440 in Battlefield 1.









I'm also putting an order in for a 1000W psu for peace of mind. I'm fairly certain the 750 can handle it after undervolting but I'd like to have a comfy cushion.

With a 1000W PSU I will be pushing these two to their limits and not care about the power draw.


----------



## gupsterg

My Fury X only went down by -25mV on 1050/500







. My stock VID for that DPM is 1.212V, so basically 1.187V. I recall you stating yours was DPM 7 1.250V? the -66mV would be -68.75mV so you'd be at 1.181V IMO.

Yeah be great if that Tri-X unlocks, rooting for it chap







.


----------



## jearly410

Exactly right gup. You know your stuff!

I have to redact what I said about 100w decrease. Seems I jumped the gun and further testing must be done. I'm rock solid -60mv through trixx in battlefront and battlefield 1 at least.


----------



## gupsterg

Cheers







, still learning mate







.

So surprising that even though are stock DPM 7 is large difference, we're very similar on under volt final VID.

You may end up with differing W readings depending on what is loading GPU, regardless if GPU is at highest state (ie 1050/500). You may note in monitoring when you run 3DM/Heaven/Valley each may end up a different average VDDC even though clocks where same. Regardless if we set DPM VID manually there is variable voltage occurring.


----------



## eperelez

Can anyone here verify if this adapter will work on Fury cards to get 60Hz @ 4K on HDMI 2.0?
http://www.newegg.com/Product/Product.aspx?Item=N82E16812232065&cm_re=vantec_displayport_to_hdmi-_-12-232-065-_-Product


----------



## LionS7

Quote:


> Originally Posted by *jearly410*
> 
> Exactly right gup. You know your stuff!
> 
> I have to redact what I said about 100w decrease. Seems I jumped the gun and further testing must be done. I'm rock solid -60mv through trixx in battlefront and battlefield 1 at least.


Do you have DX errors in Battlefront with unlocked fps ? Like "graphics device is removed", something like that ? It is not only with Radeon cards.


----------



## jearly410

Quote:


> Originally Posted by *LionS7*
> 
> Do you have DX errors in Battlefront with unlocked fps ? Like "graphics device is removed", something like that ? It is not only with Radeon cards.


I get that error in both battlefront and battlefield 1. Bf4 also has it. It's inherent to the frostbite engine and happens when my overclock is unstable. That's why I test stability with battlefront, it has prove to be the toughest on gpu overclocks.

I once got it in battlefield 1 while in dx12 mode, never when in dx11 at stock clocks.

If I get a chance to test with unlocked fps at stock clocks in battlefront I'll let you know.


----------



## LionS7

Quote:


> Originally Posted by *jearly410*
> 
> I get that error in both battlefront and battlefield 1. Bf4 also has it. It's inherent to the frostbite engine and happens when my overclock is unstable. That's why I test stability with battlefront, it has prove to be the toughest on gpu overclocks.
> 
> I once got it in battlefield 1 while in dx12 mode, never when in dx11 at stock clocks.
> 
> If I get a chance to test with unlocked fps at stock clocks in battlefront I'll let you know.


Unstable overclock on the video card ? Cos the error is showing for me on 1.23V 1100Mhz core and 1.26V. I dont think that it is 100% unstable oc, dont know... The EA forums don't tell me much, and for sure that it is unstable oc.


----------



## Fguarezi

Fury Try X OC [email protected]/560 - i5 [email protected]

Fire Strike 1.1

http://www.3dmark.com/3dm/15620112

https://uploaddeimagens.com.br/imagens/fire_4800_uc47_1115-560_3840_8x-jpg

Time Spy 1.0

http://www.3dmark.com/3dm/15620283?

https://uploaddeimagens.com.br/imagens/spy_4800_uc47_1115-560_3840_8x-jpg


----------



## ressonantia

Quote:


> Originally Posted by *jearly410*
> 
> I get that error in both battlefront and battlefield 1. Bf4 also has it. It's inherent to the frostbite engine and happens when my overclock is unstable. That's why I test stability with battlefront, it has prove to be the toughest on gpu overclocks.
> 
> I once got it in battlefield 1 while in dx12 mode, never when in dx11 at stock clocks.
> 
> If I get a chance to test with unlocked fps at stock clocks in battlefront I'll let you know.


If you don't mind me asking, how do you test stability with SW:Battlefront? Do you just fire it up and play a couple of rounds or something? I managed to get my Fury undevolted to [email protected] but its not entirely stable in Battlefront. Passes the 3DMark Fire Strike Extreme stress test though.


----------



## diggiddi

Guys which is faster with these cards more Stream processors or higher frequency, also which Fury non X overclocks the highest?


----------



## jearly410

Quote:


> Originally Posted by *ressonantia*
> 
> If you don't mind me asking, how do you test stability with SW:Battlefront? Do you just fire it up and play a couple of rounds or something? I managed to get my Fury undevolted to [email protected] but its not entirely stable in Battlefront. Passes the 3DMark Fire Strike Extreme stress test though.


Yep that's it. I fire up a solo battle on Endor and park my character inside looking out the window. If it hasn't crashed in 5 min I'll play until I'm satisfied (I will usually have a value I'm keeping an eye on.) Hwinfo + rivatuner to keep an eye on my info. Ultimately I overclock to gain fps in games, and if a game I play crashes because of an overclock, then it isn't stable and I try something new.

Using benchmarks to test stability hasn't equated to better/stable performance in games. What benchmarks ARE good for is a way to compare yourself against what is already known and have an emperical way to see if the overclock is helping. Fiji is notorious for negative voltage scaling so I find a benchmark to be the best way to see the incremental improvements.

Combine those two and you should have a very stable overclock at the best settings imo. There is never a point where I am 100% without a doubt certain an overclock is stable with everything because that's just how it is. Figuring out where to draw the line I think is a question every overclocker must answer themselves.


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> Do you have DX errors in Battlefront with unlocked fps ? Like "graphics device is removed", something like that ? It is not only with Radeon cards.


I've got 75hrs+ on SWBF and never once had a crash out with that error







.


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> I've got 75hrs+ on SWBF and never once had a crash out with that error
> 
> 
> 
> 
> 
> 
> 
> .


So is it normal The Witcher 3 witch is very heavy game to want 1.23V for stable work, but Battlefront to crash even with 1.26V ?


----------



## gupsterg

Unfortunately I don't have The Witcher 3 yet







(I have only 1 & 2), hoping to grab it in the Christmas steam sale. I can't say I've had any issues on games I've played relating to OC. The ROM I set when I was using 1080P screen, now I'm on 1440P with FreeSync and not had an issue. Recently replayed Crysis 2 & 3 and grabbed Tomb Raider 2013 on promo. I don't increase or decrease voltage depending on what I'm gonna load GPU with either. I can only presume your GPU needs higher VID to stabilise the OC.


----------



## Skyl3r

Hello Fellow Fiji people









Just figured I'd share my TimeSpy results as well.
http://www.3dmark.com/spy/630564
*Overall:* 8,843
*Graphics:* 11,471
*CPU:* 3,848

I can't seem to beat those pesky i7 users...
I'm currently running with an 850w PSU, so my overclocking is relatively limited. I'll be going dual PSU before the end of the week in preparation for 4-way XFire. I hope to increase my score significantly on two cards before moving up to 4 though.


----------



## jearly410

Quote:


> Originally Posted by *Skyl3r*
> 
> 
> Hello Fellow Fiji people
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just figured I'd share my TimeSpy results as well.
> http://www.3dmark.com/spy/630564
> *Overall:* 8,843
> *Graphics:* 11,471
> *CPU:* 3,848
> 
> I can't seem to beat those pesky i7 users...
> I'm currently running with an 850w PSU, so my overclocking is relatively limited. I'll be going dual PSU before the end of the week in preparation for 4-way XFire. I hope to increase my score significantly on two cards before moving up to 4 though.


Awesome core clocks! How much power is your rig pulling during the test? I'm curious if my 750w psu can handle a fury x + fury non-x.


----------



## diggiddi

Quote:


> Originally Posted by *jearly410*
> 
> Awesome core clocks! How much power is your rig pulling during the test? I'm curious if my 750w psu can handle a fury x + fury non-x.


Dude you should be ok, my Antec HCG 750w was able to handle OC 8350 and OC 290x lightning in crossfire


----------



## Skyl3r

Quote:


> Originally Posted by *jearly410*
> 
> Awesome core clocks! How much power is your rig pulling during the test? I'm curious if my 750w psu can handle a fury x + fury non-x.


I can tell you that my 850w won't handle much overvolting. I don't know the exact figures off hand.
I'm running +40mV on the GPUs and 1.46v on the CPU.

If you're gonna do that, you'd definitely want to look at getting a new PSU sooner rather than later. You're gonna be running very close to capacity.


----------



## jearly410

Quote:


> Originally Posted by *Skyl3r*
> 
> I can tell you that my 850w won't handle much overvolting. I don't know the exact figures off hand.
> I'm running +40mV on the GPUs and 1.46v on the CPU.
> 
> If you're gonna do that, you'd definitely want to look at getting a new PSU sooner rather than later. You're gonna be running very close to capacity.


I've been testing overvolting/undervolting my single fury x and with an oc 6600k plus tons of peripherals in seeing peaks near 550ish with avg near 440. I'm almost certain I'll be ok but I put in an order for 1000w to be safe. Thanks.


----------



## diggiddi

Quote:


> Originally Posted by *jearly410*
> 
> I've been testing overvolting/undervolting my single fury x and with an oc 6600k plus tons of peripherals in seeing peaks near 550ish with avg near 440. I'm almost certain I'll be ok but I put in an order for 1000w to be safe. Thanks.


Yeah 1000w psu is best, but you can work with the 750w till it comes, I am combining my 750 with a 550w atm


----------



## Alastair

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jearly410*
> 
> Awesome core clocks! How much power is your rig pulling during the test? I'm curious if my 750w psu can handle a fury x + fury non-x.
> 
> 
> 
> I can tell you that my 850w won't handle much overvolting. I don't know the exact figures off hand.
> I'm running +40mV on the GPUs and 1.46v on the CPU.
> 
> If you're gonna do that, you'd definitely want to look at getting a new PSU sooner rather than later. You're gonna be running very close to capacity.
Click to expand...

It wont make it 850 watt. 8370 at 5GHz and two Fury's and my Be Quiet! Dark Power Pro P10 850W couldn't manage. I kept tripping OCP.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> It wont make it 850 watt. 8370 at 5GHz and two Fury's and my Be Quiet! Dark Power Pro P10 850W couldn't manage. I kept tripping OCP.


If that unit can't do the job no 850W one will. Great power supply same guts as mine if I am not mistaken.


----------



## ManofGod1000

Well, I am now running a Sapphire R9 Fury Nitro + and Sapphire R9 Fury Tri X in Crossfire.







I also am running an FX 8300 at 4.5 Ghz which is set to 1.425v and it runs at 1.440v. Except for Crysis 3 itself, which runs up to 730w at 4k and all settings at very high, everything else I have runs at 600 Watts or less. (Seems only Crysis 3 uses all the hardware fully and no, I do not and never will run Furmark.)
I am running all this off a Thermaltake M850w and it is running very well. (I do not overclock the cards since I have tried and they don't do so very much.) Also, I am playing around with under volting anyways.

The only thing I have learned recently is, if you use a UPS / Battery Backup, it must be able to handle everything without an overload condition. This forced me to move a Cyberpower 850VA unit and buy a APC 1500VA unit. I purchased what I thought was a sufficient UPS when my other one died last month and before I did crossfire. Even without crossfire, I had a couple of overload conditions and let me tell you all, when it says overload, do not ignore it. (Thing shut off last night because of that.


----------



## jdip

Hey everyone,

After holding off on upgrading my trusty old 6950 for the longest time (and at 1440p!), I finally pulled the trigger and ordered a Sapphire R9 Fury Nitro! Looking forward to the huge boost in performance and not playing games on low anymore


----------



## josephimports

Quote:


> Originally Posted by *jdip*
> 
> Hey everyone,
> 
> After holding off on upgrading my trusty old 6950, I finally pulled the trigger and ordered a Sapphire R9 Fury Nitro! Looking forward to the huge boost in performance and not playing games on low anymore


Sounds like good times ahead. Enjoy.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> It wont make it 850 watt. 8370 at 5GHz and two Fury's and my Be Quiet! Dark Power Pro P10 850W couldn't manage. I kept tripping OCP.
> 
> 
> 
> If that unit can't do the job no 850W one will. Great power supply same guts as mine if I am not mistaken.
Click to expand...

Yes I actually thought at first my PSU was malfunctioning. So I sent it off on warranty. And then I received it back with an entire report on the tests they did and how solidly the Seasonic made unit passed said tests. (Not surprising though)


----------



## Alastair

Quote:


> Originally Posted by *ManofGod1000*
> 
> Well, I am now running a Sapphire R9 Fury Nitro + and Sapphire R9 Fury Tri X in Crossfire.
> 
> 
> 
> 
> 
> 
> 
> I also am running an FX 8300 at 4.5 Ghz which is set to 1.425v and it runs at 1.440v. Except for Crysis 3 itself, which runs up to 730w at 4k and all settings at very high, everything else I have runs at 600 Watts or less. (Seems only Crysis 3 uses all the hardware fully and no, I do not and never will run Furmark.)
> I am running all this off a Thermaltake M850w and it is running very well. (I do not overclock the cards since I have tried and they don't do so very much.) Also, I am playing around with under volting anyways.
> 
> The only thing I have learned recently is, if you use a UPS / Battery Backup, it must be able to handle everything without an overload condition. This forced me to move a Cyberpower 850VA unit and buy a APC 1500VA unit. I purchased what I thought was a sufficient UPS when my other one died last month and before I did crossfire. Even without crossfire, I had a couple of overload conditions and let me tell you all, when it says overload, do not ignore it. (Thing shut off last night because of that.


The thing is according to my wall meter, I was normally consuming 680-740 watts at the wall gaming. CPU and GPU intensive games like Battlefield 4 pushed that up to 800 watts at the wall. So I mean once taking efficiency into account I probably had another 100 watts. But it was still tripping. Turns out there were very very short spikes that the wall meter couldn't detect that was tripping OCP. When I got my HCP-1300 and did tests both cpu and GPU combined (prime + heaven I was hitting 1100 watts. Gaming the normal is still in the ~800 watt range.) Although keeping in mind I have close on 100 watts worth of fans in my rig and a D5 on setting 5.


----------



## gupsterg

Quote:


> Originally Posted by *jdip*
> 
> Hey everyone,
> 
> After holding off on upgrading my trusty old 6950 for the longest time (and at 1440p!), I finally pulled the trigger and ordered a Sapphire R9 Fury Nitro! Looking forward to the huge boost in performance and not playing games on low anymore


Enjoy







.



This springs to mind in how you will feel with upgrade!














.


----------



## ManofGod1000

Quote:


> Originally Posted by *Alastair*
> 
> The thing is according to my wall meter, I was normally consuming 680-740 watts at the wall gaming. CPU and GPU intensive games like Battlefield 4 pushed that up to 800 watts at the wall. So I mean once taking efficiency into account I probably had another 100 watts. But it was still tripping. Turns out there were very very short spikes that the wall meter couldn't detect that was tripping OCP. When I got my HCP-1300 and did tests both cpu and GPU combined (prime + heaven I was hitting 1100 watts. Gaming the normal is still in the ~800 watt range.) Although keeping in mind I have close on 100 watts worth of fans in my rig and a D5 on setting 5.


Thanks.







Yeah, I only have 4 case fans, a Noctua HN-D15 with the 2 fans, 2 ssd's and 2 hard drives. I did not even bother installing my blu ray optical drive anymore in the new case I also bought. (Cooler Master Master Case 5, the one just below the pro.)







Although, if I end up with the same issues you did, I will upgrade my PSU as well. (Thankfully, I have not but, then again, I am also not overclocking to 5.0 GHz and do not have anywhere near the number of case fans you do.







)


----------



## jdorje

I hooked up my phanteks MP in series with the typhoon on the exhaust x rad.

Overall it's a bit disappointing. Based on a tedious number of benchmarks I did, I estimate the MP on max RPM cools about as well (pushes as much air as) the typhoon on 35-40% RPM. The MP is much, much quieter on max than the typhoon on 35%, but, it's pretty hard to make matching fan curves to balance noise with cooling. Ideally I think I'd want to ramp up the MP to 100% first - as soon as the card starts getting warm - and then only ramp up the typhoon when it gets actually hot.

Is there any way to have fan curve controlled by the water temp? Pretty annoying the fans turn down all the time when I tab out of games even though the water temp is still 50+.

I will also put in a plug for the MP. It's max RPM is only about 1800 rpm, but it's incredibly quiet at that RPM. But it's not just the noise level that's awesome, it's the noise profile itself - a gentle whoosh. Tons better than the typhoon's buzzsaw or any other fan I've owned.


----------



## AngryLobster

Quote:


> Originally Posted by *jdorje*
> 
> I hooked up my phanteks MP in series with the typhoon on the exhaust x rad.
> 
> Overall it's a bit disappointing. Based on a tedious number of benchmarks I did, I estimate the MP on max RPM cools about as well (pushes as much air as) the typhoon on 35-40% RPM. The MP is much, much quieter on max than the typhoon on 35%, but, it's pretty hard to make matching fan curves to balance noise with cooling. Ideally I think I'd want to ramp up the MP to 100% first - as soon as the card starts getting warm - and then only ramp up the typhoon when it gets actually hot.
> 
> Is there any way to have fan curve controlled by the water temp? Pretty annoying the fans turn down all the time when I tab out of games even though the water temp is still 50+.
> 
> I will also put in a plug for the MP. It's max RPM is only about 1800 rpm, but it's incredibly quiet at that RPM. But it's not just the noise level that's awesome, it's the noise profile itself - a gentle whoosh. Tons better than the typhoon's buzzsaw or any other fan I've owned.


Have you tried undervolting and tested how much fan RPM it allows you to knock off? I remember on mine it managed -72mv and maintained 65C (4K) around 1200RPM. Any higher I found too loud. You can also lower the minimum RPM a few hundred RPM below what AMD sets it out the box.

Another option because of how terrible GTs can be at certain RPMs is to just set it at a constant tolerable speed (when undervolting) like 1300RPM so you avoid the drops when all tabbing and it keeps temps in check under load.


----------



## jdorje

How do you lower the min rpm? That's actually the most annoying thing. When the card is under load I'm usually wearing headphones and the minor typhoon nose of the 65c temp target isn't bad at all.

The card has terrible voltage scaling - 30+ mV per 10 mhz. So lowering voltage will greatly increase performance per watt. But not really necessary now as it's cold here this time of year.


----------



## AngryLobster

Quote:


> Originally Posted by *jdorje*
> 
> How do you lower the min rpm? That's actually the most annoying thing. When the card is under load I'm usually wearing headphones and the minor typhoon nose of the 65c temp target isn't bad at all.
> 
> The card has terrible voltage scaling - 30+ mV per 10 mhz. So lowering voltage will greatly increase performance per watt. But not really necessary now as it's cold here this time of year.


You can set a custom fan curve in afterburner to keep the RPM around 800 at idle/low loads.

IMO these cards have no business being OCd given the pathetic performance gains relative to the heat and power output.

I kept mine at -72mv stock clocks dropping it's TDP by a good 40w allowing me to make it much quieter.


----------



## xkm1948

Was trying some new adult content on my Vive tonight. Launching Vive will cause instant graphic driver lock and reset on my Fury X. Tried again with other VR apps, I can confirm 16.10.2 break VR completely, at least on my Fury X.

Not cool AMD, not cool at all.


----------



## jdorje

Quote:


> Originally Posted by *AngryLobster*
> 
> You can set a custom fan curve in afterburner to keep the RPM around 800 at idle/low loads.
> 
> IMO these cards have no business being OCd given the pathetic performance gains relative to the heat and power output.
> 
> I kept mine at -72mv stock clocks dropping it's TDP by a good 40w allowing me to make it much quieter.


The clock-to-voltage ratio is terrible. But the power use actually isn't that bad at all. At 1100 mhz and 1268 mV and with my 4690k at 4.6 ghz and 1305 mV, I barely break 400W at the wall (375-405) running heaven and x264 at the same time. This is less than my 390 drew at stock.

Bump that to 1135 mhz and 1322 mV and it becomes 430-460W. So that scaling isn't good.

Drop it to stock 1050 mhz and 1200 mV and it becomes 370-390W. Not that big a savings.

At 1050 mhz and 1134 mV it's 350-355W. At 1128 mV (-72) it crashed heaven within a minute or so. Haven't really stability tested this. So it might be around 1 watt per mhz within the likely mild-oc range.


----------



## LionS7

Quote:


> Originally Posted by *jdorje*
> 
> The clock-to-voltage ratio is terrible. But the power use actually isn't that bad at all. At 1100 mhz and 1268 mV and with my 4690k at 4.6 ghz and 1305 mV, I barely break 400W at the wall (375-405) running heaven and x264 at the same time. This is less than my 390 drew at stock.
> 
> Bump that to 1135 mhz and 1322 mV and it becomes 430-460W. So that scaling isn't good.
> 
> Drop it to stock 1050 mhz and 1200 mV and it becomes 370-390W. Not that big a savings.
> 
> At 1050 mhz and 1134 mV it's 350-355W. At 1128 mV (-72) it crashed heaven within a minute or so. Haven't really stability tested this. So it might be around 1 watt per mhz within the likely mild-oc range.


@jdorje, I see that you have bad chip like mine. Im at 1.23V 1100Mhz and maybe at least 1.28V for 1125Mhz. Im not sure haha, but at 1.27 it crashes. What happened with these cards and OC... ?


----------



## AngryLobster

Quote:


> Originally Posted by *jdorje*
> 
> The clock-to-voltage ratio is terrible. But the power use actually isn't that bad at all. At 1100 mhz and 1268 mV and with my 4690k at 4.6 ghz and 1305 mV, I barely break 400W at the wall (375-405) running heaven and x264 at the same time. This is less than my 390 drew at stock.
> 
> Bump that to 1135 mhz and 1322 mV and it becomes 430-460W. So that scaling isn't good.
> 
> Drop it to stock 1050 mhz and 1200 mV and it becomes 370-390W. Not that big a savings.
> 
> At 1050 mhz and 1134 mV it's 350-355W. At 1128 mV (-72) it crashed heaven within a minute or so. Haven't really stability tested this. So it might be around 1 watt per mhz within the likely mild-oc range.


Well your interpretation of #'s is different from mine. A 50-60watt drop in consumption and associated Temps greatly out weight the pathetic 7-9% performance gain. I play at 4K, so we're talking 3 FPS gains.

I don't "play" 3DMark and have never installed that trash in my life. After testing in games between OC and stock it just wasn't worth it to me.


----------



## Performer81

Quote:


> Originally Posted by *LionS7*
> 
> @jdorje, I see that you have bad chip like mine. Im at 1.23V 1100Mhz and maybe at least 1.28V for 1125Mhz. Im not sure haha, but at 1.27 it crashes. What happened with these cards and OC... ?


Ist this the voltage under load (after vdrop) or the max voltage (Vid)?


----------



## LionS7

Quote:


> Originally Posted by *Performer81*
> 
> Ist this the voltage under load (after vdrop) or the max voltage (Vid)?


It is the typical voltage under 100% core load.


----------



## Flamingo

Anyone check their driver version under Crimson?

Mine says 16.6, apparently being push out by Windows (10) update.

Had reinstalled yesterday, only to wake up this morning to find it 16.6 again.


----------



## diggiddi

Have you prevented win 10 from upgrading other software?


----------



## Flamingo

Quote:


> Originally Posted by *diggiddi*
> 
> Have you prevented win 10 from upgrading other software?


Nope. I did turn it off but then... these drivers resolved a long running OpenCL issue I had with Luxrender. I even ran DDU and let windows 10 install its own drivers to make sure. So I turned it back on again lol.




Can someone compare the revisions I showed in the image above with 16.10.2? I want to figure out how far (if at all) am I from the most recent release.

If the Radeon Current Settings on the 2nd image corresponds to date, then 16-09-2016 puts it on version Crimson Edition 16.9.1 Hotfix (13-09-2016?)


----------



## diggiddi

Spoiler: Warning: Spoiler!







Here is mine for 16.10 sorry its very blurry snipping tool is slipping u


----------



## LionS7

Quote:


> Originally Posted by *Flamingo*
> 
> Anyone check their driver version under Crimson?
> 
> Mine says 16.6, apparently being push out by Windows (10) update.
> 
> Had reinstalled yesterday, only to wake up this morning to find it 16.6 again.


Just keep manual your drivers up to date to the final beta and the Windows 10 cannot do anything.


----------



## jdip

When my Fury arrives can I just replace my 6950 with it without doing anything in Windows? Or should I uninstall the 6950 drivers and software first and do a fresh install of drivers for the Fury?


----------



## Flamingo

Quote:


> Originally Posted by *diggiddi*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here is mine for 16.10 sorry its very blurry snipping tool is slipping u


Thanks!

The OpenCL versions are not following the same format. but yea, 16.6 is an older version. Wont be upgrading to 16.10.3 that released today. Why upgrade when nothing is broken









(and most likely 16.10.3 OpenCL is broken, dunno will try some day)


----------



## diggiddi

Quote:


> Originally Posted by *jdip*
> 
> When my Fury arrives can I just replace my 6950 with it without doing anything in Windows? Or should I *uninstall the 6950 drivers and software first and do a fresh install of drivers for the Fury?*


This


----------



## pengs

Quote:


> Originally Posted by *LionS7*
> 
> Do you have DX errors in Battlefront with unlocked fps ? Like "graphics device is removed", something like that ? It is not only with Radeon cards.


I've had that in BF4 and crashing in BF3 while alt tabbing using both nVidia and AMD which was reminiscent of the same type of crash -- I have a feeling that it happens when the GPU changes power states, that in combination with an overclock, over/undervolt and fullscreen mode - it's a display driver crash. In Event Viewer>System you may see a display driver crash, the game is just reporting that when it tried to recover the graphics card was down.

Windowed borderless mode should alleviate that. I've noticed that many games are more stable in Windows 10 using borderless and helps if your testing overclocks as a driver crash is much more recoverable (compared to fullscreen mode), unless your Freesync and then you apparently need to use fullscreen to have it work.


----------



## LionS7

Quote:


> Originally Posted by *pengs*
> 
> I've had that in BF4 and crashing in BF3 while alt tabbing using both nVidia and AMD which was reminiscent of the same type of crash -- I have a feeling that it happens when the GPU changes power states, that in combination with an overclock, over/undervolt and fullscreen mode - it's a display driver crash. In Event Viewer>System you may see a display driver crash, the game is just reporting that when it tried to recover the graphics card was down.
> 
> Windowed borderless mode should alleviate that. I've noticed that many games are more stable in Windows 10 using borderless and helps if your testing overclocks as a driver crash is much more recoverable (compared to fullscreen mode), unless your Freesync and then you apparently need to use fullscreen to have it work.


But is it unstable overclock, cos it chashes with 1.23V (witch is stable in every game), and 1.26V too. Is it from the game engine...


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> But is it unstable overclock, cos it chashes with 1.23V (witch is stable in every game), and 1.26V too. Is it from the game engine...


Define every game. I find with my insane sized library of games ranging from the 90's to now that I can get a game crash, driver crash, something or other with any form of OC (and some anyway with everything at stock) on a myriad of systems over the years using a myriad of different Operating Systems if I try enough of them. That said BF1 is a new hot title and that isn't the same as trying to supersample SWAT 4 at 1440p







but I want to get an idea of what other games you are running frequently without issue to get a clearer picture. I often say all games then realize Ex Post Facto that is subjective to each user.
Quote:


> Originally Posted by *pengs*
> 
> I've had that in BF4 and crashing in BF3 while alt tabbing using both nVidia and AMD which was reminiscent of the same type of crash -- I have a feeling that it happens when the GPU changes power states, that in combination with an overclock, over/undervolt and fullscreen mode - it's a display driver crash. In Event Viewer>System you may see a display driver crash, the game is just reporting that when it tried to recover the graphics card was down.
> 
> Windowed borderless mode should alleviate that. I've noticed that many games are more stable in Windows 10 using borderless and helps if your testing overclocks as a driver crash is much more recoverable (compared to fullscreen mode), unless your Freesync and then you apparently need to use fullscreen to have it work.


I am a FreeSync user and the only problem I have had with running everything in fullscreen(single monitor setup though which probably has some bearing on the situation) is a game crash in Hitman Abso on this rig which steam auto-repaired and then the game was working fine after that. This was after a driver change and the anniversary update that scewed my GPU driver though. This is all on a windows 10 pro system but I had forgotten to disallow hardware updates prior to the anniversary. I haven't had any issue since I changed that setting. CPU is at 4.6 stock voltage and the GPU is not OC'd at all...I see no need as the Fury X is not great at that however in my custom loop it never goes over 45C which has my a bit curious but with constant driver swaps I would prefer to wait a bit.


----------



## eperelez

I can confirm this adapter works to get 4K 60Hz from DisplayPort to HDMI 2.0 with my Fury!

Vantec VLink CB-HD20DP12 DisplayPort 1.2 to HDMI 2.0 UHD [email protected] Active Adapter:
http://www.newegg.com/Product/Product.aspx?Item=N82E16812232065&cm_re=vantec_displayport_to_hdmi-_-12-232-065-_-Product

IT IS GLORIOUS!!!!!


----------



## Thoth420

Quote:


> Originally Posted by *eperelez*
> 
> I can confirm this adapter works to get 4K 60Hz from DisplayPort to HDMI 2.0 with my Fury!
> 
> Vantec VLink CB-HD20DP12 DisplayPort 1.2 to HDMI 2.0 UHD [email protected] Active Adapter:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16812232065&cm_re=vantec_displayport_to_hdmi-_-12-232-065-_-Product
> 
> IT IS GLORIOUS!!!!!


Glad it worked out!







What do you have it plugged into?


----------



## eperelez

Quote:


> Originally Posted by *Thoth420*
> 
> Glad it worked out!
> 
> 
> 
> 
> 
> 
> 
> What do you have it plugged into?


Thanks! I picked up a Samsung UN40KU6290FXZA 40" 4K VA LED TV for $347.99 shipped from Amazon! My PC is the first to get the 4K upgrade.


----------



## Thoth420

Quote:


> Originally Posted by *eperelez*
> 
> Thanks! I picked up a Samsung UN40KU6290FXZA 40" 4K VA LED TV for $347.99 shipped from Amazon! My PC is the first to get the 4K upgrade.


Nice that thing looks really sweet and that price is a steal. I may pick one up as I plan to swap my xbox one in for the scorpio. Love the console for a backup device or for the occasional bad port on PC that I would play via my controller anyways. Enjoy duder!


----------



## LionS7

Quote:


> Originally Posted by *Thoth420*
> 
> Define every game. I find with my insane sized library of games ranging from the 90's to now that I can get a game crash, driver crash, something or other with any form of OC (and some anyway with everything at stock) on a myriad of systems over the years using a myriad of different Operating Systems if I try enough of them. That said BF1 is a new hot title and that isn't the same as trying to supersample SWAT 4 at 1440p
> 
> 
> 
> 
> 
> 
> 
> but I want to get an idea of what other games you are running frequently without issue to get a clearer picture. I often say all games then realize Ex Post Facto that is subjective to each user.
> .


Like The Witcher 3, The Evil Dead, Dark Souls III, Rise of the Tomb Raider. This absurd is only with Battlefront. This DX error. If someone tell me that it is error with the game engine, I will believe it, cos th problem is massive.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> Like The Witcher 3, The Evil Dead, Dark Souls III, Rise of the Tomb Raider. This absurd is only with Battlefront. This DX error. If someone tell me that it is error with the game engine, I will believe it, cos th problem is massive.


Do you have origin overlay or in-game or whatever it is called on? Also do you keep your browser open whilst playing? I assume you already tried reinstalling the DX redist packages included with the game.


----------



## LionS7

Quote:


> Originally Posted by *Thoth420*
> 
> Do you have origin overlay or in-game or whatever it is called on? Also do you keep your browser open whilst playing? I assume you already tried reinstalling the DX redist packages included with the game.


Overlay is off. Yes, my browser is active, playing some stream ? Well, I didn't find DX package in the game dir... where I can find it ?


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> Overlay is off. Yes, my browser is active, playing some stream ? Well, I didn't find DX package in the game dir... where I can find it ?


I would guess in the Game folder inside Origin folder unless you installed elsewhere. I always close my browser once the game has loaded and found it to increase performance a bit and I have never had a crash in BF4 or 3 since. You can also try the repair game option in Origin as that should install anything that may be missing afaik that includes the required redists. I doubt that is the problem here but it is always the first thing I do if I start having problems with a game.


----------



## miklkit

Hi all!

My Fury is almost a month old now and doing well but I have a question.

I tested it in Passmark and it did poorly in DX9, great in DX10, and maybe ok in DX11. It does do poorly in DX9 as my 290X actually did better there. I have no DX10 games so have no opinion and in DX11 it seems to be a little better than the 290X while running much cooler.

My question is: Can the DX11 performance be improved? I don't care about DX 9 & 10 but DX11 and later DX12 are important.

The bios is 015.049.000.016 dated 7-20-2016.


----------



## xkm1948

Quote:


> Originally Posted by *miklkit*
> 
> Hi all!
> 
> My Fury is almost a month old now and doing well but I have a question.
> 
> I tested it in Passmark and it did poorly in DX9, great in DX10, and maybe ok in DX11. It does do poorly in DX9 as my 290X actually did better there. I have no DX10 games so have no opinion and in DX11 it seems to be a little better than the 290X while running much cooler.
> 
> My question is: Can the DX11 performance be improved? I don't care about DX 9 & 10 but DX11 and later DX12 are important.
> 
> The bios is 015.049.000.016 dated 7-20-2016.


Fury is designed for DX12...


----------



## Arizonian

Yeah in mainstream games like BF1 which offer both, it's already a moot point. I'm getting 44-57 FPS default Ultra @ 3840x2160 with DX12 on my nitro fury at 1100 MHz Core. It's a few FPS less in DX11. I'm a happy camper inside my monitors adaptive sync range 40-60. Despite what some of the naysayers said, DX12 is coming faster than expected. It just may take a while to usher out DX11 due nvidias market share while they try to catch up.


----------



## miklkit

Oh.

Since this card has been out a while I thought someone else had noticed the big difference in performance between the different versions of dx and had done something about it. Guess not........


----------



## Minotaurtoo

2d sucks too lol... I only noticed it though when I ran passmark.


----------



## miklkit

To me 2D is browsing like now and it is fine.

There is nothing wrong with its performance outside of DX9 with it running higher settings and higher fps than the 290X in DX11. It's just the drop relative to DX10 that has me wondering. Maybe it will get better when I get a better monitor as it's at 1080P now.


----------



## huzzug

Quote:


> Originally Posted by *miklkit*
> 
> To me 2D is browsing like now and it is fine.
> 
> There is nothing wrong with its performance outside of DX9 with it running higher settings and higher fps than the 290X in DX11. It's just the drop relative to DX10 that has me wondering. Maybe it will get better when I get a better monitor as it's at 1080P now.


With DX9, any modern card can rip through it @ xxx fps. Same with DX10. A few fps (even in tens of them) will not affect someone's gameplay. A good example of this I see is with Skyrim. Most if the time, the fault lies with the games inability to run on higher threads and that Gamebryo's issue with running anything at higher than 60fps makes the game's AI go stupid. With DX11, you need to drive AMD cards at higher resolution, especially if you have a fury @ 1080p by using VSR @ 1440p to really shine. DX12 is where these cards show their strength with how the API is designed to work.


----------



## miklkit

Since I only have one DX9 game left it is really a moot point but the difference is kinda big. My system is globally capped at 150 fps and that is where that one game runs with the 290X. The Fury can't do that, but is still able to keep it over 110fps. In fact it failed the DX9 test at first and I had to tweak some settings to get it to pass.

You shouldn't use Skyrim as an example of anything anymore as it is sooo last century it is no longer relevant anymore. Why, it is still loaded with X87 code from the 1980s!

So overall it seems I need to start working on my wife for an early Christmas present in the form of a new monitor. How about a 144mhz 27" 1440P number?


----------



## Minotaurtoo

Quote:


> Originally Posted by *miklkit*
> 
> Since I only have one DX9 game left it is really a moot point but the difference is kinda big. My system is globally capped at 150 fps and that is where that one game runs with the 290X. The Fury can't do that, but is still able to keep it over 110fps. In fact it failed the DX9 test at first and I had to tweak some settings to get it to pass.
> 
> You shouldn't use Skyrim as an example of anything anymore as it is sooo last century it is no longer relevant anymore. Why, it is still loaded with X87 code from the 1980s!
> 
> So overall it seems I need to start working on my wife for an early Christmas present in the form of a new monitor. How about a 144mhz 27" 1440P number?


I could tell you how i talked my wife into "permission" to get the 49" 4K 120hz monitor I got... : ) I gave her the whole I don't want anything for Christmas followed by a list of all the usual suspects saying I don't need any more of them ... mumbled something here and there about my small monitor driving me nuts... and then she had the brilliant idea... get a bigger monitor... which I said it'd have to be higher res or it wouldn't work... voila she said get me the biggest highest res monitor I could afford...... so I did... : ) downside: I had to get her a present of equal value... but i knew that so I cut my budget in half before I bought... or I'd had a 60" uhd.


----------



## miklkit

LOL! My wife knows I want a new monitor but the problem is she is so cheap she would make a Scotsman turn green with envy. I already told her that this Fury is only half of the package and a new monitor is next, but she is looking at $100 stuff instead of what I need.


----------



## Minotaurtoo

Quote:


> Originally Posted by *miklkit*
> 
> LOL! My wife knows I want a new monitor but the problem is she is so cheap she would make a Scotsman turn green with envy. I already told her that this Fury is only half of the package and a new monitor is next, but she is looking at $100 stuff instead of what I need.


I feel for you. Good luck... A friend once said... it's easier to ask for forgiveness than permission....


----------



## jdip

Slight upgrade for me in the GPU department...











The install was quite the ordeal. Computer refused to boot with the new card in. I was stumped and thought it was DoA, but I ended up updating my BIOS and thank god that worked.

The card is so quiet! My 6950 used to be the loudest thing in my case. Now it's my PSU. Performance wise, it's night and day. Feels good to play things on high/ultra.

I'm trying undervolting and it seems fine at -72 mV so far. I still get to 75 -80 C at load. Any suggestions on a custom fan curve to keep it cooler? And do any of you overclock your cards at the same time as undervolting?


----------



## gupsterg

My opinon on why the temps are still 75-80C is due to how the "Fuzzy Logic" cooling profile works. If you dump your ROM and open it in Fiji bios editor on the cooling tab you'll see a Target GPU temp, the "Fuzzy logic" will adjust fan profile "on the fly" to maintain that temp. As you've undervolted I reckon if you adjusted that value you may not see a huge rise in fan noise to maintain a lower Target GPU temp.

The other option is custom fan profile using MSI AB, etc. I prefer ROM mod TBH.

Yeah some owners can undervolt and OC a bit, as result willl depend on the particular card YMMV.


----------



## kubac4

Hi, so I got my xfx r9 fury today and I was so excited to try it out, but I'm having some problems with it. First of all i couldn't fit this monster in, I mean I got enough space but the plastick clip thingy on dpe PCIE wouldnt click so I just pushed as hard as i could then screwed it in and turned my pc on. Everything went smooth it worked, I uninstaled old drivers and instaled new. I noticed straight away a little coil whine when moving the mouse, but i could barely hear it. Then i wanted to run something to see how its performing but the only game I got on my pc atm is WOW. Anyway i started the game and then i heard the worst sound coming out of my pc buzzing like crazy. I minimized the game buzzing was gone. I go back to the game its back with some artifacts aswell. Should I RMA straight away ? Or could it be something I can fix ? Like the card is not properly connected ? Or should I try new bios ? Diffrent drivers ?rest of my system is z170m asus itx mobo i5 6600k 16gb corsair 3000 and silverstone 500w gold.


----------



## bluezone

Crimson 16.11.1

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.1-Release-Notes.aspx

Win 10 64 bit.

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#

Win 7 64 bit.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

Wow I had not yet even installed 16.10.3.


----------



## jdip

Quote:


> Originally Posted by *gupsterg*
> 
> My opinon on why the temps are still 75-80C is due to how the "Fuzzy Logic" cooling profile works.


Yeah that makes sense. In the TomsHardware article about undervolting they also say that the card will run hotter because it's putting out less heat overall and the fans are less aggressive because of it.

Quote:


> Originally Posted by *kubac4*
> 
> Hi, so I got my xfx r9 fury today and I was so excited to try it out, but I'm having some problems with it. First of all i couldn't fit this monster in, I mean I got enough space but the plastick clip thingy on dpe PCIE wouldnt click so I just pushed as hard as i could then screwed it in and turned my pc on. Everything went smooth it worked, I uninstaled old drivers and instaled new. I noticed straight away a little coil whine when moving the mouse, but i could barely hear it. Then i wanted to run something to see how its performing but the only game I got on my pc atm is WOW. Anyway i started the game and then i heard the worst sound coming out of my pc buzzing like crazy. I minimized the game buzzing was gone. I go back to the game its back with some artifacts aswell. Should I RMA straight away ? Or could it be something I can fix ? Like the card is not properly connected ? Or should I try new bios ? Diffrent drivers ?rest of my system is z170m asus itx mobo i5 6600k 16gb corsair 3000 and silverstone 500w gold.


Is the buzzing coil whine?

For the artifacting you could try to reinstall the drivers (use something like Driver Sweeper to wipe it). You could also try reflashing BIOS I suppose.

If you have access to another PC or a friend's PC, maybe you could try it in there first? Seems like it would be easier to try that first then to reinstall drivers + reflash BIOS.


----------



## kubac4

I have the newest drivers. I took mobo outside the case to make sure the card is connected correctly. and this buzzing is still here its so loud its ridiculous its likea cricket. TEmps are fine 33 idle 43 load. I think Ill just send it back tomorrow but should I mention why ? or just say I dont need it anymore ?


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Crimson 16.11.1
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.1-Release-Notes.aspx
> 
> Win 10 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#
> 
> Win 7 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> Wow I had not yet even installed 16.10.3.


I just installed the driver before this yesterday...hard to keep up which is great. I love options and I also love that some of these hotfix drivers get WHQL after the fact(helps with clients systems generally prebuilts with hellish firmware). The driver I installed was the BF1 driver and I was hoping it would solve my random serious framerate dive in BF4 (144 down to like 5 to 10 fps) which so far has done the trick. This new one has nothing that affects me at the moment so going to stick until the BF1 known issue listed is fixed.


----------



## bluezone

On a side note GTribe (AMD Gaming Evolved) is having a give away. Free PC and gear. The combined value of the prizes is $8,610.

https://www.gamingtribe.com/giveaway/uprising/gt_8010859197471292

AND VRMark is out.

http://www.futuremark.com/benchmarks/vrmark

My score.

http://www.3dmark.com/vrm/15828314


----------



## jdip

I'm having some issues with my new Fury as well (coming from a 6950). I'm using Windows 7 Ultimate.

1. I have my Windows Power Management to sleep my display after 5 min of inactivity. But since installing my Fury it doesn't work anymore. I tried wiping the drivers and AMD software and reinstalling it again, but that didn't work. I also checked if the display would sleep with the generic windows GPU driver (after wiping the AMD one) and it didn't work either, so I don't think it's because of the AMD drivers. FWIW the video card is the only thing I changed since it stopped working. I tried to turn on Screen Saver and Sleep after inactivity just to test (I don't usually use them) but they don't work either!

However, the display sleep timer DOES work if I reboot and stay at the log-on screen (before any applications have loaded). But once I log in and applications load, it stops working again. I tried force quitting all AMD programs but it didn't help :/ *Solved*

2. Also the colors that show on my display are not quite right. The picture is too warm and has a slight tint to it. I tried to change the color temp. in Crimson and I can get close, but it's still not perfectly neutral like how I would like it (and how it was before). I also tried to load the color profile (.icm) that I was using with my monitor before, but it changed nothing.

I'm loving the performance the card is giving me in games, but these two things are really spoiling it for me. Any ideas?


----------



## bluezone

I thought the sleep bug had been fixed. try to report it to AMD. I do not use sleep myself.

try adjusting these settings. These are settings for my display though.


Spoiler: Warning: Spoiler!


----------



## pengs

Quote:


> Originally Posted by *jdip*
> 
> Slight upgrade for me in the GPU department...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The install was quite the ordeal. Computer refused to boot with the new card in. I was stumped and thought it was DoA, but I ended up updating my BIOS and thank god that worked.
> 
> The card is so quiet! My 6950 used to be the loudest thing in my case. Now it's my PSU. Performance wise, it's night and day. Feels good to play things on high/ultra.
> 
> I'm trying undervolting and it seems fine at -72 mV so far. I still get to 75 -80 C at load. Any suggestions on a custom fan curve to keep it cooler? And do any of you overclock your cards at the same time as undervolting?


Yeah it targets 75C regardless of voltage though you should have a reduction in fan noise. Not that it's really needed.

Make sure the 260w bios is switched on, the 300w targets 80C. Mine moves from 70-76C max.


----------



## jdip

Quote:


> Originally Posted by *bluezone*
> 
> I thought the sleep bug had been fixed. try to report it to AMD. I do not use sleep myself.
> 
> try adjusting these settings. These are settings for my display though.


Thanks for taking the time to reply man









I was finally able to pinpoint what was stopping my display from sleeping. It was actually the Battle.net launcher. Weird, I know. But I recall the first time I got into Windows after installing the Fury, the B.net launcher acted as if it had just been installed as all the settings had returned to default. I found that odd but just went with it. That must have messed something up. But I reinstalled it and all is well on that side. Feels good to finally get to the bottom of it. I've been at it for a couple of days now and it was driving me insane.

As for the colors, I tried tinkering with the Radeon Additional Settings and still couldn't get it quite right. I remember with my old video card, I used a calibrated color profile (.icm file) to get things right. However, when I try and apply it now, it doesn't change the color profile at all, it just stays the same. Do you know anything about that by any chance?


----------



## jdip

Quote:


> Originally Posted by *pengs*
> 
> Yeah it targets 75C regardless of voltage though you should have a reduction in fan noise. Not that it's really needed.
> 
> Make sure the 260w bios is switched on, the 300w targets 80C. Mine moves from 70-76C max.


Yes, that makes sense about the target temp staying the same. I made a custom fan profile to keep it a bit cooler. It's still very quiet.

As for the BIOS switch, do you know which is which? When the switch is lower (clicked in) is that the 260W one?


----------



## jearly410

Quote:


> Originally Posted by *jdip*
> 
> As for the BIOS switch, do you know which is which? When the switch is lower (clicked in) is that the 260W one?


The one closest to the pcb, or "clicked in,"is the 300w. My sapphire logo also lights up blue when it is pushed in.

Another way to check is to open Radeon overdrive and if the temp target is 80 or 85 (can't remember) it is the 300w profile. If the temp target is 75 degrees it is the 260w profile.


----------



## jdip

Quote:


> Originally Posted by *jearly410*
> 
> The one closest to the pcb, or "clicked in,"is the 300w. My sapphire logo also lights up blue when it is pushed in.
> 
> Another way to check is to open Radeon overdrive and if the temp target is 80 or 85 (can't remember) it is the 300w profile. If the temp target is 75 degrees it is the 260w profile.


Oops! I thought pushed in was the default lower power BIOS because that's how my card came. Or I probably hit it by accident when installing









Thanks, +rep


----------



## kubac4

So my xfx r9 fury is on the way back to where i got it from and now im wondering if I should get another xfx or try sapphire nitro ? Is coil whine a common issue with xfx or all the fury cards ? Or I was just unlucky ? Both sapphire and xfx are same price and a good deal. I want something that's gonna run cooler in my itx toaster case ^^ what You guys think ??


----------



## miklkit

Quote:


> Originally Posted by *bluezone*
> 
> Crimson 16.11.1
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.1-Release-Notes.aspx
> 
> Win 10 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#
> 
> Win 7 64 bit.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> Wow I had not yet even installed 16.10.3.


I cleaned out all AMD stuff and installed that driver ( Win X 64 bit ) then played some TW3. Uh ohh, 36 fps. It seems the driver wiped out the Afterburner profiles and it was running at 500mhz. Reset it to 1050 and got the frame rates back but the temps were 77-78C.

Reset the fan profile in AF by moving the point it hits 100% from 70C to 65C and temps dropped 20C to 57-58C.

I never touched the button on my Nitro and the blue light is lit. I do not have any coil whine and all I hear is the fans.


----------



## LeadbyFaith21

Quote:


> Originally Posted by *kubac4*
> 
> So my xfx r9 fury is on the way back to where i got it from and now im wondering if I should get another xfx or try sapphire nitro ? Is coil whine a common issue with xfx or all the fury cards ? Or I was just unlucky ? Both sapphire and xfx are same price and a good deal. I want something that's gonna run cooler in my itx toaster case ^^ what You guys think ??


I've got the Sapphire Tri-X and one of my friends has the Nitro, both are great coolers, but they do need ample air to feed the fans to keep them cool...I know you said an ITX case, but how much airflow do you have in it?

Also, the Nitro is ever-so-slightly bigger than 2 slot and a little taller than most cards, just so you know!


----------



## kubac4

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> I've got the Sapphire Tri-X and one of my friends has the Nitro, both are great coolers, but they do need ample air to feed the fans to keep them cool...I know you said an ITX case, but how much airflow do you have in it?
> 
> Also, the Nitro is ever-so-slightly bigger than 2 slot and a little taller than most cards, just so you know!


Like i said I already had xfx fury inside my case and cooling was okay and I want to know if nitro is the same and i think nitro is slightly smaller than xfx so it will fit







im just worried about the coil whine. That xfx card made so much noise it was 2x louder than its fans at 100% speed xD


----------



## Alastair

Quote:


> Originally Posted by *kubac4*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LeadbyFaith21*
> 
> I've got the Sapphire Tri-X and one of my friends has the Nitro, both are great coolers, but they do need ample air to feed the fans to keep them cool...I know you said an ITX case, but how much airflow do you have in it?
> 
> Also, the Nitro is ever-so-slightly bigger than 2 slot and a little taller than most cards, just so you know!
> 
> 
> 
> Like i said I already had xfx fury inside my case and cooling was okay and I want to know if nitro is the same and i think nitro is slightly smaller than xfx so it will fit
> 
> 
> 
> 
> 
> 
> 
> im just worried about the coil whine. That xfx card made so much noise it was 2x louder than its fans at 100% speed xD
Click to expand...

the nitro has a longer PCB then the reference PCB used by the XfX. As for total length. I can't tell you though. But just from eyeballing it looks like Nitro is the same overall size as Tri-X.


----------



## xkm1948

I am getting better Time Spy performance with newer Crimson drivers. I used to get ~5300 for 1100MHz core. Now similar OC I am looking at ~5500.


----------



## bluezone

Another new driver?
Crimson 16.11.2

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.2-Release-Notes.aspx

Win 10 64 bit

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64 bit

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Flamingo

Anyone posting VRMark scores? (This is with 16.6 Windows driver though).

Here are my scores on stock CPU freq (6700k)

6953 @ Stock settings and fan can go max 100%
http://www.3dmark.com/vrpor/21106

7065 @ PL+50% to prevent core clock from falling below 100% and fan can go max 100%
http://www.3dmark.com/vrpor/21115

7387 @ 50% PL, 1050Mhz, fan can go max 100%
http://www.3dmark.com/vrm/15858843

Edit: CPU overclocked to 4.5Ghz

7439 @ 1050Mhz
http://www.3dmark.com/vrpor/21177

7654 @ 1080Mhz
http://www.3dmark.com/vrpor/21245


----------



## pdasterly

which newer games work with crossfire?
I have star wars battlefront already


----------



## Fguarezi

My Sapphire Tri X OC unlock 4032sp @ 1097/500

GPU Score 5279.



[
Quote:


> Originally Posted by *xkm1948*
> 
> I am getting better Time Spy performance with newer Crimson drivers. I used to get ~5300 for 1100MHz core. Now similar OC I am looking at ~5500.


----------



## neurotix

Quote:


> Originally Posted by *pdasterly*
> 
> which newer games work with crossfire?
> I have star wars battlefront already


Rise of the Tomb Raider supports it very well, in my experience.

The first Tomb Raider (2013) does as well.

Not sure if that's new enough for you, though.


----------



## pdasterly

Quote:


> Originally Posted by *neurotix*
> 
> Rise of the Tomb Raider supports it very well, in my experience.
> 
> The first Tomb Raider (2013) does as well.
> 
> Not sure if that's new enough for you, though.


2016 please


----------



## Alastair

Guys 1440P freesync monitors. Suggestions?

Not sure if I should go normal or ultra wide? Help me.


----------



## Cyants

I currently run a 1080P Ultra wide with freesync (34UM67) and I love it.

While some games and don't like to play nice with a different than 16:9 ratio,more and more devs take the 5 min needed to add support for it. For productivity there is no way i'd want less than ultra wide now.

Looking for this when I switch from my Fury X to Vega next year:
http://www.lg.com/us/monitors/lg-38UC99-W-ultrawide-monitor


----------



## Alastair

Quote:


> Originally Posted by *Cyants*
> 
> I currently run a 1080P Ultra wide with freesync (34UM67) and I love it.
> 
> While some games and don't like to play nice with a different than 16:9 ratio,more and more devs take the 5 min needed to add support for it. For productivity there is no way i'd want less than ultra wide now.
> 
> Looking for this when I switch from my Fury X to Vega next year:
> http://www.lg.com/us/monitors/lg-38UC99-W-ultrawide-monitor


Well after doing some research it seems ultra wides above the 1080P level are around R15K in my country. My budget being Around R10K. I found a 4K sammy freesync screen. But its TN. I can also find the Acer XG27HU but again thats TN. It seems 1440P IPS freesync screens are harder to find then I would of thought.


----------



## Tgrove

Quote:


> Originally Posted by *Alastair*
> 
> Guys 1440P freesync monitors. Suggestions?
> 
> Not sure if I should go normal or ultra wide? Help me.


1. what size are you looking for
2. How much are you willing to spend
3. You can still create ultra wide resolutions with a 4k monitor, what refresh rate are you looking for?
4. are you open to korean monitor brands? (This is the main question)

Dont forget korean monitors ushered us into the 1440p era and for years were the best monitors get. you can also get a squaretrade warranty for these monitors. Been using a 49" ips 4k freesync 0 blb or glow monitor for about a year now pure bliss


----------



## Alastair

Quote:


> Originally Posted by *Tgrove*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys 1440P freesync monitors. Suggestions?
> 
> Not sure if I should go normal or ultra wide? Help me.
> 
> 
> 
> 1. what size are you looking for
> 2. How much are you willing to spend
> 3. You can still create ultra wide resolutions with a 4k monitor, what refresh rate are you looking for?
> 4. are you open to korean monitor brands? (This is the main question)
> 
> Dont forget korean monitors ushered us into the 1440p era and for years were the best monitors get. you can also get a squaretrade warranty for these monitors. Been using a 49" ips 4k freesync 0 blb or glow monitor for about a year now pure bliss
Click to expand...

1. I reckon in the 27 range. So 27-30
2. R10000 ZAR
3. Above 60Hz preferably 144.
4. Yes but do I get them in South Africa?


----------



## pdasterly

Quote:


> Originally Posted by *Alastair*
> 
> Guys 1440P freesync monitors. Suggestions?
> 
> Not sure if I should go normal or ultra wide? Help me.


acer x34ck1


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Guys 1440P freesync monitors. Suggestions?
> 
> Not sure if I should go normal or ultra wide? Help me.


If you don't mind the little extra work to get some games to play right at 21:9 then def go with the 3440 x 1440 panels. Especially with xfire Furys you will def be able to drive that.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys 1440P freesync monitors. Suggestions?
> 
> Not sure if I should go normal or ultra wide? Help me.
> 
> 
> 
> If you don't mind the little extra work to get some games to play right at 21:9 then def go with the 3440 x 1440 panels. Especially with xfire Furys you will def be able to drive that.
Click to expand...

From what it seems I cant afford it.







So I think I will have to stick with standard 1080P ultra wides, 1440P IPS and maybe 144Hz or 4K TN 60Hz.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> From what it seems I cant afford it.
> 
> 
> 
> 
> 
> 
> 
> So I think I will have to stick with standard 1080P ultra wides, 1440P IPS and maybe 144Hz or 4K TN 60Hz.


I was in the same boat. I am totally happy with the BenQ I have but the typical AU O panel issues exist with anything you choose so it comes down to a bit of luck. 600 bucks was my upper limit for a panel so I had to scrap the widescreens from consideration right off the bat.

The TNs benefit greatly if they are calibrated but will never have great colors in contrast to the IPS. I found the IPS screens to have way too much glow though and thus darks look like crap and the color bonus is lost. TN til OLED or QD for me.


----------



## miklkit

I too am looking into a 144mhz monitor and have about settled on a 27" model. I personally do not like the ultrawides because of the fisheye distortion I see in most screen shots.

Your price range is about $400 USD above mine and around here there are some really nice 144hz monitors in both 16:9 and 21:9 at that price.


----------



## Alastair

Well if I can find one used, I would love to settle for an ultrawide 1440P freesync like the Acer XR341CK. But If not I think the Acer XB270HU which is 1440P144hz TN, or the Samsung U28E590D which is also a TN 4K 60Hz.


----------



## Johan45

I have a question for you guys who have had these for a while.Memory OC , is that driver dependant/ i seem to lose the ability to overclock the mem when I update drivers to anything current.

EDIT: I have the Fury strix using GPUTweak


----------



## bluezone

Quote:


> Originally Posted by *Johan45*
> 
> I have a question for you guys who have had these for a while.Memory OC , is that driver dependant/ i seem to lose the ability to overclock the mem when I update drivers to anything current.
> 
> EDIT: I have the Fury strix using GPUTweak


In this order. Try a complete uninstall of all GPU utilities (clear overclocks and such first) and clean uninstall of drivers. Power down and disconnect or turn off power supply. Wait 30 seconds and then power on PS and turn on PC. Then install drivers followed by GPU utilities. This should clear the problem.

Try to make sure in future that utility based overclocks and modifiers are disabled when installing drivers. This is sometimes the problem.

Good luck.


----------



## Johan45

I'll give it a shot. The only thing in that list I wasn't doing was killing the power.
This card has reminded me why I prefer Nvidia


----------



## Skyl3r

Quote:


> Originally Posted by *Johan45*
> 
> I'll give it a shot. The only thing in that list I wasn't doing was killing the power.
> This card has reminded me why I prefer Nvidia


I've been using Sapphire TriXX and have not been having problems. The only time I did have a problem was when I tried installing TriXX with Afterburner already installed. If you have another overclocking utility, remove that.

Also, no need to try to flame the AMD card owners.


----------



## Johan45

I wasn't trying to flame any one, this thing has been a pita especially with Win10. I'm a bencher pure and simple and can't stand weird little driver/os issues which I'm assuming this is. Drives me batty. I wanna set things up dial it in and bench not piss around for two nights with software issues.


----------



## bluezone

No worries. I do not feel flamed. Green and Red both have their issues.

This particular issue only effects a relative few people on rare occasions. You and I just happen to be among those.


----------



## Johan45

It makes it really hard to test drivers for performance that's for sure.


----------



## Alastair

I have never in all my years of team red experienced this issue.


----------



## Skyl3r

Quote:


> Originally Posted by *Alastair*
> 
> I have never in all my years of team red experienced this issue.


I was very forwardedly against AMD for a while and very hesitant to get my 390x. What I've found though is every time I thought I found a crazy driver issue it turned out to be bad RAM or operator error.

When setting up my Fury X and adding my second, I just popped them in and I was off to the races. No problems at any point.


----------



## Sleazybigfoot

Quote:


> Originally Posted by *Johan45*
> 
> I wasn't trying to flame any one, this thing has been a pita especially with Win10. I'm a bencher pure and simple and can't stand weird little driver/os issues which I'm assuming this is. Drives me batty. I wanna set things up dial it in and bench not piss around for two nights with software issues.


I'm not trying to start a flame war or anything but this isn't just the red team. I remember reading quite a few news items about NVidia's drivers causing crashes (for a lot of people mind you) in Windows 10.

Anyway, hope you get it solved. Love my Fury, runs like a charm.


----------



## Thoth420

I have only had one device_removed crash and one random hitman absolution software crash on my Fury X both on beta drivers. I have had to uninstall overclock software prior to driver swaps for AMD and Nvidia. I believe it was OS system settings or some other user error on my part.


----------



## neurotix

Johan is Windows 10 even allowed on hwbot yet? I still run Win7 for this reason. No problems with my dual Fury's in Win7.


----------



## Johan45

OK I have to say it wasn't AMD's fault this time. I think it was the latest version of GPUtweak. When I use the one off the disc it works fine except I still don't have fan control so I was using Trixx for that.



Quote:


> Originally Posted by *neurotix*
> 
> Johan is Windows 10 even allowed on hwbot yet? I still run Win7 for this reason. No problems with my dual Fury's in Win7.


Yes Win 10 is allowed for all 3DMark benches but you need to include a link to the results page. There's a chart for exceptions in the rules.


----------



## jdip

Quote:


> Originally Posted by *Johan45*


What was your score before OCing the memory? Does it make a big difference? I've read that OCing the HBM Memory on the Fury doesn't really do much (haven't tried it myself, just what I see being said everywhere).


----------



## u3a6

Quote:


> Originally Posted by *Johan45*
> 
> OK I have to say it wasn't AMD's fault this time. I think it was the latest version of GPUtweak. When I use the one off the disc it works fine except I still don't have fan control so I was using Trixx for that.
> 
> 
> Yes Win 10 is allowed for all 3DMark benches but you need to include a link to the results page. There's a chart for exceptions in the rules.


That is a nice oc! Mine won't do more than 1070MHz (I have full unlock) on the core at 65ºC. I have to push the fans up to 100% to do more. My HBM runs fine at the first step above 500MHz (545.5 MHz). It will run at 600MHz, but I have a small rainbow like artifact at that speed.

EDIT: What is your ASIC Quality?


----------



## Johan45

Quote:


> Originally Posted by *jdip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What was your score before OCing the memory? Does it make a big difference? I've read that OCing the HBM Memory on the Fury doesn't really do much (haven't tried it myself, just what I see being said everywhere).
Click to expand...

total score was 8509, graphics portion 9067 before 9218 after.

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> OK I have to say it wasn't AMD's fault this time. I think it was the latest version of GPUtweak. When I use the one off the disc it works fine except I still don't have fan control so I was using Trixx for that.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Yes Win 10 is allowed for all 3DMark benches but you need to include a link to the results page. There's a chart for exceptions in the rules.
> 
> 
> 
> That is a nice oc! Mine won't do more than 1070MHz (I have full unlock) on the core at 65ºC. I have to push the fans up to 100% to do more. My HBM runs fine at the first step above 500MHz (545.5 MHz). It will run at 600MHz, but I have a small rainbow like artifact at that speed.
> 
> EDIT: What is your ASIC Quality?
Click to expand...

No idea of my ASIC never checked, I've never concentrated on that too much TBH.
I was using 1.33v for that clock, might still be more in the tank but I think it needs water. I can unlock this one but get nasty weird artifacts benching. Looks like flashlights running toward you, a whole screen full of them.


----------



## u3a6

Quote:


> Originally Posted by *Johan45*
> 
> total score was 8509, graphics portion 9067 before 9218 after.
> No idea of my ASIC never checked, I've never concentrated on that too much TBH.
> I was using 1.33v for that clock, might still be more in the tank but I think it needs water. I can unlock this one but get nasty weird artifacts benching. Looks like flashlights running toward you, a whole screen full of them.


Mine is set at the auto calculated DPM7 of 1.25v. But I noticed that my card scales really well with temperature, if yours is similar it might scale like mine.

http://hwbot.org/submission/3341525_

That was with +20mv or something on the core with fans cranked up to 100%! (core temp below 35ºC)


----------



## Johan45

I haven't tried GB4 yet. Is the GPU test similar to GPUPI where you can really crank up the core?


----------



## ht_addict

Quote:


> Originally Posted by *u3a6*
> 
> That is a nice oc! Mine won't do more than 1070MHz (I have full unlock) on the core at 65ºC. I have to push the fans up to 100% to do more. My HBM runs fine at the first step above 500MHz (545.5 MHz). It will run at 600MHz, but I have a small rainbow like artifact at that speed.
> 
> EDIT: What is your ASIC Quality?


Really got to put the FuryX under proper watercooled loop. I use EKWB Predator 360 with GPU blocks. With original watercooling block I hit high 40's to 50's gaming at 1150/550. With the EKWB setup I hit 37-39, idling under 30oC. On top of that its much quieter.


----------



## Thoth420

Quote:


> Originally Posted by *ht_addict*
> 
> Really got to put the FuryX under proper watercooled loop. I use EKWB Predator 360 with GPU blocks. With original watercooling block I hit high 40's to 50's gaming at 1150/550. With the EKWB setup I hit 37-39, idling under 30oC. On top of that its much quieter.


Damn those predators are impressive.


----------



## Johan45

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> OK I have to say it wasn't AMD's fault this time. I think it was the latest version of GPUtweak. When I use the one off the disc it works fine except I still don't have fan control so I was using Trixx for that.
> 
> 
> Yes Win 10 is allowed for all 3DMark benches but you need to include a link to the results page. There's a chart for exceptions in the rules.
> 
> 
> 
> That is a nice oc! Mine won't do more than 1070MHz (I have full unlock) on the core at 65ºC. I have to push the fans up to 100% to do more. My HBM runs fine at the first step above 500MHz (545.5 MHz). It will run at 600MHz, but I have a small rainbow like artifact at that speed.
> 
> EDIT: What is your ASIC Quality?
Click to expand...

ASIC is 63.5% got it up to 1140 tonight


----------



## u3a6

Quote:


> Originally Posted by *Johan45*
> 
> ASIC is 63.5% got it up to 1140 tonight


Mine's 60%. I will try to push it through FireStrike in the weekend.

*EDIT:* *Have any of you guys tried to apply thermal paste on the R9 Fury strix? I know that in stock it has some kind of metal thermal pad, dunno it it is worth it to remove it and apply some nt-h1. The direct touch heatpipes may pose some problems... Also I got a NOS Thermalright Shaman for cheap, would it outperform the stock cooler?*


----------



## Johan45

Pretty sure the Strix has paste for the die,but I've taken many cards apart and better paste usually helps a bit. http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html


----------



## kfxsti

Finally got my hands on another Sapphire Tri-X OC. BUT.. it's not running at the same clocks as my first one . Even with the dip switch set towards the i/o cover it's not at 1040 on the core. Its running at 1000. Do you guys think I should send it back. Or just copy the bios from the first and flash the second to match it?


----------



## Alastair

Quote:


> Originally Posted by *kfxsti*
> 
> Finally got my hands on another Sapphire Tri-X OC. BUT.. it's not running at the same clocks as my first one . Even with the dip switch set towards the i/o cover it's not at 1040 on the core. Its running at 1000. Do you guys think I should send it back. Or just copy the bios from the first and flash the second to match it?


Really? RMA your card over 40MHz? Why not just open trixx and add +40 to the core speed. Or flash it with an OC BIOS.


----------



## Johan45

IMO if it's a brand new card from a reputable shop I would take it back. It's new and not what you bought. I would have to wonder if it was a "recertified" card or similar. The Tri X is 1040 and the Nitro is 1050 from factory and if your card isn't meeting that I would not flash it or anything that could void the warranty. Hold the retailer responsible for what they sold you versus what you got. Just my


----------



## kfxsti

40 MHz isn't the main issue. The card NOT running at its specified speed is. I do understand what your saying Alastair. It's the principle that I spent the time to find this version. Paid for it. And it's not despite the box and matching numbers what I paid for.


----------



## gupsterg

To me it reads like one bios switch position is Tri-X STD edition and another OC edition. Basically card has 2 differing clocked ROMs, when it should be same clocked ROMs but differing PowerLimit. Easy way of checking is seeing default clock for both ROM positions using GPU-Z. I once had an Asus DCUII 290X STD (purchased new by me) where one ROM was STD and other OC edition. As Asus ROMs contain a serial for card I was able to see the serial also differed.


----------



## Alastair

I dunno going through the effort of an RMA seems such a lengthy and time consuming process when the problem is very easily fixable. I dunno how the warranty would end up being void since most retailers wouldn't have a way to know the BIOS has been flashed. And I don't even know if AIB's have a way of telling if its been flashed. I dunno? Just seems like a waste of time to me. But I guess I am the kind of guy where if its something I can fix myself I might as well do it myself.


----------



## gupsterg

Only ROMs I've seen that store serial same as serial on label of card is Asus, no others. I have yet to come across any info in relation to a data stamp/counter accessible to know a card has been flashed x times.


----------



## u3a6

Quote:


> Originally Posted by *Johan45*
> 
> Pretty sure the Strix has paste for the die,but I've taken many cards apart and better paste usually helps a bit. http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html


W1zzard's sample had some kind of liquid metal thermal pad:

https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/5.html

I'm wondering what mine could have... will check it out over the weekend. About the shaman vs stock cooler, what do you think? I have seen some reviews where the shaman beats the arctic cooling accelero xtreme plus III. I know that I would have to redesign the retention system, that would be no problem. The question is: would it be worth the hours?


----------



## kfxsti

Well after thinking on both sides.. I took both sides. I went ahead and flashed it. But contacted both Sapphire and the seller to let them what I have done have verification that it is indeed a tri-x OC card. And a warranty will not be voided by flashing it myself. Thanks bunches guys for the insight and thoughts


----------



## bluezone

Get it while it is hot. Crimson 16.11.3.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.3-Release-Notes.aspx

Win 10 64.

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#

Win 7 64.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Thoth420

Quote:


> Originally Posted by *bluezone*
> 
> Get it while it is hot. Crimson 16.11.3.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-Edition-16.11.3-Release-Notes.aspx
> 
> Win 10 64.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#
> 
> Win 7 64.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


Damn no device_removed fix for Battlefield yet (it isn't just occuring in BF1, any of the 16.11.xx drivers cause it to occur in BF4 for me). Guess I will wait and hope for the best...the driver I am on now doesn't crash the game but sometimes I get massive fps dives out of nowhere.


----------



## LionS7

Quote:


> Originally Posted by *xkm1948*
> 
> I am getting better Time Spy performance with newer Crimson drivers. I used to get ~5300 for 1100MHz core. Now similar OC I am looking at ~5500.


Quote:


> Originally Posted by *xkm1948*
> 
> I am getting better Time Spy performance with newer Crimson drivers. I used to get ~5300 for 1100MHz core. Now similar OC I am looking at ~5500.


Well, you should get above 5400 on 1100/1000Mhz. Im getting 5420+ graphic points
Quote:


> Originally Posted by *Thoth420*
> 
> Damn no device_removed fix for Battlefield yet (it isn't just occuring in BF1, any of the 16.11.xx drivers cause it to occur in BF4 for me). Guess I will wait and hope for the best...the driver I am on now doesn't crash the game but sometimes I get massive fps dives out of nowhere.


Some people said, that it is unstable system, but I don't think so. It's continues with Battlefront.


----------



## Johan45

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> Pretty sure the Strix has paste for the die,but I've taken many cards apart and better paste usually helps a bit. http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html
> 
> 
> 
> W1zzard's sample had some kind of liquid metal thermal pad:
> 
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/5.html
> 
> I'm wondering what mine could have... will check it out over the weekend. About the shaman vs stock cooler, what do you think? I have seen some reviews where the shaman beats the arctic cooling accelero xtreme plus III. I know that I would have to redesign the retention system, that would be no problem. The question is: would it be worth the hours?
Click to expand...

Well now isn't that interesting. Might be something with retail vs review sampling?


----------



## u3a6

Quote:


> Originally Posted by *Johan45*
> 
> Well now isn't that interesting. Might be something with retail vs review sampling?


It appears that the retail samples have the pad, a user in the "Activation of cores in Hawaii, Tonga and Fiji (unlockability tester ver 1.6 and atomtool)" disassembled his card to find the pad, he claims that there is a gap between the core/HBM and the cooler so he cant use thermal paste :/

In the discussion of the TPU review of the Fury Strix W1zzard claims to have replaced the pad (which is not metal but some kind of mixture of polymer and carbon powder?) with thermal paste and the temps were only 1ºC worse... I think will not disassemble mine in the near future, I don't know if mine will allow thermal paste or not.


----------



## Johan45

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> Well now isn't that interesting. Might be something with retail vs review sampling?
> 
> 
> 
> It appears that the retail samples have the pad, a user in the "Activation of cores in Hawaii, Tonga and Fiji (unlockability tester ver 1.6 and atomtool)" disassembled his card to find the pad, he claims that there is a gap between the core/HBM and the cooler so he cant use thermal paste :/
> 
> In the discussion of the TPU review of the Fury Strix W1zzard claims to have replaced the pad (which is not metal but some kind of mixture of polymer and carbon powder?) with thermal paste and the temps were only 1ºC worse... I think will not disassemble mine in the near future, I don't know if mine will allow thermal paste or not.
Click to expand...

Would make re-assembly troublesome for sure. Eventually I will be taking mine apart but thanks to this discussion , I now know I need something for a suitable replacement.

EDIT: Found these but don't know if liquid metal would be ideal in this use. might be problems getting it clean again after if you needed to remove the cooler again http://www.newegg.com/Product/Product.aspx?Item=9SIA4YU2AJ2191


----------



## Alastair

I'm not sure if liquid metal or a metal pad is the best idea. Would hate to create some sort of short. Especially with the HBM and core so close to each other.


----------



## Johan45

I don't think the HBM would be an issue they look sealed. it's the SMDs that surround the die I would be concened about. I guess nailpolish would fix that though


----------



## Performer81

Yes i think theres a thin plastic layer over the orange space. Its the pressure(cleaning etc.) that does the damage.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> Well, you should get above 5400 on 1100/1000Mhz. Im getting 5420+ graphic points
> Some people said, that it is unstable system, but I don't think so. It's continues with Battlefront.


My system is stock tested 100% stable so that isn't my issue.
Witcher 3 doesn't crash nor Tomb Raider...or anything other than BF4.


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> Well, you should get above 5400 on 1100/1000Mhz. Im getting 5420+ graphic points
> Some people said, that it is unstable system, but I don't think so. It's continues with Battlefront.


Not had this issue on older or newer drivers on SWBF.
Quote:


> Originally Posted by *Thoth420*
> 
> My system is stock tested 100% stable so that isn't my issue.
> Witcher 3 doesn't crash nor Tomb Raider...or anything other than BF4.


If you mean TR13 then I've also not had an issue on that. I don't have BF4 but hope to have TW3 soon (hopefully crimbo sales).

Are both you guys gaming on Win 10? I'm still Win 7 and only reason to go into Win 10 has been 3DM TS benches TBH for me.


----------



## Thoth420

Quote:


> Originally Posted by *gupsterg*
> 
> Not had this issue on older or newer drivers on SWBF.
> If you mean TR13 then I've also not had an issue on that. I don't have BF4 but hope to have TW3 soon (hopefully crimbo sales).
> 
> Are both you guys gaming on Win 10? I'm still Win 7 and only reason to go into Win 10 has been 3DM TS benches TBH for me.


Yep I am on 10 with my gaming only PC which is the one in my sig. I run an old win 7 hooked to my TV for media, browsing, schoolwork etc.

I think I figured out what was causing the problem: shader cache being set to default AMD Optimized. I turned it to off globally as I do not need it since the games I am playing at the moment (TW3 and BF4) are stored on my SSD.


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> Are both you guys gaming on Win 10? I'm still Win 7 and only reason to go into Win 10 has been 3DM TS benches TBH for me.


Yes, Im on Windows 10.


----------



## KR0SSED0UT

Going to likely pick up the Sapphire Fury Tri-OC in the next few days to replace my XFX 380x. I've been waiting for long enough to get 1080p 60fps and the card itself is under $300 now, I can't find an excuse not to get one.

I just wanted to ask here, where I'll probably get the best answer - how has the Fury (specifically the sapphire variant, but other versions should see similar gains) matured with drivers? I've been doing some looking but most initial video reviews are with Crimson 15.7, which is extremely aged by now.

I'm mostly curious because I had the 970 SSC and wanted Freesync + no more driver gimps (nvidia drivers NEVER improved performance for me, seriously) and I'd like to see the Fury GAINING performance, not losing it like my 970 seemed to.

(I'll be playing 1080p on the NX-VUE24A 30-144hz Freesync panel)

thoughts?


----------



## jearly410

Quote:


> Originally Posted by *KR0SSED0UT*
> 
> Going to likely pick up the Sapphire Fury Tri-OC in the next few days to replace my XFX 380x. I've been waiting for long enough to get 1080p 60fps and the card itself is under $300 now, I can't find an excuse not to get one.
> 
> I just wanted to ask here, where I'll probably get the best answer - how has the Fury (specifically the sapphire variant, but other versions should see similar gains) matured with drivers? I've been doing some looking but most initial video reviews are with Crimson 15.7, which is extremely aged by now.
> 
> I'm mostly curious because I had the 970 SSC and wanted Freesync + no more driver gimps (nvidia drivers NEVER improved performance for me, seriously) and I'd like to see the Fury GAINING performance, not losing it like my 970 seemed to.
> 
> (I'll be playing 1080p on the NX-VUE24A 30-144hz Freesync panel)
> 
> thoughts?


I had done some benchmarks in total war: warhammer that showed an fps improvement one driver to the next. It is in the fury bios editing thread somewhere. I think it was from 16.9.1 to 16.9.2. Anyways, get the fury. You will be having a lot of fun with your monitor.


----------



## u3a6

BTW @Johan45 just noticed something in the fury strix cooler pic that you posted above: http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html

That picture is of the 980ti Strix version not the Fury Strix one, (I don't know what were those reviewers thinking -.-). The screws that hold the cooler near the vrm section are aligned differently also the fury strix only has 10 phases while the 980ti strix has 12, (count the indentations on the thermal pad). Just realized this because I'm gonna make a custom bracket to fit my Thermalright Shaman on my Fury X Strix.

pics for comparison.
https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/cooler2.jpg
https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_STRIX_Gaming/images/cooler2.jpg


----------



## Thoth420

Quote:


> Originally Posted by *KR0SSED0UT*
> 
> Going to likely pick up the Sapphire Fury Tri-OC in the next few days to replace my XFX 380x. I've been waiting for long enough to get 1080p 60fps and the card itself is under $300 now, I can't find an excuse not to get one.
> 
> I just wanted to ask here, where I'll probably get the best answer - how has the Fury (specifically the sapphire variant, but other versions should see similar gains) matured with drivers? I've been doing some looking but most initial video reviews are with Crimson 15.7, which is extremely aged by now.
> 
> I'm mostly curious because I had the 970 SSC and wanted Freesync + no more driver gimps (nvidia drivers NEVER improved performance for me, seriously) and I'd like to see the Fury GAINING performance, not losing it like my 970 seemed to.
> 
> (I'll be playing 1080p on the NX-VUE24A 30-144hz Freesync panel)
> 
> thoughts?


I have never owned an AMD card that did not age well with drivers often for years after release. Nvidia gimps the last generation soon as the next gen comes out. My 780Ti was amazing until the first 9xx driver and every single one after. 570 same thing. My 8800 was the only exception.


----------



## Johan45

Quote:


> Originally Posted by *u3a6*
> 
> BTW @Johan45 just noticed something in the fury strix cooler pic that you posted above: http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html
> 
> That picture is of the 980ti Strix version not the Fury Strix one, (I don't know what were those reviewers thinking -.-). The screws that hold the cooler near the vrm section are aligned differently also the fury strix only has 10 phases while the 980ti strix has 12, (count the indentations on the thermal pad). Just realized this because I'm gonna make a custom bracket to fit my Thermalright Shaman on my Fury X Strix.
> 
> pics for comparison.
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/cooler2.jpg
> https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_STRIX_Gaming/images/cooler2.jpg


Nice catch, I wondered about that


----------



## Johan45

Quote:


> Originally Posted by *u3a6*
> 
> BTW @Johan45 just noticed something in the fury strix cooler pic that you posted above: http://www.vortez.net/articles_pages/asus_strix_r9_fury_dc3_oc_review,5.html
> 
> That picture is of the 980ti Strix version not the Fury Strix one, (I don't know what were those reviewers thinking -.-). The screws that hold the cooler near the vrm section are aligned differently also the fury strix only has 10 phases while the 980ti strix has 12, (count the indentations on the thermal pad). Just realized this because I'm gonna make a custom bracket to fit my Thermalright Shaman on my Fury X Strix.
> 
> pics for comparison.
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/images/cooler2.jpg
> https://www.techpowerup.com/reviews/ASUS/GTX_980_Ti_STRIX_Gaming/images/cooler2.jpg


Nice catch, I wondered about that


----------



## Semel

Does anyone here *have* to keep their OCed Fury at the temperature below the default (~75C) one? If I disable my custom fan profile (up to 50% fan speed at 50C+) and use the default fan mode(the gpu then tries to stay at 75C) then my 1100\550 fury just crashes. I tested and in order to keep my OC stable I have to keep it <60C which tbh has never been a problem with trix awesome fans.


----------



## Performer81

All graphics cards like cool temps, the higher the temp, the less efficient is the chip.

BTW my fan curve is very strange at auto, rises till temp reaches about 62 degrees then drops to idle speed again and stays there forever. Tested several times and let it rise to 75-78, then stopped and activated manual curve again because fan just stayed at idle speed. XFX and sapphire bios no matter. (XFX R9 Futry triple dissipation)


----------



## bluezone

Quote:


> Originally Posted by *Semel*
> 
> Does anyone here *have* to keep their OCed Fury at the temperature below the default (~75C) one? If I disable my custom fan profile (up to 50% fan speed at 50C+) and use the default fan mode(the gpu then tries to stay at 75C) then my 1100\550 fury just crashes. I tested and in order to keep my OC stable I have to keep it <60C which tbh has never been a problem with trix awesome fans.


Have not seen you here in a while Semel. I keep my Nano set via Bios; to below 75C. I'm set to 70-71C. Long term high current draw (not short term super high current draw) will cause crashes for me. I think what is happening in my case is from OCP cutting in. The VRMs become less efficient @ higher temperatures; in my case +77C (VRM temperature) and up; and OCP protection cuts in. Short term high draw can reach high 80s without a crash problem. I was using 1100/550 settings but have dropped to 1050-1080/550 to reduce temperatures slightly (GPU 58-67C @ 65% fan speed on custom Bios auto fan settings). The lower clock settings produce VRM temperatures roughly equal to GPU temperatures as opposed to 1100/550; which can produce VRM temperatures 5-10c higher than the GPU range of 68-73c on old Bios set-up.
I have found, not unsurprisingly, that upscaling via super resolution produces higher temperatures and requires lower sustained clocks. Temperature wise,1080p is good for 1080/500. 1440p works best @ 1050/550 and 4K wants a drop down to 1100/500 for good temperature maintenance and quite operation on a air cooled Nano. Obviously GPU performance drops, but not as much as would be thought.
I'm still debating a water block. But I will have to wait because I just finished investing in a PS VR and PlayStation 4 Pro set-up.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> If I disable my custom fan profile (up to 50% fan speed at 50C+) and use the default fan mode(the gpu then tries to stay at 75C) then my 1100\550 fury just crashes.


Factory ROM employs "fuzzy logic" fan mode. There is a value in ROM which dictates what temperature the cooling will maintain GPU at, set it to what you want and cooling will maintain that temp. I have mine at 50C and increase granularity by +100% (4836+4836=9672).



Quote:


> Originally Posted by *Performer81*
> 
> BTW my fan curve is very strange at auto, rises till temp reaches about 62 degrees then drops to idle speed again and stays there forever. Tested several times and let it rise to 75-78, then stopped and activated manual curve again because fan just stayed at idle speed. XFX and sapphire bios no matter. (XFX R9 Futry triple dissipation)


Mod above 2 values in image and you should be able to make "auto" fan react how you want, there are other tweaks that can be done to fan profile but require manual hex editing.


----------



## bluezone

Apparently the Next colour version of AMD drivers (anniversary edition?) is arriving soon.

EDIT: They announced and pull the announcement.







.


----------



## NightAntilli




----------



## phantommaggot

Just got a Nano.
It's in a node 304
I was hoping it would fit with the HDD cage in.. but I doubt it's going to.
Feel kind of dumb for not just buying the fury nitro now..

Anyways.
What are some good tweaks to keep it cool/ quite/ and above 1000 mhz?


----------



## Performer81

Quote:


> Mod above 2 values in image and you should be able to make "auto" fan react how you want, there are other tweaks that can be done to fan profile but require manual hex editing.


Works nice. Set it to 62 degrees target temp and fans stay quiet. Better than manual fan curve.


----------



## Simmons572

Quote:


> Originally Posted by *phantommaggot*
> 
> Just got a Nano.
> It's in a node 304
> I was hoping it would fit with the HDD cage in.. but I doubt it's going to.
> Feel kind of dumb for not just buying the fury nitro now..
> 
> Anyways.
> What are some good tweaks to keep it cool/ quite/ and above 1000 mhz?


Completely unrelated to your question, but how much did you pay for that nano? I am looking to get one, and I am not quite sure what is a "Good Price" for this card.


----------



## weespid

hello i just picked up an used nano for an great price $250 cad for the guy above me i think 250 usd is an reasonable price. and am looking to get all my displays connected to my card. no my google foo has landed me no where further on how to do this on my 290 i had

top tv -hdmi
middle ultra wide - desplayport
left 1080p - dvi top
right 1080p- other dvi

the nano has 3 display port and one hdmi as you all know

and i know with the 290 it had 3 clocks (hdmi, dvi requires one of these to work)

but no where i can find how many the nano or any fury card for that matter has i'm assuming it has at least 2 like the fury
http://www.sapphiretech.com/productdetial.asp?pid=B962294E-9DBC-470E-A817-46EC0AA5B14A&lang=eng
says it can do 5 displays with out active display port adptors.
and proof from amd mat but not realy descriptive as i read somewhere else that
https://community.amd.com/thread/197500
i'm thinking as i don't think the display controller changed form fury x/nano /fury the cheapest possible wayto connect my monitors is
top
-passive desplayport to hdmi
center
- native desplay port
right takes
- native hdmi
left (old dell panel so the ADC is actually quite good not as good as straight digital but not bad







)
-active desplayport to VGA

but it would be great to know if these furys share the same output capitulates of the 290 with 3 tdms clocks ( 2 passive desplayport to hdmi) or if it has even less than the hd 5xxx series with only one (for the HDMI) and i will have to use two active VGA adptors.

any of the three ways this will be around $7 in adptors which for these old now sub $100 panels seams worth spending as the image quality is not all that great to begin with it is not an major issue and spending $40 for the "proper adptors" is too much to justify for these side panels.

also for the people with dual link 1440p monitors you can use an dvi from the hdmi and unlock the pixel clock the tool your looking for is (https://www.monitortests.com/forum/Thread-AMD-ATI-Pixel-Clock-Patcher) to get your glorious refresh rate's or get passive display port - dl dvi ( this should theoretically work although i have never seen any one try this or seen an passive adptor marked as single or dual link just dvi )

There is alot of misinformation going around regarding this subject and i'm just looking to try and help clear it up (as well as connect my desplays







)

TLDR
does the r9 nano support 2 legacy connections through 1 passive adptor and the HDMI port?


----------



## Alastair

I am busy redoing my loop to install my new rads and pump. And while reading through the manual of my EK Fury X block and backplate (I needed to find the thickness of pads needed) I saw it recommended applying thermal compound between the thermal pad and the heat source. Does this improve performance?


----------



## lanofsong

Hey AMD R9 Radeon Fury / NANO / X / Pro DUO FIJI owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> I am busy redoing my loop to install my new rads and pump. And while reading through the manual of my EK Fury X block and backplate (I needed to find the thickness of pads needed) I saw it recommended applying thermal compound between the thermal pad and the heat source. Does this improve performance?


I can ask the shop the assembled my loop if they did or not as I am using a full coverage EK block on my Fury X if you want. Temps have never gone above 47C on the Fury X using a 240(exhaust) and 360(intake) rads, d5 pump res combo 250ml which is cooling just the GPU and an i7 6700k at 4.4 static clocks(basically set to turbo speeds manually) set manual running stock voltage (CPU maxes out around 55C). I am guessing if it isn't normal practice they most likely never read the manual and blocked it business as usual. I have never had my VRM's heat up to a point of concern either at least not via what my software is telling me.


----------



## damarad21

Please do you feel a brand new sapphire nitro fury is still worth it? 280€?
It will go much better than 290X. I'm thinking to sell it and get a fury now there are some good offers.
Or much better go for nvidia 1070?
Thanks


----------



## PontiacGTX

do you think is worth paying 80usd more for a a new R9 Fury over a RX 480 4GB for 1080p?


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *PontiacGTX*
> 
> do you think is worth paying 80usd more for a a new R9 Fury over a RX 480 4GB for 1080p?


Do you intend to stay at 1080p for years? As long as you don't reach VRAM limits Fiji will benefit you more at higher resolutions.


----------



## damarad21

In my case is playing 2560x1600. I hope fury will do the job


----------



## neurotix

It's worth it for higher resolutions.

2560x1600 probably counts as higher resolution. That's 4 million pixels. Only 2 million less than my Eyefinity (5760x1080). 1080p is roughly 2 million pixels so your resolution is close to double that of 1080p.

The 1070 is $100 more but the Fury is still competitive with it, and from what I hear DirectX12 only evens the playing field and makes the Fury/Fury X as good or in some cases better than the GTX 1070. Whether this is true or not, or whether it will last, only time will tell. But considering that AMD has a long history of continuing to optimize it's cards, whereas Nvidia stops after releasing the next generation (It won't be long until Volta), I won't be surprised if the Fury and Fury X only get better with time. Not bashing Nvidia, but even most objective Nvidia users agree with this happening. Nvidia purposely gimps older cards through not just drivers, but stuff like GameWorks.

As far as compared to the RX 480/Polaris... there's no contest. The Fury is still faster, especially at higher resolutions, however the low amount of VRAM may eventually become an issue.

You can feel free to look through the graphs in this review of the RX 480, they will mostly show the RX 480 being the better card at 1080p as others are saying, while once you raise the resolution higher, the Fury dominates the RX 480, often getting 10-15fps more at 1440p and 4K.

I got my two Fury Nitros for less than the price of a GTX 1080 at the time, and they provide equivalent or better performance in most cases. Since I use Eyefinity, I HAVE to have two cards. For the games that don't support Crossfire, well those are usually low spec games anyway and you could run them on a potato. (Japanese console ports and the like). This thing literally maxes out Witcher 3 and Rise of the Tomb Raider, 60 fps constant with every setting cranked to the max at 5760x1080. Those games seem to be pretty high water mark as far as demand goes so I think I'm set for a couple years. I will probably wait for AMD Navi to upgrade. (2019?)

Anyway, even with one of these cards you won't regret it. And they have one the best performing, quietest coolers on the market. I think you'll be happy.


----------



## Thoth420

Quote:


> Originally Posted by *neurotix*
> 
> It's worth it for higher resolutions.
> 
> 2560x1600 probably counts as higher resolution. That's 4 million pixels. Only 2 million less than my Eyefinity (5760x1080). 1080p is roughly 2 million pixels so your resolution is close to double that of 1080p.
> 
> The 1070 is $100 more but the Fury is still competitive with it, and from what I hear DirectX12 only evens the playing field and makes the Fury/Fury X as good or in some cases better than the GTX 1070. Whether this is true or not, or whether it will last, only time will tell. But considering that AMD has a long history of continuing to optimize it's cards, whereas Nvidia stops after releasing the next generation (It won't be long until Volta), I won't be surprised if the Fury and Fury X only get better with time. Not bashing Nvidia, but even most objective Nvidia users agree with this happening. Nvidia purposely gimps older cards through not just drivers, but stuff like GameWorks.
> 
> As far as compared to the RX 480/Polaris... there's no contest. The Fury is still faster, especially at higher resolutions, however the low amount of VRAM may eventually become an issue.
> 
> You can feel free to look through the graphs in this review of the RX 480, they will mostly show the RX 480 being the better card at 1080p as others are saying, while once you raise the resolution higher, the Fury dominates the RX 480, often getting 10-15fps more at 1440p and 4K.
> 
> I got my two Fury Nitros for less than the price of a GTX 1080 at the time, and they provide equivalent or better performance in most cases. Since I use Eyefinity, I HAVE to have two cards. For the games that don't support Crossfire, well those are usually low spec games anyway and you could run them on a potato. (Japanese console ports and the like). This thing literally maxes out Witcher 3 and Rise of the Tomb Raider, 60 fps constant with every setting cranked to the max at 5760x1080. Those games seem to be pretty high water mark as far as demand goes so I think I'm set for a couple years. I will probably wait for AMD Navi to upgrade. (2019?)
> 
> Anyway, even with one of these cards you won't regret it. And they have one the best performing, quietest coolers on the market. I think you'll be happy.


Posts like this make me regret not just getting a vanilla Fury and staying air/clc's...draining a loop is a pain...


----------



## neurotix

Quote:


> Originally Posted by *Thoth420*
> 
> Posts like this make me regret not just getting a vanilla Fury and staying air/clc's...draining a loop is a pain...


I have a pretty high end rig, I think, I've put a lot of money in it, I've never built a water loop but studied it extensively, and I could probably afford to do one if I wanted to. I'm pretty confident I could put it together right, if you doubt me then go look at pictures of Big Red and how nice it looks. A water loop would be no problem for me. Not trying to gloat, just trying to make a point, I could most likely put a loop together.

However, I *haven't* gone water, even when friends on this site have told me to, because I feel it's not worth the hassle. Draining it every few months, the extra weight (this thing is heavy already and I have a bad back!), and the fact that you're *locked in to whatever cards you're using*. Since you'll have to buy new waterblocks when you get a new graphics card. GPUs are expensive enough already and it seems like every year they get more expensive, look at the GTX 1080 at launch, sheesh. Adding $100 on top of that for a block is ridiculous imo, especially when needing more than one. I think I would only put a water loop in a system if it was a permanent, high end thing, something like Quad SLI Titans (which due to the spacing between cards, a Quad configuration pretty much necessitates water cooling).

I've also read that at least for the CPU, a CLC or very high end air cooler is actually more or less the same temps as a CPU loop with a 240mm rad. Within a few degrees of each other. And honestly, most CPUs are designed to run fine up to 90C or so, so does it matter if your CPU is 60C or 40C when gaming? Most CPUs also have a voltage wall, where they won't go higher than a certain frequency no matter how much voltage you add... mine is like this @ 4.8ghz, it won't do higher than that no matter how much voltage I give it, I've tried. In this case a water loop wouldn't even help me overclock higher!

Anyway, if I'm getting this right, I think what you're trying to say is that you feel you should have gotten two regular Fury's instead of one Fury X and a water loop. And I certainly didn't intend to make you feel that way. Either that or it's some kind of passive aggressive jab at my last post. (Sorry, sometimes I can't tell. People tend not to like what I have to say most of the time. Apologies if I'm wrong.)


----------



## damarad21

Neurotix, great post! Many thanks


----------



## Thoth420

Quote:


> Originally Posted by *neurotix*
> 
> I have a pretty high end rig, I think, I've put a lot of money in it, I've never built a water loop but studied it extensively, and I could probably afford to do one if I wanted to. I'm pretty confident I could put it together right, if you doubt me then go look at pictures of Big Red and how nice it looks. A water loop would be no problem for me. Not trying to gloat, just trying to make a point, I could most likely put a loop together.
> 
> However, I *haven't* gone water, even when friends on this site have told me to, because I feel it's not worth the hassle. Draining it every few months, the extra weight (this thing is heavy already and I have a bad back!), and the fact that you're *locked in to whatever cards you're using*. Since you'll have to buy new waterblocks when you get a new graphics card. GPUs are expensive enough already and it seems like every year they get more expensive, look at the GTX 1080 at launch, sheesh. Adding $100 on top of that for a block is ridiculous imo, especially when needing more than one. I think I would only put a water loop in a system if it was a permanent, high end thing, something like Quad SLI Titans (which due to the spacing between cards, a Quad configuration pretty much necessitates water cooling).
> 
> I've also read that at least for the CPU, a CLC or very high end air cooler is actually more or less the same temps as a CPU loop with a 240mm rad. Within a few degrees of each other. And honestly, most CPUs are designed to run fine up to 90C or so, so does it matter if your CPU is 60C or 40C when gaming? Most CPUs also have a voltage wall, where they won't go higher than a certain frequency no matter how much voltage you add... mine is like this @ 4.8ghz, it won't do higher than that no matter how much voltage I give it, I've tried. In this case a water loop wouldn't even help me overclock higher!
> 
> Anyway, if I'm getting this right, I think what you're trying to say is that you feel you should have gotten two regular Fury's instead of one Fury X and a water loop. And I certainly didn't intend to make you feel that way. Either that or it's some kind of passive aggressive jab at my last post. (Sorry, sometimes I can't tell. People tend not to like what I have to say most of the time. Apologies if I'm wrong.)


I didn't build this loop...not confident in my abilities. It is absolutely an unnecessary hassle and not worth the thermal headroom to me in hindsight. Your system is what I wish I had right now(less any mobo by ASUS but that is a personal preference)....all good if you misunderstood me...things are lost in text.









I understand the mistake lots of people on these boards can be very well...nasty I guess. The justification of spending alot of cash for marginal to no benefit I guess....

The only thing this rig does that the ones I can feel comfortable building on my own cannot is look a bit sexier and that doesn't do a thing for raw performance, stability or ease of use or maintenance so in the end it was kind of not worth it. Thankfully I have a console(xbox one), another win 7 1080p gaming/htpc to hold me over if this thing has issues. Downtime is not fun and having a rig like yours means you will see little to none. Good decision and I am nothing but envious and my post was just reflective of that. If you made me feel that way it is a good thing, I come here to learn mostly and sometimes you have to learn with a mistake. I wouldn't want anyone else to make the same mistake for the same reason...either way this system will always make a good showpiece. I am already dying to build since I skipped this one and had a shop do it so I may be putting something else together soon just well to do it. I am glad you are happy with your system especially a multi AMD GPU system as they need more love and certainly deserve it as they have stepped up their driver and software suite game and their hardware architecture has always been on point....if I have a bone to pick with anyone it is is Intel and Nvidia the past few years...total downhill from both since the Sandy Bridge days.

Seriously though that rig is so perfect down to every last hardware choice...regardless of what you spent you got value for every penny spent where as the cost of mine was half bling.


----------



## neurotix

Quote:


> Originally Posted by *Thoth420*
> 
> I didn't build this loop...not confident in my abilities. It is absolutely an unnecessary hassle and not worth the thermal headroom to me in hindsight. Your system is what I wish I had right now....all good if you misunderstood me...things are lost in text.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I understand the mistake lots of people on these boards can be very well...nasty I guess. The justification of spending alot of cash for marginal to no benefit I guess....
> 
> The only thing this rig does that the ones I can feel comfortable building on my own cannot is look a bit sexier and that doesn't do a thing for raw performance, stability or ease of use or maintenance so in the end it was kind of not worth it. Thankfully I have a console(xbox one), another win 7 1080p gaming/htpc to hold me over if this thing has issues. Downtime is not fun and having a rig like yours means you will see little to none. Good decision and I am nothing but envious and my post was just reflective of that.


Finally someone with his head on right. Good on you.

I agree, I basically don't want the hassle and extra cost.

I can understand if you're into it because it's your hobby (e.g. you are an enthusiast of PC water cooling parts), but it seems like some people like the water cooling parts more than the actual PC components. I personally am into graphics cards, but at the same time there's no way I can afford a quad setup, or 2 GTX 1080s or whatever. Let alone the money to water cool them. Besides, I have a huge retro gaming hobby, I have a wife who likes to go outside, and so forth.














If I won the lottery I'd build a fully watercooled quad Crossfire setup in a heartbeat, or even a dual card watercooled setup. Because then I could still have money left to upgrade the cards, as well as buy new blocks and so on.

But, I still probably wouldn't want the extra hassle of maintenance on the loop. When you think about it, my PC gives me enough hassle already with stuff like random software issues, Windows being stupid and so forth. There's also 4 other PCs I built in the house. There isn't a week that goes by that I'm basically not doing tech support on something here. Some of it is getting old. If I also had to do maintenance on my loop so often... it's just extra unpaid work I don't need. I already take this thing down and clean the radiator/fans every few months, that's enough (that's why it looks so clean...)

Maybe if it's too much for you, sell that rig and use the proceeds to build a new, simpler one? That looks like a lot of money in watercooling parts. You can still make it look good without water cooling. Get some sleeved PSU extensions. Get colored shrink tubing for the radiator hoses on your H100 what-have-you. Paint your H100 radiator. If you have a case with grills, paint them. If you have access to the tools and the knowledge, get into case modding and do case mods. I might actually do some of this stuff to mine.

Anyway if you feel like it, stop by my build log and look around. The links in my sig.

As far as the Fury goes.... not much more to say other than I love mine. Oh and maybe this.


----------



## Thoth420

Quote:


> Originally Posted by *neurotix*
> 
> Finally someone with his head on right. Good on you.
> 
> I agree, I basically don't want the hassle and extra cost.
> 
> I can understand if you're into it because it's your hobby (e.g. you are an enthusiast of PC water cooling parts), but it seems like some people like the water cooling parts more than the actual PC components. I personally am into graphics cards, but at the same time there's no way I can afford a quad setup, or 2 GTX 1080s or whatever. Let alone the money to water cool them. Besides, I have a huge retro gaming hobby, I have a wife who likes to go outside, and so forth.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I won the lottery I'd build a fully watercooled quad Crossfire setup in a heartbeat, or even a dual card watercooled setup. Because then I could still have money left to upgrade the cards, as well as buy new blocks and so on.
> 
> But, I still probably wouldn't want the extra hassle of maintenance on the loop. When you think about it, my PC gives me enough hassle already with stuff like random software issues, Windows being stupid and so forth. There's also 4 other PCs I built in the house. There isn't a week that goes by that I'm basically not doing tech support on something here. Some of it is getting old. If I also had to do maintenance on my loop so often... it's just extra unpaid work I don't need. I already take this thing down and clean the radiator/fans every few months, that's enough (that's why it looks so clean...)
> 
> Maybe if it's too much for you, sell that rig and use the proceeds to build a new, simpler one? That looks like a lot of money in watercooling parts. You can still make it look good without water cooling. Get some sleeved PSU extensions. Get colored shrink tubing for the radiator hoses on your H100 what-have-you. Paint your H100 radiator. If you have a case with grills, paint them. If you have access to the tools and the knowledge, get into case modding and do case mods. I might actually do some of this stuff to mine.
> 
> Anyway if you feel like it, stop by my build log and look around. The links in my sig.
> 
> As far as the Fury goes.... not much more to say other than I love mine. Oh and maybe this.


I just wanted the bling...lol.








A fools errand with my level of knowledge in regard to custom water cooling....I am also a bit paranoid when it comes to modding so that weighs in. I think I could have built this technically but I don't have the confidence to try. I should also mention some of this was a gift so that also factored into my eyes being bigger than my stomach so to speak. Lesson learned I guess and worst case I have something to experiment with as I find I learn best from reverse engineering.

Nice rankings, those Nitros are exactly what I would buy or recommend for anyone looking for a Fury no contest. Sapphire and XFX have the best warranties and usually the best cards. I had some bad luck with MSI Lightnings but others seem to like them so maybe bad luck there.

Money isn't an issue with a new build just a matter of finding room to fit and justify it to womenfolk haha. They all say the same thing when I ask why something vanished here..."I was tired of looking at it...." and that is the one thing none of them say about Kung Fury so it has it's merit there. Again...aesthetics over perf.









I am also strong on certain retro titles and thankfully outside of a few reso issues they mostly play ok on this system it is more issues with newer titles and I am getting to the point where re-seating the GPU is going to be last on the list. If it wasn't piped in this would have already been checked off the troubleshoot list. The worst part is the system is butter smooth about 99% of the time and my frame stability shows 99.8% in 3DMark and AIDA64 shows all green with some impressive numbers but that other 1% the perf goes straight into the gutter. No crashes but some hard freezes and def not a problem you want with a CPU and GPU hard tubed in.


----------



## neurotix

When I was talking about retro titles I was talking more about... NES and Sega Master System. PC Engine. Sega Genesis. Super Nintendo.







23 consoles and counting... I run a club for it in my sig. (That stuff can be very expensive.) It's my other hobby.

The crashes and freezes seem weird, unless your CPU overclock isn't stable. Personally, I never really crash or freeze unless I've changed something recently. The rest of the time, this thing can stay up for days or weeks at 4.5ghz, 2400 RAM with loose timings. I only run 4.8 for benching and very demanding games that need it. (With 2x Fury's I'm yet to find a game that does.) This thing acts as my fileserver, I got a 4TB drive recently and I might get another. I use Plex Media Server to stream anime, movies and TV shows to my Chromecast/PS3 on my 55", and sit in my recliner. I also have it set up to stream my music. My wife got a new laptop recently (it's rather low end but nice), and it's one of those that flips around and acts like a tablet. She's even tried streaming Plex to it over the internet, and it works. So, my point is that this thing acts as a server too, it needs to be stable.

Tbh I don't use it for gaming much, even after getting the Fury's, and new monitors (3x 1080p IPS thin bezel...) The last PC game I think I actually finished was Dragon Age Inquisition at the end of 2014. Unfortunately I have fibromyalgia, bad back, and my neck, head and jaw were messed up from ECT this year. So a lot of the time it's very uncomfortable if not painful to sit up in my computer chair.

It's rather stupid really but when I do play games I rarely play anything demanding, or in Eyefinity. I was playing Attack on Titan Wings of Freedom a few weeks back, it could run on a GTX 550 with probably no issues. I wanna play Shadowrun Returns and try and complete all three scenarios, and that's even lower spec. There's a game called I Am Setsuna I have that is really similar to what I liked as a kid, I wanna play that too. It's dumb, but I'm just not really into all the Western action games, I'm no good at fps, I don't like multiplayer games (especially competitive)... the recent Tomb Raider games seem good, I've played them a bit, but I usually end up playing something else instead. Maybe I'm in the wrong hobby. I dunno.


----------



## Thoth420

Quote:


> Originally Posted by *neurotix*
> 
> When I was talking about retro titles I was talking more about... NES and Sega Master System. PC Engine. Sega Genesis. Super Nintendo.
> 
> 
> 
> 
> 
> 
> 
> 23 consoles and counting... I run a club for it in my sig. (That stuff can be very expensive.) It's my other hobby.
> 
> The crashes and freezes seem weird, unless your CPU overclock isn't stable. Personally, I never really crash or freeze unless I've changed something recently. The rest of the time, this thing can stay up for days or weeks at 4.5ghz, 2400 RAM with loose timings. I only run 4.8 for benching and very demanding games that need it. (With 2x Fury's I'm yet to find a game that does.) This thing acts as my fileserver, I got a 4TB drive recently and I might get another. I use Plex Media Server to stream anime, movies and TV shows to my Chromecast/PS3 on my 55", and sit in my recliner. I also have it set up to stream my music. My wife got a new laptop recently (it's rather low end but nice), and it's one of those that flips around and acts like a tablet. She's even tried streaming Plex to it over the internet, and it works. So, my point is that this thing acts as a server too, it needs to be stable.
> 
> Tbh I don't use it for gaming much, even after getting the Fury's, and new monitors (3x 1080p IPS thin bezel...) The last PC game I think I actually finished was Dragon Age Inquisition at the end of 2014. Unfortunately I have fibromyalgia, bad back, and my neck, head and jaw were messed up from ECT this year. So a lot of the time it's very uncomfortable if not painful to sit up in my computer chair.
> 
> It's rather stupid really but when I do play games I rarely play anything demanding, or in Eyefinity. I was playing Attack on Titan Wings of Freedom a few weeks back, it could run on a GTX 550 with probably no issues. I wanna play Shadowrun Returns and try and complete all three scenarios, and that's even lower spec. There's a game called I Am Setsuna I have that is really similar to what I liked as a kid, I wanna play that too. It's dumb, but I'm just not really into all the Western action games, I'm no good at fps, I don't like multiplayer games (especially competitive)... the recent Tomb Raider games seem good, I've played them a bit, but I usually end up playing something else instead. Maybe I'm in the wrong hobby. I dunno.


I'm really OCD so most of these issues are most likely just crappy game performance compared to my expectation or a driver not playing nice because I have some old junk lying around from a later released beta I installed. I have never had a BSOD with this rig and it stays on all the time with a very clean event viewer log but every once and a while with games that use the net(be it for cloud syncing like Hitman auto saves or multiplayer shooters like BF) my performance goes into the gutter. I suspect it is an OS, BIOS config or software issue as I have tested this system with every last program that could make it crash to no avail. I even have a full copy of AIDA64 monitoring it and that shows everything in the green. I think my expectations of PC gaming being on downtime for a while playing the console were just beyond reality because like I said it is very rarely I run into an issue and my physical location could be the problem so I don't want to blame the hardware. Major power issues on this street that I run a battery backup on all my electronics because of brownouts and the ISP service is also quite spotty....this could also all be server side issues with games. Anything I play that stores locally seems to work fine but it also is so old that my hardware just laughs at it so kind of hard to tell unless I marathon something.

I am 36 but I tend to come off younger here due to getting into PC building and OCing later than most people my age into this stuff and rightfully so but have always been a gamer since NES. I love a myriad of games but the PC is mostly my multiplayer fragbox and something to play those old PC only titles on. I do love my first person shooters though, all kinds. I miss the tactical shooters of old such as Rainbow Six 3 and SWAT 4 and still play them today hoping for their return but also enjoy the more arcade style ones that are out now to some degree...mostly due to lack of anything more hardcore with the support we used to see from the so called AAA gaming companies.









I am a firm believer in the "play whatever you enjoy" mantra. If you enjoy what you are playing there is no way that possibly could be the wrong hobby...in fact that is what makes gaming so awesome...the scope is wide and the genres abound.


----------



## neurotix

If it says anything, I'm a huge fan of jRPGs. PC not really the best system for those, right now the 3DS, Vita or PS4 is (I have a 3DS and Vita but no PS4). I grew up with the classic Final Fantasy, Chrono Trigger etc. That's the stuff I love the most. That's why I wanna play I Am Setsuna.

And by all means, if you still have any of those old consoles and play them, or even a retro gaming PC, then come check my club out. You'll fit in well.

Heading to bed.

neuro


----------



## Thoth420

Quote:


> Originally Posted by *neurotix*
> 
> If it says anything, I'm a huge fan of jRPGs. PC not really the best system for those, right now the 3DS, Vita or PS4 is (I have a 3DS and Vita but no PS4). I grew up with the classic Final Fantasy, Chrono Trigger etc. That's the stuff I love the most. That's why I wanna play I Am Setsuna.
> 
> And by all means, if you still have any of those old consoles and play them, or even a retro gaming PC, then come check my club out. You'll fit in well.
> 
> Heading to bed.
> 
> neuro


I loved FF up to 7 and ChronoTrigger is a masterpiece! I certainly will check it out. Night duder


----------



## Nameless101

Hi all,
So I'm planning on building a custom water loop and integrating my Fury X into it. I've already ordered the aquacomputer block for it, but the don't have a backplate available. The EK or Watercool backplate might well work out in tandem with it, but how nevessary is it really? Will it affect VRM cooling in any signifcant way? Any help would be appreciated!


----------



## Radox-0

Quote:


> Originally Posted by *Nameless101*
> 
> Hi all,
> So I'm planning on building a custom water loop and integrating my Fury X into it. I've already ordered the aquacomputer block for it, but the don't have a backplate available. The EK or Watercool backplate might well work out in tandem with it, but how nevessary is it really? Will it affect VRM cooling in any signifcant way? Any help would be appreciated!


Possibly a slight benefit but nothing significant that would case an issue without. I used a Fury X for a while and then Nano both under water and due to being on a riser card never needed a back plate and had 0 issues in terms of overclocking either.

Mostly a aesthetic enhancement I believe


----------



## Nameless101

Quote:


> Originally Posted by *Radox-0*
> 
> Possibly a slight benefit but nothing significant that would case an issue without. I used a Fury X for a while and then Nano both under water and due to being on a riser card never needed a back plate and had 0 issues in terms of overclocking either.
> 
> Mostly a aesthetic enhancement I believe


That's great information, thanks. I think I'll save myself the cost and potential hassle then. In any case, it can always be added later!


----------



## Khr1s

I ordered a Sapphire Fury Nitro today for 300 euros







I can't wait to fit this beast in my Evolv ATX! I would have gone for an RX 480 but because I play at 1080p I went for the fury!

My last card was an 7790


----------



## ht_addict

Quote:


> Originally Posted by *Nameless101*
> 
> Hi all,
> So I'm planning on building a custom water loop and integrating my Fury X into it. I've already ordered the aquacomputer block for it, but the don't have a backplate available. The EK or Watercool back plate might well work out in tandem with it, but how necessary is it really? Will it affect VRM cooling in any significant way? Any help would be appreciated!


 I went with the EKWB(44% off) for my Fury X's with the back plates. Nothing like sitting at mid 20's when on the web to mid/upper 30's when gaming at Ultra settings. Personally go the back plate for the passive cooling to the VRM's. The more cooling the better in my book. EKWB sells them for $35 on their website. Worth the money.


----------



## Nameless101

Quote:


> Originally Posted by *ht_addict*
> 
> I went with the EKWB(44% off) for my Fury X's with the back plates. Nothing like sitting at mid 20's when on the web to mid/upper 30's when gaming at Ultra settings. Personally go the back plate for the passive cooling to the VRM's. The more cooling the better in my book. EKWB sells them for $35 on their website. Worth the money.


That's fair enough. I will definitely check how my temps do without backplate and then act accordingly. Unfortunately I think I may only get everything up and running in January.


----------



## JunkaDK

Quote:


> Originally Posted by *ht_addict*
> 
> I went with the EKWB(44% off) for my Fury X's with the back plates. Nothing like sitting at mid 20's when on the web to mid/upper 30's when gaming at Ultra settings. Personally go the back plate for the passive cooling to the VRM's. The more cooling the better in my book. EKWB sells them for $35 on their website. Worth the money.


Fury X under water is just SEXY AF!







Recently finished my build for now


----------



## Thoth420

Backplate is optional but the card looks much better with one. I had mine painted white and it looks fantastic that way.


----------



## neurotix

Quote:


> Originally Posted by *JunkaDK*
> 
> Fury X under water is just SEXY AF!
> 
> 
> 
> 
> 
> 
> 
> Recently finished my build for now


This looks sick. Nice job man.

And to the guy coming from a 7790 pfft LOL you're gonna be blown away by the Fury Nitro. Congrats!


----------



## Arizonian

Very nice JunkaDK. Cable work is sweet. Enjoy.









Bought a nitro fury for $480 over a year ago and it's been the best price/performance move I made since GTX580 days. I have no problems waiting on Vega while I'm playing @4KUHD. Saw them going for $270 a bit ago, almost wanted to crossfire mine but I'm sticking with strong single GPU solution.


----------



## damarad21

Cross firing 2 fury nitros will not generate a lot of hot? I'm tempted today


----------



## Thoth420

I really want to just switch up to 4K 60hz for now since the only game I play that could make the argument of benefiting at all from 144hz is BF4 and I can roll scrubs all day even with a 60hz panel in that. I am just concerned that a single Fury X will not satisfy me and I have no plan to swap this GPU or add a second Fury X or Fury to this build. I don't mind dropping settings, removing AA etc. but still just don't see my performance being that great.


----------



## neurotix

Quote:


> Originally Posted by *damarad21*
> 
> Cross firing 2 fury nitros will not generate a lot of hot? I'm tempted today


No. But it depends on a few things, the size of your case, your ambient temps, your airflow, positive vs negative pressure, driver settings, fan profile, overclocking and so on.

I have a HUGE Corsair 780T open air case, it has great airflow, you can see pics in my sig rig (Big Red). The cards have the two front fans blowing fresh air on them. Regardless, they're pretty close to each other, only about an inch apart.

I use some settings in my drivers to keep temps down, Vsync ON, Frame Rate Target Control 60 fps, Power Efficiency, and Frame Pacing. This helps limit the amount of frames the cards draw, if they hit 60 then they don't draw any more than that, even though they're capable (these things get up to 230 fps in some scenes in Valley with all that stuff off for benching so yeah).

I generally don't overclock at all for gaming. No need with two of these things. I also come close to maxing out my power draw from my PSU, when I overvolted them by +50 or so for 1125mhz. I saw 998 watts on my Kill-a-watt meter. That's bad, I only have a 1000w power supply. So I decided not to push further. Maybe they can overclock higher but I'm not willing to risk it. Even with the pair at 1100mhz I get good bench scores anyway.

Anyway, with those driver settings on, and my case layout, and a fairly aggressive fan profile (100% at 70C), my top card only hits 60C and bottom 52C in The Witcher 3 at 5760x1080. Every single setting on Ultra. And a constant 60 fps. It's the same or less temps for other games, Tomb Raider 2013 Eyefinity my top card is around 52C and bottom card is 45C. Oh yeah, and my ambient temps are generally right around 21C.

So, if your motherboard has enough space between the slots, you have a good 1000W PSU, you have good airflow, your ambients aren't too high, and you turn those driver options on I mentioned, you can expect very good temps and high performance. If any of the stuff I mentioned is missing, well, I'm not sure how it'll be.

These things can draw massive amounts of power when overclocking so be prepared for that. If I overclock mine at all, I generally just do 1100/500mhz. What I listed in my sig is basically my bench settings.


----------



## damarad21

I'm afraid I only have a very good PSU 850w. All my rig in a nice Cooltek W2 case. Considering 5960X + Sapphire Fury nitro and many other stuff, I think is completed and well balanced for 2560x1600 gaming. Also thinking I'm getting old and being a "new father" I have less and less time to enjoy it..


----------



## neurotix

You might be able to get by with an 850W as long as you don't overclock the cards or raise the power limit, it depends on your CPU somewhat though, you probably shouldn't OC your CPU either. Or at least not to the moon. It also depends on what CPU you have, if you have an AMD CPU then no way probably. But if it's an i3 or i5 you might be able to OC and run the Fury's stock. You could maybe decrease the power limit and undervolt and underclock as well.


----------



## Thoth420

Quote:


> Originally Posted by *damarad21*
> 
> I'm afraid I only have a very good PSU 850w. All my rig in a nice Cooltek W2 case. Considering 5960X + Sapphire Fury nitro and many other stuff, I think is completed and well balanced for 2560x1600 gaming. Also thinking I'm getting old and being a "new father" I have less and less time to enjoy it..


Welcome to the club(of being a parent not owning a GPU)! I find I have way less time these days with family, school and work so of course I have less time than the days of WoW marathon Gladiator/Best in Slot PvE raiding but I find when I do have time to game even the most casual stuff feels much more rewarding. I recently went back to WoW and even with all the time saving changes made it still seems daunting and I cannot even fathom how much free time I used to have to put into this or gaming in general.

That rig will certainly do the trick for hardcore gamer let alone someone in our shoes these days. I run 2560 x 1440 @ 144hz on a 6700k and Fury X so you should be more than fine for quite a while as even the latest titles run just fine. I don't pull 144fps or anything on maxed settings but the average FPS is above 60 which is good enough for me as the monitor will outlive the system.

I am running a 750 watt EVGA SuperNOVA P2 and the max draw with both heavy CPU and GPU OC's the draw didn't come close to maxing out. I wouldn't worry at all as long as the quality of PSU is good. I only use Superflower units.


----------



## diggiddi

Quote:


> Originally Posted by *damarad21*
> 
> I'm afraid I only have a very good PSU 850w. All my rig in a nice Cooltek W2 case. Considering 5960X + Sapphire Fury nitro and many other stuff, I think is completed and well balanced for 2560x1600 gaming. Also thinking I'm getting old and being a "new father" I have less and less time to enjoy it..


According TPU Fury Strix pulls 226~230w by itself and 290x pulls 294~300w all by itself in max gaming conditions
I was able to Crossfire 290x lightnings on an Antec 750 HCG so your 850 w should be fine, just keep gpu at stock or you can even reduce power limit
and mild to moderate cpu overclocking won't cause issues but to be on the safe side if you can get a 1000w PSU do it


----------



## neurotix

Stock Fury might pull that much two of them but OC'ed?

I've seen some crazy high numbers from mine, however I do have other stuff plugged into the power strip... maybe I should change that.


----------



## neurotix

So, I did some tests for you guys, I'm going to have to say I retract my statements, you should be fine with a 850w, or maybe even a 750w if you're careful about what you run (no Fire Strike).



I hooked up my Kill-a-watt to my tower only.

Here's my methodology:

I ran everything at 5760x1080 except for Fire Strike Ultra, which obviously renders at 4K internally and downsamples to 1080p. It actually centers on my center monitor so that's great.

I only used games that support Crossfire and Eyefinity well.

In the AMD drivers, I turned Vsync to ON, Frame Rate Target Control 60fps, Power Efficiency ON and Frame Pacing ON.

My CPU was at 4.5ghz, RAM at 2400 CAS11. I can go higher, but I was lazy and didn't want to reset and go to 4.8. I think this gives a more "average" idea of how much power the cards draw. The i7 draws 100w or so at idle with these settings.

For Valley I simply ran it at 5760x1080 with 4x AA. As per the rules of the Valley thread here for a multi monitor submission.

Fire Strike Ultra was run, well, the only way you can.

Witcher 3, everything was on Ultra, I simply stood in the first town for a while, and watched the readings on the Kill-a-watt.

Rise of the Tomb Raider, the first snowy area, again I stood for a while.

Sleeping Dogs... this is an older game. 2012. But it's really fun and it supports Crossfire and Eyefinity fantastically. Even centers the HUD on the center screen. Anyway, for this I simply went out of my apartment, went to the garage, got on a motorcycle and then started driving fast through the city. While also looking at my Kill-a-watt. If you like open world games or Kung fu movies, get this game. A single Fury could probably run it at 4k 60 no problem.

So, when watching the Kill-a-watt, I only looked for *PEAK* power draw. So the highest number I saw is what I recorded.

Anyway, here's the results:

*stock: (1050mhz)*

Valley: 550 watt
Fire Strike Ultra: 650
Witcher 3: 570
Rise of the Tomb Raider: 495
Sleeping Dogs: 324

*1100/550 +38mv*

Valley: 620
Fire Strike Ultra: 827
Witcher 3: 626
Rise of the Tomb Raider: 511
Sleeping Dogs: 343

*1125/550 +52mv*

Valley: 635
Fire Strike Ultra: 870
Witcher 3: 639
Rise of the Tomb Raider: 525
Sleeping Dogs: 350

There you go. Obviously, Fire Strike Ultra, and maybe any game at 4K or rendered at 4K and downsampled, is extremely brutal on the card and will produce not only a high power draw, but also lots of heat. Especially when overclocking.

I was surprised at the result for ROTTR. It has some nice graphics, and I'm running it at high resolution, I thought it would draw more power to run. I felt the same way about Witcher 3.

It seems that if you just want to game, you don't have to worry much with an 850w or lower PSU, even when overclocking. I would be careful with stuff like Fire Strike Ultra though if you have an insane overclock (higher than 1125mhz, high voltage).

I didn't try at 1150mhz or anything because honestly, I haven't ever pushed my cards past it, since I thought I might blow something up. I suppose I have some overclocking to do...

My apologies for my assumptions, I suppose now the record's been set straight.

neuro


----------



## miklkit

Reading the last 2 pages scared me. I'm running an AMD FX at 5 ghz plus the Fury Nitro stock. I have seen just over 700 watts on my UPS while playing The Witcher 3 and I have an 850 watt Seasonic. The cpu runs at 30-40% loads while the Fury is at 100%.

Anyway, while I'm here I have a question. I have DX7,8,9, and 11 games. No DX10 or 12. The system is globally capped at 150fps and that is where the older games run.

But I noticed that in one DX9 game the 290X the Fury replaced delivered better frame rates. So I checked it in Passmark and found that the DX9 performance is pretty bad, the DX10 performance is very good, and the DX11 performance is again not good.

I'm not concerned about DX9 but do want more DX11 performance. Is there some way to tweak the bios for better DX11 performance? I should mention that I'm still using an 8 year old 1080p monitor.


----------



## damarad21

Ha ha. You are right. Hopefully I will be able to save some time for me in the future, at least 1 hour/week for gaming. Little baby is absorbing all my time and working the rest of it 
My case is not too big, and having [email protected],5 and fury nitro, I think I will stay for a while, although some times I'm thinking to sell everything as I'm. It using it too much


----------



## jdorje

4gb isn't enough for tomb raider (edit: rise of the tomb raider) it seems.

If I run it on very high (the highest preset), it averages ~64 fps in the benchmark. Minimum frame rate tumbles as the 4gb is exceeded. Worse, in actual gaming, some areas become very bad performance as that minimum frame rate will be the norm.

If I drop textures only from very high down to high then it stays around 3600 mB of vram. Average fps remains near identical but the min frame rate remains strong. Disappointing because the HBM makes the high-texture gaming incredibly fast, but if you go over 4GB then it's bad news (at least in this situation).

Of course, I can't really tell a difference in high versus very high textures.

Dropping the whole thing down to high brings average fps (again in the benchmark) up to ~85. Which isn't bad.

This is all at [email protected] with 40-145 freesync. This game doesn't really need high frame rates, but stuttering is to be avoided obviously. I haven't gamed at sub-40 fps in a while but freesync seems to do fine with it these days.


----------



## ManofGod1000

Quote:


> Originally Posted by *jdorje*
> 
> 4gb isn't enough for tomb raider it seems.
> 
> If I run it on very high (the highest preset), it averages ~64 fps in the benchmark. Minimum frame rate tumbles as the 4gb is exceeded. Worse, in actual gaming, some areas become very bad performance as that minimum frame rate will be the norm.
> 
> If I drop textures only from very high down to high then it stays around 3600 mB of vram. Average fps remains near identical but the min frame rate remains strong. Disappointing because the HBM makes the high-texture gaming incredibly fast, but if you go over 4GB then it's bad news (at least in this situation).
> 
> Of course, I can't really tell a difference in high versus very high textures.
> 
> Dropping the whole thing down to high brings average fps (again in the benchmark) up to ~85. Which isn't bad.
> 
> This is all at [email protected] with 40-145 freesync. This game doesn't really need high frame rates, but stuttering is to be avoided obviously. I haven't gamed at sub-40 fps in a while but freesync seems to do fine with it these days.


I am going to assume you mean Rise of the Tomb Raider and not Tomb Raider 2013, right?


----------



## ManofGod1000

Quote:


> Originally Posted by *neurotix*
> 
> So, I did some tests for you guys, I'm going to have to say I retract my statements, you should be fine with a 850w, or maybe even a 750w if you're careful about what you run (no Fire Strike).
> 
> 
> 
> I hooked up my Kill-a-watt to my tower only.
> 
> Here's my methodology:
> 
> I ran everything at 5760x1080 except for Fire Strike Ultra, which obviously renders at 4K internally and downsamples to 1080p. It actually centers on my center monitor so that's great.
> 
> I only used games that support Crossfire and Eyefinity well.
> 
> In the AMD drivers, I turned Vsync to ON, Frame Rate Target Control 60fps, Power Efficiency ON and Frame Pacing ON.
> 
> My CPU was at 4.5ghz, RAM at 2400 CAS11. I can go higher, but I was lazy and didn't want to reset and go to 4.8. I think this gives a more "average" idea of how much power the cards draw. The i7 draws 100w or so at idle with these settings.
> 
> For Valley I simply ran it at 5760x1080 with 4x AA. As per the rules of the Valley thread here for a multi monitor submission.
> 
> Fire Strike Ultra was run, well, the only way you can.
> 
> Witcher 3, everything was on Ultra, I simply stood in the first town for a while, and watched the readings on the Kill-a-watt.
> 
> Rise of the Tomb Raider, the first snowy area, again I stood for a while.
> 
> Sleeping Dogs... this is an older game. 2012. But it's really fun and it supports Crossfire and Eyefinity fantastically. Even centers the HUD on the center screen. Anyway, for this I simply went out of my apartment, went to the garage, got on a motorcycle and then started driving fast through the city. While also looking at my Kill-a-watt. If you like open world games or Kung fu movies, get this game. A single Fury could probably run it at 4k 60 no problem.
> 
> So, when watching the Kill-a-watt, I only looked for *PEAK* power draw. So the highest number I saw is what I recorded.
> 
> Anyway, here's the results:
> 
> *stock: (1050mhz)*
> 
> Valley: 550 watt
> Fire Strike Ultra: 650
> Witcher 3: 570
> Rise of the Tomb Raider: 495
> Sleeping Dogs: 324
> 
> *1100/550 +38mv*
> 
> Valley: 620
> Fire Strike Ultra: 827
> Witcher 3: 626
> Rise of the Tomb Raider: 511
> Sleeping Dogs: 343
> 
> *1125/550 +52mv*
> 
> Valley: 635
> Fire Strike Ultra: 870
> Witcher 3: 639
> Rise of the Tomb Raider: 525
> Sleeping Dogs: 350
> 
> There you go. Obviously, Fire Strike Ultra, and maybe any game at 4K or rendered at 4K and downsampled, is extremely brutal on the card and will produce not only a high power draw, but also lots of heat. Especially when overclocking.
> 
> I was surprised at the result for ROTTR. It has some nice graphics, and I'm running it at high resolution, I thought it would draw more power to run. I felt the same way about Witcher 3.
> 
> It seems that if you just want to game, you don't have to worry much with an 850w or lower PSU, even when overclocking. I would be careful with stuff like Fire Strike Ultra though if you have an insane overclock (higher than 1125mhz, high voltage).
> 
> I didn't try at 1150mhz or anything because honestly, I haven't ever pushed my cards past it, since I thought I might blow something up. I suppose I have some overclocking to do...
> 
> My apologies for my assumptions, I suppose now the record's been set straight.
> 
> neuro


Interesting. I just ran the Fire Strike Ultra Test and with a FX 8350 at 4.5GHz, 1 x Sapphire Fury Nitro + and 1 x Sapphire Fury Tri X, the highest load value I saw was at briefly 750 Watts and that was just on the first test. Everything else was just over 600 Watts except, of course, for the physics test which was at 332 Watts. (This is with the front panel reading of my APC 1500VA UPS and only the computer connected through the battery side, not even the monitor.)


----------



## bluezone

While I would leave this to a speciality shop. Here is a video showing process of how to replace/repair a surface mount GPU/CPU BGA package on a PS4. It would apply to just about any device. Interesting process.






Cheers

EDIT: Warning long video.


----------



## Alastair

What is a good timespy score for 3840 shader Fury's? I got this 8511 with 4.95GHz CPU and 1120/550 on the cards. This would likely be the highest placing AMD based system in the HOF it seems. If only Beta drivers were approved.
http://www.3dmark.com/3dm/16356186?


----------



## josephimports

Quote:


> Originally Posted by *Alastair*
> 
> What is a good timespy score for 3840 shader Fury's? I got this 8511 with 4.95GHz CPU and 1120/550 on the cards. This would likely be the highest placing AMD based system in the HOF it seems. If only Beta drivers were approved.
> http://www.3dmark.com/3dm/16356186?


Strong scores







...for comparison


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *Alastair*
> 
> What is a good timespy score for 3840 shader Fury's? I got this 8511 with 4.95GHz CPU and 1120/550 on the cards. This would likely be the highest placing AMD based system in the HOF it seems. If only Beta drivers were approved.
> http://www.3dmark.com/3dm/16356186?


Man I couldn't even overclock my FX-9590 that high stable with my cooling (H220-X). That shows how much better custom liquid will always be. I'll try time spy later with my Fury X and hopefully you can interpret the results imagining a lower CU count.


----------



## catbebi

Hi all, my Sapphire r9 Fury Nitro just arrived, but I have some concerns I hope you tech sages can help me on:

Upon installing the card, I noticed *GPU temps* gradually rose to *70 to 80 degrees C just web browsing!* The problem I have is that the GPU is at 100% utilization at all times while the fan speed is at around 20%.

I had to manually enable the custom fan curve in the Sapphire TriXX software to get my temps under control as the AUTO setting wasn't cutting it.

In the AMD Radeon Settings, the Power Efficiency is set to ON, under Global Graphics Gaming settings, as if that does anything.

*QUESTION: Is there any way to lower non-gaming gpu utilization so I'm not wasting electricity when at idle or web browsing?*

Any input would be appreciated!


----------



## MrKoala

You'll need to find which application is loading the GPU. You can't really blame the card if it's given (useless) work to do.

Or maybe this is a driver error of some sort.


----------



## Cyants

Maybe something else run in the background and set the GPU to 100% usage? if you use Firefox you tried disabling use HW Acceleration in aboutreferences#advanced ?

Your fanspeed stuck at 20% was odd, maybe a GPU manager like afterburner had incorect settings and was stuck at 20% manual?


----------



## Thoth420

My Fiji doesn't go to max load browsing anything ever.
Cores might hop if hardware accel is on in browser but it doesn't peg full usage even when play video.

My guess is driver issue or OC profile or a combination of the two (ddu them or use the guide on the forums here; fielders choice). Also you should wipe any OC software and profiles prior to this.


----------



## DedEmbryonicCe1

Man I hope they eventually find and squash this forever. I'm pretty sure it's some weird interaction with HW Acceleration in Firefox. As you can see throughout the video the GPU is never loaded beyond tiny spikes here and there but the fan continually ramps up until reaching 99% despite no need to. Resetting it back to the default behavior only stops it momentarily. Sometimes it actually stops ramping partway and even goes back down by 1 % only to start increasing after a few more seconds. Also, the time it takes to resume this behavior after resetting the speed with TRIXX is random beyond my ability to discern.

This hasn't happened to me in months so it has to be some weird combination they haven't noticed/replicated yet. Now to fill out a rather lengthy bug report linking every last tab I have open in two Firefox windows.


----------



## catbebi

Quote:


> Originally Posted by *Cyants*
> 
> Maybe something else run in the background and set the GPU to 100% usage? if you use Firefox you tried disabling use HW Acceleration in aboutreferences#advanced ?


I ran ProcessExplorer to check for backround processes and nothing really shows much gpu utilization.
Quote:


> Originally Posted by *Thoth420*
> 
> My guess is driver issue or OC profile or a combination of the two (ddu them or use the guide on the forums here; fielders choice). Also you should wipe any OC software and profiles prior to this.


I also followed the guide below for Win7 Crimson, and it's still @ 100% utilization after installing drivers again (16.11.5):

http://www.overclock.net/t/988215/how-to-remove-your-amd-gpu-drivers-new-2016

Other than trying different drivers, I'm scratching my head here.


----------



## neurotix

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> 
> 
> 
> 
> Man I hope they eventually find and squash this forever. I'm pretty sure it's some weird interaction with HW Acceleration in Firefox. As you can see throughout the video the GPU is never loaded beyond tiny spikes here and there but the fan continually ramps up until reaching 99% despite no need to. Resetting it back to the default behavior only stops it momentarily. Sometimes it actually stops ramping partway and even goes back down by 1 % only to start increasing after a few more seconds. Also, the time it takes to resume this behavior after resetting the speed with TRIXX is random beyond my ability to discern.
> 
> This hasn't happened to me in months so it has to be some weird combination they haven't noticed/replicated yet. Now to fill out a rather lengthy bug report linking every last tab I have open in two Firefox windows.


The latest version of Trixx wouldn't let me control my fan speeds at all with a Fury. It's only meant for Polaris or something I think and doesn't play well with our cards.

Get Trixx 5.2.1. Try that and see if it fixes your problem. Try and manually set the fan speed (use Fixed and set it low).


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *neurotix*
> 
> The latest version of Trixx wouldn't let me control my fan speeds at all with a Fury. It's only meant for Polaris or something I think and doesn't play well with our cards.
> 
> Get Trixx 5.2.1. Try that and see if it fixes your problem. Try and manually set the fan speed (use Fixed and set it low).


It is the latest version. They brand it as 3.0 at the top and it says v6.3.0 at the bottom.. WHY they do that, I have no idea. Trixx was not running when this problem started. I only opened it afterwards so that I could use the reset button to return the fan speed to defaults.


----------



## neurotix

Sorry, I meant try the older, 5.2.1 Trixx instead.

I tried the new one and it messed the fan speeds up on my Fury's.


----------



## kondziowy

Quote:


> Originally Posted by *u3a6*
> 
> W1zzard's sample had some kind of liquid metal thermal pad:
> 
> https://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/5.html
> 
> I'm wondering what mine could have... will check it out over the weekend. About the shaman vs stock cooler, what do you think? I have seen some reviews where the shaman beats the arctic cooling accelero xtreme plus III. I know that I would have to redesign the retention system, that would be no problem. The question is: would it be worth the hours?


So my Fury Strix also had this black thermal pad. I replaced it with MX-2 thermal paste and the core was not making full contact with a heatpipe (only like 10% of the core was touching the heatpipe!). So I applied 4x more paste than I normally do, and it was making full contact, HBM also.

With MX-2 temperatures are over +10*C worse than standard thermal pad. Standard pad had excellent temperatures, maxed out at 72*C, and now I get 80*C after 3 minutes of full load (don't want to test more).

This pad was minimum 0.5mm thick, but less than 1mm for sure, smelled like brand new car mixed with some kind of metallic smell + cinnamon. What was that thing (Asus support has no idea) and what should I replace it with?

Coollaboratory Liquid MetalPad is too thin (0,2mm). Should I use Fujipoly SARCON GR80A or SARCON XR-m ? Is there anything else/better?


----------



## u3a6

Quote:


> Originally Posted by *kondziowy*
> 
> So my Fury Strix also had this black thermal pad. I replaced it with MX-2 thermal paste and the core was not making full contact with a heatpipe (only like 10% of the core was touching the heatpipe!). So I applied 4x more paste than I normally do, and it was making full contact, HBM also.
> 
> With MX-2 temperatures are over +10*C worse than standard thermal pad. Standard pad had excellent temperatures, maxed out at 72*C, and now I get 80*C after 3 minutes of full load (don't want to test more).
> 
> This pad was minimum 0.5mm thick, but less than 1mm for sure, smelled like brand new car mixed with some kind of metallic smell + cinnamon. What was that thing (Asus support has no idea) and what should I replace it with?
> 
> Coollaboratory Liquid MetalPad is too thin (0,2mm). Should I use Fujipoly SARCON GR80A or SARCON XR-m ? Is there anything else/better?


Well I have also disassembled my Fury Strix and honestly I am disappointed with Asus... The black pad on my card was not making full contact with the gpu and HBM, also two of the VRM caps (high side) fell straight off the card... With this said I soldered them back into place and put a custom cooler solution on it.

Fujipoly/Sarcon XR-m are as good as it gets in terms of thermal pads afaik! You might get it to work with one of those, but I think the best for you would be getting a Raijintek Morpheus II core edition since it is listed as compatible with the Fiji core (you should check the clearance too). If you want to go crazy I think alphacool makes some blocks for the fury strix. This would be my advice...

I made a bracket to slap the Thermalright Shaman on the card and I've also made a custom vrm cooler out of a ram cooler with an heatpipe after a couple of hours on my milling machine. Here are some pics:

stock [email protected](stock power [email protected]% power limit, 1050MHz and 64CU's)










Custom 350W 325A bios (beyond 150% stock) + full shader unlock+higher vcore. Look at those temps, also the cooler is quieter than the stock one. I will get a 2500rpm fan soon!!!


----------



## kondziowy

Well, those temps are sick indeed







(those caps though, like ***?)
But I have to try this Sarcon XR-m first. I wonder if it can be as good as or better than the stock one.
If there is nothing better, I will just order it and maybe in a week or two I will post results.

As for unlocking. This is not a perfect candidate but yours wasn't also right? Would you try to unlock this card:

SE1 hw/sw: 00030000 / 00000000 [..............xx]
SE2 hw/sw: 00030000 / 00000000 [..............xx]
SE3 hw/sw: 04800000 / 00000000 [.....x..x.......]
SE4 hw/sw: 02010000 / 00000000 [......x........x]
56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
8 CU's are disabled by HW lock, override is possible at your own risk.

It doesn't have dual bios. Isn't it too risky?


----------



## u3a6

Quote:


> Originally Posted by *kondziowy*
> 
> Well, those temps are sick indeed
> 
> 
> 
> 
> 
> 
> 
> (those caps though, like ***?)
> But I have to try this Sarcon XR-m first. I wonder if it can be as good as or better than the stock one.
> If there is nothing better, I will just order it and maybe in a week or two I will post results.
> 
> As for unlocking. This is not a perfect candidate but yours wasn't also right? Would you try to unlock this card:
> 
> SE1 hw/sw: 00030000 / 00000000 [..............xx]
> SE2 hw/sw: 00030000 / 00000000 [..............xx]
> SE3 hw/sw: 04800000 / 00000000 [.....x..x.......]
> SE4 hw/sw: 02010000 / 00000000 [......x........x]
> 56 of 64 CUs are active. HW locks: 8 (R/W) / SW locks: 0 (R/W).
> 8 CU's are disabled by HW lock, override is possible at your own risk.
> 
> It doesn't have dual bios. Isn't it too risky?


Well that card does not look to unlock indeed... It is possible, but I personally would not push my luck on that one (no bios switch :/). Mine had a perfect lower row and the upper had the 3rd cu out of the row. Tried 3840 and it worked, then pushed my luck to 4096 and it is working fine, overlocking is not super duper impressive but I can get it to easily do 1100/545.5 stable through firestrike with a voltage bump. With this said if i keep the temps in the 20's I can pass geekbench 4 compute at 1210/600.


----------



## kondziowy

This guy is showing Fujipoly SARCON XR-m thermal pad -> 



and he said it's good for chips up to 80W of power. So that is probably going to be a spectacular failure with 300W Fury.
Asus used some kind of magic pad that doesn't exist. And I dumped it in the garbage bin


----------



## u3a6

Quote:


> Originally Posted by *kondziowy*
> 
> This guy is showing Fujipoly SARCON XR-m thermal pad ->
> 
> 
> 
> and he said it's good for chips up to 80W of power. So that is probably going to be a spectacular failure with 300W Fury.
> Asus used some kind of magic pad that doesn't exist. And I dumped it in the garbage bin


I think at this moment you would be better off getting an aftermarket cooler :/ (the morpheus II seems to be quite good and iut lis listed as compatible with the fiji core but I have never tried such cooler).


----------



## Alastair

Quote:


> Originally Posted by *kondziowy*
> 
> This guy is showing Fujipoly SARCON XR-m thermal pad ->
> 
> 
> 
> and he said it's good for chips up to 80W of power. So that is probably going to be a spectacular failure with 300W Fury.
> Asus used some kind of magic pad that doesn't exist. And I dumped it in the garbage bin


You could always try a metal pad like cool laboratories metal pad?


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *kondziowy*
> 
> This guy is showing Fujipoly SARCON XR-m thermal pad ->
> 
> 
> 
> and he said it's good for chips up to 80W of power. So that is probably going to be a spectacular failure with 300W Fury.
> Asus used some kind of magic pad that doesn't exist. And I dumped it in the garbage bin


Yeah I wouldn't use that for the die and HBM package but it would be great (in the 1.5-2mm thick variant) for the VRMs. It's also probably expensive as all hell for a full sheet that you can cut to size.


----------



## kondziowy

Quote:


> Originally Posted by *Alastair*
> 
> You could always try a metal pad like cool laboratories metal pad?


I don't know why I don't like this idea. Does Liquid metal pad damage heatsink a little bit when melting?
My goal is to actually leave it as stock as possible. Card still has 2.5 years of warranty









Raijintek Morpheus 2 is also not bad idea









But

I think I finally found it - or something simillar. *Phase change material Hi-Flow 225*

225U Photo
225UT Photo

It looks the same. Can be used for cpus and gpus. When changing phase - it resists dripping (even vertically). Has to be heated up to allow heatsink disassembly - exactly like mine was. If it smells like cinnamon - im golden







I just need to figure out which one is it from four hi-flow 225 versions.

It is single use only. Reviewers who disassembled the card (Asus R9 Fury) probably did not get the best thermal results during testing.


----------



## u3a6

Quote:


> Originally Posted by *kondziowy*
> 
> I don't know why I don't like this idea. Does Liquid metal pad damage heatsink a little bit when melting?
> My goal is to actually leave it as stock as possible. Card still has 2.5 years of warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Raijintek Morpheus 2 is also not bad idea
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But
> 
> I think I finally found it - or something simillar. *Phase change material Hi-Flow 225*
> 
> 225U Photo
> 225UT Photo
> 
> It looks the same. Can be used for cpus and gpus. When changing phase - it resists dripping (even vertically). Has to be heated up to allow heatsink disassembly - exactly like mine was. If it smells like cinnamon - im golden
> 
> 
> 
> 
> 
> 
> 
> I just need to figure out which one is it from four hi-flow 225 versions.
> 
> It is single use only. Reviewers who disassembled the card (Asus R9 Fury) probably did not get the best thermal results during testing.


Hahah, that seems to be it! How did you find it?


----------



## kondziowy

Today the first thing I googled was "best metal thermal pad", heatsink-guide.com came up and from there it was easy







Turns out google knows everything.

It's probably hi-flow 225u here are all the specs in pdf. Let's hope it doesn't ship in 12x12 inch sheets







That's 88 thermal pads at one time.

Edit: naah it's way too thin. My card needs something at least 0.5mm thick, can't find it anywhere. Maybe Asus has it custom made.


----------



## bluezone

OK this isn't Fury related. I was checking out if I could spot the PS4 Pro downgrade in resolution after update 1.08 on Last of Us. 1440 to 1080p roughly. Yes I can.

This screen shot made me LOL.



Anyone think that Joel has become too attached to Elle? Yes I know bad collision detection.


----------



## xkm1948

Looks like Wattman is coming to Fury in the next major Crimson update.


----------



## NightAntilli

I haven't looked at the whole presentation yet, but, it's quite unclear to me whether everything (RadeonChill, Relive, etc) will be supported on the Fury cards as well, or just Polaris. I hope it's for the Fury line as well... Otherwise I might as well have bought an RX 480 instead of the Fury Nitro.


----------



## bluezone

Quote:


> Originally Posted by *NightAntilli*
> 
> I haven't looked at the whole presentation yet, but, it's quite unclear to me whether everything (RadeonChill, Relive, etc) will be supported on the Fury cards as well, or just Polaris. I hope it's for the Fury line as well... Otherwise I might as well have bought an RX 480 instead of the Fury Nitro.


As a Nano owner, the RadeonChill slides look very interesting. The short version is less heat.

EDIT: I wonder if this is a refinement of frequency FastSwitching

What do you think Gupsterg?


----------



## NightAntilli

Ok so I've been going through the PDF file...

The following is *confirmed to be supported on the Fury line*:

Automatic bad HDMI signal detection and fallback
VP9 decode acceleration
Dolby Vision and HDR10 support
Borderless FreeSync
Radeon Chill
Wattman
XConnect
Radeon ReLive (AMD's ShadowPlay basically)

*NOT included for the Fury line (Polaris exclusive);*
DisplayPort HBR3 support (Single Cable 4K 120Hz, 5K 60Hz, 8K 30Hz)

Only one feature will be missing for the Fury cards. Very good... And... Considering that Radeon Chill can actually improve frametimes, and reduce input lag, this is huge.


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *NightAntilli*
> 
> *NOT included for the Fury line (Polaris exclusive);*
> DisplayPort HBR3 support (Single Cable 4K 120Hz, 5K 60Hz, 8K 30Hz)


I don't see that as much of a downside considering none of these cards are fast enough to game at these resolutions + refresh rates and it will be years before there is video content widely available either.


----------



## Alastair

Quote:


> Originally Posted by *NightAntilli*
> 
> Ok so I've been going through the PDF file...
> 
> The following is *confirmed to be supported on the Fury line*:
> 
> Automatic bad HDMI signal detection and fallback
> VP9 decode acceleration
> Dolby Vision and HDR10 support
> Borderless FreeSync
> Radeon Chill
> Wattman
> XConnect
> Radeon ReLive (AMD's ShadowPlay basically)
> 
> *NOT included for the Fury line (Polaris exclusive);*
> DisplayPort HBR3 support (Single Cable 4K 120Hz, 5K 60Hz, 8K 30Hz)
> 
> Only one feature will be missing for the Fury cards. Very good... And... Considering that Radeon Chill can actually improve frametimes, and reduce input lag, this is huge.


Is it exclusively implimented by the driver? Do certain games need to support this feature? Like CSGO or example which is a much crummier older engine runs at like 300FPS for me usually, BF4 also being a slightly older title I play online. R6 siege etc. you get my drift. Do the games need to support "chill" or will the driver do all the work.

Another thing. It seems my choice of curved freesync monitors in South Africa is rather limited to almost non existent. But I am sure for a similar amount of money I can build a 5760x1080 triple monitor eyefinity set up. Does Eyefinity still work well? Eyefinity+ freesync at the same time? Good 1080P freesync monitors anyone want to recommend?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> As a Nano owner, the RadeonChill slides look very interesting. The short version is less heat.
> 
> EDIT: I wonder if this is a refinement of frequency FastSwitching
> 
> What do you think Gupsterg?


No idea







.
Quote:


> This is a new power saving feature that dynamically regulates frame-rate based on in-game movement.


Quote:


> In one example taken from World of Warcraft, the FPS average is measurably higher with Radeon Chill turned off, at 125 FPS, versus 62 FPS with Radeon Chill enabled.


RadeonChill interesting feature. To me it comes across as FRTC but monitoring input to be aware when to go low.

WattMan I'm looking forward to more, to me:-

a) it's biggest advantage on current OC SW we use is modification of VID per DPM, rather than VDDC offset which increases VID/VDDC for all states. Plus it will allow GPU clock editing per DPM.
b) it's the first OC SW that allows "Advance"/"Fuzzy Logic" fan mode modification, current OC SW allows us basically "Lookup table" fan mode adjustment.

The other feature I'm looking forward to is the ReLive screen capture tool. I tried Plays.Tv once recently and IIRC supports 1080P, which to me was no use when I'd gone 1440P on monitor.


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *Alastair*
> 
> Is it exclusively implimented by the driver? Do certain games need to support this feature? Like CSGO or example which is a much crummier older engine runs at like 300FPS for me usually, BF4 also being a slightly older title I play online. R6 siege etc. you get my drift. Do the games need to support "chill" or will the driver do all the work.


http://cdn.videocardz.com/1/2016/12/AMD-Crimson-ReLive-VideoCardz-56.jpg
There is a whitelist of games AMD has tested and allows Chill with. More will be added over time.


----------



## Alastair

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Is it exclusively implimented by the driver? Do certain games need to support this feature? Like CSGO or example which is a much crummier older engine runs at like 300FPS for me usually, BF4 also being a slightly older title I play online. R6 siege etc. you get my drift. Do the games need to support "chill" or will the driver do all the work.
> 
> 
> 
> http://cdn.videocardz.com/1/2016/12/AMD-Crimson-ReLive-VideoCardz-56.jpg
> There is a whitelist of games AMD has tested and allows Chill with. More will be added over time.
Click to expand...

So essentially then. We need a profile for it? Like a crossfire profile? Somehow I find myself thinking, " this may not work out to well for us."


----------



## MrKoala

Quote:


> Originally Posted by *Alastair*
> 
> So essentially then. We need a profile for it? Like a crossfire profile? Somehow I find myself thinking, " this may not work out to well for us."


This one kinds of make sense because it relies on reading the motion in game, so some type of telemetry is required. Such information is obviously not as well defined as graphical frames and may need tweaking for each game engine.


----------



## Alastair

So now AMD needs to worry about crossfire profiles and radeon chill profiles. Why do I feel crossfire users will get the short end of this stick?


----------



## MrKoala

Well, I don't see what the user has to lose. The worst case is just not supporting a game and doing full performance rendering as usual.

If you're worried about losing development time, those are dramatically different subsystems so they are probably not programmed by the same guy anyway.


----------



## Medusa666

I hope this will be compatible with my Radeon Pro Duo, should be.


----------



## Alastair

Quote:


> Originally Posted by *MrKoala*
> 
> Well, I don't see what the user has to lose. The worst case is just not supporting a game and doing full performance rendering as usual.
> 
> If you're worried about losing development time, those are dramatically different subsystems so they are probably not programmed by the same guy anyway.


I do. If RTG needs to spend more time to develop these chill profiles for games, then I can only see how it might hurt crossfire users as thaey may or may not spend more time developing chill profiles and end up leaving us crossfire users in the dark.


----------



## looncraz

Quote:


> Originally Posted by *Alastair*
> 
> I do. If RTG needs to spend more time to develop these chill profiles for games, then I can only see how it might hurt crossfire users as thaey may or may not spend more time developing chill profiles and end up leaving us crossfire users in the dark.


Crossfire and Chill profiles should have a lot of overlap, you would work on them at the same time if anything, though Chill should also require less work. Getting the two to work in unison could be either problematic or wonderfully beautiful.


----------



## u3a6

The new Crimson Radeon ReLive drivers are live:

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64


----------



## NightAntilli

According to ComputerBase.de, the Fiji line has an advantage in ReLive compared to Polaris.

Fiji can record at 1440p/60fps and 4K/30fps, while Polaris is limited to 1440p/30fps max.


----------



## xkm1948

Can't test new driver until later tonight. Can some Fury owners test out the now available Wattman??


----------



## gupsterg

No issue using driver with custom ROMs, seen on AMD Reddit some RX 480 users are reporting not allowing custom ROMs with Crimson ReLive Edition driver (Hawaii owners / older are OK with custom ROM+new driver).

VID per DPM and GPU clock per DPM modification works







.

"Advance"/"Fuzzy Logic" fan profile modification is not shown, even though this feature is on Polaris in WattMan and Fiji uses same PowerPlay in ROM







. So still gotta do ROM mod for this







.

WattMan on RX 480 has memory clock/states/voltage (voltage is IMC & not RAM), not available on Fiji







. I tried stock and custom ROMs, I could get HBM OC via slider in OverDrive if modded OverDrive HBM limit in ROM on previous drivers. I was hoping they would have implemented HBM voltage control in WattMan for Fiji as IR3567B supports this via i2c/ROM. So ROM mod is still the way for this







.

3DM FS benches are consistently good for me, very similar to Crimson v16.3.2 which benched the best for me. I had found anything after v16.3.2 and upto v16.11.1 had been always benching ever so slightly lower in 3DM FS, v16.11.4 and 5 is pretty much on par with v16.12.1.

MSI AB shows now no active sliders with Crimson ReLive, some monitoring data like GPU temp, fan stuff is also missing







.

HWiNFO VRM temps are now missing due to new driver







(Mumak has reported to AMD).

GPU-Z still works for all aspects with new driver as it did with older ones.

Yet to test ReLive, OCAT and Radeon Chill.


----------



## Performer81

No problems here with Afterburner, Fiji and the new drivers. Everything works.


----------



## damarad21

Does anyone know how to get working doom and steam link with the fury. It happened the same with my old 290X. Know with the Fury Nitro same issue . Doom Is blocked loading 99% sending to the tv from steam link. Please any help?


----------



## gupsterg

Quote:


> Originally Posted by *Performer81*
> 
> No problems here with Afterburner, Fiji and the new drivers. Everything works.


Are you using Afterburner 4.3.0? I'm on 4.2.0 last time I tried 4.3.0 it did not allow voltage offset manipulation even on stock ROM. I'm on Win 7 Pro x64, I have Win 10 Pro x64 but use that rarely.


----------



## CALiteral

Quote:


> Originally Posted by *damarad21*
> 
> Does anyone know how to get working doom and steam link with the fury. It happened the same with my old 290X. Know with the Fury Nitro same issue . Doom Is blocked loading 99% sending to the tv from steam link. Please any help?


I don't think the Steam Link works with DX12 or Vulkan. Frustrating, I know.


----------



## Performer81

Quote:


> Originally Posted by *gupsterg*
> 
> Are you using Afterburner 4.3.0? I'm on 4.2.0 last time I tried 4.3.0 it did not allow voltage offset manipulation even on stock ROM. I'm on Win 7 Pro x64, I have Win 10 Pro x64 but use that rarely.


Win 10 x64 and Afterburner 4.3.0.


----------



## gupsterg

Cheers







 , just did fresh download of 4.3.0 and yes it is working fully, I may have had older beta build or something







.


----------



## Semel

Quote:


> Originally Posted by *NightAntilli*
> 
> According to ComputerBase.de, the Fiji line has an advantage in ReLive compared to Polaris.
> 
> Fiji can record at 1440p/60fps and 4K/30fps, while Polaris is limited to 1440p/30fps max.


Don't get your hopes up

It looks like Fiji is really struggling recording games that are VRAM hungry. (and probably heavy on GPU in general) If your VRAM is hitting ~3700-3900+ usage then your recording even at 1920x1080 30fps would cause problems like it did for me in Dishonored 2 and RoTTR and deus ex.. I was getting 1-3 seconds freezes and the recording itself was obviously all choppy.

4GB VRAM is really destroying this pretty good card even at 1080p

If you have intel CPU with quicksyn your best bet would be using OBS+quicksync. It's a shame,really..


----------



## damarad21

Quote:


> Originally Posted by *CALiteral*
> 
> I don't think the steam link works with DX12 or Vulkan.
> I don't think the Steam Link works with DX12 or Vulkan. Frustrating, I know.


Indeed it doesn't work also with open gl. Considering to change the fury for a gtx 1070 for that reason. Buff although I'm happy with the card for 280€, silence, temps and how well in built and performs.. buff I really wanted to play doom in the tv with steam link .


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> It looks like Fiji is really struggling recording games that are VRAM hungry. (and probably heavy on GPU in general) If your VRAM is hitting ~3700-3900+ usage then your recording even at 1920x1080 30fps would cause problems like it did for me in Dishonored 2 and RoTTR and deus ex.. I was getting 1-3 seconds freezes and the recording itself was obviously all choppy.


I just used ReLive with Lords of the Fallen, my in game settings below.



I did 7mins recording and no issues, resulted in 2.25GB files so won't upload to YT, will try to do smaller run or tweak settings of ReLive to gain smaller file. I was also running MSI AB in the background whilst doing ReLive LOTF (HML attached below, currently meddling on stock ROM







).

ReLiveLOTF1440Pmaxsetting.zip 21k .zip file


Texture Quality = Very High gives warning on stability with cards with <6GB , I have seen this 



 recently showing upto 6GB VRAM usage at 1080P. At 1440P I touch 3.7GB on TQ = High @ 1440P and 3.9GB TQ = V.High. Gonna see what kinda VRAM usage I get on Fiji at 1080P, to me it seems VRAM usage is just allocation and not actual usage as I had found out few months back when researching this aspect.


----------



## Semel

Well, there is a difference between allocation or actual requirement like in the last modern games when your RAM gets filled with textures that couldn't get stored in VRAM. Deus EX and ROTTR are especially notorious when it comes to this. I had RAM usage spiking to 12-13+GB when changing settings heavily affecting textures quality.

When you start recording you GPU VRAM usage increases and if it gets exceeded even before that then you get what you get..


----------



## gupsterg

1080P same setting as 1440P screenie above accept TQ = High results in 3.5GB and TQ = V.High 3.9GB . I wish drivers would exposed actual VRAM usage to GPU-Z, etc ...


----------



## xkm1948

Any idea on how to use Wattman to overclock FuryX? I feel so stupid now.


----------



## ressonantia

Quote:


> Originally Posted by *xkm1948*
> 
> Any idea on how to use Wattman to overclock FuryX? I feel so stupid now.


Something like this maybe? You just change the clocks and corresponding voltages if not stable with defaults.


----------



## xkm1948

Is undervolting FuryX a thing? I am gonna try undervolting at stock frequency.


----------



## gupsterg

Yep, undervolting on Fury/X is out there







.

Depending on game you sometimes don't see the same scaling in performance as set OC, so in those cases I'd be inclined to make card stock/under volted. You'll see lower power/temps, etc. Nano owners on stock cooling gain better average clocks when undervolting.

This is where SW OC is more flexible then ROM, your not doing ROM swaps/driver resets.

I have found a few times applying SW OC has made driver/WattMan crash







, even when I'm applying known good OC settings which I've done via ROM.


----------



## Maximization

stinks for me new driver, froze system into reboot loop, make sure you do not have afterburner or other overclocking software load automatically with boot up. old overclocking tools are all nullified. fury x crossfire. it actually froze my usb ports so i could not do a restore or safe mode boot.... thank god i had a clone image laying around from last week


----------



## Semel

*gupsterg*

When I tried to record Dishonored 2 using Relive @1920x1200 60\30 fps my game kept freezing for 1-3 seconds often.

Using OBS and the updated AMD plugin I could record it just fine.Performance hit was a bit higher though about 5 fps(OBS has a performance hit even when not recording anything) . Yeah there were some lagged frames according to OBS stats but at least the game didn't freeze here and there and the recording didn't stutter noticeably.

I wonder why is that..


----------



## ManofGod1000

Time to do some under volt stuff on my crossfire Sapphire R9 Furies. (I tried it with Afterburner but I have never really liked that software very much.) With Sapphire Trix software, you have to disable ULPS so I hope that the Wattman software does not require that, we will see.


----------



## gupsterg

Quote:


> Originally Posted by *Semel*
> 
> *gupsterg*
> 
> When I tried to record Dishonored 2 using Relive @1920x1200 60\30 fps my game kept freezing for 1-3 seconds often.
> 
> Using OBS and the updated AMD plugin I could record it just fine.Performance hit was a bit higher though about 5 fps(OBS has a performance hit even when not recording anything) . Yeah there were some lagged frames according to OBS stats but at least the game didn't freeze here and there and the recording didn't stutter noticeably.
> 
> I wonder why is that..


No idea mate







, I will try other games besides LOTF with ReLive. I've never used OBS but plan to try it







. I would say report bugs so hopefully AMD fix







.


----------



## Thoth420

Quote:


> Originally Posted by *Maximization*
> 
> stinks for me new driver, froze system into reboot loop, make sure you do not have afterburner or other overclocking software load automatically with boot up. old overclocking tools are all nullified. fury x crossfire. it actually froze my usb ports so i could not do a restore or safe mode boot.... thank god i had a clone image laying around from last week


Yep I have always ran into snags on cards from both camps but tends to be worse for AMD ones if I install a different driver with any GPU OC software set to run at boot. I prefer to use that setting so I just made sure to add removing the software(profiles as well since I only use one but maybe you can leave those) beforehand and then checking for a new version as I leave that setting off in AB. Install the AB or whatever after the driver of course and re apply my settings. Kinda a pain but haven't had a driver issue since I started using that method.


----------



## bluezone

Ok I just had a weird lock up of my PC and had to restart. During shutdown I had a warning dialog informing me that some portion of relive couldn't save to memory. Very strange because I do not have the capture portion of the driver installed.


----------



## xkm1948

Turned on Fury X power efficiency, lowered state 7 voltage by 13mv. Benchmarks seem to be uneffected. System still passes the future mark stress test at 99.7%. I will try to undervolt more tonight.


----------



## aDyerSituation

I recently got a Fury Nitro card and I have been having issues. My screen on it flickers time to time with 16.11. I got a new displayport cable today, going to try that and the new driver today. But I have been reading online that there are a lot of artifact and flickering issues with these cards that AMD has ignored. Is there another solution or something else I can try? The flickering is pretty distracting and sometimes i have to unplug the cable and plug it back in. My monitor is the AOC g2460pf


----------



## Deadroger

Installed my new R9 Fury Nitro today after finally retiring the HD7970. She was heavily abused for 4 years and 2 months, what a great card. I'm not expecting this Fury to still be with me for that long but i was going to go for a 1070 until i saw this Fury for £130 cheaper a couple days ago.

Needed to replace my 4 and a half year old 1080 120hz monitor too as pointless going for the 1070 to start with unless moving to a 1440 display. Trouble is g-sync is expensive it seems, over £100 more than a freesync unit with similar specs.

I know the GTX1070 is a better card, but going for the Fury and a nice 27" 1440 144hz freesync monitor (AOC AGON AG271QX) saved me over £230 than going the nvidia route. Was worth the trade off imo.

Very happy with the card and display so far. First time playing Doom, with Vulkan, @ 1440 with freesync on is so smooth with everything maxed is unreal. I don't know what magic this freesync thing is but i really like it lol!

I'm late to the party but i'm glad i have a Fury, even the name sounds good


----------



## Kana-Maru

Quote:


> I know the GTX1070 is a better card


That is not necessarily true. FPS charts doesn't mean "everything". As you stated you ended up saving or getting more for buck at a cheaper price. Also the Fury competitor was Maxwell, not Pascal. For instance if I was to spend my cash on a Titan X Pascal for $1200+, it would be overkill for my setup. My monitor is limited to only 80hz @ 1440p which is the resolution I normally use for my games. My Fury X handles my current setup well. Now if I start pushing for 120hz - 144hz @ 1440p - 1600p then Vega and Nvidia high-end GPUs will need to be compared before purchase. I can get 144hz @ 1080p and 60hz @ 4K.

I'm planning on getting a pair of Freesync monitors to replace my Nvidia monitors so I'll probably be sticking with AMD regardless. Who knows at this point.

Quote:


> Very happy with the card and display so far. First time playing Doom, with Vulkan, @ 1440 with freesync on is so smooth with everything maxed is unreal. I don't know what magic this freesync thing is but i really like it lol!


Not magic, but actual [competent] developers using Async Compute & other features in the Vulkan API properly just like devs use on the consoles [Xbox One\PS4]. Unlike DX12, Vulkan won't let you program in the old DX11 way. I'm literally averaging 60fps @ 4K with my Fury X with 100% max settings + Ultra + TSAA.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> Turned on Fury X power efficiency, lowered state 7 voltage by 13mv. Benchmarks seem to be uneffected. System still passes the future mark stress test at 99.7%. I will try to undervolt more tonight.


I'm now creating a tuned undervolted ROM for my Fury X aswell







. Got to get a wall plug power meter and/or measuring equipment to get amps at PCI-E plugs for better idea on power saving.

As the data which HWiNFO shows from the driver isn't totally accurate I've been lowering OCP in ROM and then firing up 3DM FSE Demo looped to see when OCP kicks in. On my 1145 @ 1.268V ROM, GPU phases OCP I can lower from stock 240A down to 216A, but going down to 204A equals card shutdown. When at stock clocks/VID (1050/1.212V) I can have OCP set to 204A, anything lower card shuts down. So voltage control chip is seeing roughly below ~274W on OC ROM and ~247W on stock when monitoring GPU phases. Hopefully undervolting will result in some decent drop in amps usage, testing some ROMs today hopefully







.


----------



## Kana-Maru

Here is my older post from August when I underclocked my Fury X:

"Stock voltage for my Fury X would peak at 1.225mV and the average would hover around 1.19mV - 1.21mV.

I undervolting the card using these settings:
Core Voltage: -36
Power Limit %: -25
Core Clock 1050Mhz

The voltage now peaks at 1.16mV and the average is roughly 1.14mV - 1.15mV. I saw it dip as low as 1.10mV during less stressful parts in the benchmark test.

I ran Heaven Benchmark 4 [Tessellation = Extreme 1440p + 4K], Valley Benchmark 1 [Ultra: 1440p + 4K] and Fire Strike Extreme & Ultra. No artifacts and no crashing.

The temps were around 35c-40c! Wow that's around 5c - 8c lower than what I normally see!
I might try to go lower, but at some point I'm sure I'll need to lower my core clock a little bit."

So far so good. I normally run my Fury X at stock unless I'm going for a high benchmark. It's been awhile since I've overclocked my GPU, but I was able to hit 1180Mhz on the Core easily at one point. I also haven't played around with Wattman just yet.

I have a big post coming up on my blog with the new Crimson ReLive 16.12.1 drivers. I was planning to do this at the end of the year anyways, but AMD continues to impress me with their drivers. I'm benchmarking several of the best looking and most stressful titles we've had over the past couple of years @ 4K with the Fury X running stock settings. Normally the in-game settings are 100% maxed or the highest preset with no AA. So far I've completed 8 games @ 4K.

The games benched at 4K are:
-*Doom* Vulkan
-*TitanFall* 2 DX11
-*Hitman* DX12
-*Metro: Last Light Redux* DX11
-*Ryse: Son of Rome* DX11
-*The Witcher 3* DX11
-*Rise of the Tomb Raider* DX12 [Crystal Dynamics, after SO MANY UPDATES FINALLY has DX12+ Async Compute working just as good as DX11]
-*Metal Gear Solid V: The Phantom Pain DX11*

What do you guys think about that lineup? I have other games I'm going to benchmark as well such as The Evil Within @ 4K 100% maxed w\ AA Disabled, maybe Metro: 2033 and possibly Deus Ex: Mankind Divided. I have other games I've tested @ 4K in the past as well. I was just re-doing some of those test with the newer Relive driver.

I can easily tell you that AMD has impressed me with their drivers and Fury X. I can only speak for the Fury X because it the only card in my gaming rig. Some people require 60fps or nothing and although I do hit 60fps average in a few titles @ 4K, I have no issues with 40fps and higher. Especially when the games look gorgeous and doesn't have microstutter or screen-tearing. I'm hoping to have the page up later today.


----------



## damarad21

Well, just tested my new fury nitro.

http://www.3dmark.com/fs/11039894

Not bad at all for a "bargain" video card .


----------



## diggiddi

Quote:


> Originally Posted by *Kana-Maru*
> 
> Here is my older post from August when I underclocked my Fury X:
> 
> "Stock voltage for my Fury X would peak at 1.225mV and the average would hover around 1.19mV - 1.21mV.
> 
> I undervolting the card using these settings:
> Core Voltage: -36
> Power Limit %: -25
> Core Clock 1050Mhz
> 
> The voltage now peaks at 1.16mV and the average is roughly 1.14mV - 1.15mV. I saw it dip as low as 1.10mV during less stressful parts in the benchmark test.
> 
> I ran Heaven Benchmark 4 [Tessellation = Extreme 1440p + 4K], Valley Benchmark 1 [Ultra: 1440p + 4K] and Fire Strike Extreme & Ultra. No artifacts and no crashing.
> 
> The temps were around 35c-40c! Wow that's around 5c - 8c lower than what I normally see!
> I might try to go lower, but at some point I'm sure I'll need to lower my core clock a little bit."
> 
> So far so good. I normally run my Fury X at stock unless I'm going for a high benchmark. It's been awhile since I've overclocked my GPU, but I was able to hit 1180Mhz on the Core easily at one point. I also haven't played around with Wattman just yet.
> 
> I have a big post coming up on my blog with the new Crimson ReLive 16.12.1 drivers. I was planning to do this at the end of the year anyways, but AMD continues to impress me with their drivers. I'm benchmarking several of the best looking and most stressful titles we've had over the past couple of years @ 4K with the Fury X running stock settings. Normally the in-game settings are 100% maxed or the highest preset with no AA. So far I've completed 8 games @ 4K.
> 
> The games benched at 4K are:
> -*Doom* Vulkan
> -*TitanFall* 2 DX11
> -*Hitman* DX12
> -*Metro: Last Light Redux* DX11
> -*Ryse: Son of Rome* DX11
> -*The Witcher 3* DX11
> -*Rise of the Tomb Raider* DX12 [Crystal Dynamics, after SO MANY UPDATES FINALLY has DX12+ Async Compute working just as good as DX11]
> -*Metal Gear Solid V: The Phantom Pain DX11*
> 
> What do you guys think about that lineup? I have other games I'm going to benchmark as well such as The Evil Within @ 4K 100% maxed w\ AA Disabled, maybe Metro: 2033 and possibly Deus Ex: Mankind Divided. I have other games I've tested @ 4K in the past as well. I was just re-doing some of those test with the newer Relive driver.
> 
> I can easily tell you that AMD has impressed me with their drivers and Fury X. I can only speak for the Fury X because it the only card in my gaming rig. Some people require 60fps or nothing and although I do hit 60fps average in a few titles @ 4K, I have no issues with 40fps and higher. Especially when the games look gorgeous and doesn't have microstutter or screen-tearing. I'm hoping to have the page up later today.


Could you test Crysis 3 and Project car /Assetto corsa if you have them. Assetto pagani version is free on steam


----------



## Kana-Maru

Quote:


> Originally Posted by *diggiddi*
> 
> Could you test Crysis 3 and Project car /Assetto corsa if you have them. Assetto pagani version is free on steam


No problem. I own Crysis 3, but I do not own Project Cars or Assetto Corsa at the moment. I can try to catch them on sale during the Steam winter sale or something. I'll download [Project Cars] Pagani from steam now. Looks like it'll only take 3 minutes.

I have already uploaded the results to my blog.

You can Google the title:
"Crimson ReLive 16.12.1 - Several Games Benchmarked @ 4K"

Apparently I can't post the direct link to my benchmarks on this site without getting modded. I've already tested 8 games maxed out @ 4K. I've also compared the Crimson 16.2.1 to older Crimson drivers in a few games. I'll be updating the article soon.

I have finally gotten around to testing the "ReLive" feature. I used the "Instant Reply" feature with Metal Gear Solid V: The Phantom Pain 100% maxed settings @ 4K. This was an initial run, so no warm ups, and overall I think the game ran and played well overall. I'm in the process of uploading the video now to YouTube. I'll post the YouTube link soon.


----------



## Performer81

Wattman is really nice. I can undervolt to regions Afterburner would give me instant crashes, because not every clock/voltage domain is affected by undervolting.



Down from 1,25V on my XFX Fury. Fans barely go over idle rpm although i have a temp limit of 62 degrees.







Super cool and quiet.


----------



## Kana-Maru

Here is my Metal Gear Solid V: TPP 4K recording using the new 'ReLive' Instant Reply feature. I chose this levels since there was a lot happening and plenty of physics and it makes a great benchmark. The GPU Power Saving\Efficiency was Enabled during this test btw.

The Fury X is limited to 4K @ 30fps recoding, but I included my FPS on-screen so you guys can get an idea of what FPS I'm actually getting during actual gameplay.

Recording Profile: High [Bitrate 30Mbps]
Graphics Settings are 100% maxed.


----------



## neurotix

Guys who are having trouble with the new Relive drivers and Trixx/Afterburner, try using Trixx 5.2.1. Don't use Trixx 6.x.x (the newest one with the stupid new interface) for Fury cards, on 16.11 if I used the latest Trixx I couldn't control my cards fan speed. Using Relive drivers and having no issues at all on Win7 x64 SP1 using Trixx 5.2.1. Can overclock, control fans, and everything is working great!

Radeon Chill is awesome, my temps went from 10-15C lower using the same OC and fan profile without any performance loss! Tomb Raider 2013 and Skyrim only max out my top card at 45C in Eyefinity! ROTTR and Witcher 3 are around 51C max! 60fps all around.


----------



## ressonantia

Hey @kana-maru, nice benchmarks, just a quick note. In your Doom benchmarks, I think you've got your max FPS and your min FPS mixed?
Quote:


> Doom
> FPS Avg: 60.09
> FPS Max: 46.64
> FPS Min Caliber: 109


----------



## Kana-Maru

Quote:


> Originally Posted by *ressonantia*
> 
> Hey @kana-maru, nice benchmarks, just a quick note. In your Doom benchmarks, I think you've got your max FPS and your min FPS mixed?


Thanks. I've corrected that.


----------



## gupsterg

Quote:


> Originally Posted by *Performer81*
> 
> Wattman is really nice. I can undervolt to regions Afterburner would give me instant crashes, because not every clock/voltage domain is affected by undervolting.
> 
> 
> 
> Down from 1,25V on my XFX Fury. Fans barely go over idle rpm although i have a temp limit of 62 degrees.
> 
> 
> 
> 
> 
> 
> 
> Super cool and quiet.


Yes this is a big benefit of WattMan. No other OC SW is allowing GPU clock/VID per DPM manipulation.

You can also use the WattMan VIDs in ROM







.

I'm doing it by ROM







. My DPM 7 is 1.212V and can go down to 1.175V @ 1050MHz. DPM 4-6 I have lowered by -18.75mV from stock. Will be posting results/data in Fiji bios mod thread when completed all testing.

I will also post a method to keep card to a DPM so you can stability test it, has been discussed in bios mod thread before but will do a section in OP. Perhaps this may help you get DPM 6 down from 1200mV as shown in your screenshot.


----------



## Semel

Anyone is planning on selling their fury and getting vega?


----------



## gupsterg

Heart says sell.

Brain says no sell.

I reckon it's gonna be priced as steep as GTX 1080, Fury X was very closely priced to 980 Ti. I paid cheap for my Fury X ~£250 (Mar 16), so if Vega is 50% faster it needs to be @ £375 (launch day) for me to consider. No way am I thinking of ploughing £500+ into a single GPU.


----------



## bluezone

Yes Vega looks very interesting.


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> Heart says sell.
> 
> Brain says no sell.
> 
> I reckon it's gonna be priced as steep as GTX 1080, Fury X was very closely priced to 980 Ti. I paid cheap for my Fury X ~£250 (Mar 16), so if Vega is 50% faster it needs to be @ £375 (launch day) for me to consider. No way am I thinking of ploughing £500+ into a single GPU.


that's me... I got lucky on my fury x... got it for a very good price for new... which now its pretty much nominal price (350-400$) so for me to consider vega it would have to be under 600$ for sure.... especially since the fury x currently drives all the games I play but one at 4k 60fps quite well... one I have to back down to 1440p or turn the graphics down at 4k to keep 60fps.


----------



## neurotix

Was saving for Vega and had over $1k saved, dropped it all on two Fury's ($300 each) and 3x 1080p IPS monitors.

The Fury at the $300 price point is just amazing... I mean, I got two for less than the price of a GTX 1080... and it gets similar (probably better) performance in games that support Crossfire... and most of the AAA titles I have do. GTX 1080 benches higher but in games I really don't think it can out Crossfire Fury's.

I do HWBOT and I saw bench scores of SLI 1080s at 1800mhz in Fire Strike (regular) and the score was basically equivalent to my Fury's at 1100mhz. 23k range.

So happy with my setup now.


----------



## Jflisk

I have 2 Fury Xs so any new cards I buy are going to have to top them and I mean 1 card equals both of them for me to do the switch up . When I see the real performance of a card that's when I make my decision , oh and a water block being available helps the decision as well.


----------



## neurotix

Same, would need to be literally twice as powerful or close...

6000+ shaders
128 ROPs
400+ TMUs
16GB memory

Something along those lines. I really doubt Vega will be anything near this.


----------



## neptunex

Well, being X% faster than 'previous gen' NVIDIA GPUs doesn't help it either. If I was AMD I'd just bang 8Gb Fury with higher base clocks and waste more time trying to break into the high end market with a good product.


----------



## phantommaggot

Hey guys, quick question.
I can't seem to find it answered when I search.

I'm gonna re-paste the air cooler on my Sapphire r9 Nano with GC Extreme. I need to know what thickness thermal pads to order so I can replace those as well.


----------



## xkm1948

It seems Virtual Reality puts a heavier load on GPU comparing with most GPU stress test. I can get by with 1175mV for 1050MHz(stock voltage for 1050 was 1243mV) during most stress test. However the moment I fire up any VR game my system would instant crash. I need to step it up to ~1230mV for the card to be completely stable on VR.

At the same time I don't quite like the Chill feature during Fallout4. Minium FPS dipped down to ~40 which makes it almost unplayable. I need a good constant 55~60FPS to enjoy the game. So I am not sure whether the Radeon Chill will be enabled for most of the time for me.

Edit: On a side note I believe the new Vega will also come with 4096 processing unit. In terms of frequency and efficiency it should do way better but it won't be a good enough reason for us Fury X owners to upgrade to.


----------



## diggiddi

Quote:


> Originally Posted by *xkm1948*
> 
> It seems Virtual Reality puts a heavier load on GPU comparing with most GPU stress test. I can get by with 1175mV for 1050MHz(stock voltage for 1050 was 1243mV) during most stress test. However the moment I fire up any VR game my system would instant crash. I need to step it up to ~1230mV for the card to be completely stable on VR.
> 
> At the same time I don't quite like the Chill feature during Fallout4. Minium FPS dipped down to ~40 which makes it almost unplayable. I need a good constant 55~60FPS to enjoy the game. So I am not sure whether the Radeon Chill will be enabled for most of the time for me.
> 
> Edit: On a side note I believe the new Vega will also come with 4096 processing unit. In terms of frequency and efficiency it should do way better *but it won't be a good enough reason for us Fury X owners to upgrade to*.


How do you draw that conclusion?


----------



## neurotix

Quote:


> Originally Posted by *diggiddi*
> 
> How do you draw that conclusion?


Maybe because the same specs, just on a smaller process node, will still probably give similar performance. At best, probably +20%. GTX 1080 is only around 30% faster than 980ti... the newer flagships have traditionally only given 20-30% performance. It was this way from HD 7970 to R9 290x, and 290x to Fury X. Polaris (RX 480) is still around the level of the 290x, maybe +20%. It is more efficient and uses less power though. So Vega might have the same specs, a little more performance while using half the energy because of the node shrink. But this is all speculation and could be totally wrong- we won't know til it comes out.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> ... but it won't be a good enough reason for us Fury X owners to upgrade to.


Quote:


> Originally Posted by *diggiddi*
> 
> How do you draw that conclusion?


Still toying with what I may do, depends a lot on launch price and performance of Vega. For me changing from Hawaii to Fiji was a very very small cost. Every Hawaii card I've had, I've sold without loss, a) due to savvy buying b) selling when ebay promo c) some I even made some £ selling after use







. I envisage Fiji to Vega would be more of an upgrade cost at launch, but maybe smaller outlay than if I wait after launch (say ~ upto 1yr), even if Vega is cheaper then Fiji will sell for less IMO.

I wasn't blown away by Fury X originally after purchase, but the longer I used it and looked at the "whole" package it was a sound purchase and decent upgrade for several reasons.

At 1080P IMO Fury X stomped FPS for every game I have, so going to 1440P became logical. Only 2 titles I have which get somewhere around 60 FPS are Crysis 3 and Lords of the fallen. I can't think of one game I've got where I'm not using maximum settings. Only setting that I may tweak from max is AA TBH. Also going FreeSync monitor a few months back has made the longevity of the card better IMO as well. I also fold and [email protected] performance is phenomenal IMO. Anywhere from 400K to 500K on a single card on ~24hrs straight run, where as Hawaii is around 200K. I was not impressed by Polaris on perf/W, Fiji I am, hoping Vega is better than Fiji on perf/W.

Even with aftermarket air coolers on the Hawaii cards I owned, they were not as quiet as the AIO on Fury X or even the Tri-X on Fury IMO. I really like the smaller form factor of the Fury X even if the AIO is a bit of a PITA for installation IMO. I really did contemplate going WC on Hawaii when I thought I'm now not getting anything GPU wise, I don't feel the need to WC vs the AIO on Fury X = cost efficiency IMO.

Fury X is probably the first card I'm considering to keep as a memento of PC hardware







.


----------



## xkm1948

I have always wondered. Do they pay you for performing [email protected]? If not how do you justify the cost of electricity? As a cheap broken biology researcher I find it amazing that people willingly contribute their computing power and pay the electric bill for our research. I personally would never do it.


----------



## neurotix

Quote:


> Originally Posted by *xkm1948*
> 
> I have always wondered. Do they pay you for performing [email protected]? If not how do you justify the cost of electricity? As a cheap broken biology researcher I find it amazing that people willingly contribute their computing power and pay the electric bill for our research. I personally would never do it.


I have like 55 million points in [email protected] so let me take this.

We eat the cost to support the research and the cause.

However...folding in the winter months, the cards produce a lot of heat. This means you can turn the heat down for that room. This offsets the cost and makes the folding cost a lot less, sometimes nothing extra. My physicist friend explained this to me, and also said it costs nothing due to the polar vortex of cold air ( not sure I understand this). But there ya go. =P


----------



## MrKoala

If you use electric heaters in winter, the price you pay for electricity going through the heater or the computer would be exactly the same for the same amount of heat produced, so you might as well do something more productive with it. Beside wear on the computer hardware (gamers will replace them quickly anyway), you don't lose anything.


----------



## jelin1984

how i can install at asus r9 fury
saphire r9 fury bios?????


----------



## xkm1948

Quote:


> Originally Posted by *neurotix*
> 
> I have like 55 million points in [email protected] so let me take this.
> 
> We eat the cost to *support the research and the cause*.
> 
> However...folding in the winter months, the cards produce a lot of heat. This means you can turn the heat down for that room. This offsets the cost and makes the folding cost a lot less, sometimes nothing extra. My physicist friend explained this to me, and also said it costs nothing due to the polar vortex of cold air ( not sure I understand this). But there ya go. =P


This does make me smile. I used to have a collaboration via the Standford folding team to look for protein-protein transient interactions. It was for a minor project regarding some maize chroloplast projects I worked on in my old lab, I never get to know which fine gentlemen/lady's GPU helped me out but I definitely appreciate the effort. It made up about 1/100th portion of a paper published in a 3rd tier molecular genetics journal.

Instead of tapping average gamers for computing power most public research Universities have either started building their own super computing cluster, or purchasing virtual cluster from a private source. Most of my graduate students friends, who were totally gamers, never picked up the idea of doing [email protected] Researchers are paid a pathetic salary. Hell one of our lab postdoc have to go to McDonalds to wash dishes every night after work so he can feed his daughter. We can barely live a dignified life. The structure of the entire research system in US is broken.

On one hand you have senior researchers/postdocs/grad students working like slaves pushing the boundary of science. On the other hand the Universities tap average joe for their home computing power to backup public research. So where did the billion dollar worth of funds that Universities suppose to spend on scientific research go? First it was not spent on improving the condition of miserable researchers. Second it was not spent on improving infrastructures. Most of the money went into the endless abyss called the administration system of the University.

Anyway enough ranting. Thank you for supporting research. That answered all of my questions.


----------



## gupsterg

@xkm1948

You receive no payment for doing [email protected] and I don't do it for room heating either







, it feels like your being charitable, socially responsible and being part of a bigger cause forwarding human kind.

How I got into folding is sort of like this. I've been a subscriber to Custom PC magazine in the UK since issue 1 (Oct 2003) and due to liking the magazine as soon as they had a [email protected] team I joined. In the magazine they run folder of the month and interview them, which I've never been







. They also mention top producers and when members reach x milestone. Several issues I've been in mag due to x milestone, which is cool when I collect the mag.

As I meddle around with tweaking my rig a lot, it's a great stability test IMO and produces something of use. You sorta get addicted to points or climbing in your team. Feb 16 I was 150th in CPC team, 135th in Mar 16, currently 106th (my profile link).

My brother has been involved in biological research, most of it is above my understanding level, he has also worked FOC on projects and we as a family have supported him. He has done research on human embryo for disease detection IIRC, currently he is involved in a cancer research lab and going for a PHD. Proud of him and happy he is following something he has a passion for and betterment of humans.


----------



## neurotix

Quote:


> Originally Posted by *xkm1948*
> 
> This does make me smile. I used to have a collaboration via the Standford folding team to look for protein-protein transient interactions. It was for a minor project regarding some maize chroloplast projects I worked on in my old lab, I never get to know which fine gentlemen/lady's GPU helped me out but I definitely appreciate the effort. It made up about 1/100th portion of a paper published in a 3rd tier molecular genetics journal.
> 
> Instead of tapping average gamers for computing power most public research Universities have either started building their own super computing cluster, or purchasing virtual cluster from a private source. Most of my graduate students friends, who were totally gamers, never picked up the idea of doing [email protected] Researchers are paid a pathetic salary. Hell one of our lab postdoc have to go to McDonalds to wash dishes every night after work so he can feed his daughter. We can barely live a dignified life. The structure of the entire research system in US is broken.
> 
> On one hand you have senior researchers/postdocs/grad students working like slaves pushing the boundary of science. On the other hand the Universities tap average joe for their home computing power to backup public research. So where did the billion dollar worth of funds that Universities suppose to spend on scientific research go? First it was not spent on improving the condition of miserable researchers. Second it was not spent on improving infrastructures. Most of the money went into the endless abyss called the administration system of the University.
> 
> Anyway enough ranting. Thank you for supporting research. That answered all of my questions.


No problem, glad to help you out. If you have any applicable hardware and want to support the cause, head over to Brass Bottom Boys, my old folding team, and let them know. They really need the help right now.


----------



## Jflisk

Do you mean how to install a bios

Run on cmd line as administrator in windows. You don't need to exit windows to flash the card . put everything including bios into a folder on C:\ call it atiflash

CMD opens command prompt box . type cd c:\ then cd atiflash then type
atiflash -p 0(0 top or 1 second depending on card position) biosname.rom

The proper usage of atiflash 2.71 is

atiflash -p 0(0 top or 1 second card depending on card position) biosname.rom

All the switches can be found here
http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/

More here
http://www.overclock.net/t/640063/how-to-flash-ati-cards

or this

THIS IS FROM A PREVIOUS POST

Look up ATIflash directions you can use it off an administrative command line. There are two in the one package atiwinflash and atiflash use the atiflash command should look like

instructions
http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/Atiflash

downloadhttps://www.techpowerup.com/downloads/2531/atiflash-2-71/

From elevated command prompt atiflash -p 0 ( 0 First card 1 second card and so on) biosname.binatiflash -p 0 biosname.bin

Hope this helps


----------



## gupsterg

Quote:


> Originally Posted by *jelin1984*
> 
> how i can install at asus r9 fury
> saphire r9 fury bios?????


I would not recommend flashing Fury Strix with Tri-X ROM. PCB design differs, several members that have flashed a Fury Tri-X or Fury X with Strix ROM have had issues.

The Strix has more VRM phases than Fury Tri-X or Fury X, as IR3567B is dual output 6+2 phase voltage control chip, they employ doublers on Strix to make the 10 GPU phases work when IR3567B is configured for 5 phase output on loop 1.

What do you hope to gain by flashing Strix with Tri-X ROM?


----------



## Performer81

MAybe he wants the timings/voltage mod.


----------



## Johan45

For whatever reason he wants Gupsterg is right it's a very bad idea. On a reference design it's usually OK but strix is different just like Kingpin is different from classified and they're the same brand but still wouldn't work.


----------



## neurotix

Quote:


> Originally Posted by *Johan45*
> 
> For whatever reason he wants Gupsterg is right it's a very bad idea. On a reference design it's usually OK but strix is different just like Kingpin is different from classified and they're the same brand but still wouldn't work.


Listen to this guy. He knows what he's talking about.


----------



## Thoth420

I am cannibalizing Kung Fury X soon and stockpiling parts to build an AM4 build. I was curious what you guys think I could get for my Fury X with a full coverage Supremacy Block and a custom white EK backplate( paint is the same used on expensive vehicles...it is very glossy and looks amazing...makes my NZXT chassis paint look like crap in comparison.) Just looking for a ballpark...I plan on listing it in the marketplace this weekend.


----------



## JonDuma

I upgraded Sapphire R9 Fury Tri-X driver to Crimson relive then after playing BF1, I stated to get an error "Default Radeon Wattman settings have been restored due to an unexpected system failure" and "VIDEO_TDR_FAILURE".

1. I tried to switch bios, it is same.
2. I tried to use old drivers, it is still the same
2. I tried to reset my PC, it is still the same.
4. Updated motherboard with the latest bios (Asus maximus 7 Hero) and still the same.

And the unfortunate this is my warranty just expired last Nov 30.
Anyone with ideas or recommendation on how to fix would really be appreciated.

Thanks,
Jon


----------



## AngryLobster

Don't use Wattman and another OC software like Afterburner in conjunction. Better yet, don't use Wattman at all. DDU the drivers and reinstall them just don't ever go into the Wattman tab that way it's not active.

I had the same issue with my Fury. Wattman is not ready for prime time on our cards.


----------



## bluezone

Quote:


> Originally Posted by *JonDuma*
> 
> I upgraded Sapphire R9 Fury Tri-X driver to Crimson relive then after playing BF1, I stated to get an error "Default Radeon Wattman settings have been restored due to an unexpected system failure" and "VIDEO_TDR_FAILURE".
> 
> 1. I tried to switch bios, it is same.
> 2. I tried to use old drivers, it is still the same
> 2. I tried to reset my PC, it is still the same.
> 4. Updated motherboard with the latest bios (Asus maximus 7 Hero) and still the same.
> 
> And the unfortunate this is my warranty just expired last Nov 30.
> Anyone with ideas or recommendation on how to fix would really be appreciated.
> 
> Thanks,
> Jon


It's a long shot but how is the thermal paste on the CPU. Are the temps OK. How about GPU fan speed?


----------



## kfxsti

Does anyone know of a good tutorial about repasting a Fury? I am A bit hesitant because of some of the pics I have seen showing some putting paste on the the core and the HBM and some just putting it on the core alone. It may sound silly to ask, but I would rather ask and it be done correctly than not ask and reallllly mess some stuff up lol. It will be Gelid GC Extreme going back on it btw. Thanks for any help guys.


----------



## bluezone

Quote:


> Originally Posted by *kfxsti*
> 
> Does anyone know of a good tutorial about repasting a Fury? I am A bit hesitant because of some of the pics I have seen showing some putting paste on the the core and the HBM and some just putting it on the core alone. It may sound silly to ask, but I would rather ask and it be done correctly than not ask and reallllly mess some stuff up lol. It will be Gelid GC Extreme going back on it btw. Thanks for any help guys.


I do not know of any. But I do have a few suggestions.

1. Remove and replace heat sink screws a little at time moving from one screw to the one on the opposite corner. Then to the two other opposing screws. Back and forth.
2. Be careful in the cleaning of the interposer. The circuit traces are exposed. In other words do not use anything hard or sharp to clean between the GPU and HBM dies. "Nice and gentile like". I used alcohol and Q-tips. Plus. I did not try to remove every last speck TIM from the interposer. Just the worst of it.
3.Clean the dies and cooler very well. I use alcohol and a micro fiber cloth.
4. TIM needs to be on both the HBM and the GPU dies. I used the extended "X" method. Upon re-disassembly this seemed to have worked well for me.
5. See item 1. You can crack the interposer (very rare) if you do not exercise due care in tightening in a crisscross pattern, tighten a little bit at a time.

When using Gelid GC, warm its container in hot water before use. It is pretty thick when cold.

Cheers


----------



## JonDuma

Quote:


> Originally Posted by *bluezone*
> 
> It's a long shot but how is the thermal paste on the CPU. Are the temps OK. How about GPU fan speed?


Quote:


> Originally Posted by *bluezone*
> 
> It's a long shot but how is the thermal paste on the CPU. Are the temps OK. How about GPU fan speed?


Hi Blue,

I am using a Be Quiet - Dark rock pro 3 with 3 fans and artic silver 5, the temps in BF1 are between 49 and 55. Also the temps of the GPU is around 70 degrees only.

One thing I have noticed is that all of the blue LED light indicator of the Fury was led up even without the driver installed.

Thanks,
Jon


----------



## kfxsti

Quote:


> Originally Posted by *bluezone*
> 
> I do not know of any. But I do have a few suggestions.
> 
> 1. Remove and replace heat sink screws a little at time moving from one screw to the one on the opposite corner. Then to the two other opposing screws. Back and forth.
> 2. Be careful in the cleaning of the interposer. The circuit traces are exposed. In other words do not use anything hard or sharp to clean between the GPU and HBM dies. "Nice and gentile like". I used alcohol and Q-tips. Plus. I did not try to remove every last speck TIM from the interposer. Just the worst of it.
> 3.Clean the dies and cooler very well. I use alcohol and a micro fiber cloth.
> 4. TIM needs to be on both the HBM and the GPU dies. I used the extended "X" method. Upon re-disassembly this seemed to have worked well for me.
> 5. See item 1. You can crack the interposer (very rare) if you do not exercise due care in tightening in a crisscross pattern, tighten a little bit at a time.
> 
> When using Gelid GC, warm its container in hot water before use. It is pretty thick when cold.
> 
> Cheers


Thank you so much !!! About to start on both cards right now!!!


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *phantommaggot*
> 
> Hey guys, quick question.
> I can't seem to find it answered when I search.
> 
> I'm gonna re-paste the air cooler on my Sapphire r9 Nano with GC Extreme. I need to know what thickness thermal pads to order so I can replace those as well.


Did anyone have an answer for this? I would assume 1.0-1.5mm is plenty but it's rather hard to tell from the photos I've seen.


----------



## Alastair

Anyone know what a 1070 single and dual manages in heaven 4.0 at 1440P extreme? Just wanna compare my Fury's.


----------



## Johan45

Quote:


> Originally Posted by *JonDuma*
> 
> I upgraded Sapphire R9 Fury Tri-X driver to Crimson relive then after playing BF1, I stated to get an error "Default Radeon Wattman settings have been restored due to an unexpected system failure" and "VIDEO_TDR_FAILURE".
> 
> 1. I tried to switch bios, it is same.
> 2. I tried to use old drivers, it is still the same
> 2. I tried to reset my PC, it is still the same.
> 4. Updated motherboard with the latest bios (Asus maximus 7 Hero) and still the same.
> 
> And the unfortunate this is my warranty just expired last Nov 30.
> Anyone with ideas or recommendation on how to fix would really be appreciated.
> 
> Thanks,
> Jon


Have you modified the bios to unlock cores? Don't know if it applies to Fiji but I know the new Relive drivers have been messing with Polaris if they have a modded bios. The driver is checking the BIOS checksum
https://www.techpowerup.com/forums/threads/amd-bios-signature-check-re-enabled-with-relive-locks-out-polaris-bios-modders.228536/#post-3567588


----------



## bluezone

Quote:


> Originally Posted by *JonDuma*
> 
> I upgraded Sapphire R9 Fury Tri-X driver to Crimson relive then after playing BF1, I stated to get an error "Default Radeon Wattman settings have been restored due to an unexpected system failure" and "VIDEO_TDR_FAILURE".
> 
> 1. I tried to switch bios, it is same.
> 2. I tried to use old drivers, it is still the same
> 2. I tried to reset my PC, it is still the same.
> 4. Updated motherboard with the latest bios (Asus maximus 7 Hero) and still the same.
> 
> And the unfortunate this is my warranty just expired last Nov 30.
> Anyone with ideas or recommendation on how to fix would really be appreciated.
> 
> Thanks,
> Jon


I just noticed this post on GURU3D:
Quote:


> Originally Posted by streetwolf View Post
> Repeating my previous post which no one replied to.... Does anyone else find that setting the GPU fan to manual isn't working as expected? Either setting it to manual takes a few tries and once you do the fan speed reverts to the automatic speed at some point even though it is still set to manual? A restart of Windows will also set the speed to the automatic one. Just looking to confirm so I can report it to AMD.
> 
> Yes, I've found the fan settings not working as expected, but I have not had any issues like you describe. When I click "apply" the new setting takes effect instantly, and remains in effect - no multiple attempts and no reverting to auto, even after a reset.
> 
> When I switch to manual it's set to 1200, and I get 1330 rpms. Increasing the slider to 1300 drops rpms to 1065. I have to set it to 1850 to get the 1330rpms that I started with(when it was initially set to 1200).
> 
> IMO if you can reproduce a bug consistently, then you should report it. A fan setting not sticking could be bad news.


I've noticed a similar problem, odd fan speeds. keep an eye on fan speeds using Relive driver everyone.


----------



## gupsterg

Quote:


> Originally Posted by *Johan45*
> 
> Have you modified the bios to unlock cores? Don't know if it applies to Fiji but I know the new Relive drivers have been messing with Polaris if they have a modded bios. The driver is checking the BIOS checksum
> https://www.techpowerup.com/forums/threads/amd-bios-signature-check-re-enabled-with-relive-locks-out-polaris-bios-modders.228536/#post-3567588


The ReLive driver for Polaris is checking bios signature not checksum. The bios signature is supposed to be a hash of hash of protected tables. When we bios mod if the signature is not updated to reflect updated information in bios tables driver knows you have non stock ROM. ReLive driver on Fiji and earlier GPU, bios signature is not being checked.
Quote:


> Originally Posted by *bluezone*
> 
> I've noticed a similar problem, odd fan speeds. keep an eye on fan speeds using Relive driver everyone.


Fury X with stock or custom ROM on ReLive no issue for me on fan for stock AIO.


----------



## bluezone

Quote:


> Originally Posted by *JonDuma*
> 
> Hi Blue,
> 
> I am using a Be Quiet - Dark rock pro 3 with 3 fans and artic silver 5, the temps in BF1 are between 49 and 55. Also the temps of the GPU is around 70 degrees only.
> 
> One thing I have noticed is that all of the blue LED light indicator of the Fury was led up even without the driver installed.
> 
> Thanks,
> Jon


Your temps look good. From what I have read; Artic Sliver; is adequate for CPUs. For GPUs it is not suggested; if you are using it for repasting.
I have a Nano and it does not use activity lights. So I have no idea if that is normal behaviour. But I does seem odd.


----------



## gupsterg

Quote:


> Originally Posted by *JonDuma*
> 
> Also the temps of the GPU is around 70 degrees only.


70°C GPU on Fury Tri-X stock ROM/fan profile is about what I have in my recorded data when I owned one. So not an issue, my room temps ~22-24°C from memory.
Quote:


> Originally Posted by *JonDuma*
> 
> One thing I have noticed is that all of the blue LED light indicator of the Fury was led up even without the driver installed.


My Fury X which uses same PCB as Fury Tri-X does the same when no driver is installed. Also when mobo is in boot stages upto OS loading and if OS has AMD driver the GPUTach LEDs will only have 1 on when not under load.



Spoiler: AMD Fury X (and Tri-X as same PCB) GPUTach LED









Googling VIDEO_TDR_FAILURE brings up not good reading material, ie some are RMA'ing card to resolve. As some are resolving with driver/OS reinstall I would be tempted to hook up a spare drive and just doing a fresh OS install to see how it goes.


----------



## Alastair

What should a pair of Fury's get under load when custom cooled with 620mm of rad? I am running about 44C on both cards. My water DeltaT seems a bit high.


----------



## ht_addict

Quote:


> Originally Posted by *Alastair*
> 
> What should a pair of Fury's get under load when custom cooled with 620mm of rad? I am running about 44C on both cards. My water DeltaT seems a bit high.


 My Dual FuryX's idle in mid 20's and game at mid to upper 30's.


----------



## u3a6

Quote:


> Originally Posted by *kfxsti*
> 
> Thank you so much !!! About to start on both cards right now!!!


IMHO avoid even touching the interposer...


----------



## Alastair

Quote:


> Originally Posted by *ht_addict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What should a pair of Fury's get under load when custom cooled with 620mm of rad? I am running about 44C on both cards. My water DeltaT seems a bit high.
> 
> 
> 
> My Dual FuryX's idle in mid 20's and game at mid to upper 30's.
Click to expand...

how much rad?


----------



## Alastair

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kfxsti*
> 
> Thank you so much !!! About to start on both cards right now!!!
> 
> 
> 
> IMHO avoid even touching the interposer...
Click to expand...

the interposer isn't that fragile. cleaning it with earbuds (q-tips) and tooth picks are alright if you are careful enough. I've dissammbled and re assembled my two cards three times now. so that's six times I've had reason to clean around and on the interposer with no I'll effects.


----------



## u3a6

Quote:


> Originally Posted by *Alastair*
> 
> the interposer isn't that fragile. cleaning it with earbuds (q-tips) and tooth picks are alright if you are careful enough. I've dissammbled and re assembled my two cards three times now. so that's six times I've had reason to clean around and on the interposer with no I'll effects.


Afaik the sapphire cards (at least the nitro) have some plastic film over the interposer for protection (according to buildzoid). The others are a little bit more fragile (I don't know which cards you or kfxsti own, so I just recommend the safest option). Just wanted to make sure kfxsti did not end up with bricked cards because caring to much about getting that interposer shinny clean.


----------



## gupsterg

Yeah the whole interposer thing has perturbed me from doing TIM swap. From buildzoid's comment and few videos I watched, the tension on the stock cross bracket is also something that worries me on reinstallation.


----------



## Alastair

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> the interposer isn't that fragile. cleaning it with earbuds (q-tips) and tooth picks are alright if you are careful enough. I've dissammbled and re assembled my two cards three times now. so that's six times I've had reason to clean around and on the interposer with no I'll effects.
> 
> 
> 
> Afaik the sapphire cards (at least the nitro) have some plastic film over the interposer for protection (according to buildzoid). The others are a little bit more fragile (I don't know which cards you or kfxsti own, so I just recommend the safest option). Just wanted to make sure kfxsti did not end up with bricked cards because caring to much about getting that interposer shinny clean.
Click to expand...

I took off that plastic thing, as thermal paste was geting stuck underneath it. But from what I can see the interposer also has a glue like substance over it protecting it as well.


----------



## kfxsti

Quote:


> Originally Posted by *u3a6*
> 
> Afaik the sapphire cards (at least the nitro) have some plastic film over the interposer for protection (according to buildzoid). The others are a little bit more fragile (I don't know which cards you or kfxsti own, so I just recommend the safest option). Just wanted to make sure kfxsti did not end up with bricked cards because caring to much about getting that interposer shinny clean.


I have two Sapphire Tri-X OC's. I noticed one card has plastic over the interposer.. but one didn't. I have pics of the process if anyone wants them.


----------



## u3a6

Quote:


> Originally Posted by *kfxsti*
> 
> I have two Sapphire Tri-X OC's. I noticed one card has plastic over the interposer.. but one didn't. I have pics of the process if anyone wants them.


Sure, post them!


----------



## Alastair

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kfxsti*
> 
> I have two Sapphire Tri-X OC's. I noticed one card has plastic over the interposer.. but one didn't. I have pics of the process if anyone wants them.
> 
> 
> 
> Sure, post them!
Click to expand...


----------



## kfxsti

Quote:


> Originally Posted by *Alastair*


Ninjad lol. I post mine here shortly. The little ones are wanting to play outside for a bit, so Dady time first lol.


----------



## lanofsong

Hey AMD R9 Radeon Fury/ Fury X/Nano/Pro DUO Fiji owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Alastair

how do I know if Radeon chill is working? I can't seem to find anything in watt man.


----------



## Performer81

Open wattman and scroll to the bottom, there is the chill option. It also does not work in all games.
YOu can set min and max. fps. If you dont move fps should drop to min ingame.


----------



## Alastair

Quote:


> Originally Posted by *Performer81*
> 
> Open wattman and scroll to the bottom, there is the chill option. It also does not work in all games.
> YOu can set min and max. fps. If you dont move fps should drop to min ingame.


Nothing there for me. Maybe disabled for crossfire set ups?


----------



## gupsterg

Luv'in the Fiji FPS data in these charts







, are we gonna see more games like this? AMD Finewine™







....


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> Luv'in the Fiji FPS data in these charts
> 
> 
> 
> 
> 
> 
> 
> , are we gonna see more games like this? AMD Finewine™
> 
> 
> 
> 
> 
> 
> 
> ....


Wow, catching up to 1080 almost in 4k.... Passing up 1070 quite a bit too... Too bad I don't play these games... Wish dx11 could somehow get magical love also.. most of what I play lately is dx11.

Either way good to see it moving along ...


----------



## ressonantia

Quote:


> Originally Posted by *Alastair*
> 
> Nothing there for me. Maybe disabled for crossfire set ups?


Chill seems to be incompatible with crossfire at the moment.
Quote:


> Originally Posted by *gupsterg*
> 
> Luv'in the Fiji FPS data in these charts
> 
> 
> 
> 
> 
> 
> 
> , are we gonna see more games like this? AMD Finewine™
> 
> 
> 
> 
> 
> 
> 
> ....


I ran the Division benchmark in both DX12 and DX11 and honestly they were equal in performance with my NANO. I did get roughly the same FPS as those tests though so /shrug


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> Wow, catching up to 1080 almost in 4k.... Passing up 1070 quite a bit too... Too bad I don't play these games... Wish dx11 could somehow get magical love also.. most of what I play lately is dx11.
> 
> Either way good to see it moving along ...


Yeah sweet







.

I was viewing this TPU review of a GTX 1080, Fury/X is using v16.10.1 driver for review. Take the 1440P data for BF1, Arkham Knight, COD, Deus EX, Doom, F1 2016, Far Cry Primal, Hitman, JC3, Mafia III, Rainbow 6, ROTTR, SW2, TW3 and TWW. The Fury/X isn't doing too bad at all against a GTX 1070.

Considering I seen a deal for a XFX Fury Triple Dissipation today for £240 compared with a GTX 1070 £319 which is 33% more and there's a premium for G-Sync monitor Fiji isn't bad at all.


----------



## Kana-Maru

Quote:


> Originally Posted by *dagget3450*
> 
> Wow, catching up to 1080 almost in 4k.... Passing up 1070 quite a bit too... Too bad I don't play these games... Wish dx11 could somehow get magical love also.. most of what I play lately is dx11.
> 
> Either way good to see it moving along ...


Check out my Fury X @ stock - 4K 100% graphical settings + ReLive Drivers results:


__
https://www.reddit.com/r/5ii3uh/relive_16121_fury_x_4k_more_games_benchmarked/

There is a TL;DR there as well. It's worth reading I believe. Next I'll be reviewing ReLive and 1440p since quite a few people want to see some 1440p results.


----------



## xkm1948

TPU's is reviews are already favoring nvidia in a subtle way. I would take W1zard review with a large grain of salt.


----------



## bluezone

Patch Tuesday. Relive 16.12.2









Paatch notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-16.12.2-Release-Notes.aspx

win 7 64

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

Win 10 64.

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#


----------



## Tgrove

Quote:


> Originally Posted by *xkm1948*
> 
> Is undervolting FuryX a thing? I am gonna try undervolting at stock frequency.


Quote:


> Originally Posted by *Kana-Maru*
> 
> Check out my Fury X @ stock - 4K 100% graphical settings + ReLive Drivers results:
> 
> 
> __
> https://www.reddit.com/r/5ii3uh/relive_16121_fury_x_4k_more_games_benchmarked/
> 
> There is a TL;DR there as well. It's worth reading I believe. Next I'll be reviewing ReLive and 1440p since quite a few people want to see some 1440p results.


This was a great review dat amd finewine technology


----------



## Tgrove

Quote:


> Originally Posted by *bluezone*
> 
> Patch Tuesday. Relive 16.12.2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Paatch notes.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-16.12.2-Release-Notes.aspx
> 
> win 7 64
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> Win 10 64.
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64#


Dam i knew something with those last drivers ****ed my wifi up


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Kana-Maru*
> 
> Check out my Fury X @ stock - 4K 100% graphical settings + ReLive Drivers results:
> 
> 
> __
> https://www.reddit.com/r/5ii3uh/relive_16121_fury_x_4k_more_games_benchmarked/
> 
> There is a TL;DR there as well. It's worth reading I believe. Next I'll be reviewing ReLive and 1440p since quite a few people want to see some 1440p results.


THX @Kana for this


----------



## Kana-Maru

Quote:


> Originally Posted by *Tgrove*
> 
> This was a great review dat amd finewine technology


Thanks. AMD has been on top of their drivers. It feels great to get FPS throughout the life of the GPU instead of depending solely on overclocking.

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> THX @Kana for this


No problem at all. Someone has to continue to show the Fury X some love out here. If I had more GPUs I would bench more, but this is all I have. Heading towards it second year the Fury X is still doing very well. Fury X was marketed as a 4K GPU, looks like Vega might be a beast at 4K.


----------



## bluezone

Quote:


> Originally Posted by *Kana-Maru*
> 
> Thanks. AMD has been on top of their drivers. It feels great to get FPS throughout the life of the GPU instead of depending solely on overclocking.
> No problem at all. Someone has to continue to show the Fury X some love out here. If I had more GPUs I would bench more, but this is all I have. Heading towards it second year the Fury X is still doing very well. Fury X was marketed as a 4K GPU, looks like Vega might be a beast at 4K.


Yes very nice write-up.









I know I'm a little early but, HAPPY HOLIDAYS everyone .


----------



## Charcharo

Yesterday came my R9 Fury Nitro OC (1020mhz) from Sapphire's editors contest







!
Now to join this cool club too.

BTW people, if I will use stock or at most 1050 mhz to the core (+30) and undervolt the GPU, is there a point for me to even use the 300Watt BIOS? I just do not know whether it will help in gaming scenarios?


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> Next I'll be reviewing ReLive and 1440p since quite a few people want to see some 1440p results.


Looking forward to that







and most definitely enjoyed reading your 4K article







.
Quote:


> Originally Posted by *Charcharo*
> 
> Yesterday came my R9 Fury Nitro OC (1020mhz) from Sapphire's editors contest
> 
> 
> 
> 
> 
> 
> 
> !
> Now to join this cool club too.
> 
> BTW people, if I will use stock or at most 1050 mhz to the core (+30) and undervolt the GPU, is there a point for me to even use the 300Watt BIOS? I just do not know whether it will help in gaming scenarios?


Nice result on contest win







. If you use those clocks and undervolt I think you'll be OK, if an issue bios mod can sort it







.
Quote:


> Originally Posted by *bluezone*
> 
> I know I'm a little early but, HAPPY HOLIDAYS everyone .


You too mate







and all members.


----------



## Ne01 OnnA

HAPPY HOLIDAYS to Everyone !


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Kana-Maru*
> 
> Thanks. AMD has been on top of their drivers. It feels great to get FPS throughout the life of the GPU instead of depending solely on overclocking.
> No problem at all. Someone has to continue to show the Fury X some love out here. If I had more GPUs I would bench more, but this is all I have. Heading towards it second year the Fury X is still doing very well. Fury X was marketed as a 4K GPU, looks like Vega might be a beast at 4K.


It's a Very Good GPU








Besides it is New Tek after all:
2.5D Stack with HBM
Small and it has good Perf->tW->FPS ratio (on mY Setup i've managed to get 149-220tW at 1920:1440)

Happy HolyDays for you


----------



## Performer81

DOes hwinfo64 read out vrm temps correctly? I doubt it. It tells me that my vrm temps are always about the same than my GPU temp, mostly 1 -2 degrees higher.
I have a temp limit of 65 degrees and my vrms never went above 66. Shouldnt they be much higher?
(XFX Fury TD)


----------



## bluezone

Quote:


> Originally Posted by *Performer81*
> 
> DOes hwinfo64 read out vrm temps correctly? I doubt it. It tells me that my vrm temps are always about the same than my GPU temp, mostly 1 -2 degrees higher.
> I have a temp limit of 65 degrees and my vrms never went above 66. Shouldnt they be much higher?
> (XFX Fury TD)


It's partially because you have the Temperature Limit @ 65c. Roughly, setting a limit, sets the point of active down clocking to maintain temperatures. So power draw is reduced from high clocks and the increased leakage due to higher temperatures is a smaller factor. This should lead to lower stress levels on the VRMs.


----------



## gupsterg

I think Performer81 has changed Target GPU temperature in cooling profile of ROM so cooling solution maintain that temp on GPU, as few posts back I highlighted to high how to mod ROM fan profile. Which in a round about way is helping his VRM temperature. I do not believe he has changed GPU throttle temperature the one which would throttle clocks to keep GPU at x temperature.

I believe HWiNFO does show correct VRM temperature. I had a Fury Tri-X which besides the fan/shroud/backplate is using the same heatshink as XFX Fury. The temps are better due to how the VRM is cooled independently of GPU IMO. IIRC the plate is independent of the GPU cooling.



Look through the side of the cooler. IIRC the mosfet section in not connected to the main heatpipes or uses it's own set. I noted higher GPU VRM temps on Fury X compared with Tri-X as the coolant on the Fury X flows RAD > GPU block > GPU VRM > RAD.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> I think Performer81 has changed Target GPU temperature in cooling profile of ROM so cooling solution maintain that temp on GPU, as few posts back I highlighted to high how to mod ROM fan profile. Which in a round about way is helping his VRM temperature. I do not believe he has changed GPU throttle temperature the one which would throttle clocks to keep GPU at x temperature..


Temperature Limit slider and GPU target throttle(in rom) are one and the same. I discovered this a while back but forgot to mention it.

EDIT: meaning they both have the same effect.

Sorry I meant GPU throttle not GPU target.


----------



## Performer81

Throttle temp was untouched. The bios is unmodded again (Trix-x OC Bios on my XFX Fury), the Temp limit was a little annoying. Tested Unigine for 15min with stock voltage/Clock/fan settings. Temps still very close together:



I wonder why theres no vrm Temp monitoring in GPU-z.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> Temperature Limit slider and GPU target (in rom) are one and the same. I discovered this a while back but forgot to mention it.
> 
> EDIT: meaning they both have the same effect.


Target GPU temperature for cooling solution aka "Fuzzy logic/Advance" fan mode is not exposed in OverDrive/WattMan.



Below is stock Fury X PowerPlay > Fan table translation.

Code:



Code:


08   UCHAR   ucRevId;                                /* Change this if the table format changes or version changes so that the other fields are not the same. */
03 3C   UCHAR   ucTHyst;                                /* Temperature hysteresis. Integer. */
0FA0 40C        USHORT  usTMin;                         /* The temperature, in 0.01 centigrades, below which we just run at a minimal PWM. */
1770 60C        USHORT  usTMed;                         /* The middle temperature where we change slopes. */
1F40 80C        USHORT  usTHigh;                        /* The high point above TMed for adjusting the second slope. */
1388 50%        USHORT  usPWMMin;                       /* The minimum PWM value in percent (0.01% increments). */
1D4C 75%        USHORT  usPWMMed;                       /* The PWM value (in percent) at TMed. */
2710 100%       USHORT  usPWMHigh;                      /* The PWM value at THigh. */
2134 85C        USHORT  usTMax;                         /* The max temperature */
01      UCHAR   ucFanControlMode;                       /* Legacy or Fuzzy Fan mode */
0064 100%       USHORT  usFanPWMMax;                    /* Maximum allowed fan power in percent */
12E4 4836       USHORT  usFanOutputSensitivity;         /* Sensitivity of fan reaction to temepature changes */
0898 2200       USHORT  usFanRPMMax;                    /* The default value in RPM */
00011AD0 724MHz ULONG  ulMinFanSCLKAcousticLimit;       /* Minimum Fan Controller SCLK Frequency Acoustic Limit. */
41 65C  UCHAR   ucTargetTemperature;                    /* Advanced fan controller target temperature. */
0F 15%  UCHAR   ucMinimumPWMLimit;                      /* The minimum PWM that the advanced fan controller can set.    This should be set to the highest PWM that will run the fan at its lowest RPM. */
0064 100        USHORT  usFanGainEdge;
0064 100        USHORT  usFanGainHotspot;
0064 100        USHORT  usFanGainLiquid;
0064 100        USHORT  usFanGainVrVddc;
0064 100        USHORT  usFanGainVrMvdd;
0064 100        USHORT  usFanGainPlx;
0064 100        USHORT  usFanGainHbm;
00000000000000000000    USHORT  usReserved;

usTMax is exposed in OverDrive/WattMan from PowerPlay > Fan table.

ucTargetTemperature is not exposed in OverDrive/WattMan from PowerPlay > Fan table.


----------



## gupsterg

@Performer81



Seems right to me temps.

Fury Tri-X stock ROM - 3DM FS demo loop 30min



Fury Tri-X custom ROM 1090MHz @ DPM 7 1.250V - 17.5hrs [email protected]


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Target GPU temperature for cooling solution aka "Fuzzy logic/Advance" fan mode is not exposed in OverDrive/WattMan.
> 
> 
> 
> Below is stock Fury X PowerPlay > Fan table translation.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> 08   UCHAR   ucRevId;                                /* Change this if the table format changes or version changes so that the other fields are not the same. */
> 03 3C   UCHAR   ucTHyst;                                /* Temperature hysteresis. Integer. */
> 0FA0 40C        USHORT  usTMin;                         /* The temperature, in 0.01 centigrades, below which we just run at a minimal PWM. */
> 1770 60C        USHORT  usTMed;                         /* The middle temperature where we change slopes. */
> 1F40 80C        USHORT  usTHigh;                        /* The high point above TMed for adjusting the second slope. */
> 1388 50%        USHORT  usPWMMin;                       /* The minimum PWM value in percent (0.01% increments). */
> 1D4C 75%        USHORT  usPWMMed;                       /* The PWM value (in percent) at TMed. */
> 2710 100%       USHORT  usPWMHigh;                      /* The PWM value at THigh. */
> 2134 85C        USHORT  usTMax;                         /* The max temperature */
> 01      UCHAR   ucFanControlMode;                       /* Legacy or Fuzzy Fan mode */
> 0064 100%       USHORT  usFanPWMMax;                    /* Maximum allowed fan power in percent */
> 12E4 4836       USHORT  usFanOutputSensitivity;         /* Sensitivity of fan reaction to temepature changes */
> 0898 2200       USHORT  usFanRPMMax;                    /* The default value in RPM */
> 00011AD0 724MHz ULONG  ulMinFanSCLKAcousticLimit;       /* Minimum Fan Controller SCLK Frequency Acoustic Limit. */
> 41 65C  UCHAR   ucTargetTemperature;                    /* Advanced fan controller target temperature. */
> 0F 15%  UCHAR   ucMinimumPWMLimit;                      /* The minimum PWM that the advanced fan controller can set.    This should be set to the highest PWM that will run the fan at its lowest RPM. */
> 0064 100        USHORT  usFanGainEdge;
> 0064 100        USHORT  usFanGainHotspot;
> 0064 100        USHORT  usFanGainLiquid;
> 0064 100        USHORT  usFanGainVrVddc;
> 0064 100        USHORT  usFanGainVrMvdd;
> 0064 100        USHORT  usFanGainPlx;
> 0064 100        USHORT  usFanGainHbm;
> 00000000000000000000    USHORT  usReserved;
> 
> usTMax is exposed in OverDrive/WattMan from PowerPlay > Fan table.
> 
> ucTargetTemperature is not exposed in OverDrive/WattMan from PowerPlay > Fan table.


I believe Performer81is referring to the temperature slider in Overdrive/Wattman, but I could be wrong. Temperature Limit slider is in Crimson Overdrive control panel. In ReLive Driver is simply referred to as Temperature. It has a Auto/Manual toggle with a slider. In either case the slider upper limit temperature displayed matches the GPU throttle setting in Cooling portion of the Bios. adjusting the slider affects throttle temperature just the same as setting GPU throttle in Bios.

EDIT: @Performer81 which do you mean Temperature limit or Target GPU temperature?

That should of been GPU throttle in my earlier post. Dam cold meds.


----------



## Performer81

I set the temp limit in wattman. I also had a temp limit in bios but i removed it because it was unnecessary. It means that the fans try to hold the temp at this limit. Has nothing to do with throttling, card never throttles.


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I believe Performer81is referring to the temperature slider in Overdrive/Wattman, but I could be wrong. Temperature Limit slider is in Crimson Overdrive control panel. In ReLive Driver is simply referred to as Temperature. It has a Auto/Manual toggle with a slider. In either case the slider upper limit temperature displayed matches the GPU throttle setting in Cooling portion of the Bios. adjusting the slider affects throttle temperature just the same as setting GPU throttle in Bios.
> 
> EDIT: @Performer81 which do you mean Temperature limit or Target GPU temperature?


No idea which he is refering to







and I agree on your above explanation







. Due to what he had asked recently relating to modding cooling in ROM I've assumed that was what had shown him temperature discrepancy. After he posted HWiNFO screenshot, I think he was not aware HWiNFO shows GPU and HBM VRM temperature independently, it fits with how he was highlighting a VRM temp is not far above/close/below GPU temp.


----------



## Performer81

No i just wondered why GPU and vrm temps were so close together. In reviews with thermal probe vrms were like a 100 degrees and GPU 75. I thought that even with a 65 degrees temp limit vrms should be at 90 somewhere.


----------



## gupsterg

I think you stating temps in the legitreviews article?


----------



## Performer81

http://www.tomshardware.de/sapphire-amd-radeon-r9-fury-tri-x,testberichte-241864-7.html


----------



## gupsterg

Hmmm, no real explanation why their temps are so high. The increased PL ROMs I have seen from TPU / Sapphire / Owners have Target GPU temperature as 80°C and the other switch position has 75°C. These profiles would mean very slow running fans, which could lead to higher VRM temps, perhaps the earlier cards had even higher Target GPU temperature.

Anyhow I received another Fury X today which I'm gonna test for OC ability, I'll remove the panels and stick a temperature probe on the rear of PCB where VRM is and see temps compared with HWiNFO







.


----------



## Kana-Maru

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> It's a Very Good GPU
> 
> 
> 
> 
> 
> 
> 
> 
> Besides it is New Tek after all:
> 2.5D Stack with HBM
> Small and it has good Perf->tW->FPS ratio (on mY Setup i've managed to get 149-220tW at 1920:1440)
> 
> Happy HolyDays for you


I agree that the Fury X is a very good. Happy Holidays to you as well.


----------



## Charcharo

I am surprised at just how quiet the Fury Nitro is.

I can barely even hear it. A lot quieter than my (otherwise excellent) R9 390 PCS+.


----------



## kfxsti

Quote:


> Originally Posted by *u3a6*
> 
> Sure, post them!


I haven't forgotten man. Had a call over the weekend, and had to take a 5 hour drive to work on some things and am on the way back home. I'll post the pics tomorrow . As I still have the cards out and taken apart.


----------



## Alastair

Anyone had any strange stability issues with the new relive drivers? both the hotfix and earlier WHQL ones? I seem to be getting blue screens. All of them while browsing the Web (mostly this website) on Firefox? This is a fresh install of Windows as well.


----------



## Kana-Maru

Quote:


> Originally Posted by *Alastair*
> 
> Anyone had any strange stability issues with the new relive drivers? both the hotfix and earlier WHQL ones? I seem to be getting blue screens. All of them while browsing the Web (mostly this website) on Firefox? This is a fresh install of Windows as well.


I'm having no issues like that and I use Firefox as well. I also use multiple browsers at the same time while gaming sometime. I'm on Win 10 Pro. Have you tried disabling all of your add-ons to ensure there isn't some sort of conflict? If not, you might want to try that and re-enable the add-ons slowly.


----------



## defyoddz

ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or am i missing something?


----------



## ht_addict

Quote:


> Originally Posted by *Alastair*
> 
> how much rad?


Dual 360's


----------



## Tobiman

Quote:


> Originally Posted by *defyoddz*
> 
> ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or am i missing something?


The Fury line-up isn't marketed well enough at this point with Vega on the way. Also, review sites emphasize fps over every thing else.


----------



## diggiddi

Quote:


> Originally Posted by *defyoddz*
> 
> ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or *am i missing something?*


No, you are on point


----------



## shadowxaero

Soooooo I snagged an XFX Fury X off craigslist for 200 bucks...three days later.




TimeSpy GPU Score 11,197
http://www.3dmark.com/spy/930147

FireStrike Ultra GPU Score 8393
http://www.3dmark.com/spy/930147

FireStrike GPU Score 35,466
http://www.3dmark.com/fs/11175155


----------



## TheHorse

Quote:


> Originally Posted by *shadowxaero*
> 
> Soooooo I snagged an XFX Fury X off craigslist for 200 bucks...three days later.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> TimeSpy GPU Score 11,197
> http://www.3dmark.com/spy/930147
> 
> FireStrike Ultra GPU Score 8393
> http://www.3dmark.com/spy/930147
> 
> FireStrike GPU Score 35,466
> http://www.3dmark.com/fs/11175155


?


----------



## shadowxaero

Quote:


> Originally Posted by *TheHorse*
> 
> ?


Lol I already had a Sapphire Fury so now I am running crossfire.


----------



## gupsterg

Sweet rig & scores







, hard tubing looks always so fantastic







. If I ever WC, I'd luv to be able to use that kinda tubing







, take much practice to get right?

Merry Christmas







.


----------



## shadowxaero

Quote:


> Originally Posted by *gupsterg*
> 
> Sweet rig & scores
> 
> 
> 
> 
> 
> 
> 
> 
> , hard tubing looks always so fantastic
> 
> 
> 
> 
> 
> 
> 
> . If I ever WC, I'd luv to be able to use that kinda tubing
> 
> 
> 
> 
> 
> 
> 
> , take much practice to get right?
> 
> Merry Christmas
> 
> 
> 
> 
> 
> 
> 
> .


Thanks! And Merry Christmas.

This is my first hard tubing build (and the first time I had to redo it in order to add in the second GPU).

Over all it wasn't to hard to get the bends right. I am however working with PETG tubing so with one tube that was a bit tricky, going from my CPU to the top radiator, I had to reheat and re-bend as I was about half a centimeter off. Other than that is went pretty smooth just kind of eyeballing where to bend.

The hardest part was getting the piece of like like rubber into the tubes before you heat and bend so the tube doesn't collapse on you lol.


----------



## gupsterg

Cheers for info







, I know we've spoken in the past on forum, but forget did your Fury Tri-X unlock cores?


----------



## jassilamba

Hey guys, I can get a Sapphire branded R9 Fury X (reference card) for 250 new. I know I have heard a lot of good things about the new AMD drivers, how would you guys rate the cards performance for a 144hz 1440P monitor with the latest drivers? It will be a LAN rig, so I think having a really small footprint card would be really really nice.


----------



## Alastair

Quote:


> Originally Posted by *jassilamba*
> 
> Hey guys, I can get a Sapphire branded R9 Fury X (reference card) for 250 new. I know I have heard a lot of good things about the new AMD drivers, how would you guys rate the cards performance for a 144hz 1440P monitor with the latest drivers? It will be a LAN rig, so I think having a really small footprint card would be really really nice.


they are very solid performers at 1440.


----------



## Alastair

Quote:


> Originally Posted by *ht_addict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> how much rad?
> 
> 
> 
> Dual 360's
Click to expand...

You have 100mm more rad than I do. So I guess I could expect you to have lower temps than me


----------



## Thoth420

Aside my performance issue (which I believe is singular) where my fps drops top 2-5 fps for a period of 10 to 15 seconds in almost any game about 1% of the time the performance is amazing. I still haven't pinned down what is causing it but it is not normal for a Fury X which users in this thread can attest to. I am about ready to start from scratch with a new system and slowly add hardware from this until I can figure out what is causing it. I am pretty sure I have weeded out software but it could be something not playing well with w10 (never used it prior to this rig). I don't think I can even go back to 7 with the Skylake either if the OS is the problem. I cannot get it to install on this system for the life of me.









I also have limited experience and I think I bit off way more than I can chew with this system(still learning how to configure DDR4 RAM properly and getting used to this half awesome and half bloatware nightmare OS(wanna love it but kind of hate it feeling)....so more my mistake. I miss my SB rig it was rock solid stable....and the CPU was an ES that clocked to 5.0 on air.

TLDR: Fury X is a fantastic deal for running 1440p. The small form factor is great as well.


----------



## shadowxaero

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers for info
> 
> 
> 
> 
> 
> 
> 
> , I know we've spoken in the past on forum, but forget did your Fury Tri-X unlock cores?


Yea it is partially unlocked 3840SPs I believe.


----------



## shadowxaero

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers for info
> 
> 
> 
> 
> 
> 
> 
> , I know we've spoken in the past on forum, but forget did your Fury Tri-X unlock cores?


Yea it is partially unlocked 3840SPs I believe.


----------



## Flamingo

Who wants to hear your GPU make weird ass sounds?

Download GPU Caps viewer and run the Vulcan demons









http://www.geeks3d.com/20161107/gpu-caps-viewer-1-32-0-released/

My Nano made so many different sounds lol


----------



## u3a6

Quote:


> Originally Posted by *Flamingo*
> 
> Who wants to hear your GPU make weird ass sounds?
> 
> Download GPU Caps viewer and run the Vulcan demons
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geeks3d.com/20161107/gpu-caps-viewer-1-32-0-released/
> 
> My Nano made so many different sounds lol


It was screaming of joy at 4000 fps!


----------



## neurotix

Bumping this up.


----------



## Alastair

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Flamingo*
> 
> Who wants to hear your GPU make weird ass sounds?
> 
> Download GPU Caps viewer and run the Vulcan demons
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geeks3d.com/20161107/gpu-caps-viewer-1-32-0-released/
> 
> My Nano made so many different sounds lol
> 
> 
> 
> It was screaming of joy at 4000 fps!
Click to expand...

Would like to try and run this at full screen on my two cards overnight and see if this doesn't help lesson the coil whine at all (not that it has ever bothered me.)


----------



## damarad21

Originally Posted by defyoddz View Post

ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or am i missing something?

There are some people thinking 4gb it is not/will be not enough to run games in good conditions as it seems now on 6/8gb will be the new standard. That's the reason why many users do prefer rx480. In my case a got a fury Nitro 1 month ago for a good price, still have the choice to send it back and change for another card. Although I think for that price, fury is the best in this time..


----------



## shadowxaero

Quote:


> Originally Posted by *damarad21*
> 
> Originally Posted by defyoddz View Post
> 
> ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or am i missing something?
> 
> There are some people thinking 4gb it is not/will be not enough to run games in good conditions as it seems now on 6/8gb will be the new standard. That's the reason why many users do prefer rx480. In my case a got a fury Nitro 1 month ago for a good price, still have the choice to send it back and change for another card. Although I think for that price, fury is the best in this time..


Well look at it this way, even with the 480 having 8GB of vRAM it doesn't have to raw horsepower the Fury does and wont handle 1440p as well because of it. As for vRAM I don't really run out at 1440p, now at 4K vRAM does become somewhat of an issue but 4GB of HBM is pretty similar to 6GB of GDDR5, at least when comparing to my friends 980ti. But for 260 get the second fury and Xfire them.

TBH most new titles don't have very good Xfire Support, but for the ones that do, dual Fury's are fantastic. And hopefully as dx12 is adopted more devs include multiGPU support. I mean we have Ashes, Tomb Raider and Deus Ex with great dx12 multigpu support so we may be off to a good start of xfire and SLI being worth it.


----------



## Charcharo

Well I did encounter stutter at 1440P on my Fury when compared to my R9 390. On DOOM with Nightmare settings and in Titanfall 2 with the Insane Texture preset.

Anyways, apart from those 2 issues, my experience so far:

Now the first thing that made an impression on me was just how damn big this GPU is. My huge R9 390 PCS+ looked... somewhat small next to it. That is good for me, I like big tri-cooler video cards a lot. To my subjective aesthetic tastes, this and the Tri-X Fury are the best looking cards out there.

So now my machine is a lot better looking. The other somewhat freaky thing is how quiet it is... now the R9 390 was already quiet, especially if you make your own fan profile... but this thing is absurd. After 20 runs of Metro's benchmark at max settings (bar SSAA) at 1440P, it was still almost inaudible. I am impressed from this incarnation of the Tri-X cooler.

As for game improvements... minimal. Anything less than a 100% increase in performance is barely worthy of my time, but since this was free and looks so damn awesome I did go ahead and do it







!

So here they are.

Settings:
AA: Application
Tex Filtering = High
Surface Format = On
Shader cache = AMD optimized
Tess: Application

All games are at 2560x1440 and Ultra less stated otherwise. Both cards are at factory OC settings.
R9 390 : 1010MHz, R9 Fury = 1020MHz.

Metro Last Light Redux.
+PhysX, no SSAA

R9 390 : 51.7 fps average, 22 min

R9 Fury: 65.45 fps average, 31 min

Clear Sky DX10.1
A-Tested AA 4x

R9 390:
Day 34-58
Rain - 33-62
Sun Rays 27-45

R9 Fury:
Day 41 - 69
Rain 38 - 76
Sun Rays 33 - 56

DOOM
Nightmare settings, above Ultra

R9 390: 85.5

R9 Fury: 108.5

The Talos Principle
Beyond Ultra, Custom settings

R9 390: 55.6

R9 Fury: 87.3 FPS

Witcher 3:
Skellige Custom Benchmark Run

R9 390:
2016-12-11 13:09:00 - witcher3 Ultra without Hairworks
Frames: 3291 - Time: 80781ms - Avg: 40.740 - Min: 30 - Max: 52

2016-12-11 13:21:09 - witcher3 Hairworks and App settings
Frames: 2233 - Time: 80312ms - Avg: 27.804 - Min: 17 - Max: 34

2016-12-11 13:27:18 - witcher3 Hairworks and AMD optimized
Frames: 2679 - Time: 76875ms - Avg: 34.849 - Min: 27 - Max: 44

R9 Fury:
2016-12-22 16:24:23 - witcher3 Ultra, No Hairworks
Frames: 3911 - Time: 76094ms - Avg: 53.397 - Min: 40 - Max: 62

2016-12-22 16:27:47 - witcher3 Ultra with HW and 8xAA
Frames: 3268 - Time: 76484ms - Avg: 43.728 - Min: 34 - Max: 53

2016-12-22 16:32:06 - witcher3 Ultra with HW and 4xAA
Frames: 3351 - Time: 75812ms - Avg: 44.201 - Min: 36 - Max: 56

*Fury was always at Application settings







... the improved tessellation helps it I guess.


----------



## damarad21

Quote:


> Originally Posted by *shadowxaero*
> 
> Well look at it this way, even with the 480 having 8GB of vRAM it doesn't have to raw horsepower the Fury does and wont handle 1440p as well because of it. As for vRAM I don't really run out at 1440p, now at 4K vRAM does become somewhat of an issue but 4GB of HBM is pretty similar to 6GB of GDDR5, at least when comparing to my friends 980ti. But for 260 get the second fury and Xfire them.
> 
> TBH most new titles don't have very good Xfire Support, but for the ones that do, dual Fury's are fantastic. And hopefully as dx12 is adopted more devs include multiGPU support. I mean we have Ashes, Tomb Raider and Deus Ex with great dx12 multigpu support so we may be off to a good start of xfire and SLI being worth it.


As far I can see, fury is a great chip, with good power, but it will suffer if textures do not have memory enough, dies not matter if memory is HBM. Thing is if next year when we will set textures to high, fury will do the work or not (crossing fingers if only it will have issues in ultra settings)


----------



## Alastair

With my stock Tri-X Bios I have managed to get my Fury's (3840 shaders) to 1150 and 550MHz with +30mv. I do not see any sort of negative scaling on my set up until around +50mv. But I reckon 1150 is a good OC for now. That is 15% which I am very happy with.


----------



## neurotix

Quote:


> Originally Posted by *damarad21*
> 
> Originally Posted by defyoddz View Post
> 
> ive found a new r9 fury for $260, with the new drivers this seems like the best dollar/performance card for 1440p gamers, why would anyone buy a 480 at the same price or am i missing something?
> 
> There are some people thinking 4gb it is not/will be not enough to run games in good conditions as it seems now on 6/8gb will be the new standard. That's the reason why many users do prefer rx480. In my case a got a fury Nitro 1 month ago for a good price, still have the choice to send it back and change for another card. Although I think for that price, fury is the best in this time..


I regularly run out of/overflow the VRAM on my cards because I game at 5760x1080p (6.2 million pixels)... it doesn't cause my fps to tank or drop though... this is according to Playclaw5 and Afterburner.. there's a couple games that go over the 4GB but I still maintain an almost constant 60 fps. Dragon Age Inquisition is one. Witcher 3 is another. I'm not sure how this works tbh. DA:I actually gave me "out of memory" DirectX errors until I increased the size of my Windows pagefile from 8GB to 16GB (I only have 8GB system RAM). After that it was fine and according to Playclaw has even been using 7GB+ memory. But still gets 60 fps.

Of course, I run two Fury Nitros, so 7000 shaders and 448 texture units. It might be the raw grunt of this setup. Tbh I couldn't be happier. I'll only replace these things if they end up being unable to run new games because of VRAM. Still, I'd think reducing texture quality from Ultra to something lower and more compressed might solve this in the future. A lot of the time you can't tell the difference anyway, unless you put it to low or something. Anyway this is an interesting topic for this thread and these cards.

I'm not sure but it always seems like either 1) cards have too much memory for the time, but not enough GPU power for the long term (6970 anyone?? Try using one of those now....) or 2) Not enough memory but very powerful GPUs (The Fury.. maybe the 970?) What do you guys think are some other examples of this? Why, when we pay so much for these things (usually) can they never strike a good balance? (Disregarding the stack limitations of first-gen HBM... maybe they should have gone with GDDR5 for these...)


----------



## damarad21

Quote:


> Originally Posted by *neurotix*
> 
> I regularly run out of/overflow the VRAM on my cards because I game at 5760x1080p (6.2 million pixels)... it doesn't cause my fps to tank or drop though... this is according to Playclaw5 and Afterburner.. there's a couple games that go over the 4GB but I still maintain an almost constant 60 fps. Dragon Age Inquisition is one. Witcher 3 is another. I'm not sure how this works tbh. DA:I actually gave me "out of memory" DirectX errors until I increased the size of my Windows pagefile from 8GB to 16GB (I only have 8GB system RAM). After that it was fine and according to Playclaw has even been using 7GB+ memory. But still gets 60 fps.
> 
> Of course, I run two Fury Nitros, so 7000 shaders and 448 texture units. It might be the raw grunt of this setup. Tbh I couldn't be happier. I'll only replace these things if they end up being unable to run new games because of VRAM. Still, I'd think reducing texture quality from Ultra to something lower and more compressed might solve this in the future. A lot of the time you can't tell the difference anyway, unless you put it to low or something. Anyway this is an interesting topic for this thread and these cards.
> 
> I'm not sure but it always seems like either 1) cards have too much memory for the time, but not enough GPU power for the long term (6970 anyone?? Try using one of those now....) or 2) Not enough memory but very powerful GPUs (The Fury.. maybe the 970?) What do you guys think are some other examples of this? Why, when we pay so much for these things (usually) can they never strike a good balance? (Disregarding the stack limitations of first-gen HBM... maybe they should have gone with GDDR5 for these...)


Exactly, this is the feeling I have. Rx480 and gtx 1060 do not have muscle enough for that memory amount. Fury would be great with 6Gb, and other cards are to much expensive ...


----------



## Charcharo

To be fair, it is not as if Memory and GPU intensive situations scale linearly. Games like Metro Last Light and Witcher 3 really do not need more than 2.5GB of VRAM even at 4K (and their textures are generally really good to boot!) but obliterate the GPU itself. Other titles like Titanfall 2 or DOOM seem to not be too rough on the GPU, but do WANT a LOT of VRAM.

Very fast CPUs, SSDs and RAM can help a bit BTW, but only a bit.

AMD has admitted that before Fiji had released, their memory management was sloppy as they always had enough VRAM to just force things to do well. Supposedly 2-3 experienced engineers work on special driver support for Fury (and I guess that leaks into other GCN cards too) that does better management. Their work definitely can be felt, and these days it seems they can do more with less, but even that has a limit.

Fury having 6GB of VRAM would rock, but alas it does not. At least that made AMD take management more seriously which helps other and future AMD cards too as well as PCMR modding on AMD GPUs (Glory!) . As it is now I already see some VRAM limitations on my Fury, but overall it is a beast! Still, lets hope it is the last flagship with less than 8GB of VRAM.


----------



## neurotix

Quote:


> Originally Posted by *damarad21*
> 
> Exactly, this is the feeling I have. Rx480 and gtx 1060 do not have muscle enough for that memory amount. Fury would be great with 6Gb, and other cards are to much expensive ...


Agreed, not powerful enough for me. I had dual 290s for a while. With a new card I want something more powerful than that, not just more energy efficient and smaller. That's why I went with Fury's. At least I only spent around $600 (I was prepared to spend twice that- I got new monitors instead.)

Quote:


> Originally Posted by *Charcharo*
> 
> To be fair, it is not as if Memory and GPU intensive situations scale linearly. Games like Metro Last Light and Witcher 3 really do not need more than 2.5GB of VRAM even at 4K (and their textures are generally really good to boot!) but obliterate the GPU itself. Other titles like Titanfall 2 or DOOM seem to not be too rough on the GPU, but do WANT a LOT of VRAM.
> 
> Very fast CPUs, SSDs and RAM can help a bit BTW, but only a bit.
> 
> AMD has admitted that before Fiji had released, their memory management was sloppy as they always had enough VRAM to just force things to do well. Supposedly 2-3 experienced engineers work on special driver support for Fury (and I guess that leaks into other GCN cards too) that does better management. Their work definitely can be felt, and these days it seems they can do more with less, but even that has a limit.
> 
> Fury having 6GB of VRAM would rock, but alas it does not. At least that made AMD take management more seriously which helps other and future AMD cards too as well as PCMR modding on AMD GPUs (Glory!) . As it is now I already see some VRAM limitations on my Fury, but overall it is a beast! Still, lets hope it is the last flagship with less than 8GB of VRAM.


Great insight. I didn't know about the driver team for Fury. I agree that 6GB or even 8GB on Fury would be great and make these cards keepers for a long time. Alas, they went with HBM, I don't think it even really offers that much more performance over GDDR5 (surely it offers more but VRAM speed generally matters little in my experience.) Repped.


----------



## Charcharo

Thank you, Neurotix!

The reasons AMD went with HBM are probably quite complicated, but here are some of them:

1. Power usage. HBM is a LOT more efficient than GDDR5, even modern day mature GDDR5. Anand had estimated that between 32 and 50watts went for the memory controller and memory on the 290X, alone! By using the power saved on HBM (between 15 and 30), you can clock a bit higher or offer better temps and acoustics. Or pack more easily pack more shaders.

2. The memory controller itself is simpler to integrate and smaller. So AMD saved at least some die space on choosing HBM. This allowed them to double the L2 cache and/or add more stream processors. If Fury had GDDR5, I am sure it would either have been beyond what TSMC could easily make and/or be with less cores and other stuff.

One day, HBM would actually save money from complex and hard memory controllers.

3. Experience and simple pioneering of technology. With this better experience and taking the engineering tasks head on, AMD engineers gain knowledge. This means future HBM1/2 technologies and interposer will be cheaper, R&D-wise. The last bit is pure marketing and pride - being the first to get almost prototype level technology to work and work well and last long? Definitely adds some bragging right, at least in front of people that care about technology.

4. More bandwidth + DCC lets AMD's limited (64 on Fury) ROPs perform better all in all. Brute force is a universal solver of problems it seems.

Of course, the issue with HBM1 is down to its slow speed (may cause latency even if it is VERY wide) and limited amount. Both can be somewhat alleviated via drivers as we know, but alas, not fully.


----------



## neurotix

Quote:


> Originally Posted by *Charcharo*
> 
> Thank you, Neurotix!
> 
> The reasons AMD went with HBM are probably quite complicated, but here are some of them:
> 
> 1. Power usage. HBM is a LOT more efficient than GDDR5, even modern day mature GDDR5. Anand had estimated that between 32 and 50watts went for the memory controller and memory on the 290X, alone! By using the power saved on HBM (between 15 and 30), you can clock a bit higher or offer better temps and acoustics. Or pack more easily pack more shaders.
> 
> 2. The memory controller itself is simpler to integrate and smaller. So AMD saved at least some die space on choosing HBM. This allowed them to double the L2 cache and/or add more stream processors. If Fury had GDDR5, I am sure it would either have been beyond what TSMC could easily make and/or be with less cores and other stuff.
> 
> One day, HBM would actually save money from complex and hard memory controllers.
> 
> 3. Experience and simple pioneering of technology. With this better experience and taking the engineering tasks head on, AMD engineers gain knowledge. This means future HBM1/2 technologies and interposer will be cheaper, R&D-wise. The last bit is pure marketing and pride - being the first to get almost prototype level technology to work and work well and last long? Definitely adds some bragging right, at least in front of people that care about technology.
> 
> 4. More bandwidth + DCC lets AMD's limited (64 on Fury) ROPs perform better all in all. Brute force is a universal solver of problems it seems.
> 
> Of course, the issue with HBM1 is down to its slow speed (may cause latency even if it is VERY wide) and limited amount. Both can be somewhat alleviated via drivers as we know, but alas, not fully.


Again, great insight. This explains a lot.

I had actually been saving my money for Vega, but assuming new games come out and will still run on decent settings on my Fury's, I'll wait for Navi in 2019. For the price, I'm extremely happy with the performance of my cards. I feel like I got a good value and saved a lot of money with my purchase. And every game I have runs at 60 fps at 5760x1080. With all the settings maxed out.

I think these cards could probably last a lot longer based on raw power, if it weren't for the small amount of memory, but we'll cross that bridge when we come to it.

The only real downside is that AMD cards don't hold their value. I just had to sell a 270X (golden- 84% ASIC) for $100 and a R7 265 (7850) for $50. I say I had to sell them, because if I didn't sell them now, in a year I'd be lucky to get $75 for both. They were my backup cards. I paid $300 each new for my Fury Nitro's, and a friend of mine on OCN just got one off Ebay that was essentially open box for only $200. So I imagine when Vega comes out I'll only get like $100 a card for these =/ Meanwhile, people still try and sell Nvidia GTX 780ti Kingpin's for like $200-$300 on Ebay







Try playing any modern game on one of those. Just ridiculous and kind of depressing.









EDIT: Oh and my 270x and R7 265 were both complete with box and accessories. They took weeks to sell. And I still only got $150


----------



## Charcharo

True enough though some people are afraid that AMD cards might have been used for mining. I recently sold my GF's GTX 760 for 100 dollars (got her an RX 470 for her Birthday, and yes she keeps the money lol







) which IMHO is a bit of a rip off on that person. To be fair I could have probably asked for more but it would make me feel dirty as 280s and 285s and 280Xes can be had for a similar price and easily obliterate the 760 even in old games, let alone new. The 470 is basically a near three-fold increase in performance all in all.

My Fury Nitro was for free and a friend now has the awesome meme power of the R9 390 I owned







! So all in all, I am very happy. Lets hope time will be kind and the RX 490 is not only good, but finds its way into my house (that way after a PSU change, I might save my brother from his own GTX 760 mediocrity I got him into and ascend to Fury







!).


----------



## neurotix

Thing have gone well for me, even though I've spent a LOT of money on my setup recently, I should be able to afford whatever I want next year. I could probably do dual Vega if I wanted and still have enough for a CPU, motherboard and memory. I may wait even longer for 8700k or whatever it ends up being called though. So it's not a big deal.

The 470 is a small beast. Canned heat. Great 1080p card, and I'm sure it overclocks great. Looking/thinking about getting an RX 480 for my wife's rig (Big Blue).

Anyway I like my Fury Nitro's so much that I did this:


----------



## Charcharo

They are real nice looking.

BTW guys, is there any real difference between the 1020 and 1050 mhz version of the Nitro in terms of OC ability? I understand that Sapphire might have used some binning method, but since that is not quite an exact science... should I really attempt to push it hard?

Am asking because for all my theoretical knowledge on how GPUs work, I usually avoided doing major OCing (at most minor) as I had to make do with a single card for 6-7 years before. Obviously now that I work things are a bit easier on me, but I am still but a student







(I dont have enough money to replace a dead Fury







, probably right up until Vega launch ) !


----------



## Alastair

I just hope that Fiji even with 4GB of ram has enough power to power my new 3440x1440 screen when I get it. At least for another 2 years or so while I finish doing my pilots license and settle into a better paying job.


----------



## Alastair

Quote:


> Originally Posted by *Charcharo*
> 
> They are real nice looking.
> 
> BTW guys, is there any real difference between the 1020 and 1050 mhz version of the Nitro in terms of OC ability? I understand that Sapphire might have used some binning method, but since that is not quite an exact science... should I really attempt to push it hard?
> 
> Am asking because for all my theoretical knowledge on how GPUs work, I usually avoided doing major OCing (at most minor) as I had to make do with a single card for 6-7 years before. Obviously now that I work things are a bit easier on me, but I am still but a student
> 
> 
> 
> 
> 
> 
> 
> (I dont have enough money to replace a dead Fury
> 
> 
> 
> 
> 
> 
> 
> , probably right up until Vega launch ) !


To be honest I dont think there is much between it. Maybe the 1050 is a slightly lower leakage chip. But honeslty how much of a difference ism it really going tto make considering how hard it is to get useful OC's into our cards anyway.


----------



## neurotix

I would think this might be dependent on ASIC quality as compared to clock speed.

My top card is 63% and bottom is 60%, neither do more than 1125mhz. And even last night when I was overclocking, they were giving me trouble doing that. I was giving them a ton of voltage too but once they passed 70C they became unstable in Valley.

Either way don't expect them to clock much past 1100mhz.

Both of mine are 1050mhz cards btw.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> I would think this might be dependent on ASIC quality as compared to clock speed.
> 
> My top card is 63% and bottom is 60%, neither do more than 1125mhz. And even last night when I was overclocking, they were giving me trouble doing that. I was giving them a ton of voltage too but once they passed 70C they became unstable in Valley.
> 
> Either way don't expect them to clock much past 1100mhz.
> 
> Both of mine are 1050mhz cards btw.


Fiji scales very well with temperature. Custom loop and I am holding both my Tri-X's (62.9% and a 60.2%) at 1150/550 with +30mv at around 40C-44C underload depending on ambient


----------



## damarad21

What it is clear, nobody would change a Fury priced around 280€ for a RX480 or Gtx1060 in spite of 4GB. Must better count with a powerful chip. So only way is move to gtx 1070 or above or wait for new high end AMD cards...


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Fiji scales very well with temperature. Custom loop and I am holding both my Tri-X's (62.9% and a 60.2%) at 1150/550 with +30mv at around 40C-44C underload depending on ambient


Yep.

Usually I just run mine stock, er well, 1050mhz. With two of them, I don't need to overclock to maintain 60 fps in everything on Ultra. So not much point.

Running them at 1050mhz they generally don't ever pass 60c.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Fiji scales very well with temperature. Custom loop and I am holding both my Tri-X's (62.9% and a 60.2%) at 1150/550 with +30mv at around 40C-44C underload depending on ambient
> 
> 
> 
> Yep.
> 
> Usually I just run mine stock, er well, 1050mhz. With two of them, I don't need to overclock to maintain 60 fps in everything on Ultra. So not much point.
> 
> Running them at 1050mhz they generally don't ever pass 60c.
Click to expand...

If you can block those with universals (what I have seen of the custom blocks is they do nothing drastic to account for the HBM stacks), just cool the VRM's and maybe you could see some good clocks?


----------



## Performer81

MY XFX Fury does around 1140MHZ with 1300mv VID. Temps are no problem the big cooler can handle it well, I have a temp limit of 62 degrees. WIth stock 1,243V I achieve 1090MHz. Scales pretty good because of low Asic.
DOnt think the voltage is that bad, even for 24/7.


----------



## Alastair

Quote:


> Originally Posted by *Performer81*
> 
> MY XFX Fury does around 1140MHZ with 1300mv VID. Temps are no problem the big cooler can handle it well, I have a temp limit of 62 degrees. WIth stock 1,243V I achieve 1090MHz. Scales pretty good because of low Asic.
> DOnt think the voltage is that bad, even for 24/7.


With that sort of high voltage surely you are seeing negative scaling. Also what ASIC is yours because 1.24V seems pretty damn below average. I thought my 1.225V card was bad. But damn yours takes the cake. And both mine will do 1120 on stock volts and 1135 with +5mv and 1150 with +30.


----------



## Performer81

Asic is 56,7. I red that this isnt really a bad thing, especially with AMD, because it runs cool and scales good with voltage.
By the way how do you test overclocking. Battlefield 4\1, Dishonored 2 and especially Ryse in 1440P are very demanding tests. Also your GPUs are 20 degrees lower than mine.

PS: No i dont see negative scaling with 1140. Nearly linear increase in Performance.


----------



## neurotix

Both my cards (ASIC 63% and 60%) seem to have a VID of 1.25v for 1050mhz.

Pretty bad in my opinion? I hate the high VID. I've even undervolted them by -12mv and -10% power limit and they seem stable in nearly every game and bench. When I do this they seem to get 1.23v or so.

Hope this helps.


----------



## Alastair

Quote:


> Originally Posted by *Performer81*
> 
> Asic is 56,7. I red that this isnt really a bad thing, especially with AMD, because it runs cool and scales good with voltage.
> By the way how do you test overclocking. Battlefield 4\1, Dishonored 2 and especially Ryse in 1440P are very demanding tests. Also your GPUs are 20 degrees lower than mine.
> 
> PS: No i dont see negative scaling with 1140. Nearly linear increase in Performance.


I find that result interesting. Because negative scaling is something that plagued peoples first attempts at overclocking. I start to experience negative scaling at around 1.25V on my card. Very minor (88.2 down to 87.9 fps @ 1120) however it grows to nearly a full frame lost at +75mv in Heaven bench. People were picking up on the losses in performance through Firestrike benches if I remember correctly. Are you saying that you are getting perfect scaling even with high voltage levels? Could you run some repeatable synthetic benches for me by any chance? (Heaven and Firestrike) At fixed clock speeds just with different voltages applied running from low to high. If you are in fact not seeing any sort of negative scaling then it is a very interesting observation. Because it would mean that its the custom cards that are able to somehow circumvent it and it would point to an issue either with the reference design or the BIOS'es on the reference cards.

I know my cards are much cooler than you. Its something I commented on earlier, Fiji seems to like being cold. What my comment was about was your voltage. You have I think the highest stock voltage on a Fiji based card I have ever seen.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Both my cards (ASIC 63% and 60%) seem to have a VID of 1.25v for 1050mhz.
> 
> Pretty bad in my opinion? I hate the high VID. I've even undervolted them by -12mv and -10% power limit and they seem stable in nearly every game and bench. When I do this they seem to get 1.23v or so.
> 
> Hope this helps.


Both my cards have similar ASIC quality ratings to yours yet my best card runs at 1.187 and my worst card runs at 1.225. Did you do the BIOS update when AMD released the updated BIOS for Fury x and Nano? Cause I know those BIOS'es forced 1.25V into both my cards when I applied it. But I returned to my stock BIOS because I couldn't make unlock ROMs with the Fury X BIOS.


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Both my cards have similar ASIC quality ratings to yours yet my best card runs at 1.187 and my worst card runs at 1.225. Did you do the BIOS update when AMD released the updated BIOS for Fury x and Nano? Cause I know those BIOS'es forced 1.25V into both my cards when I applied it. But I returned to my stock BIOS because I couldn't make unlock ROMs with the Fury X BIOS.


My cards won't unlock btw.

I never did the bios update because I bought these cards roughly a month ago new, so I figured they probably had the latest bios.

I'll look into it though.


----------



## Performer81

I tested 1050MHZ wit 1,2, 1,25 and 1,3V in Firestrike and indeed there was a 1fps drop every step.







WIth linear improvement I meant the bench results with 1050, 1100 and 1150......all with 50% PL.
My PCB is reference by the way.
In this reddit thread people say that software voltage control causes negative scaling but bios edit not. So thats wrong and both causes negative scaling or am i doing something wrong.


__
https://www.reddit.com/r/4cc7ir/is_it_true_that_adding_voltage_lowers_fps_on_the/


----------



## gupsterg

Negative scaling occurs regardless of which way voltage is applied. May that be SW or FW.

I also tried various customisations to ROM, even knocking out monitoring data. I also tried 2 ROMs prepared by The Stilt. I also tried drivers all the way back to 1st release that supported Fiji.

I was not able to combat negative scaling with voltage increase. The only thing I have noted is each GPU may accept a differing level of voltage increase prior to negative scaling occuring.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Negative scaling occurs regardless of which way voltage is applied. May that be SW or FW.
> 
> I also tried various customisations to ROM, even knocking out monitoring data. I also tried 2 ROMs prepared by The Stilt. I also tried drivers all the way back to 1st release that supported Fiji.
> 
> I was not able to combat negative scaling with voltage increase. The only thing I have noted is each GPU may accept a differing level of voltage increase prior to negative scaling occuring.


So far with the roms you have prepared for me I have not seen any negative scaling as of yet. with "roughly"+50mv. Will the Nitro roms work on Tri-X? Has anyone tried?


----------



## Performer81

No, Nitro Rom doesnt work on reference design.


----------



## Alastair

Quote:


> Originally Posted by *Performer81*
> 
> No, Nitro Rom doesnt work on reference design.


Le Sigh.


----------



## gupsterg

Nitro ROM VoltageObjectInfo can be modified to make it relevent with VRM design for another card, like I did for Strix. Then only issue remains is how the display connectors are on a differing PCB, like you ran into with using the modded Strix ROM, this can be sorted.

Once you've done above then I always think I haven't really got Nitro / Strix ROM.

You see all command tables between like GPU families are identical. Even some data tables are the same, the ones that differ tend to only have some data value changes relevent to cooling solution & profile, PCB, GPU TDP, etc. In Fiji bios mod is section AtomBios some info/links in there.

I also explored/discussed how XA managed scaling on LN2 on Fiji with The Stilt. It boiled down to few things in our opinion. The driver may have some form of limiting GPU even if we as normal users are not violating say voltage, temps, etc. As XA ROM was knocking out monitoring perhaps driver was unaware of whats occuring on OC front. Also hard mods were used, AFAIK IR3567B does not "see" these (loosely speaking) so driver may not. The driver could have been modified, some of the HWBot guys have access to things like that, XA ROMs were prepared by Asus techies. Also the Strix PCB is designed for hard volts, etc, you'll see pads for some stuff like that on an area on PCB.

I also contacted some people regarding driver, this yielded nothing helpful as no one had a solution.

Yeah +50mV seems the break point, I can get away with +56mV on the card I have, so basically stock VID DPM7 1.212V increased to 1.268V, anything higher is not OK for scaling. I can do 1175MHz with that increased VID, I regard it bench stable only. 1145MHz is rock solid, I can throw 3hrs+ each of 3DM FS & FSE, Heaven, Valley at GPU without hitting an issue. [email protected] can be ran be 48hrs+ without a hitch. So general gaming is not an issue.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Nitro ROM VoltageObjectInfo can be modified to make it relevent with VRM design for another card, like I did for Strix. Then only issue remains is how the display connectors are on a differing PCB, like you ran into with using the modded Strix ROM, this can be sorted.
> 
> Once you've done above then I always think I haven't really got Nitro / Strix ROM.
> 
> You see all command tables between like GPU families are identical. Even some data tables are the same, the ones that differ tend to only have some data value changes relevent to cooling solution & profile, PCB, GPU TDP, etc. In Fiji bios mod is section AtomBios some info/links in there.
> 
> I also explored/discussed how XA managed scaling on LN2 on Fiji with The Stilt. It boiled down to few things in our opinion. The driver may have some form of limiting GPU even if we as normal users are not violating say voltage, temps, etc. As XA ROM was knocking out monitoring perhaps driver was unaware of whats occuring on OC front. Also hard mods were used, AFAIK IR3567B does not "see" these (loosely speaking) so driver may not. The driver could have been modified, some of the HWBot guys have access to things like that, XA ROMs were prepared by Asus techies. Also the Strix PCB is designed for hard volts, etc, you'll see pads for some stuff like that on an area on PCB.
> 
> I also contacted some people regarding driver, this yielded nothing helpful as no one had a solution.
> 
> Yeah +50mV seems the break point, I can get away with +56mV on the card I have, so basically stock VID DPM7 1.212V increased to 1.268V, anything higher is not OK for scaling. I can do 1175MHz with that increased VID, I regard it bench stable only. 1145MHz is rock solid, I can throw 3hrs+ each of 3DM FS & FSE, Heaven, Valley at GPU without hitting an issue. [email protected] can be ran be 48hrs+ without a hitch. So general gaming is not an issue.


I have a feeling we are on the right track with the release 2 roms you gave me. If we can figure out why they are getting stuck in DPM 7 that would probably work wonders. My next thought would then be to try out the Nitro rom. The only difference I can see with the display connectors is the DVI port it has.


----------



## Performer81

I benched [email protected],25V and [email protected],3V again and somehow my 1090 scores are always!!!! 1-2fps faster.







(Valley bench, Time Spy, Firestrike). I could swear that was not always the case and that the clock advantage was stronger than negative scaling, damn.
MAybe i should go back to Software overclocking or the sapphire bios? I just changed the DPM7 to 1300 and let the other ones alone.


----------



## gupsterg

@Performer81


Spoiler: Warning: Spoiler!



It doesn't matter if you use software or firmware to change voltage, clocks, powerlimit, etc. Negative scaling with voltage increase will occur .......

It doesn't matter if you use a combination of software and firmware to change voltage, clocks, powerlimit, etc. Negative scaling with voltage increase will occur .......

I tried combinations of VID changes in PowerPlay and/or VDDC offset using IR3567B registers. Negative scaling with voltage increase will occur .......

I even changed the CAC records (these are voltages within PowerPlay used for setting VID when ROM "auto calculates" VID) with a combination of manual VID changes in PowerPlay and/or VDDC offset using IR3567B registers. Negative scaling with voltage increase will occur .......

The Stilt provided ROMs where he had changed tables used for ASIC profiling for VID in ROM and then I tried combo of CAC records/VID/VDDC offset changes. Negative scaling with voltage increase will occur .......

All the above were also tried with drivers from latest at the time to 1st release of driver supporting Fiji officially. Negative scaling with voltage increase will occur .......

I spent a 1-2 weeks







and went







...



@Alastair


Spoiler: Warning: Spoiler!



Why you have differing voltages between say stock Tri-X ROM vs Tri-X with XA LN2 Strix PowerPlay is:-

a) the GoldenDB for voltages differs slightly on XA LN2 Strix PP.
b) DPM GPU clocks differ, PP is set to "auto calculate" VID per DPM, due to clock increases compared to stock PP = more VID.

Take a AIDA64 registers dump on stock ROM and then on Tri-X with XA LN2 PP (release 2 rom pack), you should see differing VID per DPMs. If you set the Tri-X with XA LN2 Strix PowerPlay (release 2 rom pack) to have the same GPU clocks as Tri-X stock and then set VID as shown in registers dump when on stock ROM, the release 2 rom should give you same voltages.

These videos show some of the differences between Strix stock PP and XA LN2, 



, 



. Fiji bios mod has some posts (early pages) with example images of using Linux driver pptable_v1_0.h to translate PowerPlay.


----------



## Performer81

No i am back with my original bios and software overvolting with Afterburner again and now my scores are better with 1140MHZ again.
Instead of 1300mv in Bios i put +60mv in Afterburner and my scores are up again. I dont know what went wrong.

1090MHz: 109fps in Valley
1140MHZ: 114fps in valley

Timespy:
1090: 5031
1140: 5179

Before it was nearly the opposite.


----------



## gupsterg

Take an AIDA64 registers dump on current ROM, without an OC, then we can assess what is final VID with offset using MSI AB.

*** edit ***

From a previous post your stock VID is 1.243V. So you are at ~1.303V. As 1.303V wouldn't be SVI2 compliant I'd think you're at 1.300V.

GPU-Z render test always gives a flat VDDC in monitoring, so I would test what VDDC you get when doing that test:-

i) Stock ROM with SW OC 1140MHz +60mV.

ii) Stock ROM with DPM 7 modified to 1140MHz 1300mV.


----------



## Performer81

Its between 1,2625 and 1,2688 under GPU-z load. WIth 1,3V Vid it should have been more or less the same. I cant set +57 in MSI Afterburner, only 60. Some other thing must have caused the Performance drop, i dont know. MAybe wattman itself.

Edit: Thats with +48 for 1130MHZ, sorry.


----------



## gupsterg

Hmmm, should not have been fluctuating like "1,2625 and 1,2688" . Did you have Power Efficiency disabled? if so that may have caused the fluctuation for VDDC in GPU-Z render test.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @Performer81
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It doesn't matter if you use software or firmware to change voltage, clocks, powerlimit, etc. Negative scaling with voltage increase will occur .......
> 
> It doesn't matter if you use a combination of software and firmware to change voltage, clocks, powerlimit, etc. Negative scaling with voltage increase will occur .......
> 
> I tried combinations of VID changes in PowerPlay and/or VDDC offset using IR3567B registers. Negative scaling with voltage increase will occur .......
> 
> I even changed the CAC records (these are voltages within PowerPlay used for setting VID when ROM "auto calculates" VID) with a combination of manual VID changes in PowerPlay and/or VDDC offset using IR3567B registers. Negative scaling with voltage increase will occur .......
> 
> The Stilt provided ROMs where he had changed tables used for ASIC profiling for VID in ROM and then I tried combo of CAC records/VID/VDDC offset changes. Negative scaling with voltage increase will occur .......
> 
> All the above were also tried with drivers from latest at the time to 1st release of driver supporting Fiji officially. Negative scaling with voltage increase will occur .......
> 
> I spent a 1-2 weeks
> 
> 
> 
> 
> 
> 
> 
> and went
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> 
> 
> @Alastair
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Why you have differing voltages between say stock Tri-X ROM vs Tri-X with XA LN2 Strix PowerPlay is:-
> 
> a) the GoldenDB for voltages differs slightly on XA LN2 Strix PP.
> b) DPM GPU clocks differ, PP is set to "auto calculate" VID per DPM, due to clock increases compared to stock PP = more VID.
> 
> Take a AIDA64 registers dump on stock ROM and then on Tri-X with XA LN2 PP (release 2 rom pack), you should see differing VID per DPMs. If you set the Tri-X with XA LN2 Strix PowerPlay (release 2 rom pack) to have the same GPU clocks as Tri-X stock and then set VID as shown in registers dump when on stock ROM, the release 2 rom should give you same voltages.
> 
> These videos show some of the differences between Strix stock PP and XA LN2,
> 
> 
> 
> ,
> 
> 
> 
> . Fiji bios mod has some posts (early pages) with example images of using Linux driver pptable_v1_0.h to translate PowerPlay.


Gupsterg I am sorry I do not fully understand.

EDIT: Never mind I do. Ok I already did that. I copied my voltages from my stock Tri-X rom into the release 2 roms through the BIOS editor. I was stuck on DPM 7 (Cards were not clocking down to DPM1 when idle) and when I would put the cards under load I would get VDROOP way below the levels I had set.


----------



## xkm1948

Wattman overclocking is fun!

1250mv and 1110 core OC.


----------



## AngryLobster

Edit: Wrong thread.


----------



## gupsterg

@Alastair

The difference between VID & VDDC is normal, it's not VDROOP in a way of speaking. For example you can set card to say have VID of 1.250V and then run 3DM FS / Valley / Heaven and you will end up with differing VDDC levels, this is PowerTune/LLC effect. Only way you would know if it is VDROOP is if there was an application which total you exact VID GPU was requesting at the time and what VDDC it got, which we don't have.

Even though ROM shows 8 DPM the way PoweTune is there are more states.
Quote:


> PowerTune is also highly granular in terms of its ability to manage clocks. While previous GPUs had only 3 or 4 power states (idle/low, medium, and peak), a GPU with PowerTune contains hundreds of intermediate states in between the primary power states to maximize performance within the TDP constraint


Quote from AMD PowerTune PDF 2012, page 6, paragraph 2.

I will do a release 3 ROM this will be Tri-X ROM with PowerPlay only modified to have the difference in XA LN2 PowerPlay rather than have that complete PowerPlay in it.

@xkm1948

Sweet







.
Quote:


> Originally Posted by *Performer81*
> 
> Some other thing must have caused the Performance drop, i dont know. MAybe wattman itself.


I have been tuning a undervolt ROM for my card past several days and I've been on Crimson Relive v16.2.2, I decided to OC with WattMan







.


Spoiler: Here is AIDA64 dump on undervolt ROM v7.



Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  512 MHz, VID = 0.92500 V
DPM2: GPUClock =  724 MHz, VID = 0.93700 V
DPM3: GPUClock =  892 MHz, VID = 1.00000 V
DPM4: GPUClock =  944 MHz, VID = 1.05000 V
DPM5: GPUClock =  984 MHz, VID = 1.10000 V
DPM6: GPUClock = 1018 MHz, VID = 1.14300 V
DPM7: GPUClock = 1050 MHz, VID = 1.17500 V







Spoiler: Next using WattMan to set an OC of 1135MHz ~ 68mV.



Code:



Code:


------[ GPU PStates List ]------

DPM0: GPUClock =  300 MHz, VID = 0.90000 V
DPM1: GPUClock =  510 MHz, VID = 0.92500 V
DPM2: GPUClock =  725 MHz, VID = 0.93700 V
DPM3: GPUClock =  890 MHz, VID = 1.00000 V
DPM4: GPUClock =  945 MHz, VID = 1.05000 V
DPM5: GPUClock =  985 MHz, VID = 1.10000 V
DPM6: GPUClock = 1020 MHz, VID = 1.14300 V
DPM7: GPUClock = 1135 MHz, VID = 1.24300 V





1135MHz is ~8.1% clock increase over 1050MHz, 3DM FS results compare, performance consistency of 3x run compare of each.

I used regard v16.3.2 as my favourite driver for 3DM benches as drivers after that always seemed slightly lower performing but been happy with how v16.11.x and v16.12.x have benched so far. Only issue have with Wattman is the 1250mV voltage limit. If I use my 24/7 OC 1145MHz ROM with 1268mV DPM 7 Wattman shows that value but when editing it it will only still allow 1250mV







. I plan to try a differing mod to PowerPlay of ROM to see if WattMan will then allow 1300mV.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @Alastair
> 
> The difference between VID & VDDC is normal, it's not VDROOP in a way of speaking. For example you can set card to say have VID of 1.250V and then run 3DM FS / Valley / Heaven and you will end up with differing VDDC levels, this is PowerTune/LLC effect. Only way you would know if it is VDROOP is if there was an application which total you exact VID GPU was requesting at the time and what VDDC it got, which we don't have.
> 
> Even though ROM shows 8 DPM the way PoweTune is there are more states.
> Quote:
> 
> 
> 
> PowerTune is also highly granular in terms of its ability to manage clocks. While previous GPUs had only 3 or 4 power states (idle/low, medium, and peak), a GPU with PowerTune contains hundreds of intermediate states in between the primary power states to maximize performance within the TDP constraint
> 
> 
> 
> Quote from AMD PowerTune PDF 2012, page 6, paragraph 2.
> 
> I will do a release 3 ROM this will be Tri-X ROM with PowerPlay only modified to have the difference in XA LN2 PowerPlay rather than have that complete PowerPlay in it.
> 
> @xkm1948
> 
> Sweet
> 
> 
> 
> 
> 
> 
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *Performer81*
> 
> Some other thing must have caused the Performance drop, i dont know. MAybe wattman itself.
> 
> Click to expand...
> 
> I have been tuning a undervolt ROM for my card past several days and I've been on Crimson Relive v16.2.2, I decided to OC with WattMan
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> Spoiler: Here is AIDA64 dump on undervolt ROM v7.
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  512 MHz, VID = 0.92500 V
> DPM2: GPUClock =  724 MHz, VID = 0.93700 V
> DPM3: GPUClock =  892 MHz, VID = 1.00000 V
> DPM4: GPUClock =  944 MHz, VID = 1.05000 V
> DPM5: GPUClock =  984 MHz, VID = 1.10000 V
> DPM6: GPUClock = 1018 MHz, VID = 1.14300 V
> DPM7: GPUClock = 1050 MHz, VID = 1.17500 V
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Next using WattMan to set an OC of 1135MHz ~ 68mV.
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ------[ GPU PStates List ]------
> 
> DPM0: GPUClock =  300 MHz, VID = 0.90000 V
> DPM1: GPUClock =  510 MHz, VID = 0.92500 V
> DPM2: GPUClock =  725 MHz, VID = 0.93700 V
> DPM3: GPUClock =  890 MHz, VID = 1.00000 V
> DPM4: GPUClock =  945 MHz, VID = 1.05000 V
> DPM5: GPUClock =  985 MHz, VID = 1.10000 V
> DPM6: GPUClock = 1020 MHz, VID = 1.14300 V
> DPM7: GPUClock = 1135 MHz, VID = 1.24300 V
> 
> 
> 
> 
> 
> 1135MHz is ~8.1% clock increase over 1050MHz, 3DM FS results compare, performance consistency of 3x run compare of each.
> 
> I used regard v16.3.2 as my favourite driver for 3DM benches as drivers after that always seemed slightly lower performing but been happy with how v16.11.x and v16.12.x have benched so far. Only issue have with Wattman is the 1250mV voltage limit. If I use my 24/7 OC 1145MHz ROM with 1268mV DPM 7 Wattman shows that value but when editing it it will only still allow 1250mV
> 
> 
> 
> 
> 
> 
> 
> . I plan to try a differing mod to PowerPlay of ROM to see if WattMan will then allow 1300mV.
Click to expand...

So why with the stock ROM I am not getting such dips in voltage? The voltages that are set (be it stock or an offset increase) remain almost exactly at what I set it. But the voltage underload ends up being a lot lower than what I want applied with the release 2 ROM? And any idea why release 2 is staying stuck in DPM7 even when idling. I am sure with these roms we are on the right track to finally beating negative scaling. Because all my initial testing right up to +50mv (1.247mv) have shown no loss in performance thus far.

As for your 3D mark results. looks like a bit of negotiation scaling is occurring. But on average a 5% boost for an 8% OC doesn't seem too bad.


----------



## xkm1948

Would it possible to edit the Crimson driver directly to override the voltage limit? Modding software should be easier than modding BIOS right?


----------



## gupsterg

@Alastair

Look at the combined test result







, it is 8% scaling. That is where I note negative scaling or not, I have 555 3DM FS results from all the Fiji / Hawaii cards, so I think I've seen enough results of mine to know how to interpret them







. Present your results for 3DM FS please







.



I would say only explanation would be the CAC records in release 2 being different to stock Fury Tri-X creating the difference, even with manual VID. Anyhow when you test release 3 we should know as that will be wholly Fury Tri-X ROM with only edits to knock out monitoring.
Quote:


> Originally Posted by *Performer81*
> 
> DOes hwinfo64 read out vrm temps correctly? I doubt it. It tells me that my vrm temps are always about the same than my GPU temp, mostly 1 -2 degrees higher.
> I have a temp limit of 65 degrees and my vrms never went above 66. Shouldnt they be much higher?
> (XFX Fury TD)








I took readings at various intervals, also pushed down on probe to make sure contact was good with PCB. HWiNFO was set to 500ms polling interval, +/- 1°C discrepancy between HWiNFO and temp probe. GPU had 1.2V stock VID, OC set via MSI AB ~+50mV with 1140MHz. So IMO @Mumak's SW is sweet







.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> Would it possible to edit the Crimson driver directly to override the voltage limit? Modding software should be easier than modding BIOS right?


Not AFAIK.


----------



## Mumak

Quote:


> Originally Posted by *gupsterg*
> 
> I took readings at various intervals, also pushed down on probe to make sure contact was good with PCB. HWiNFO was set to 500ms polling interval, +/- 1°C discrepancy between HWiNFO and temp probe. GPU had 1.2V stock VID, OC set via MSI AB ~+50mV with 1140MHz. So IMO @Mumak's SW is sweet
> 
> 
> 
> 
> 
> 
> 
> .


Thanks


----------



## Pedros

Hey guys,

so here's my doubt right now. I know that new cards are coming from AMD ... but meanwhile i just bought a freesync monitor and i have a GTX 980.

So meanwhile, i'm thinking selling the GTX980 and buy a Sapphire R9 Fury Nitro OC.

My question now is, how is the Fury performance with latest drivers, for 1440p ?


----------



## Thoth420

Quote:


> Originally Posted by *Pedros*
> 
> Hey guys,
> 
> so here's my doubt right now. I know that new cards are coming from AMD ... but meanwhile i just bought a freesync monitor and i have a GTX 980.
> 
> So meanwhile, i'm thinking selling the GTX980 and buy a Sapphire R9 Fury Nitro OC.
> 
> My question now is, how is the Fury performance with latest drivers, for 1440p ?


I wouldn't buy any of the AMD cards right now to replace a 980 just over the FS feature. Wait on Vega it will be well worth it.


----------



## LazarusIV

Quote:


> Originally Posted by *Pedros*
> 
> Hey guys,
> 
> so here's my doubt right now. I know that new cards are coming from AMD ... but meanwhile i just bought a freesync monitor and i have a GTX 980.
> 
> So meanwhile, i'm thinking selling the GTX980 and buy a Sapphire R9 Fury Nitro OC.
> 
> My question now is, how is the Fury performance with latest drivers, for 1440p ?


Quote:


> Originally Posted by *Thoth420*
> 
> I wouldn't buy any of the AMD cards right now to replace a 980 just over the FS feature. Wait on Vega it will be well worth it.


I dunno if I agree with Thoth420. If you've already got the FreeSync monitor then I'd say sell your 980 if you can get a good price for it and grab the Sapphire R9 Fury NITRO OC+ while they have a good price. Eyes wide open though, you probably won't be able to sell the Fury for what you bought it for once Vega comes out... Upside is that Vega seems like it won't really be out until late 1H 2017. I'm guessing we'll see a June release. That's pure speculation, I have no special insight.


----------



## neurotix

Quote:


> Originally Posted by *LazarusIV*
> 
> I dunno if I agree with Thoth420. If you've already got the FreeSync monitor then I'd say sell your 980 if you can get a good price for it and grab the Sapphire R9 Fury NITRO OC+ while they have a good price. Eyes wide open though, you probably won't be able to sell the Fury for what you bought it for once Vega comes out... Upside is that Vega seems like it won't really be out until late 1H 2017. I'm guessing we'll see a June release. That's pure speculation, I have no special insight.


I paid $300 each for my Fury Nitros Oct 6th 2016. A friend of mine on my old folding team bought one that was open box on Ebay for $200 just recently. If you can find one like that used, then it's hard to beat price/performance wise. I'm happy with my purchase. But you have to understand, once Vega comes out you'll probably be lucky to get $100 for one of these. If Pedros is okay with that, and plans on holding the card for a while, then go for it.


----------



## LazarusIV

Quote:


> Originally Posted by *neurotix*
> 
> I paid $300 each for my Fury Nitros Oct 6th 2016. A friend of mine on my old folding team bought one that was open box on Ebay for $200 just recently. If you can find one like that used, then it's hard to beat price/performance wise. I'm happy with my purchase. But you have to understand, once Vega comes out you'll probably be lucky to get $100 for one of these. If Pedros is okay with that, and plans on holding the card for a while, then go for it.


Quite true. To me personally, if I already had the monitor, then I'd take the plunge. I got the Sapphire R9 Fury NITRO OC+ in my sig for $250 brand new. Pretty sick deal considering its beastial performance.


----------



## neurotix

Quote:


> Originally Posted by *LazarusIV*
> 
> Quite true. To me personally, if I already had the monitor, then I'd take the plunge. I got the Sapphire R9 Fury NITRO OC+ in my sig for $250 brand new. Pretty sick deal considering its beastial performance.


Agreed.

Really happy with my pair. Run everything I have on Ultra at 60 fps in Eyefinity and run very cool.


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> I dunno if I agree with Thoth420. If you've already got the FreeSync monitor then I'd say sell your 980 if you can get a good price for it and grab the Sapphire R9 Fury NITRO OC+ while they have a good price. Eyes wide open though, you probably won't be able to sell the Fury for what you bought it for once Vega comes out... Upside is that Vega seems like it won't really be out until late 1H 2017. I'm guessing we'll see a June release. That's pure speculation, I have no special insight.


I guess it comes down to how much you can get for the 980. With the Ti and the 1080 out the 1080Ti and Vega on the horizon I would only do it if you can make the new GPU almost free after the sale of the 980.

I also expected to see a Vega release before June but that may have been wishful thinking. I am busy so waiting months isn't a huge deal but if you game daily and want an upgrade now go for it. I am not doing a thing until Vega and the 1080Ti as well as the RyZen CPUs are out and I am far from impressed with the performance of my current setup.
Skylake is overpriced junk(miss my 2600k ES that clocked to 5.0 on air with way less errata) and the Fury X was always just a placeholder...


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> I guess it comes down to how much you can get for the 980. With the Ti and the 1080 out the 1080Ti and Vega on the horizon I would only do it if you can make the new GPU almost free after the sale of the 980.


True true, as long as you can get good value for the 980 I'd say it's worth it for the ~ 7 months worth of FreeSync use until Vega is available.


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> True true, as long as you can get good value for the 980 I'd say it's worth it for the ~ 7 months worth of FreeSync use until Vega is available.


Fair enough. I find VRR to be not worth spending extra money thus G Sync is a ripoff and FS is only good if you don't have to spend money to utilize it.









I have tons of experience using both.


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> Fair enough. I find VRR to be not worth spending extra money thus G Sync is a ripoff and FS is only good if you don't have to spend money to utilize it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have tons of experience using both.


I think people tend to have much different sensitivities to VRR and whatnot, just like tearing. I find myself more sensitive to higher resolutions as opposed to higher refresh rates. I haven't used a VRR monitor in person yet so I have no input in that regard. One nice thing is it seems FreeSync adds hardly, if any, additional cost to a monitor as opposed to G-Sync which adds $150-$200 right off the bat.

Regardless, he already has that monitor. If he were thinking of buying the monitor and getting the Fury for it I'd say no, but he's already got the monitor and I figure he can get a decent price for his 980 still.


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> I think people tend to have much different sensitivities to VRR and whatnot, just like tearing. I find myself noticing higher resolutions more than higher refresh rates. I haven't used a VRR monitor in person yet though so I have no input in that regard. One nice thing though is that it seems FreeSync adds hardly, if any, additional cost to a monitor as opposed to G-Sync adding $150-$200 right off the bat.
> 
> Regardless, he already has that monitor. If he were thinking of buying the monitor and getting the Fury for it I'd say no, but he's already got the monitor and I figure he can get a decent price for his 980 still.


Yes the free part is what makes FS so much better because I am super sensitive to judder, stutter, micro-stutter, AND screen tearing and I cannot use a 60hz monitor for online multiplayer or in any game with fast motion(generally in the first person viewport) at all. That said both forms of VRR suffer diminishing returns on 144hz or greater panels because they handle most all of these problems already with a fast pixel clock, fast response rate and the high refresh. I found the few 4K panels that had VRR to be much more useful than the vast amount(almost all) 1440 144hz G and Free Sync panels I have played with.

1440 seems to be not my cup of tea either and I am going back to 1080 144hz for now and 4K monitor at 60hz probably with FS when Vega comes out for the eye candy games. That said the OP should take that into account. I play more on the competitive end and design my system to be optimal for that. I do enjoy playing single player games with a controller and everything cranked as high as possible but at the end of the day I still focus on the games on I play competitive over the eye candy ones.


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> Yes the free part is what makes FS so much better because I am super sensitive to judder, stutter, micro-stutter, AND screen tearing and I cannot use a 60hz monitor for online multiplayer or in any game with fast motion(generally in the first person viewport) at all. That said both forms of VRR suffer diminishing returns on 144hz or greater panels because they handle most all of these problems already with a fast pixel clock, fast response rate and the high refresh. I found the few 4K panels that had VRR to be much more useful than the vast amount(almost all) 1440 144hz G and Free Sync panels I have played with.
> 
> 1440 seems to be not my cup of tea either and I am going back to 1080 144hz for now and 4K monitor at 60hz probably with FS when Vega comes out for the eye candy games. That said the OP should take that into account. I play more on the competitive end and design my system to be optimal for that. I do enjoy playing single player games with a controller and everything cranked as high as possible but at the end of the day I still focus on the games on I play competitive over the eye candy ones.


I think we're in similar boats, I don't find myself really caring about super high refresh rates like 120Hz or 144Hz. I much prefer high resolution and good, rich color. For me, a 1440p or 2160p IPS / VA monitor at like 100Hz is great. I find myself reluctant to spend any money on a VRR monitor since I haven't seen one in person yet so I'm not sure if I'd be sensitive enough for it to matter much... I really want to see one in person.


----------



## gupsterg

Quote:


> Originally Posted by *Pedros*
> 
> Hey guys,
> 
> so here's my doubt right now. I know that new cards are coming from AMD ... but meanwhile i just bought a freesync monitor and i have a GTX 980.
> 
> So meanwhile, i'm thinking selling the GTX980 and buy a Sapphire R9 Fury Nitro OC.
> 
> My question now is, how is the Fury performance with latest drivers, for 1440p ?


This TPU review used fairly recent drivers v16.10.1 WHQL, relative performance chart . So you'll see a gain by going Fury vs 980 IMO on performance plus being able to use the FreeSync.

Fiji cards stomp 1080P IMO, 1440P is a sweet spot for them IMO and I do think you need FreeSync to get a better experience when using 1440P. For example Lords of the fallen on max in games settings will give me min 46 FPS @ 1440P and I find with FS on the game feels more fluid than with it off, even if I have monitor is set to 144Hz with V-Sync off. Another good example of FreeSync I experienced was Dead space, this game when using V-Sync caps 30 FPS, without V-Sync I was getting very high FPS but even when I limited FPS slightly below refresh rate of screen I had the awful line rendered on screen. This line went away with FreeSync







.

I'm in the UK, so my context is based on what I've seen on buying price for a Fury card past 3mths, they have been available at £250 on promo. Taking the 4 most recent ebay selling prices for a GTX 980 the average price is £210 (without taking fees/shipping deductions). So IMO for the small performance gain and ability to use FreeSync with 1440P I think the £60 outlay to go Fiji would be worth it, for a better gaming experience







.

I reckon Vega is still 6mths away and also gonna be near double the price of what promo prices of Fury cards are at the moment. So you will still have a good performance to £ card. I don't think you will lose a great deal when you come to sell Fiji if you go Vega in 6mths time, as you would have bought it at a reduced price. This month I sold a Tri-X 290 for £150, March 16 I sold same model of card £200 which was same price as another Vapor-X, Asus DCUII and Tri-X I sold in mid/late 2015. So taking those sales into context I would think you'd lose at most £50 in 6mths time selling Fiji.


----------



## catbebi

I recently upgraded my GPU to a r9 FURY and am in the market for a 21:9 ultrawide monitor w/ freesync.

Anyone have any experience running 3440x1440 or 2560x1080 resolutions? If so, which would you recommend for gaming?


----------



## Alastair

Quote:


> Originally Posted by *catbebi*
> 
> I recently upgraded my GPU to a r9 FURY and am in the market for a 21:9 ultrawide monitor w/ freesync.
> 
> Anyone have any experience running 3440x1440 or 2560x1080 resolutions? If so, which would you recommend for gaming?


A single Fury will manage 3440x1440 I am sure. might struggle with some of the very demanding titles. but I think for the most part it should be fine. It's performance at normal 1440P is fine so I reckon it will handle 21:9 1440p ok. I'm also jumping onto the 21:9 wagon. And I'm looking forward to how my pair of Fury's tackles it.


----------



## Cyants

Quote:


> Originally Posted by *catbebi*
> 
> I recently upgraded my GPU to a r9 FURY and am in the market for a 21:9 ultrawide monitor w/ freesync.
> 
> Anyone have any experience running 3440x1440 or 2560x1080 resolutions? If so, which would you recommend for gaming?


I have a FuryX with a 2560x1080 34 inch from LG (34UM67) and I love it, I wanted the 3440x1440 but the price difference was a bummer for me last year. Sometime next year I'll get a Ryzen/Vega combo with a 3840 x 1600 38UC99-W http://www.lg.com/us/monitors/lg-38UC99-W-ultrawide-monitor but thats still a couple of months away at least...

If you like the 2560x1080 price point get the 34UC79G-B, it didn't exist when i bought mine. This one has 144hz with freesync capability. http://www.lg.com/us/monitors/lg-34UC79G-B-ultrawide-monitor


----------



## JonDuma

Quote:


> Originally Posted by *gupsterg*
> 
> 70°C GPU on Fury Tri-X stock ROM/fan profile is about what I have in my recorded data when I owned one. So not an issue, my room temps ~22-24°C from memory.
> My Fury X which uses same PCB as Fury Tri-X does the same when no driver is installed. Also when mobo is in boot stages upto OS loading and if OS has AMD driver the GPUTach LEDs will only have 1 on when not under load.
> 
> 
> 
> Spoiler: AMD Fury X (and Tri-X as same PCB) GPUTach LED
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Googling VIDEO_TDR_FAILURE brings up not good reading material, ie some are RMA'ing card to resolve. As some are resolving with driver/OS reinstall I would be tempted to hook up a spare drive and just doing a fresh OS install to see how it goes.


Thanks, I will try to setup a Win7 PC to check this Fury first since I am building a HTPC.


----------



## Pedros

Hey guys,
thank you so much for your replies.

Basically i'll sell the 980 for a similar price and i was thinking on going with the Fury and next year get the Vega when it's out. After that i would use the fury for my secondary machine.

As for the monitor, it's pretty recent so the math for the month will include both GPU and monitor







( Got the Omen 32" ... i was going for ultrawide but after trying out for a couple of hours i couldn't get use to the ultrawide monitor from the sitting distance i have on my desk )

As for getting the Vega next year, having the Fury ( from what i understand, how it stands now, the Fury gets better results than the 980, am i wrong ? ) would give more time to let the Vega products come out and settle and then i could just choose the best of the bunch without having any "pressure" to get a GPU from day 1.

So, my question is about Fury performance and drivers: is Fury much more mature now that performance is superior to GTX980 or is pretty much the same?

Again, thank you all once again.


----------



## Charcharo

Quote:


> Originally Posted by *Pedros*
> 
> Hey guys,
> thank you so much for your replies.
> 
> Basically i'll sell the 980 for a similar price and i was thinking on going with the Fury and next year get the Vega when it's out. After that i would use the fury for my secondary machine.
> 
> As for the monitor, it's pretty recent so the math for the month will include both GPU and monitor
> 
> 
> 
> 
> 
> 
> 
> ( Got the Omen 32" ... i was going for ultrawide but after trying out for a couple of hours i couldn't get use to the ultrawide monitor from the sitting distance i have on my desk )
> 
> As for getting the Vega next year, having the Fury ( from what i understand, how it stands now, the Fury gets better results than the 980, am i wrong ? ) would give more time to let the Vega products come out and settle and then i could just choose the best of the bunch without having any "pressure" to get a GPU from day 1.
> 
> So, my question is about Fury performance and drivers: is Fury much more mature now that performance is superior to GTX980 or is pretty much the same?
> 
> Again, thank you all once again.


Here is my R9 390 to R9 Fury upgrade:

http://www.overclock.net/t/1547314/official-amd-r9-radeon-fury-nano-x-pro-duo-fiji-owners-club/10220#post_25736073

Now my R9 390 was +10MHz (1%) OCed out of the box and that is how I tested.

As you can see here https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Amp_Extreme/29.html it is a similar upgrade to your 980 to Fury step. To be frank, for me it is small, but Freesync is great. And the Nitro is ... one hell of a cooler/PCB, like holy molly is it good.

*Also it is funny that now we compare R9 390 to 980 and not 970...lol..


----------



## catbebi

Quote:


> Originally Posted by *Alastair*
> 
> A single Fury will manage 3440x1440 I am sure. might struggle with some of the very demanding titles. but I think for the most part it should be fine. It's performance at normal 1440P is fine so I reckon it will handle 21:9 1440p ok. I'm also jumping onto the 21:9 wagon. And I'm looking forward to how my pair of Fury's tackles it.


Yeah, according to reviews, 1440p isn't too bad, but it's those demanding titles I'm worried about, and games releasing in 2017/2018.

And let us know how that setup works out for you.
Quote:


> Originally Posted by *Cyants*
> 
> I have a FuryX with a 2560x1080 34 inch from LG (34UM67) and I love it, I wanted the 3440x1440 but the price difference was a bummer for me last year. Sometime next year I'll get a Ryzen/Vega combo with a 3840 x 1600 38UC99-W http://www.lg.com/us/monitors/lg-38UC99-W-ultrawide-monitor but thats still a couple of months away at least...
> 
> If you like the 2560x1080 price point get the 34UC79G-B, it didn't exist when i bought mine. This one has 144hz with freesync capability. http://www.lg.com/us/monitors/lg-34UC79G-B-ultrawide-monitor


The 34UM67 is one of the displays I'm strongly considering, and the 34UC79 is also on my radar. However, I'm not really sure I need one w/ 144hz refresh rate, and at that price point, I feel that I should just get a 3440x1440 display instead.

BTW, that 38" monitor is absolutely sick! I wish I had that kind of budget.


----------



## neurotix

I really wanted an Ultrawide but I can't justify spending < $800 on a monitor. Or rather, I simply don't have the money if I'm going to also upgrade my CPU, motherboard, memory, and GPUs in the future.

I did, however, go from the 3x ASUS V238H to 3x VC239H slim bezel IPS displays and I'm really happy. And it only cost me around $300. The way I have these set up, the bezel area between each screen is only slightly under half an inch wide. That's a lot less than the bezel on the V238H, which was huge. These monitors being IPS, they also look brilliant, the colors between each display match much better, and the energy consumption with them all on is about what just one of the V238H took to run.

I have no idea what the consensus on my new monitors is, and if they're any good or not, and I'm not really knowledgeable on monitors. But for me, I love them and my setup was affordable for me. Really happy with them so far.







EDIT: Apparently the first two reviews I found online of my monitors rate them pretty highly.


----------



## ManofGod1000

Anyone else have a 2 x Sapphire Fury setup? I have Nitro + and a Tri X and the Nitro + fan speeds seem to go higher than I think they should. Yes, these cards can get hot but, I am using a Cooler Master Mastercase 5 and the fans on the Nitro + will reach 100% when running Crysis 3. It is not incredibly annoying since the fans are not rackety, just distracting. Also, it seems to take forever for the fans to come back down to 22% once the game is closed.

Any ideas or am I just going to have to deal with it? I am using the 16.11.5 drivers because, when I used the 16.12.1 and .2, the Nitro + fan would get stuck at 100% and not come back down until I rebooted the computer.


----------



## Skyl3r

Quote:


> Originally Posted by *neurotix*
> 
> I agree, I basically don't want the hassle and extra cost.
> 
> I can understand if you're into it because it's your hobby (e.g. you are an enthusiast of PC water cooling parts), but it seems like some people like the water cooling parts more than the actual PC components. I personally am into graphics cards, but at the same time there's no way I can afford a quad setup, or 2 GTX 1080s or whatever. Let alone the money to water cool them. Besides, I have a huge retro gaming hobby, I have a wife who likes to go outside, and so forth.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I won the lottery I'd build a fully watercooled quad Crossfire setup in a heartbeat, or even a dual card watercooled setup. Because then I could still have money left to upgrade the cards, as well as buy new blocks and so on.


I actually watercooled my whole system for the 40c-ish temperature improvements on my CPU and 10-20c improvements on my Fury X's.
I like overclocking and I wanted to raise the bar just a little. You'd be surprised how little how little gain the $600-800 you can spend on a custom loop will really net you. But... oh well, it looks cool.


----------



## neurotix

Quote:


> Originally Posted by *ManofGod1000*
> 
> Anyone else have a 2 x Sapphire Fury setup? I have Nitro + and a Tri X and the Nitro + fan speeds seem to go higher than I think they should. Yes, these cards can get hot but, I am using a Cooler Master Mastercase 5 and the fans on the Nitro + will reach 100% when running Crysis 3. It is not incredibly annoying since the fans are not rackety, just distracting. Also, it seems to take forever for the fans to come back down to 22% once the game is closed.
> 
> Any ideas or am I just going to have to deal with it? I am using the 16.11.5 drivers because, when I used the 16.12.1 and .2, the Nitro + fan would get stuck at 100% and not come back down until I rebooted the computer.


I have 2 Nitro Fury's. But I have a enormous Corsair 780T case that is fairly open air. I don't even need a side fan blowing on the cards to get good temps. Generally my top card doesn't pass 60c with 21c ambients. Crysis 3 and some benchmarks will make it go to 65c, but 99% of my games, it doesn't even reach 60c, it seems to go to 58c most often. So, it is probably just Crysis 3 still being quite demanding, because I have the same issue.

Even so they run Crysis 3 beautifully on Ultra with every setting maxed out, 60 fps at 5760x1080. No crashes. But my top card runs fairly hot.

I'd ask what your ambients are?

The problem with the fan speeds with 16.12.1 is probably due to using the latest version of Trixx (6.xx) or possibly Afterburner. I use the older Trixx 5.2.1 and the fan control works with Trixx 5.2.1. Don't use the latest Trixx with these cards, it doesn't play well with them, I think it's only meant for Polaris GPUs.
Quote:


> Originally Posted by *Skyl3r*
> 
> I actually watercooled my whole system for the 40c-ish temperature improvements on my CPU and 10-20c improvements on my Fury X's.
> I like overclocking and I wanted to raise the bar just a little. You'd be surprised how little how little gain the $600-800 you can spend on a custom loop will really net you. But... oh well, it looks cool.


I agree it looks good, but for all the reasons I listed, and especially the maintenance and extra cost... I'm not rich by any means even though I have a decent setup. I can't afford $500-$600 GPUs and then also pay $150 more for each GPU for a block. It's just not happening.

Though I have the utmost respect for guys who are into it and make incredible looking rigs that are totally watercooled. It's just not for me.


----------



## JunkaDK

Some love for the Fury X in my latest build









Check it out here :https://pcpartpicker.com/b/8J7WGX


----------



## Arizonian

Quote:


> Originally Posted by *JunkaDK*
> 
> Some love for the Fury X in my latest build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Check it out here :https://pcpartpicker.com/b/8J7WGX


Very nice work.


----------



## Alastair

@gupsterg

Here are some stock BIOS results. 3840 shaders. 1000, 1100 and then 1150 (+50mv)
I only have normal fire strike and Timespy. Overall my graphics score from 1000 to 1150 went up around 13.5% on timespy. Pretty good scaling for a 15% OC.

Firestrike was less than receptive though. Probably because of 1080P. Graphics test 2 saw the largest gain of 12%.

Timespy.
http://www.3dmark.com/compare/spy/995854/spy/995764
http://www.3dmark.com/compare/spy/995764/spy/995797/spy/995854

Firestrike
http://www.3dmark.com/compare/fs/11298043/fs/11297966/fs/11297897
http://www.3dmark.com/compare/fs/11298043/fs/11297897


----------



## NightAntilli

Quote:


> Originally Posted by *catbebi*
> 
> I recently upgraded my GPU to a r9 FURY and am in the market for a 21:9 ultrawide monitor w/ freesync.
> 
> Anyone have any experience running 3440x1440 or 2560x1080 resolutions? If so, which would you recommend for gaming?


I use my Fury Nitro on 2560x1080. I think the Fury is slightly overpowered for this resolution for the majority of games. At the same time I think 3440x1440 is a bit too heavy for it. I'd stick to 2560x1080 if you want to have longevity in the card. If you're upgrading to Vega soon for example, 3440x1440 is fine for the time being.

Here's a video of my gameplay with the Fury Nitro at 2560x1080

http://www.overclock.net/t/1541528/official-21-9-owners-appreciation-thread-post-anything-related-to-21-9/900_20#post_25719812


----------



## Alastair

anyone have any ideas how you can get to your bios switch when you have EK blocks on?


----------



## ressonantia

So, I feel like a complete noob but this is interesting to me anyway. It seems like modifying your VID per DPM in wattman, causes it to change at a more fundamental level, as shown in AIDA64. Basically its like having a ROM mod but without the reflashing.

Before:


After:


----------



## gupsterg

@Alastair

I see similar in TS when comparing a stock Fury X result from 3DM DB against my OC ROM, I will do a run of stock vs OC on my own rig ASAP







.

Yeah in FS GS1 & 2 I do not get scaling as exact % of OC for me either. They did scale better as % of OC on Hawaii, but the results I'm looking at also had VRAM clock difference; I can't do more on Hawaii as sold it recently. Like on Fiji, combined test was the best indicator for me on Hawaii







. This ROG thread has concise info on 3DM tests and link to tech guide @ futuremark.

@ressonantia

Yes WattMan on VID control is exactly like modding it in ROM







. Do is take a registers dump on stock ROM, then set OC via MSI AB and do a registers dump, you will see why WattMan/ROM VID control is better







. Note the more you OC above stock clock in MSI AB & TriXX you will see a bigger changer in the VID per DPM section of registers







.

I'm now gonna roll with my undervolt ROM with some other mods as my basic setup and then use WattMan to set per game OC profiles. As I do also have a lot of old titles I still enjoy, for example Dead Space using FreeSync/FRTC I get solid 90 FPS with very low GPU clocks = low voltage/power draw.

This HML log is on my OC ROM so the lower DPMs states had stock VID.

DS1_MG279Q.zip 38k .zip file


Currently my undervolt ROM only has modified DPM 3 to 7, once I finish modding DPM 0 to 2 I should see a drop in VID = less power usage for those scenarios







. Gotta get one of those wall plug power meters, would think it must be super perf per watt in situations like that.


----------



## battleaxe

Quote:


> Originally Posted by *Alastair*
> 
> anyone have any ideas how you can get to your bios switch when you have EK blocks on?


Toothpick?


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Currently my undervolt ROM only has modified DPM 3 to 7, once I finish modding DPM 0 to 2 I should see a drop in VID = less power usage for those scenarios
> 
> 
> 
> 
> 
> 
> 
> . Gotta get one of those wall plug power meters, would think it must be super perf per watt in situations like that.


Just a heads up. I found that modifying DPM0 can lead to instability with my Nano and DPM1 reaps very little improvement.









I wonder how long until this is implemented in TVs.

HDMI 2.1 spec adds 8K/10K video, dynamic HDR and variable refresh

http://www.eurogamer.net/articles/digitalfoundry-2017-hdmi-2-1-specifications-revealed

I would love HDR and VRR on a 4K TV monitor.


----------



## Alastair

Anyone know how I can display both my cards monitoring data at the same in Wattman? Or make a window outside of Radeon Settings so I can minimize Radeon Settings?


----------



## Johan45

How about two GPUz windows


----------



## Alastair

Quote:


> Originally Posted by *Johan45*
> 
> How about two GPUz windows


That is what I am using. I just wanted to know if I could do it with wattman.


----------



## Johan45

No idea never used the wattman SW


----------



## xkm1948

After going through the Vega architect review over at TPU I am 100% sure I will be getting Vega. The FuryX will retire to the living room to my HTPC.


----------



## Flamingo

I really want to get it too. but I will be going to nvidia sadly because of better 3d rendering support (iRay) :[

probably will be my first nvidia card lol. lets hope competition is good, so that it drives prices down.


----------



## gupsterg

Waiting in anticipation to try my first Fury Nitro







.


----------



## Alastair

Guys how long is the warranty on the Sapphire Tri-X Fury?


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Guys how long is the warranty on the Sapphire Tri-X Fury?


Should be 2 years. 2 years for all Sapphire cards. I'm like the biggest Sapphire fanboy on these forums so I would know.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys how long is the warranty on the Sapphire Tri-X Fury?
> 
> 
> 
> Should be 2 years. 2 years for all Sapphire cards. I'm like the biggest Sapphire fanboy on these forums so I would know.
Click to expand...

One of my old fury's (I sold one) has gone faulty. The new owner contacted me. It has started to artifact and stuff. Amazon doesn't want anything to do with it because it is out of the 30 day return window. So I assume I go through Sapphire now? I still have the original receipt and I wanna help the guy out. It was still a fare wad of cash when I sold it.


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> One of my old fury's (I sold one) has gone faulty. The new owner contacted me. It has started to artifact and stuff. Amazon doesn't want anything to do with it because it is out of the 30 day return window. So I assume I go through Sapphire now? I still have the original receipt and I wanna help the guy out. It was still a fare wad of cash when I sold it.


Try Sapphire RMA yes.

They are called Althon Micro (assuming they haven't changed) and they are very good guys if you're nice to them







, but everyone complains about them and how long it takes to get cards back. I'm not sure if you'd be dealing with them though since you're from South Africa.

I'll tell you my story,

I had two R9 290 Tri-X in Crossfire, and under load when they heated up I would get this awful fan grinding noise, it almost sounded like coil whine but it wasn't, it was the fans. Other people on the net had similar problems with the fans on that card (look it up if interested... youtube especially... it was common).

Anyway I sent one of them in to Sapphire, waited about 2 weeks, they sent me back a R9 290 Vapor-X... a card upgrade basically.

But I wanted a matching pair, so I RMA'ed my second Tri-X card to them, along with a nice note thanking them for fixing my card, and a bag of candy. (See: Swedish fish theory.) Voila, within a week I had another R9 290 Vapor-X... a matching pair.

Since those cards came out rather late in the life of the 290, not many had them, whereas many peeps had the Tri-X cards. Anyway, even after the 970 and 980s came out, even after the Fury was out, I was able to turn around and sell the 290 Vapor-X cards for about $250 each

So, anyway, if you're nice to the RMA people in your tickets, say things like thank you, and maybe bribe them a little *cough* you might get yourself something really nice.

Oh, and even though Sapphire only has a 2 year warranty... I buy them because 1) never volt locked- I've had like 15 Sapphire cards and none have been volt locked. 2) guaranteed Hynix on high end cards 3) usually the best cooler of the generation when compared to coolers from ASUS, Gigabyte, MSI et al.

Anyway good luck with your RMA brother. Hope this helps you out.


----------



## gupsterg

@Alastair

I've always regarded Sapphire cards to be good but more and more I'm inclined to think their warranty may not be good as other AIBs. Seen some nightmares on the Sapphire forum as the warranty is through reseller in most cases.

Check this post by forum mod, above it my post has link to Sapphire support ticket site.


----------



## diggiddi

Quik ques Which Fury non X generally overclocks the highest?


----------



## xkm1948

Watching this, looks like Vega will not have AIO. At least the engineering sample running DOOM at 4K Ultra does not even need AIO to cool itself down. My body is ready for VEGA,


----------



## Thoth420

Quote:


> Originally Posted by *xkm1948*
> 
> Watching this, looks like Vega will not have AIO. At least the engineering sample running DOOM at 4K Ultra does not even need AIO to cool itself down. My body is ready for VEGA,


I was wondering about that, thanks.


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Quik ques Which Fury non X generally overclocks the highest?


it is either going to be the Asus Strix or the Nitro. I have a feeling the Strix after seeing what Xtreme Addict did to one on LN2. It just doesn't have a dual bios.
Quote:


> Originally Posted by *xkm1948*
> 
> Watching this, looks like Vega will not have AIO. At least the engineering sample running DOOM at 4K Ultra does not even need AIO to cool itself down. My body is ready for VEGA,


My wallet isn't. And as much as I love the Fury's these Vegas seem really impressive especially with 8GB VRAM. I really want my Fury's to last at least 3 more years though.


----------



## Arizonian

Quote:


> Originally Posted by *xkm1948*
> 
> Watching this, looks like Vega will not have AIO. At least the engineering sample running DOOM at 4K Ultra does not even need AIO to cool itself down. My body is ready for VEGA,


My nitro fury has been great performance and value I can't deny. A nice run without any issues, including timely game day ready driver updates. I'll be passing the fury to my second rig and get that off the Nvidia ecosystem too. I'm going to pair up my 970's on my third rig I just finished upgrading this holiday now that it can actually do so without bottlenecking.

After watching the vega doom demo at 4K I'm kinda feeling....


----------



## Performer81

Boring. Doom ist the No.1 AMD Game. Even Fury was at Vulcan on pair with the 1080.


----------



## neurotix

My thoughts...

I think Vega will be on par with a 1080 but probably not Titan XP or 1080ti.

What really determines it in my mind is how many ROPs it has. If it has 64 it might suck or at least perform slightly under a 1080. This may be the case because supposedly it's simply a die shrink and rework of the Fury X (if it has 4096SP this is definitely the case). If it has more than 64 ROPs and more texture units than a Fury X, it will probably land squarely between the 1080 and Titan XP. Best case scenario, it somehow has 128 ROPs, in which case it will blow Titan XP out of the water, but who knows about Volta.

I wanted the Fury X to have more ROPs when it came out and that's why I avoided buying it for so long. But the price was right and I'm happy with my cards.

Come on AMD, we've had 64 ROPs since the 290X in the end of 2013. Nvidia's best cards have 96 now. There's no excuse for a high end, 2017 card to have 64 ROPs.

That's just


----------



## LazarusIV

Quote:


> Originally Posted by *neurotix*
> 
> My thoughts...
> 
> I think Vega will be on par with a 1080 but probably not Titan XP or 1080ti.
> 
> What really determines it in my mind is how many ROPs it has. If it has 64 it might suck or at least perform slightly under a 1080. This may be the case because supposedly it's simply a die shrink and rework of the Fury X (if it has 4096SP this is definitely the case). If it has more than 64 ROPs and more texture units than a Fury X, it will probably land squarely between the 1080 and Titan XP. Best case scenario, it somehow has 128 ROPs, in which case it will blow Titan XP out of the water, but who knows about Volta.
> 
> I wanted the Fury X to have more ROPs when it came out and that's why I avoided buying it for so long. But the price was right and I'm happy with my cards.
> 
> Come on AMD, we've had 64 ROPs since the 290X in the end of 2013. Nvidia's best cards have 96 now. There's no excuse for a high end, 2017 card to have 64 ROPs.
> 
> That's just


It is most certainly not a re-hash of Fury Architecture... read this, really interesting info on Vega architecture and memory systems.


----------



## Charcharo

In all honesty, if Vega is 500+mm^2 and can not compete with Titan X Pascal (a cut down 471mm^2 chip) ... then AMD is in trouble. I know price matters the most in the end, but this is a major technological hurdle. Even a Fury X at 14nm with a less dense design would be smaller than that and hit the 1080 directly.

So here is hoping that what we see from Vega is either clocked low, actually small vega or cut down or has terrible drivers.


----------



## dagget3450

If its 1080 performance level i will be so bummed. Might even have to go to the darkside but will probably just be depressed and not get anything.


----------



## diggiddi

Quote:


> Originally Posted by *Alastair*
> 
> it is either going to be the Asus Strix or the Nitro. I have a feeling the Strix after seeing what Xtreme Addict did to one on LN2. It just doesn't have a dual bios.
> My wallet isn't. And as much as I love the Fury's these Vegas seem really impressive especially with 8GB VRAM. I really want my Fury's to last at least 3 more years though.


Thx Anyone else? On Stock cooling though is it able to hit 1100 easily?


----------



## u3a6

Quote:


> Originally Posted by *dagget3450*
> 
> If its 1080 performance level i will be so bummed. Might even have to go to the darkside but will probably just be depressed and not get anything.


Honestly I think Vega will stomp the 1080... From the performance that we are already seeing, this far away from launch? The performance will only get better, furthermore the card was kept in a very restricted airflow scenario with a blower style cooler which is far from the ideal... We do not know if that card had the all of the cu's active, also the state of the drivers is questionable... The MI25 specs indicate a clock speed of 1525MHz (Imagine a Fury X at those clocks), also the pro cards usually have very conservative clocks...


----------



## bluezone

Happy New Year from AMD and RTG. 1st new driver of the year.

release notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-16.12.2-Release-Notes.aspx

I cannot seem to find live links, so follow the links in the notes to the various editions.

Cheers

EDIT: big download 496 MB


----------



## ressonantia

Is that just the WHQL version of the drivers that were released end of December last year? Driver version is still the same by the looks of it


----------



## bluezone

Quote:


> Originally Posted by *ressonantia*
> 
> Is that just the WHQL version of the drivers that were released end of December last year? Driver version is still the same by the looks of it


Yes it is a WHQL with bug fixes not a major update It's in the release notes.


----------



## Simmons572

Hey folks, just completed my build log, so I figured now would be a good time join the club!




I can't access the signup sheet here at work, so I will fill it out once I get home.


----------



## Alastair

Quote:


> Originally Posted by *Simmons572*
> 
> Hey folks, just completed my build log, so I figured now would be a good time join the club!
> 
> 
> 
> 
> I can't access the signup sheet here at work, so I will fill it out once I get home.


A pretty powerful little rig ya got there.


----------



## neurotix

Quote:


> Originally Posted by *Simmons572*
> 
> Hey folks, just completed my build log, so I figured now would be a good time join the club!
> 
> 
> 
> 
> I can't access the signup sheet here at work, so I will fill it out once I get home.


This thing. I saw your build log. Excellent work man. I love it.

Bet it's really fast too... what resolution do you play at on it?


----------



## ArturoH4L

Recently bough a sapphire nitro r9 fury and i'm having this issue in gta v. (it only happens in GTAV)






any ideas?


----------



## gupsterg

Quote:


> Originally Posted by *diggiddi*
> 
> Quik ques Which Fury non X generally overclocks the highest?


Quote:


> Originally Posted by *Alastair*
> 
> it is either going to be the Asus Strix or the Nitro. I have a feeling the Strix after seeing what Xtreme Addict did to one on LN2.


Quote:


> Originally Posted by *diggiddi*
> 
> Thx Anyone else? On Stock cooling though is it able to hit 1100 easily?


I've only had 1x Fury Tri-X, only reached 1090MHz with stock VID, which was 1.250V. Adding upto +50mV did not gain me even 1100MHz on it







. It did unlock to 3840SP, so when clocked same as genuine Fury X performance was near identical.

Recently got a Fury Nitro, so far gained 1085MHz with stock VID, which is 1.250V. So far only added +25mV and did not get 1100MHz on it







. It will not unlock any SP







, like most of the Nitro owners shares in unlock thread. The HBM on the card seems the best I've had, even when compared with the 8x Fury X cards I've had. It will do 545MHz at stock MVDD, 600MHz was nearly stable with +25mV on MVDD which I have not come close to on any Fury X card with even +100mV on MVDD.

As a compare the Fury X which I've considered my best sample has stock VID 1.212V, 1.268V gains me 1145Mhz, 545MHz HBM needs +25mV on MVDD. This OC is 24/7 stable, I can gain 1175/545 bench stable, if I increase voltage to gain that OC 24/7 stable I run into the negative scaling with voltage increase phenomenon







.

I reckon upto 1100MHz is average, upto 1150MHz above average, over 1150MHz super sample.

Personally I don't believe Nitro/Strix is a better buy than Tri-X/XFX Triple D.

Tri-X AFAIK more people have gained unlocks and most probably XFX Triple D, Nitro is pretty much no go on unlock, Strix has been unlockable. The Strix lacking dual bios is something I don't like, it's got the best VRM out there for Fiji cards but none of the Fiji cards are pants on that front. Nitro and Strix are wider cards than Tri-X/XFX Triple D, also as they use reference PCB design a Fury X waterblock can be used.


----------



## fat4l

Heey Gup!








Whats up mate ? How is it going ?
U had 8 Fury X already ?







That is insane bro. How do you feel about them ?
This negative scaling is rly crap I guess. Why is it even happening, I always wondered ?








One would think, up the volts, put it under water = win ?


----------



## Bojamijams

Just ordered a R9 Fury Nitro. Very excited to go back to AMD. It looks like it won't unlock to a Fury X but that's OK, the price was worth it. Any tips on for a new Fury owner as far as drivers/settings/overclocking? I'm trying to go back through this thread but 1034 pages is a LOT


----------



## Alastair

Quote:


> Originally Posted by *fat4l*
> 
> Heey Gup!
> 
> 
> 
> 
> 
> 
> 
> 
> Whats up mate ? How is it going ?
> U had 8 Fury X already ?
> 
> 
> 
> 
> 
> 
> 
> That is insane bro. How do you feel about them ?
> This negative scaling is rly crap I guess. Why is it even happening, I always wondered ?
> 
> 
> 
> 
> 
> 
> 
> 
> One would think, up the volts, put it under water = win ?


Well water does = win I will give you that. 1120MHz stock volts. Fiji LOVES being cold. And you can add voltage UP TO A POINT. I think the break point for negative scaling varies from chip to chip, but it generally starts at the 1250mv region from what I have seen.

Gups and I have been experimenting (My cards are the guinea pigs) by knocking out the monitoring data from the BIOS. So the cards end up with absolutely no idea how hot they are, how many volts they are receiving etc, etc. We are trying to replicate what Xtreme Addict did to his Fury Strix LN2 ROM which seemed to carry on scaling regardless of voltage.


----------



## fat4l

Quote:


> Originally Posted by *Alastair*
> 
> Well water does = win I will give you that. 1120MHz stock volts. Fiji LOVES being cold. And you can add voltage UP TO A POINT. I think the break point for negative scaling varies from chip to chip, but it generally starts at the 1250mv region from what I have seen.
> 
> Gups and I have been experimenting (My cards are the guinea pigs) by knocking out the monitoring data from the BIOS. So the cards end up with absolutely no idea how hot they are, how many volts they are receiving etc, etc. We are trying to replicate what Xtreme Addict did to his Fury Strix LN2 ROM which seemed to carry on scaling regardless of voltage.


Interesting.
Can you tell me more about what he did to that Strix ?
And why is it negatively scaling ?


----------



## Alastair

hjghj

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Well water does = win I will give you that. 1120MHz stock volts. Fiji LOVES being cold. And you can add voltage UP TO A POINT. I think the break point for negative scaling varies from chip to chip, but it generally starts at the 1250mv region from what I have seen.
> 
> Gups and I have been experimenting (My cards are the guinea pigs) by knocking out the monitoring data from the BIOS. So the cards end up with absolutely no idea how hot they are, how many volts they are receiving etc, etc. We are trying to replicate what Xtreme Addict did to his Fury Strix LN2 ROM which seemed to carry on scaling regardless of voltage.
> 
> 
> 
> Interesting.
> Can you tell me more about what he did to that Strix ?
> And why is it negatively scaling ?
Click to expand...

You will have to ask Gupsterg for more details. But it basically sounds like he wiped out the BIOS's ability to monitor temps and voltages. (Which also means you wont have a way to monitor from programs like afterburner and the likes.)


----------



## Bojamijams

Quote:


> Originally Posted by *Alastair*
> 
> hjghj
> You will have to ask Gupsterg for more details. But it basically sounds like he wiped out the BIOS's ability to monitor temps and voltages. (Which also means you wont have a way to monitor from programs like afterburner and the likes.)


When you guys are talking about negative scaling, do you mean performance goes down (3dmark score for example) or stability of the clocks goes down?


----------



## gupsterg

@fat4l

No idea why negative scaling happen with voltage increase mate







.

Temps on Fury X stock AIO are phenomenal when comparing with Hawaii aftermarket air cooler, for example my 24/7 OC of 1145/545 will keep GPU at 50°C. As it's using Advanced fan control mode some games it's inaudible fan noise and others it can reach 2500 RPM. I removed the 2200 RPM limit in ROM, the gentle typhoon 120mm to me isn't that loud at 2200 RPM. Even the Tri-X air cooler is great (Nitro is practically same HSF), easily maintained 55°C with some fan profile modding. VRM temps again on Fury/X way better than Hawaii.

I really luv the Fury/X card for "out of box" experience compared with the Hawaii cards I had with non ref air coolers. Yeah 8 Fury Xs







, really do like the card mate, recently when I got the Nitro I was like OMG look at the length!







. Fury X is so short and sweet, I saw a deal on EKWB WB (£50) was tempted to go custom loop but then thought AIO is decent and I'll save the £££ for if I go VEGA!







.

@Alastair

Sorry for delay for new ROM, just been a busy with some requests in my inbox and become a mod on another forum







.

@Bojamijams

Yes you see FPS drop = lower bench result, for example 1175MHz with 1.3V will give same score of 3DM FS as 1145MHz with 1.27V, then if you do 1175MHz with 1.27V you gain over 1145MHz @ 1.27V.


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> @fat4l
> 
> No idea why negative scaling happen with voltage increase mate
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Temps on Fury X stock AIO are phenomenal when comparing with Hawaii aftermarket air cooler, for example my 24/7 OC of 1145/545 will keep GPU at 50°C. As it's using Advanced fan control mode some games it's inaudible fan noise and others it can reach 2500 RPM. I removed the 2200 RPM limit in ROM, the gentle typhoon 120mm to me isn't that loud at 2200 RPM. Even the Tri-X air cooler is great (Nitro is practically same HSF), easily maintained 55°C with some fan profile modding. VRM temps again on Fury/X way better than Hawaii.
> 
> I really luv the Fury/X card for "out of box" experience compared with the Hawaii cards I had with non ref air coolers. Yeah 8 Fury Xs
> 
> 
> 
> 
> 
> 
> 
> , really do like the card mate, recently when I got the Nitro I was like OMG look at the length!
> 
> 
> 
> 
> 
> 
> 
> . Fury X is so short and sweet, I saw a deal on EKWB WB (£50) was tempted to go custom loop but then thought AIO is decent and I'll save the £££ for if I go VEGA!
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Alastair
> 
> Sorry for delay for new ROM, just been a busy with some requests in my inbox and become a mod on another forum
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Bojamijams
> 
> Yes you see FPS drop = lower bench result, for example 1175MHz with 1.3V will give same score of 3DM FS as 1145MHz with 1.27V, then if you do 1175MHz with 1.27V you gain over 1145MHz @ 1.27V.


I have currently a 360x120mm external rad on a custom loop for my cpu... overkill... if I were to stick at 5ghz where my loop never gets noticeably above room temp on the output of the rad, I wonder if I could add my fury x into the loop without creating too much more heat for the cpu... currently max long gaming temps on the cpu peaks (spikes) in the mid 40's on the cores with average temps in the mid 30's.....I'm thinking seriously about getting a waterblock for my gpu and adding it in just after the cpu in the loop in a series. Just wonder if it would actually do any good at all.... and the cost too.


----------



## diggiddi

Quote:


> Originally Posted by *gupsterg*
> 
> I've only had 1x Fury Tri-X, only reached 1090MHz with stock VID, which was 1.250V. Adding upto +50mV did not gain me even 1100MHz on it
> 
> 
> 
> 
> 
> 
> 
> . It did unlock to 3840SP, so when clocked same as genuine Fury X performance was near identical.
> 
> Recently got a Fury Nitro, so far gained 1085MHz with stock VID, which is 1.250V. So far only added +25mV and did not get 1100MHz on it
> 
> 
> 
> 
> 
> 
> 
> . It will not unlock any SP
> 
> 
> 
> 
> 
> 
> 
> , like most of the Nitro owners shares in unlock thread. The HBM on the card seems the best I've had, even when compared with the 8x Fury X cards I've had. It will do 545MHz at stock MVDD, 600MHz was nearly stable with +25mV on MVDD which I have not come close to on any Fury X card with even +100mV on MVDD.
> 
> As a compare the Fury X which I've considered my best sample has stock VID 1.212V, 1.268V gains me 1145Mhz, 545MHz HBM needs +25mV on MVDD. This OC is 24/7 stable, I can gain 1175/545 bench stable, if I increase voltage to gain that OC 24/7 stable I run into the negative scaling with voltage increase phenomenon
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I reckon upto 1100MHz is average, upto 1150MHz above average, over 1150MHz super sample.
> 
> Personally I don't believe Nitro/Strix is a better buy than Tri-X/XFX Triple D.
> 
> Tri-X AFAIK more people have gained unlocks and most probably XFX Triple D, Nitro is pretty much no go on unlock, Strix has been unlockable. The Strix lacking dual bios is something I don't like, it's got the best VRM out there for Fiji cards but none of the Fiji cards are pants on that front. Nitro and Strix are wider cards than Tri-X/XFX Triple D, also as they use reference PCB design a Fury X waterblock can be used.


TY, Repped up, but The Nitro pricing though, makes up for all ts deffeciencies especially size wise


----------



## FlawleZ

Just received my Sapphire Nitro R9 Fury today from Newegg. Slapped it in and ran a couple quick benchmarks on my outdated 3DMark with Firestrike to see where I stand. Landed right at 13,200 @ stock and a modest overclock of 1150 on core put me at 13,800. I checked and its not unlockable unfortunately so I'll have to settle with further overclock testing. Should be 14K+ on firestrike with ease I guess I can't complain considering most results don't show huge overclocks for the Fury.

Will try and update with results tomorrow when I have more time.


----------



## fat4l

Quote:


> Originally Posted by *gupsterg*
> 
> @fat4l
> 
> No idea why negative scaling happen with voltage increase mate
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Temps on Fury X stock AIO are phenomenal when comparing with Hawaii aftermarket air cooler, for example my 24/7 OC of 1145/545 will keep GPU at 50°C. As it's using Advanced fan control mode some games it's inaudible fan noise and others it can reach 2500 RPM. I removed the 2200 RPM limit in ROM, the gentle typhoon 120mm to me isn't that loud at 2200 RPM. Even the Tri-X air cooler is great (Nitro is practically same HSF), easily maintained 55°C with some fan profile modding. VRM temps again on Fury/X way better than Hawaii.
> 
> I really luv the Fury/X card for "out of box" experience compared with the Hawaii cards I had with non ref air coolers. Yeah 8 Fury Xs
> 
> 
> 
> 
> 
> 
> 
> , really do like the card mate, recently when I got the Nitro I was like OMG look at the length!
> 
> 
> 
> 
> 
> 
> 
> . Fury X is so short and sweet, I saw a deal on EKWB WB (£50) was tempted to go custom loop but then thought AIO is decent and I'll save the £££ for if I go VEGA!
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Alastair
> 
> Sorry for delay for new ROM, just been a busy with some requests in my inbox and become a mod on another forum
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Bojamijams
> 
> Yes you see FPS drop = lower bench result, for example 1175MHz with 1.3V will give same score of 3DM FS as 1145MHz with 1.27V, then if you do 1175MHz with 1.27V you gain over 1145MHz @ 1.27V.


So 8 of them..... cool








To be honest I also like the card... never had it but like it








Whats ur exp regarding the OC ? U had 8 of them...any big differences ?
Also is it possible to remove that negative scaling with some bios mods ?


----------



## gupsterg

Out of the Fury X cards only 1 did not OC past 1090MHz, stock VID 1.25V, I tested only with upto +75mV with that card and no gain







.

Most reached ~1125MHz, without negative scaling and with only needing upto ~+50mV. 1 card reached 1135MHz with ~+56mV. 1 card reached 1145MHz with ~+56mV, this I kept in main rig and have had since March 16. This data is based on OCs which I regarded 24/7 stable, upto 3hrs each 3DM FS, Heaven and Valley looped. I also do runs of [email protected] for upto 24hrs+ on a card.

Undervolting seems more of benefit with these cards. For example the most recent Fury X I'm still testing has stock VID 1.20V, that's come down to 1.137V IIRC, phenomenally quiet the AIO gets with that VID/stock clocks. Perf.per watt must be good at that setting. IIRC that card so far I've ran ~150hrs straight of [email protected]







.

Recently I've seen some benches of some games which use say DX12/Vulkan where it's on par or slightly ahead of a GTX 1070.

Each card I snagged for between £200-£300, more often in past 3mths £250 or less. IMO represent good value







.

Trying to solve negative scaling but not made real headway yet, Alastair's testing maybe yielding results.


----------



## Simmons572

Quote:


> Originally Posted by *Alastair*
> 
> A pretty powerful little rig ya got there.


Quote:


> Originally Posted by *neurotix*
> 
> This thing. I saw your build log. Excellent work man. I love it.
> 
> Bet it's really fast too... what resolution do you play at on it?


Thanks guys. I'm running 1080p 144Hz Freesync... Well.. I was before I ran into PSU issues.

I'll have to post back once I get it running again


----------



## FlawleZ

Updated my drivers and 3DMark version. Got 14,368 for Firestrike. Ran another quick and dirty run with core @ 1150. Haven' t had to add voltage yet nor tried touching the memory. May mess with those a little later on.

http://www.3dmark.com/3dm/17344826


----------



## ressonantia

I envy you guys with high clocking cards! I got a Gigabyte Fury that would only go to 1080MHz and could only be undervolted by about -42mV, my NANO on the other hand doesn't go past 1050MHz even with +100mV and gets to about 1020MHz I think but then it thermal throttles after 10 minutes anyway so its a moot point.


----------



## ht_addict

Guess I'm lucky. Both my cards do 1150/550, with power limit and voltage at max. With them being water-cooled with EKWB blocks, temps at idle are mid 20's, gaming is mid to low 30's.


----------



## FlawleZ

Are you guys using afterburner to overclock the memory or us that even possible?


----------



## Alastair

Quote:


> Originally Posted by *FlawleZ*
> 
> Are you guys using afterburner to overclock the memory or us that even possible?


It is yes.


----------



## CptAsian

Quote:


> Originally Posted by *FlawleZ*
> 
> Are you guys using afterburner to overclock the memory or us that even possible?


As Alastair said, it indeed is. While I use AB to overclock normally, I generally don't bother with the memory on my Furys because from what I understand, HBM overclocks have minimal impact.


----------



## gupsterg

Quote:


> Originally Posted by *ressonantia*
> 
> I envy you guys with high clocking cards! I got a Gigabyte Fury that would only go to 1080MHz and could only be undervolted by about -42mV, my NANO on the other hand doesn't go past 1050MHz even with +100mV and gets to about 1020MHz I think but then it thermal throttles after 10 minutes anyway so its a moot point.


Yeah I had a Fury Tri-X max 1090MHz and a Fury X the same clock, both did not improve OC even with voltage added. This Nitro is pants as well, all 3 of these cards had stock VID DPM 7 ~1.250V. Initial testing 1085MHz was possible with stock VID, [email protected] started show bad states







. Backed down to 1075MHz and it was "hit & miss" sometimes 10hrs passed without issue in [email protected] other times within 6hrs 3x bad states = lost work unit







.

In comparison to how much voltage I've given to the 9 other Fiji cards I went nuts with this sample, +100mV yielded 1100MHz (ie DPM 7 VID 1.35V







), poorest sample I've had







.

Ahh well back to testing Fury X no 8 with 70.4% ASIC rating I have got 1125MHz stable with +50mV to stock VID DPM 7 1.2V. Awaiting another Nitro OC+







.


----------



## neurotix

Guys how do you get voltage control for memory in Afterburner?

Everyone else says they have an arrow you can click and expand to get it, but I don't...



The arrow next to power limit, when clicked just gives me Temp Limit, which is already maxed out at 85C.


----------



## miklkit

Question: I've been running my Fury Nitro stock with nothing more than a custom fan profile in afterburner to keep it cooler. It had the frame rate and GPU temps showing in the OSD. Then I updated to the latest drivers that have a lot of new stuff added on like Wattman.

Now Afterburner doesn't work. There is no fan profile at all and the sliders in the display are all just a dot on the far left. It also no longer shows temperatures at all so I have to look at HWINFO64 to see what is going on. I think I'm figuring out how to set the fan speed to control temps, but it seems to have a mind of its own. After one gaming session I found temps had been in the 95C range while the fans were running at 745rpm! Before with Afterburner temps had peaked at 78C.

What can I do now? Is Afterburner now obsolete? Can Wattman be deleted? Should I just go back to the older drivers?


----------



## 786sam

hey guys I'm new to the scene... although most of you guys use the Fiji for gaming I'm using them for mining coins (zcash)
iv got 5x nanos and would like to know how to undervolt for than -72mv? I'm using trixx at the moment and that seems to be the minimum I can undervolt? I tried with msi afterburner but the voltage slider stays transparent?


----------



## bluezone

Quote:


> Originally Posted by *786sam*
> 
> hey guys I'm new to the scene... although most of you guys use the Fiji for gaming I'm using them for mining coins (zcash)
> iv got 5x nanos and would like to know how to undervolt for than -72mv? I'm using trixx at the moment and that seems to be the minimum I can undervolt? I tried with msi afterburner but the voltage slider stays transparent?


Strange you should ask this. I ran into another forums thread on mining z-cash. One of the popular miners had made up a optimized bios just for Nano z-cash mining. I'll PM you a copy I downloaded. It's strictly for mining no game play. Reduced voltages. Need to pass on a copy to Gupsterg if he's interested.

If you want to look for it yourself. google Nano bios, Nano mining bios or Nano undervolt Bios . I don't remember the exact query I used.

Cheers.

UPDATE: I found a bookmarked link.

https://bitcointalk.org/index.php?topic=1424132.0


----------



## ressonantia

Quote:


> Originally Posted by *miklkit*
> 
> Question: I've been running my Fury Nitro stock with nothing more than a custom fan profile in afterburner to keep it cooler. It had the frame rate and GPU temps showing in the OSD. Then I updated to the latest drivers that have a lot of new stuff added on like Wattman.
> 
> Now Afterburner doesn't work. There is no fan profile at all and the sliders in the display are all just a dot on the far left. It also no longer shows temperatures at all so I have to look at HWINFO64 to see what is going on. I think I'm figuring out how to set the fan speed to control temps, but it seems to have a mind of its own. After one gaming session I found temps had been in the 95C range while the fans were running at 745rpm! Before with Afterburner temps had peaked at 78C.
> 
> What can I do now? Is Afterburner now obsolete? Can Wattman be deleted? Should I just go back to the older drivers?


I run both wattman and afterburner together. You might have to reinstall afterburner to get it working. Just uninstall, keep settings and then reinstall. Make sure you've got the latest version.


----------



## diggiddi

Quote:


> Originally Posted by *FlawleZ*
> 
> Updated my drivers and 3DMark version. Got 14,368 for Firestrike. Ran another quick and dirty run with core @ 1150. Haven' t had to add voltage yet nor tried touching the memory. May mess with those a little later on.
> 
> http://www.3dmark.com/3dm/17344826


Which Model fury?


----------



## neurotix




----------



## bluezone

Quote:


> Originally Posted by *FlawleZ*
> 
> Updated my drivers and 3DMark version. Got 14,368 for Firestrike. Ran another quick and dirty run with core @ 1150. Haven' t had to add voltage yet nor tried touching the memory. May mess with those a little later on.
> 
> http://www.3dmark.com/3dm/17344826


Very nice. Made me want to retest with the newest drivers and my MOD Bios.

@1050

http://www.3dmark.com/3dm/17372626

@1100

http://www.3dmark.com/3dm/17372898

Your Xeon X5675 beats the pants off my old I7 2600 physics and combined score.

Makes me wish had these drivers for the Fan boy comp last year.

@1100 No Tess.

http://www.3dmark.com/3dm/17373348

Vs old score on I5 3570k (200 point higher scores than I7 2600 likely due to DDR3 RAM clocks)

http://www.3dmark.com/3dm/11424658


----------



## Alastair

Well while screwing around with my overclocks I actually settled on what I think is a nice balance. I wasnt sure how low I could go, but I used BIOS editor to lower all my DPM states (Excluding 0-2 and 7) by -30ish mv. Then I used -25 offset in Trixx.which effects all DPM values, so now my mid DPM states are now at -50ish. I am able to run 1100/550 with a nice -25mv offset.

(Insert Obama "Not Bad" meme here)


----------



## miklkit

Quote:


> Originally Posted by *ressonantia*
> 
> I run both wattman and afterburner together. You might have to reinstall afterburner to get it working. Just uninstall, keep settings and then reinstall. Make sure you've got the latest version.


That did it! I now have temperatures and fan speeds again.


----------



## lanofsong

Hello AMD R9 Radeon Fury / Nano / X / Pro DUO FIJI owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Alastair

Anyone know how to afterburner to take my memory OC's? Ive extended unofficial limits but still everytime I hit the apply button it doesn't work.


----------



## Performer81

YOu have to extend official overclocking limits and set inofficial overclocking mode to " with powerplay support". Then reboot.


----------



## Stardust105

Hello, I'm going to be upgrading my GPU soon, I was wondering how the pump noise on the R9 Fury X is now? Does it still whine or buzz? (going for a almost completely silent PC at idle with Be Quiet Silent Loop, Silent Wings 3)
Otherwise I will most likely be getting a Sapphire RX 490 or EVGA GTX 1070.


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *Stardust105*
> 
> Hello, I'm going to be upgrading my GPU soon, I was wondering how the pump noise on the R9 Fury X is now? Does it still whine or buzz? (going for a almost completely silent PC at idle with Be Quiet Silent Loop, Silent Wings 3)
> Otherwise I will most likely be getting a Sapphire RX 490 or EVGA GTX 1070.


It varies, but I've never had noise issues with my Fury X (purchased October 2015).


----------



## Stardust105

Well, decided to go with RX 480 instead because of how insanely cheap it is in comparison here in Denmark, thanks for the help though.


----------



## AngryLobster

Doing some experimenting with my Nitro Fury and man is overclocking completely pointless with these cards.

My card does -96mv @ 1050 out the box but even going up to 1125 with very little additional voltage nets literally 3 FPS while adding substantial heat and noise. +500RPM and +30w for 1-3 FPS.


----------



## diggiddi

What about the HBM?


----------



## Alastair

Quote:


> Originally Posted by *AngryLobster*
> 
> Doing some experimenting with my Nitro Fury and man is overclocking completely pointless with these cards.
> 
> My card does -96mv @ 1050 out the box but even going up to 1125 with very little additional voltage nets literally 3 FPS while adding substantial heat and noise. +500RPM and +30w for 1-3 FPS.


odd Battlefield 4 and pretty much every other game I play gives me pretty good scaling.

Battlefield 4 in the test range. Standing on top of the building overlooking the island. 2560x1440 150% res scaling and ultra settings (no msaa)
1000/500 = 110 fps.
1000/550 = 115 fps.
1050/550 = 122-125fps
1100/550 = 130-133 fps.


----------



## B'Fish

Im curious, im currently using a 980ti (hybrid ) overclocked to 1500/4000mhz. But im just in love with AMD. I would be happier if I owned a Fury X I suppose







. I can trade this GPU for a Fury x + get some extra cash but i wanna know if this would be a downgrade for a 1080p user. I know fury x keeps up or beats a 980ti @ 4k resolution.

What do you guys think? Is a Fury X with the lastest drivers as good as an overclocked 980ti?

Im using a 144hz 1080p monitor btw. The reason im asking is because i cant really find good benchmark results with the latest AMD drivers. the only one thats legit is benchmarked @ 4k which i do not have interest in.


----------



## Bojamijams

Quote:


> Originally Posted by *B'Fish*
> 
> Im curious, im currently using a 980ti (hybrid ) overclocked to 1500/4000mhz. But im just in love with AMD. I would be happier if I owned a Fury X I suppose
> 
> 
> 
> 
> 
> 
> 
> . I can trade this GPU for a Fury x + get some extra cash but i wanna know if this would be a downgrade for a 1080p user. I know fury x keeps up or beats a 980ti @ 4k resolution.
> 
> What do you guys think? Is a Fury X with the lastest drivers as good as an overclocked 980ti?
> 
> Im using a 144hz 1080p monitor btw. The reason im asking is because i cant really find good benchmark results with the latest AMD drivers. the only one thats legit is benchmarked @ 4k which i do not have interest in.


Stick with what you have. When Vega comes out, then switch


----------



## Thoth420

Quote:


> Originally Posted by *B'Fish*
> 
> Im curious, im currently using a 980ti (hybrid ) overclocked to 1500/4000mhz. But im just in love with AMD. I would be happier if I owned a Fury X I suppose
> 
> 
> 
> 
> 
> 
> 
> . I can trade this GPU for a Fury x + get some extra cash but i wanna know if this would be a downgrade for a 1080p user. I know fury x keeps up or beats a 980ti @ 4k resolution.
> 
> What do you guys think? Is a Fury X with the lastest drivers as good as an overclocked 980ti?
> 
> Im using a 144hz 1080p monitor btw. The reason im asking is because i cant really find good benchmark results with the latest AMD drivers. the only one thats legit is benchmarked @ 4k which i do not have interest in.


If you are interested in a trade PM me. My Fury X is blocked though so you would have to put it in a loop. I am pulling my system apart and starting fresh with an AM4 build in a Praxis WetBench so I need a normal air cooled GPU(hybrid does the trick as well...I can mount that rad to the bench). No pressure just figured I would toss the option out there since I plan on tearing this system down and parting it out either way.

I cannot speak for 1080p as I run 1440p @ 144hz however the Fury X shines more at higher resos so it may not be the best choice for you. I would think any performance increases in DX11 games would be minimal at best. You may see marked improvements in DX12.


----------



## dagget3450

Quote:


> Originally Posted by *AngryLobster*
> 
> Doing some experimenting with my Nitro Fury and man is overclocking completely pointless with these cards.
> 
> My card does -96mv @ 1050 out the box but even going up to 1125 with very little additional voltage nets literally 3 FPS while adding substantial heat and noise. +500RPM and +30w for 1-3 FPS.


Pretty much how fiji is. If you want overclocking fiji is not going to be the choice. Right now until AMD getd vega out Nvidia has the best options for OC.


----------



## neurotix

Best Fury single card score. Might submit to HWBOT


----------



## B'Fish

Any1 willing to post his/hers valley benchmarks? Extremehdpreset(1080p)


----------



## dagget3450

Quote:


> Originally Posted by *B'Fish*
> 
> Any1 willing to post his/hers valley benchmarks? Extremehdpreset(1080p)


Valley is more CPU/ram oriented, Might be better to try something like Timespy or FS?


----------



## B'Fish

Sure try those and tell me what clocks you used. I am very interested in the latest driver performance + 1080p performance. I can sell my current 980ti for 500 euro and re-buy a fury x for 300 euros. I am trying to see if that is a good deal. Sounds like it unless the performance numbers are just to far off.


----------



## bluezone

For those contemplating a TIM replacement on their card. Here is a good read before undertaking the project.

"How To: Optimizing Your Graphics Card's Cooling"

http://www.tomshardware.com/reviews/optimizing-graphics-cooling,4838.html

EDIT: I would suggest reading the comments that follow the article as well.


----------



## supermiguel

whats the best water block for the Fury X?

I guess there are 3? Aqua Computer, XSPC and EK

Edit: So right now i got 2 Fury X and i want to add them to my Super Monsta loop, but between the water block and the backplate, its about $300, thats how much a third Fury X would cost....

So add both of my Fury X to my loop or get a 3rd one? I have 3 samsung 4k 60hz monitor, mostly Play Overwatch and eve


----------



## diggiddi

Quote:


> Originally Posted by *Alastair*
> 
> odd Battlefield 4 and pretty much every other game I play gives me pretty good scaling.
> 
> Battlefield 4 in the test range. Standing on top of the building overlooking the island. 2560x1440 150% res scaling and ultra settings (no msaa)
> 1000/500 = 110 fps.
> 1000/550 = 115 fps.
> 1050/550 = 122-125fps
> 1100/550 = 130-133 fps.


Could you run the same test with gpu side first keeping memory stock freq, then keeping gpu at fixed freq and increasing memory
Thx


----------



## DedEmbryonicCe1

Quote:


> Originally Posted by *supermiguel*
> 
> whats the best water block for the Fury X?
> 
> I guess there are 3? Aqua Computer, XSPC and EK
> 
> Edit: So right now i got 2 Fury X and i want to add them to my Super Monsta loop, but between the water block and the backplate, its about $300, thats how much a third Fury X would cost....
> 
> So add both of my Fury X to my loop or get a 3rd one? I have 3 samsung 4k 60hz monitor, mostly Play Overwatch and eve


http://www.swiftech.com/KOMODO-R9FuryXEco.aspx
{Not saying it's the best just another option}


----------



## supermiguel

Quote:


> Originally Posted by *DedEmbryonicCe1*
> 
> http://www.swiftech.com/KOMODO-R9FuryXEco.aspx
> {Not saying it's the best just another option}


Nice anyone using this block? that can give me their opinion?

i like this "AMD's original back plate and Swiftech's optional back plate are fully compatible with the block."


----------



## Aretak

Does the white Asus Nano have any benefits over the reference model? It has an Asus-branded PCB and a different, wider fan grille on the port end, but is everything else identical under the hood? No upgraded components or anything?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> Very nice. Made me want to retest with the newest drivers and my MOD Bios.
> 
> @1050
> 
> http://www.3dmark.com/3dm/17372626
> 
> @1100
> 
> http://www.3dmark.com/3dm/17372898
> 
> Your Xeon X5675 beats the pants off my old I7 2600 physics and combined score.
> 
> Makes me wish had these drivers for the Fan boy comp last year.
> 
> @1100 No Tess.
> 
> http://www.3dmark.com/3dm/17373348
> 
> Vs old score on I5 3570k (200 point higher scores than I7 2600 likely due to DDR3 RAM clocks)
> 
> http://www.3dmark.com/3dm/11424658


That is some truly sweet scaling on your GS score







, puts my 1145/545 OC to shame







. Spurred by your result I did 33 runs of 3DM FS







.

1050MHz undervolted x3 (1175mV)
1100MHz same VID as above testing x3

1050MHz @ 1212mV (stock VID) x3
1100MHz @ 1212mV (stock VID) x3
1125MHz @ 1212mV (stock VID) x3

1050MHz @ 1243mV x3 (this is the VID I need to have 1135MHz OC 24/7 stable for crazy hrs of benching/folding, etc)
1100MHz @ 1243mV x3
1125MHz @ 1243mV x3
1135MHz @ 1243mV x3
1145MHz @ 1243mV x3

1145MHz @ 1268mV (this is the VID I need to have 1145MHz OC 24/7 stable for crazy hrs of benching/folding, etc)

I must do the 1268mV x3 runs of 1050MHz, 1100MHz, 1125Mhz, 1135Mhz and will see if can present it in a good way in bios mod thread. In a nut shell though would you believe 1050MHz undervolted scales above 1050MHz stock VID 1212mV consistently







. Virtually every step up of VID from my undervolt setting of 1175mV has a performance hit of upto ~1% and this increments in way, hopefully will make more sense when I collate data and post.

The test card is my Fury X with 64.4% ASIC, now I also did some quick compares of same settings with a ~70%+ ASIC Fury X I have (stock VID 1.2V), that also is showing similar scaling issue with VID increase. SO higher ASIC does not negate it







. I'll also be testing another Fury Nitro OC+ I received which is OC'ing better than the 1st card I had, ~60% ASIC (stock VID 1.25V).

Quote:


> Originally Posted by *Alastair*
> 
> Anyone know how to afterburner to take my memory OC's? Ive extended unofficial limits but still everytime I hit the apply button it doesn't work.


Quote:


> Originally Posted by *Performer81*
> 
> YOu have to extend official overclocking limits and set inofficial overclocking mode to " with powerplay support". Then reboot.


I use Win 10 rarely, but found that even if ROM had OverDrive limit of RAM extended from 500MHz to 600MHz MSI AB would not OC HBM on slider without medding with the settings







. Where as in Win 7 that same mod equals HBM OC with MSI AB default settings.

Quote:


> Originally Posted by *diggiddi*
> 
> What about the HBM?


Not a huge amount, but depend on what you load GPU with from some off and on testing I did. TPU did some test in this article and HardwareLuxx.

I also posted some data from my 3DM runs in this post and about the clock steps of HBM in this post.


----------



## bluezone

All Nanos are the same, but I did not know about the grill being different on the white Nano.

Crimson Relive driver 17.1.1

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.1.1-Release-Notes.aspx


----------



## steadly2004

I just got my Fury Nitro OC+ from Newegg for like $260. Still have to send the $20 rebate though. Just reading on here, looks like its a good overclocker. I have only tried with 3dmark, but the results are pretty good. Settled on 1150mhz and 550mhz mem. Can't go any higher on memory, and with the core, 1175 crashed. I did increase the power limiter to 50%, and it gets slightly better numbers at those settings than without. It does work without the power limit increase, but throttles a few mhz.



http://www.3dmark.com/3dm/17447900?

BTW- this card is a placeholder until VEGA comes out. Sold my 2x Titan X maxwell cards and pre-ordered a Microboard ultrawide with freesync. I just couldn't stand spending >1k on a monitor just to have gsync + ultrawide. I had to redo my loop since the fury was like 12.5 inches. The Titans were only 10.5 and my pump/res were in the way.


*EDIT* that wasn't game stable, just dropped it to 1100 and played a bit of BF1. I'll have to see if more is reasonable, but for now, not without added voltage at least.


----------



## FlawleZ

Quote:


> Originally Posted by *diggiddi*
> 
> Which Model fury?


Sapphire Nitro Tri X OC+
Quote:


> Originally Posted by *bluezone*
> 
> Very nice. Made me want to retest with the newest drivers and my MOD Bios.
> 
> @1050
> 
> http://www.3dmark.com/3dm/17372626
> 
> @1100
> 
> http://www.3dmark.com/3dm/17372898
> 
> Your Xeon X5675 beats the pants off my old I7 2600 physics and combined score.
> 
> Makes me wish had these drivers for the Fan boy comp last year.
> 
> @1100 No Tess.
> 
> http://www.3dmark.com/3dm/17373348
> 
> Vs old score on I5 3570k (200 point higher scores than I7 2600 likely due to DDR3 RAM clocks)
> 
> http://www.3dmark.com/3dm/11424658


Nice scores. The good thing about Firestrike is how well the physics test scales with more threads.


----------



## gupsterg

I was wondering any owners of Fiji who can get 545MHz to 600MHz HBM able to share some 3DM FS runs where graphics test 2 is getting close to 80FPS?

I've been scratching my head on how some of the top valid results for 3DM FS have like upto ~+10% higher graphics test 2 FPS than any runs I've done on various drivers (which are a lot







). For example fnZx has several runs where with 585MHz HBM has nailed nice gain in GT2 and then Unremember3d. They both have i5 4690K / Z87 or Z97 mobo / etc and so only conclusion I can come to is HBM clock is helping!? fnZx has several valid results which to me means it wasn't a one off high scoring GT2 result.

3DM results compare.


----------



## Johan45

Quote:


> Originally Posted by *gupsterg*
> 
> I was wondering any owners of Fiji who can get 545MHz to 600MHz HBM able to share some 3DM FS runs where graphics test 2 is getting close to 80FPS?
> 
> I've been scratching my head on how some of the top valid results for 3DM FS have like upto ~+10% higher graphics test 2 FPS than any runs I've done on various drivers (which are a lot
> 
> 
> 
> 
> 
> 
> 
> ). For example fnZx has several runs where with 585MHz HBM has nailed nice gain in GT2 and then Unremember3d. They both have i5 4690K / Z87 or Z97 mobo / etc and so only conclusion I can come to is HBM clock is helping!? fnZx has several valid results which to me means it wasn't a one off high scoring GT2 result.
> 
> 3DM results compare.


----------



## gupsterg

Cheers, I was interested in valid result, ie without tess.tweak


----------



## Johan45

I doubt I have any TBH
One thing I can tell you though is that GT2 can and does bug out and will still give a valid result
Also the FS test do benefit from higher Vram across all the tests.


----------



## gupsterg

Cheers for info







, if you get any valid results please do share







.

I managed 1185MHz @ 1.275mv 545MHz @ 1.3V on my Fury X no 8 (ASIC Quality 70.4%, DPM 7 stock VID 1.2V), engough to get me past fnZx







, until he notices







.



3DM FS compare gupsterg vs fnZx .

Gotta admit the stock AIO is nuts for cooling, I've got all monitoring off whilst benching and straight after a run and I've been doing them all afternoon I see ~26-30°C (fan 100%). May do a twin fan mod and risk TIM/PAD swap from factory stuff to see if get any more outta AIO, otherwise I start opening windows in the room!







.


----------



## Johan45

In time maybe, don't have that system up and running currently. Playing with Kaby


----------



## supermiguel

So i need to pay to run firestrike???


----------



## Johan45

I don't think so but you'll have to watch the demo and likely get your score from FM website


----------



## ressonantia

@gupsterg: Here's my lowly NANO

[email protected]/500MHz/+25%


[email protected]/545MHz/+25%


Comparison


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers, I was interested in valid result, ie without tess.tweak


Here is a run with a Tess tweak (and more) yet still valid. Hint look at the clock speeds for instance.









http://www.3dmark.com/3dm/17471759

The system can be fooled, but kind of pointless if you are trying to improve your frame rate. Not to mention dishonest. To do this I tried the 1st things that came to mind. Gupsterg PM if your interested but not important info.
As a matter of fact, here is a apparently valid back-up NO-Tess run to my earlier invalid run.









http://www.3dmark.com/3dm/17473428

But note errors in settings.
Quote:


> Originally Posted by *Johan45*
> 
> Also the FS test do benefit from higher Vram across all the tests.


Interesting, I had long suspected that. How did you validate this.


----------



## supermiguel

what kind of results are you guys getting with 2 Fury X?


----------



## bluedevil

Quote:


> Originally Posted by *supermiguel*
> 
> what kind of results are you guys getting with 2 Fury X?


I get about 22k in FS.

http://www.3dmark.com/fs/11443776


----------



## neurotix

Quote:


> Originally Posted by *supermiguel*
> 
> what kind of results are you guys getting with 2 Fury X?


34k graphics score and 22k Fire Strike with 2x Fury Nitros:

http://hwbot.org/submission/3337159_neurotix_3dmark___fire_strike_2x_radeon_r9_fury_22379_marks

Yes tess is off, yes it's allowed on HWBOT and no I'm not willing to argue about it again.

(If HWBOT will take it it's valid. Check the cups)


----------



## dagget3450

on my old x5650 xeons with tess i scored 32462 gpu on furyx x2. Weirdly enough i had to turn off HT to get a better GPU score. got these slightly oc'd 1150 on core for the bench, 500 hbm. This isn't really setup for benching but i suppose i could try with air cooling.

i'll be throwing them back on my x99 soon, just want to bench out my 390x quadfire before i do. it's pretty fun atm to play with.



Edit:

with HT lesser gpu score, stock gpu clocks run same cpu settings as above

http://www.3dmark.com/fs/11460197

31k gpu score


----------



## supermiguel

How good is the scaling of this cards? Like from 2 to 3 and from 3 to 4? Also with custom water cooling have you guys been able to overclock it more?


----------



## dagget3450

Quote:


> Originally Posted by *supermiguel*
> 
> How good is the scaling of this cards? Like from 2 to 3 and from 3 to 4? Also with custom water cooling have you guys been able to overclock it more?


I have 4 furyx that ill be putting back on x99 soon. Scaling past 2 cards is highly dependent on cpu speed/ipc/overhead. The bigger issue is if your thinking of 3 or 4 for gaming it's a bust. You will get shoddy performance due to cpu overhead, and DX11 and poor game engines. Only a few examples would probably bypass this and run good. Then for dx12 right now mgpu is all but a myth.

For bench marking if your looking for scaling on fiji beyond 2 cards your gonna need massive OC on cpu and fastest ipc/cpu you can get.

... I hope i am replying properly to your post, i think that is what your asking?


----------



## Johan45

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> Also the FS test do benefit from higher Vram across all the tests.
> 
> 
> 
> Interesting, I had long suspected that. How did you validate this.
Click to expand...

Just experience


----------



## steadly2004

Just got another slightly higher score with my Nitro.
17,153 graphics score. 1165 core and 550 memory. Also changed radeon graphics settings to prefer performance instead of balanced/quality. Seems to help a bit.
http://www.3dmark.com/fs/11466130


----------



## supermiguel

Quote:


> Originally Posted by *steadly2004*
> 
> Just got another slightly higher score with my Nitro.
> 17,153 graphics score. 1165 core and 550 memory. Also changed radeon graphics settings to prefer performance instead of balanced/quality. Seems to help a bit.
> http://www.3dmark.com/fs/11466130


Does CPU and RAM influence this test?


----------



## Alastair

Quote:


> Originally Posted by *supermiguel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *steadly2004*
> 
> Just got another slightly higher score with my Nitro.
> 17,153 graphics score. 1165 core and 550 memory. Also changed radeon graphics settings to prefer performance instead of balanced/quality. Seems to help a bit.
> http://www.3dmark.com/fs/11466130
> 
> 
> 
> 
> Does CPU and RAM influence this test?
Click to expand...

Yes greatly. Especially for AMD owners since it essentially nerfs our FX CPU's. Futuremark refuses to acknowledge and accept that the FX is a true 8 core processor. And so when the combined test is run it only runs on what Futuremark has identified to be physical cores. Which is probably to be expected as an overclocked FX would probably knock on the door of an I7 in this test if it got to flex all of its muscle.


----------



## supermiguel

Quote:


> Originally Posted by *Alastair*
> 
> Yes greatly. Especially for AMD owners since it essentially nerfs our FX CPU's. Futuremark refuses to acknowledge and accept that the FX is a true 8 core processor. And so when the combined test is run it only runs on what Futuremark has identified to be physical cores. Which is probably to be expected as an overclocked FX would probably knock on the door of an I7 in this test if it got to flex all of its muscle.


So is it fair to compare scores between cards if you all have different CPUS? wouldnt it be better to run a more graphics card specific tests? Or is this as good as it gets?


----------



## neurotix

Quote:


> Originally Posted by *supermiguel*
> 
> So is it fair to compare scores between cards if you all have different CPUS? wouldnt it be better to run a more graphics card specific tests? Or is this as good as it gets?


AMD CPUs vs AMD CPUs with the same amount of threads.

Intel CPUs vs Intel CPUs with the same amount of threads.

I had a guy claim his 980ti was better than my 2x 290s, because his score was around the same with one 980ti, however he had a 5960x (8 core 16 thread) and I only have a 4790k. Of course we know 2x 290s will be better than a single 980ti (or maybe even a GTX 1080) in the games that support it. I tried to explain this to him, and tell him that he needed to disable HT and re run the test (so it'd be 8 threads vs 8 threads) but he wouldn't listen to me.









The graphics score is weighted more heavily than the CPU physics score, but CPU physics still affects the final result enough that it can skew the score and make a card look much better than it really is. Even a 660ti or 7870 would look amazing when paired with a i7-6950x (10 core 20 thread) because that physics score would up the total score enough that those cards would look much better.

The best scenario is to simply compare graphics scores unless you want to compare a very similar setup, e.g. 4 core mainstream i7 with the same GPU vs someone else with the same CPU and GPU series. So two i7s and the same GPU. In this case you can compare the total score of the systems, and this would also be a very common configuration (4 core i7s are like, 95% of the builds here).

However the rest of the time, the graphics score is all that matters, especially if comparing Intel vs AMD CPUs, if looking at the total score the AMD system will look bad, but the graphics scores might be very similar.

It should also be noted that even in DirectX11 benches, in general, AMD CPUs (e.g. FX-8xxx) perform poorly at lower resolutions but perform very similar to Intel i7s the higher the resolution goes. I noticed this when running Valley after moving to Intel. 60-some FPS in Valley at 5760x1080 with both CPUs. At 1080p however, the Intel got 20 more FPS at stock compared to the FX at 5ghz. How this applies to 3dmark is, that if you run the basic Fire Strike on Intel vs AMD CPUs, the score on Intel will be much higher, however if you try running Fire Strike Ultra I'm pretty sure the scores will be much more similar.

Hope this helps.


----------



## gupsterg

@ressonantia

+ rep for info, but I'm looking for results which are valid and have somewhere near 80FPS in GT2 of 3DM FS







.

@bluezone

+ rep for info, I'll be honest I bench "straight" ie *nothing to bend the result* other than hardware/OC giving gains







.

I'm noticing Win 10 is really freaky about how it shows clocks (fully updated). I haven't used it much TBH even though had it for months sitting on a spare HDD. Win 10 defo gives a straight 200 points more for a 3DM FS bench.

I've been benching the nuts off Fury X no 8 past few days and still planning on some more, a few posts back I posted image of 500+ benches well it's now a lot more







.



Fury X no 8 is surpassing my Fury X no 3 in some ways. I nailed some HBM 600MHz benches







, here is a compare of 1145MHz HBM 500MHz vs 545MHz vs 600MHz (disregard clocks for GPU in FM DB). I'm sorta doing a mega bench run of various GPU clocks/HBM clocks plus FS vs FSE vs FSU.

I also meddled with TS some more than my just 1 or 2 runs here and there. One thing I noted is, it seems to show instability in an OC more than FS. I need at least 1300mV for 1175MHz on Fury X no 8 for a run to be bench stable, FS on same card needs only 1275mV. Fury X no 3 only needs 1268mV @ 1175MHz for TS.

You know we've chatted about ASIC quality/Leakage, well it was very interesting to do these benches







.

You see my Fury X no 3 is ASIC Quality 64.4%, this will do 1145 @ 1.268V without tripping OCP in IR3567B when I lower it to 216A. Fury X no 8 with ASIC Quality 70.4% will trip 216A with 1050MHz @ 1.262V







, I set stock 240A due to lack of time but will test to see if 228A is sufficient. Also been taking some wall plug meter readings as well







. Another interesting piece of testing was comparing scaling, performance, etc of 1145/545 on 70% ASIC @ 1262mV vs 64% ASIC @ 1268mV.

Hopefully will be able to do write up if and when I finish all benches







.

@Johan45

I've had one glitch so far on 3DM FS GT2







, true clocks in compare 1100/600 vs 1175/600 and 3 runs of 1100/600 as they should be. After all the benches I did past few days I'd concur on your info







.


----------



## supermiguel

Quote:


> Originally Posted by *gupsterg*
> 
> @ressonantia
> 
> + rep for info, but I'm looking for results which are valid and have somewhere near 80FPS in GT2 of 3DM FS
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @bluezone
> 
> + rep for info, I'll be honest I bench "straight" ie *nothing to bend the result* other than hardware/OC giving gains
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I'm noticing Win 10 is really freaky about how it shows clocks (fully updated). I haven't used it much TBH even though had it for months sitting on a spare HDD. Win 10 defo gives a straight 200 points more for a 3DM FS bench.
> 
> I've been benching the nuts off Fury X no 8 past few days and still planning on some more, a few posts back I posted image of 500+ benches well it's now a lot more
> 
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Fury X no 8 is surpassing my Fury X no 3 in some ways. I nailed some HBM 600MHz benches
> 
> 
> 
> 
> 
> 
> 
> , here is a compare of 1145MHz HBM 500MHz vs 545MHz vs 600MHz. I'm sorta doing a mega bench run of various GPU clocks/HBM clocks and the FS vs FSE vs FSU.
> 
> I also meddled with TS some more than my just 1 or 2 runs. One thing I noted is it seems to show instability in an OC more than FS. I need at least 1300mV for 1175MHz on Fury X no 8 for a run to be bench stable, FS on same card needs only 1275mV. Fury X no 3 only needs 1268mV @ 1175MHz for TS.
> 
> You know we've chatted about ASIC quality/Leakage, well it was very interesting to do these benches
> 
> 
> 
> 
> 
> 
> 
> .
> 
> You see my Fury X no 3 is ASIC Quality 64.4%, this will do 1145 @ 1.268V without tripping OCP in IR3567B when I lower it to 216A. Fury X no 8 with ASIC Quality 70.4% will trip 216A with 1050MHz @ 1.262V
> 
> 
> 
> 
> 
> 
> 
> , I set stock 240A due to lack of time but will test to see if 228A is sufficient. Also been taking some wall plug meter readings as well
> 
> 
> 
> 
> 
> 
> 
> . Another interesting piece of testing was comparing scaling, performance, etc of 1145/545 on 70% ASIC @ 1262mV vs 64% ASIC @ 1268mV.
> 
> Hopefully will be able to do write up if and when I finish all benches
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @Johan45
> 
> I've had one glitch so far on 3DM FS GT2
> 
> 
> 
> 
> 
> 
> 
> , true clocks in compare 1100/600 vs 1175/600 and 3 runs of 1100/600 as they should be.


Fury x no 8?? U have 8 fury x?


----------



## steadly2004

Quote:


> Originally Posted by *supermiguel*
> 
> Does CPU and RAM influence this test?


I see it's been answered, but it mostly effects physics/total score. That's why I specifically mentioned the graphics score when I posted. ? Total score is weighted between graphics/physics/combined scores.


----------



## neurotix

delete


----------



## gupsterg

Quote:


> Originally Posted by *supermiguel*
> 
> Fury x no 8?? U have 8 fury x?


I have had 11 Fiji cards in total







.

1x Fury Tri-X
2x Fury Nitro OC+
8x Fury X

I currently have only the 2x Nitro and 2x Fury X left, others disposed of







.


----------



## supermiguel

Quote:


> Originally Posted by *gupsterg*
> 
> I have had 11 Fiji cards in total
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 1x Fury Tri-X
> 2x Fury Nitro OC+
> 8x Fury X
> 
> I currently have only the 2x Nitro and 2x Fury X left, others disposed of
> 
> 
> 
> 
> 
> 
> 
> .


Dam i want to be like u when i grow up


----------



## gupsterg

LOL, who said I've grown up!


----------



## supermiguel

Quote:


> Originally Posted by *gupsterg*
> 
> LOL, who said I've grown up!


did u ever test 3 way crossfire with the fury x? any better or worst than 2 way? specially on gaming, i know people say its bad past 2 since your processor gets hit pretty bad


----------



## Johan45

Quote:


> Originally Posted by *steadly2004*
> 
> Quote:
> 
> 
> 
> Originally Posted by *supermiguel*
> 
> Does CPU and RAM influence this test?
> 
> 
> 
> I see it's been answered, but it mostly effects physics/total score. That's why I specifically mentioned the graphics score when I posted. ? Total score is weighted between graphics/physics/combined scores.
Click to expand...

Short answer is yes as to how much I don't know but when testing for max GPU clock. I run my CPU at stock to save heat in my cold loop. Once I have the GPU dialed In and profile set I restart and set CPU/Cache speed and there is a definite bump in the thousands(FS) for GFX score. This is with a 5960x which at stock is 3.0GHZ 3.5 turbo and I bench at 5.0GHz


----------



## Johan45

Quote:


> Originally Posted by *gupsterg*
> 
> @Johan45
> 
> I've had one glitch so far on 3DM FS GT2
> 
> 
> 
> 
> 
> 
> 
> , true clocks in compare 1100/600 vs 1175/600 and 3 runs of 1100/600 as they should be. After all the benches I did past few days I'd concur on your info
> 
> 
> 
> 
> 
> 
> 
> .


I didn't think I was the only one who would see this.


----------



## gupsterg

@supermiguel

Even though I've had the GPUs I haven't done any CF usage or benching







.

My main rig was built with the intent of just single GPU usage, it has 850W PSU only (i5 rig in sig), as it was on promo at the time. I've seen upto 510W from wall plug meter for total system inc display, ~90W idle. The 510W measurement is MAX W when 3DM FS does the combined test so CPU/GPU is loaded. As I only got the wall plug meter recently I've not had time to do other measurements as been busy meddling with GPU, etc.

I could plug the 650W PSU from my Q6600 rig and use it on the 2nd GPU if installed in my i5 rig, but I use that rig a lot as well. Usually what happens is when I get a GPU it goes into the Q6600 rig, to get all the preliminary data (ie Stock ROM/ASIC quality/Stock VID per DPM) and set card to run [email protected] plus OC it in stages to start seeing how it responds to OC'ing. Only if it's a good clocker it gets to my i5 rig.

I do believe my i5 would be a bottleneck for CF, I may at one point do CF just to see







.

@Johan45

I reckon that glitch has happened due to not have a fully stable GPU/HBM OC, as never seen it on other Fiji cards.

I had got plenty of info on what Fury X no 8 needed for say x OC but it had not has as through a stability testing as say my Fury X no 3. I've had Fury X no 3 since Mar 16, so it's had a lot of usage/testing, etc. I reckon the 600MHz HBM was not fully stable at 1.4V, Fury X no 8 has been the first card that OC'd to that HBM clock to be bench stable for lengthy tests. As Fury X no 3 has been one of the best cards I've never risked running 600MHz/1.4V HBM.

Some of the Fiji cards didn't even manage 545MHz with increased MVDDC, some managed 545MHz at stock MVDDC or only needed upto 1.35V, 600MHz has been rare for me. Every card has had stock cooling/TIM/pads, been tested with max fan to see their limit.


----------



## supermiguel

how would you check if your CPU is the bottleneck of your CF? like i have a 3930k will it handle 2? 3? 4? how can i test/check on this?


----------



## Johan45

@gupsterg
I felt it was something to do with HBM since I have never noticed this on any other card before. The first I noticed I was in a competition and running FSE . Started glitching at the beginning of test 2 with an unusally high FPS. The test finished but I couldn't submit the result. No way test 2 should be +50 FPS in that bench. Could be that I had never noticed with other cards but I don't think so. Typically if it's not stable enough GT2 will drop FPS.


----------



## Alastair

What can I use to stress test my Fury's for stability? I am finding heaven bench to be inadequate and I can't play BF4 (my next best hope) as it just crashes all the time


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> What can I use to stress test my Fury's for stability? I am finding heaven bench to be inadequate and I can't play BF4 (my next best hope) as it just crashes all the time


Try Fire Strike Ultra in loop mode under advanced if you have it. This requires that you buy 3dmark. (Click Custom at the top, click Fire Strike, check "loop", and set the resolution to 3840x2160. Try Graphics Test 2 for GPU or Combined Test for stressing the whole system). Makes the cards nice and hot and they will crash pretty quickly if they aren't stable.

Heaven/Valley aren't good stressors simply because they are pretty tolerant of unstable OCs. I had a 270X that would do 1250mhz in games (on air); it could complete Valley at 1300mhz, but then at this speed it would artifact and crash nearly instantly in any game.

I am sure there are other ways of stressing your cards, I would recommend you avoid Furmark (e.g. fuzzy donut) but you probably know this already.


----------



## Bojamijams

Are you guys finding the HBM overclocks worthwhile? Seems like they do almost nothing in the 1080p FIrestrike. I am wondering if the FIrestrike Ultra would see more benefit.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What can I use to stress test my Fury's for stability? I am finding heaven bench to be inadequate and I can't play BF4 (my next best hope) as it just crashes all the time
> 
> 
> 
> Try Fire Strike Ultra in loop mode under advanced if you have it. This requires that you buy 3dmark. (Click Custom at the top, click Fire Strike, check "loop", and set the resolution to 3840x2160. Try Graphics Test 2 for GPU or Combined Test for stressing the whole system). Makes the cards nice and hot and they will crash pretty quickly if they aren't stable.
> 
> Heaven/Valley aren't good stressors simply because they are pretty tolerant of unstable OCs. I had a 270X that would do 1250mhz in games (on air); it could complete Valley at 1300mhz, but then at this speed it would artifact and crash nearly instantly in any game.
> 
> I am sure there are other ways of stressing your cards, I would recommend you avoid Furmark (e.g. fuzzy donut) but you probably know this already.
Click to expand...

indeed I wish to avoid furmark. I also do not plan on paying R349 for a benchmark I might use sporadically. I can pay a game for that price. And two I most certainly do not intend to fund lazy and biased developers who couldn't be even half arsed about implementing proper asynchronous compute into a supposedly true DX12 bench that shows off "all the best" the api has to offer.

But thanks anyway. Besides furmark. BF4 was me best bet. I could sit on the building on the test range overlooking the island,get 100% use from both cards and still have a decent cpu load that the combined heat and stress would be indicative of gaming. And I intended to run it at least 24 hours. But I can't. But I do not know what else might be as useful to me.


----------



## Alastair

Quote:


> Originally Posted by *Bojamijams*
> 
> Are you guys finding the HBM overclocks worthwhile? Seems like they do almost nothing in the 1080p FIrestrike. I am wondering if the FIrestrike Ultra would see more benefit.


I actually find some pretty impressive improvements in game with HBM OC regardless of what the synthetics tell me. I gain an FPS in heaven going from 500 to 550 (I'm pretty sure it's 545 but if it isn't going in steps 550 is a nice round number)

In BF4 I can get quite a tangible bump in fact. 4 fps.


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> indeed I wish to avoid furmark. I also do not plan on paying R349 for a benchmark I might use sporadically. I can pay a game for that price. And two I most certainly do not intend to fund lazy and biased developers who couldn't be even half arsed about implementing proper asynchronous compute into a supposedly true DX12 bench that shows off "all the best" the api has to offer.
> 
> But thanks anyway. Besides furmark. BF4 was me best bet. I could sit on the building on the test range overlooking the island,get 100% use from both cards and still have a decent cpu load that the combined heat and stress would be indicative of gaming. And I intended to run it at least 24 hours. But I can't. But I do not know what else might be as useful to me.


Gaming on the games you actually play is generally the best test anyway.









Cheers man.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> indeed I wish to avoid furmark. I also do not plan on paying R349 for a benchmark I might use sporadically. I can pay a game for that price. And two I most certainly do not intend to fund lazy and biased developers who couldn't be even half arsed about implementing proper asynchronous compute into a supposedly true DX12 bench that shows off "all the best" the api has to offer.
> 
> But thanks anyway. Besides furmark. BF4 was me best bet. I could sit on the building on the test range overlooking the island,get 100% use from both cards and still have a decent cpu load that the combined heat and stress would be indicative of gaming. And I intended to run it at least 24 hours. But I can't. But I do not know what else might be as useful to me.
> 
> 
> 
> Gaming on the games you actually play is generally the best test anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers man.
Click to expand...

I'll have to keep looking. But those were some pretty impressive results you posted earlier BTW


----------



## Alastair

Have you guys tried 17.1.1? It seems to be a lot slower on my system. My heaven FPS at 1440P extreme have dropped by 5.


----------



## Fediuld

Maybe something else was running on the system background?

My Nano got 1% boost in 3d Mark Spy at 1140 core, while now I can clock it to 1150 core and been stable.

However have rolled back to 16.11.4 because The Division freesync is broken since 16.12


----------



## Alastair

Quote:


> Originally Posted by *Fediuld*
> 
> Maybe something else was running on the system background?
> 
> My Nano got 1% boost in 3d Mark Spy at 1140 core, while now I can clock it to 1150 core and been stable.
> 
> However have rolled back to 16.11.4 because The Division freesync is broken since 16.12


Nope. I definitely see a drop in heaven. 1100/550 = 90.4FPS at 1440P extreme in 16.12.1. Goes to 85.6 in 17.1.1


----------



## gupsterg

Quote:


> Originally Posted by *Alastair*
> 
> odd Battlefield 4 and pretty much every other game I play gives me pretty good scaling.
> 
> Battlefield 4 in the test range. Standing on top of the building overlooking the island. 2560x1440 150% res scaling and ultra settings (no msaa)
> 1000/500 = 110 fps.
> 1000/550 = 115 fps.
> 1050/550 = 122-125fps
> 1100/550 = 130-133 fps.


So in 1000/550 vs 1000/500 you've netted 5FPS, as % = 4.5% boost for 10% OC on HBM. So in 1050/550 vs 1000/550 you've netted 7-10FPS, as % = ~6-9% boost for 5% OC on GPU. GPU clock from what I recall in my tests of synthetics/games you gain pretty much all the time. HBM on the other hand is poor gains for OC % and depending on use will sometimes give gains and other times not. Don't get me wrong







, I've used 545MHz @ 1.325V 24/7 for many months.
Quote:


> Originally Posted by *Alastair*
> 
> What can I use to stress test my Fury's for stability? I am finding heaven bench to be inadequate and I can't play BF4 (my next best hope) as it just crashes all the time


Hmmm ...

I found Heaven handy TBH, for example on several Fiji cards if HBM clock / HBM voltage was issue the trees went black within minutes. Valley used to be textures on the ground going. Every card I've tested will usually show instability more in one app vs another, so I first try to get it stable in that app, then usually others fall in place after that







.

My usual is [email protected] first, then 3DM GT1 loop, Heaven, Valley and then gaming. I never use OCCT/Furmark/Kombustor on GPU.
Quote:


> Originally Posted by *Bojamijams*
> 
> Are you guys finding the HBM overclocks worthwhile? Seems like they do almost nothing in the 1080p FIrestrike. I am wondering if the FIrestrike Ultra would see more benefit.


Similar situation, I should finish 3DM FS / FSE / FSU mega bench this weekend, thread will be live hopefully Monday







. Those 3 benches, 3 runs each with each GPU/HBM clock, MAX W at wall plug for rig taken and idle will be stated. I will hopefully add same set of data for a differing ASIC quality (LeakageID) Fury X as well







.

1st set of data will be:-

1050 @ Undervolt 1137mV

1050 @ Stock VID 1200mV
1100 @ Stock VID 1200mV
1125 @ Stock VID 1200mV

1050 @ 1262mV
1100 @ 1262mV
1125 @ 1262mV
1145 @ 1262mV

1175 @ 1275mV

Those above clocks/VID @ 500MHz / 545MHz / 600MHz HBM







. This comes to 27 benches for above set of GPU clocks and 1 set of HBM clock, so for 3 HBM clocks = 81 benches, then throw in FS vs FSE vs FSU = 243 benches







(only got 54 to go now). Once I do Fury X ASIC quality 64.4% vs 70% it will be a lotta benches







.
Quote:


> Originally Posted by *supermiguel*
> 
> So is it fair to compare scores between cards if you all have different CPUS? wouldnt it be better to run a more graphics card specific tests? Or is this as good as it gets?


Unless CPU is bottlenecking GPU, regardless of differing CPU from what I've seen the graphics score which is made up of graphics test 1 & 2 is fair compare of GPU performance. Some good info in this thread on 3DM13 score calc. I will throw in some benches of stock i5 4690K vs OC upto CPU 4.9GHz / Cache 4.4GHz in the thread I post







. Check out the HWBot subs, there are a few subs I match on GS and sub had better CPU







, also bare in mind where you see sub using Win 10 the user is getting slight boost from that OS driver, I know I am after comparing like driver version on Win 7 vs Win 10







.
Quote:


> Originally Posted by *supermiguel*
> 
> how would you check if your CPU is the bottleneck of your CF? like i have a 3930k will it handle 2? 3? 4? how can i test/check on this?


I've never done these kinda checks, as haven't been in this position in a way.
Quote:


> Originally Posted by *Johan45*
> 
> @gupsterg
> I felt it was something to do with HBM since I have never noticed this on any other card before. The first I noticed I was in a competition and running FSE . Started glitching at the beginning of test 2 with an unusally high FPS. The test finished but I couldn't submit the result. No way test 2 should be +50 FPS in that bench. Could be that I had never noticed with other cards but I don't think so. Typically if it's not stable enough GT2 will drop FPS.


+50FPS is valid for GT2 IMO







, it's just ~80FPS depending on "situation"








.

3x 3DM FS stock clocks Fury X with my undervolt ROM on Win 7 with Crimson Relive v16.12.2 WHQL, GT2 ~66FPS.

4x 3DM FS my 24/7 OC ROM on Fury X no 3, 4th result is with latest 3DM, Win 7 Crimson Relive v16.12.2 WHQL, GT2 ~71FPS.

2x 3DM FS my 24/7 OC ROM on Fury X no 3, Win 10 Crimson Relive v16.12.2 WHQL, GT2 ~72FPS.

My HWBot sub on 24/7 OC ROM Win 7, older driver (will be redoing with Win 10







), GT2 ~78FPS due to tessaltion load tweak.

How I see it is stable OC like my 1145/545, which I've done hundreds of hours usage since setting up months ago won't surpass figures above for GT2. The 3DM FS subs where sub had ~80FPS as *valid result* which I was talking about before, I truly believe are glitched result, like I saw on the 1100/600 I posted in post 10414.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> odd Battlefield 4 and pretty much every other game I play gives me pretty good scaling.
> 
> Battlefield 4 in the test range. Standing on top of the building overlooking the island. 2560x1440 150% res scaling and ultra settings (no msaa)
> 1000/500 = 110 fps.
> 1000/550 = 115 fps.
> 1050/550 = 122-125fps
> 1100/550 = 130-133 fps.
> 
> 
> 
> So in 1000/550 vs 1000/500 you've netted 5FPS, as % = 4.5% boost for 10% OC on HBM. So in 1050/550 vs 1000/550 you've netted 7-10FPS, as % = ~6-9% boost for 5% OC on GPU. GPU clock from what I recall in my tests of synthetics/games you gain pretty much all the time. HBM on the other hand is poor gains for OC % and depending on use will sometimes give gains and other times not. Don't get me wrong
> 
> 
> 
> 
> 
> 
> 
> , I've used 545MHz @ 1.325V 24/7 for many months.
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What can I use to stress test my Fury's for stability? I am finding heaven bench to be inadequate and I can't play BF4 (my next best hope) as it just crashes all the time
> 
> Click to expand...
> 
> Hmmm ...
> 
> I found Heaven handy TBH, for example on several Fiji cards if HBM clock / HBM voltage was issue the trees went black within minutes. Valley used to be textures on the ground going. Every card I've tested will usually show instability more in one app vs another, so I first try to get it stable in that app, then usually others fall in place after that
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My usual is [email protected] first, then 3DM GT1 loop, Heaven, Valley and then gaming. I never use OCCT/Furmark/Kombustor on GPU.
Click to expand...

Its funny looking at those numbers they don't add up right? I couldn't work it out myself. There is something strange happening when I alt+tab out the game. After one instance I started the game, got into position and was doing 150fps. Alt tabbed and it went down to 122ish.







I ahve decided to leave BF4. It isnt working. It keeps crashing, regardless of OC settings on cpu gpu. Tried re-installing the game and redoing drivers. I dunno whats up. So yeah.


----------



## gupsterg

I haven't got BF4 so can't test myself for FPS







, I don't dispute your FPS/testing data







.

I don't know if it has in game benchmark, usually games which have in game benchmarks I use for comparing cards/OC, etc, as basically each run will be same "GPU work". I do game FRAPS tests at time, but I know in the back of my head it's like a ball park test, for a number of reasons, as an example I may do something different (ie mouse movement at differing speed, etc) leading to differing FPS. This is where set in game benchmark or synthetic is better. I don't believe synthetic vs game for stability testing is any different for purpose, each has to load GPU, each has an "engine", all I believe is each app is using GPU in a different way leading to different testing of GPU/OC.

One thing I came across in my 3DM mega bench which should have TimeSpy is it seems to show instability more, don't know if it is due to DX12 usage, etc. Both these results are from my Fury X no 3 (left older driver, right v16.12.2 WHQL), the one I've had the longest and keep pretty much all the time in my i5 rig. It needs only 1268mV to do those TS runs @ 1175MHz (545MHz HBM @ 1.325V), same as 3DM FS / FSE / FSU. Now look at Fury X no 8, 3 runs of TS back to back (v16.12.2 WHQL), it needs 1300mV for 1175MHz, compared with 1275mV for 1175MHz on 3DM FS / FSE / FSU. If I set it to 1275mV and increment +6mV at a time this is what happens, 4 runs of TS. These 4 runs were back to back prior to the better scoring ones (all 7 runs together, all were Valid results).

Why I've launched into this mega testing is how these GPU's are clocking the same, have what I believe "good ASIC" VID (spoke about it in the Fiji ASIC quality thread), the Fury X no 3 is ~64% ASIC quality with 1.212V stock and Fury X no 8 is 70% ASIC quality with 1.200V stock. Besides the voltage scenario for above, the Fury X no 8 needs higher OCP than no 3, proving what The Stilt has said so many times (see heading What is "ASIC Quality"? in Hawaii bios mod OP).

Apologies for bit of ramble about the testing I'm doing, sorta hyped about it at mo







.

I also believe VR hits GPU hard, @xkm1948 has posted several times how his OC for general gaming fails for VR.


----------



## Alastair

Any one know a program that can give a really nice afterburner like OSD without it actually being afterburner?


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Any one know a program that can give a really nice afterburner like OSD without it actually being afterburner?


Playclaw5.

Try and get an older version of it. It's what I use exclusively.

Mine looks like this:



Hope this helps.

EDIT: My build version of it is 3105. Try and find that.

Also, when you first start it it will nag you and want you to pay, just click "Try It" every time you open it and it will work indefinitely without having to pay and seemingly none of the features are locked off.


----------



## Bojamijams

My Fury is a really peculiar thing. I can do 1111/550 with stock voltage no problem. But even going +72mV in Trixx will not get me 1130 stable. Never encountered this before.

Anyway.. is there a way for me to write a new bios that takes into account the memory overclock so I can remove Trixx from my system and stay with my 1111/550?


----------



## jdorje

Rtss+hwinfo


----------



## ressonantia

Quote:


> Originally Posted by *Bojamijams*
> 
> My Fury is a really peculiar thing. I can do 1111/550 with stock voltage no problem. But even going +72mV in Trixx will not get me 1130 stable. Never encountered this before.
> 
> Anyway.. is there a way for me to write a new bios that takes into account the memory overclock so I can remove Trixx from my system and stay with my 1111/550?


Yeah have a look at @gupsterg's thread here


----------



## Alastair

Does having different cards as your primary adapter in your CF set up effect your max OC? I ask this because I recently switched my cards around. So card 1 became card2 and card 2 became card 1. And I cant seem to clock as high now.


----------



## SirBubby

New Fury X owner here. Mine came yesterday, got it for $308 on Amazon. I'm upgrading from a MSI 390x. The 390x would reach 90c on stock speeds so it's nice to have a water cooled card. I've been messing with overclocks and I didn't hit the lottery with this one. My ACIS score is only 65%. Best I can get stable using Sapphire Trixx with power limit to 50% and 72mv is 1140/570. I used to use MSI Afterburner but couldn't find a any to unlock the memory.

I'm new to HBM memory and i'm trying to figure out how it compares to DDR5 when it comes to VRAM usage. My 390x would use close to 6GB VRAM in Battlefield 1 and Battlefront @ 1440p Ultra settings. The Fury X only uses about 3.5GB of Vram yet I get 15-20 more FPS. Is the game or card limiting the amount the game can use? Wouldn't I see reduced quality or speed by dropping it to 3.5GB?


----------



## supermiguel

You guys able to get better overclocking with custom cooling?


----------



## weespid

Quote:


> Originally Posted by *Alastair*
> 
> Yes greatly. Especially for AMD owners since it essentially nerfs our FX CPU's. Futuremark refuses to acknowledge and accept that the FX is a true 8 core processor. And so when the combined test is run it only runs on what Futuremark has identified to be physical cores. Which is probably to be expected as an overclocked FX would probably knock on the door of an I7 in this test if it got to flex all of its muscle.


While then for benching firestrike try windows 7 missing the hotfx that changes the 8350 from 8 cores /8 threads to 4 cores /8 threads this please done to improve the windows task scheduler because of the module design of the chips but if all 8 threads are hit the same it should do nouthing. Or if future mark uses integer compute rather than floating point it is better to run on just 4 threads afak. Olny way to find out is to test. Any way back on topic.

Does anyone know if changeing clocks and voltage from MSI afterburner with relive brings around instability? Because I'm finding my 1060/545 -35mv which I had stable before on my nano is not even close to stable any more.
Quote:


> Originally Posted by *Alastair*
> 
> Does having different cards as your primary adapter in your CF set up effect your max OC? I ask this because I recently switched my cards around. So card 1 became card2 and card 2 became card 1. And I cant seem to clock as high now.


I would assume so because if your pushing the limit top card usually runs hotter and gets any where from 2-20% more load (based of my cf 7950's) in bf4


----------



## Tgrove

Quote:


> Originally Posted by *SirBubby*
> 
> New Fury X owner here. Mine came yesterday, got it for $308 on Amazon. I'm upgrading from a MSI 390x. The 390x would reach 90c on stock speeds so it's nice to have a water cooled card. I've been messing with overclocks and I didn't hit the lottery with this one. My ACIS score is only 65%. Best I can get stable using Sapphire Trixx with power limit to 50% and 72mv is 1140/570. I used to use MSI Afterburner but couldn't find a any to unlock the memory.
> 
> I'm new to HBM memory and i'm trying to figure out how it compares to DDR5 when it comes to VRAM usage. My 390x would use close to 6GB VRAM in Battlefield 1 and Battlefront @ 1440p Ultra settings. The Fury X only uses about 3.5GB of Vram yet I get 15-20 more FPS. Is the game or card limiting the amount the game can use? Wouldn't I see reduced quality or speed by dropping it to 3.5GB?


After using 2 fury x for over a year only thing i can say is hbm just uses less vram than ddr5 for whatever reason. 4gb hbm seems to be like 6gb ddr5 as far as capacity goes


----------



## Johan45

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> @gupsterg
> I felt it was something to do with HBM since I have never noticed this on any other card before. The first I noticed I was in a competition and running *FSE* . Started glitching at the beginning of test 2 with an unusually high FPS. The test finished but I couldn't submit the result. No way test 2 should be +50 FPS in that bench. Could be that I had never noticed with other cards but I don't think so. Typically if it's not stable enough GT2 will drop FPS.
> 
> 
> 
> +50FPS is valid for GT2 IMO
> 
> 
> 
> 
> 
> 
> 
> , it's just ~80FPS depending on "situation"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 3x 3DM FS stock clocks Fury X with my undervolt ROM on Win 7 with Crimson Relive v16.12.2 WHQL, GT2 ~66FPS.
> 
> 4x 3DM FS my 24/7 OC ROM on Fury X no 3, 4th result is with latest 3DM, Win 7 Crimson Relive v16.12.2 WHQL, GT2 ~71FPS.
> 
> 2x 3DM FS my 24/7 OC ROM on Fury X no 3, Win 10 Crimson Relive v16.12.2 WHQL, GT2 ~72FPS.
> 
> My HWBot sub on 24/7 OC ROM Win 7, older driver (will be redoing with Win 10
> 
> 
> 
> 
> 
> 
> 
> ), GT2 ~78FPS due to tessaltion load tweak.
> 
> How I see it is stable OC like my 1145/545, which I've done hundreds of hours usage since setting up months ago won't surpass figures above for GT2. The 3DM FS subs where sub had ~80FPS as *valid result* which I was talking about before, I truly believe are glitched result, like I saw on the 1100/600 I posted in post 10414.
Click to expand...

I think you Missed the "E" I was testing Extreme when I first noticed the "glitch" and 50 FPS is not right in GT2 on FSE


----------



## gupsterg

Quote:


> Originally Posted by *Tgrove*
> 
> After using 2 fury x for over a year only thing i can say is hbm just uses less vram than ddr5 for whatever reason. 4gb hbm seems to be like 6gb ddr5 as far as capacity goes


4GB HBM is just like 4GB DDR5 as far as capacity.

Basically what we see as VRAM usage in monitoring is an allocation of VRAM and not actual usage, see few posts here.

For example, I was googling for performance data on Lords of the fallen one day and noted on a RX 480 8GB "VRAM Usage" of upto 6GB @ 1080P max in game settings, but on my Fury X I see 3.9GB @ 1440P same settings, so the driver or game is aware that it's a 4GB card and only allocating as much (1080P I see 3.5GB).
Quote:


> Originally Posted by *Johan45*
> 
> I think you Missed the "E" I was testing Extreme when I first noticed the "glitch" and 50 FPS is not right in GT2 on FSE










ahhh yes, my mistake missed the E







and answered on what I previously was posting regarding 3DM FS GT2 80FPS anomaly.
Quote:


> Originally Posted by *supermiguel*
> 
> You guys able to get better overclocking with custom cooling?


From what I've noted members with WC don't have much better an OC for everyday use as stock AIO/air coolers TBH. Alastair has 2x Fury CF EKWB blocks and ht_addict has a meaty setup and still 1150/550, his post.


----------



## Bojamijams

Quote:


> Originally Posted by *gupsterg*
> 
> . Alastair has 2x Fury CF EKWB blocks and ht_addict has a meaty setup and still 1150/550, his post.


Should the mem be at 545 as per your previous discovery of HBM and steps?


----------



## supermiguel

Quote:


> Originally Posted by *Bojamijams*
> 
> Should the mem be at 545 as per your previous discovery of HBM and steps?


why 545?


----------



## Bojamijams

Quote:


> Originally Posted by *supermiguel*
> 
> why 545?


http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo/820#post_25241381


----------



## kfxsti

Just an odd question for you guys. Say my motherboard Pci-e lanes runs at x16-x4 . Would there be that much of a bottleneck with two fury's ? I still have my ASRock z170 Pro4 (x16-x4) and prefer it over the Asus Maximus I have now.. just wondering if it's worth the swap or just go ahead and grab a better ASRock board.


----------



## gupsterg

Quote:


> Originally Posted by *Bojamijams*
> 
> Should the mem be at 545 as per your previous discovery of HBM and steps?


Yes







.

I follow the steps since doing testing you linked







, prior to that I did not as I saw no test data confirming it. I had seen Fiji owners using the steps prior to testing as well, as they had seen the information originally posted by AMDMatt. He posted information on HBM clocking steps few months after Fiji was released (IIRC), on OCuk where he posts a lot, Guru3D and so on (on OCN he uses a different username).

As HBM OC has so little effect on FPS (ie can be lost in run to run variance depending on test), IMO only way to test is by doing VRAM memory test.


----------



## NightAntilli

Quote:


> Originally Posted by *kfxsti*
> 
> Just an odd question for you guys. Say my motherboard Pci-e lanes runs at x16-x4 . Would there be that much of a bottleneck with two fury's ? I still have my ASRock z170 Pro4 (x16-x4) and prefer it over the Asus Maximus I have now.. just wondering if it's worth the swap or just go ahead and grab a better ASRock board.


You sure it'll run in x16-x4 and not x8-x8?

In any case, there will be a difference... You can use this as a reference (although it's an extremely old test);
https://www.bit-tech.net/hardware/graphics/2007/10/12/crossfire_comparison_intel_x38_versus_p35/3119

There's also this, as another reference with an HD7970;

http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa

It's likely that the crossfire scaling will be s more significant performance loss than the x4 bottleneck.


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tgrove*
> 
> After using 2 fury x for over a year only thing i can say is hbm just uses less vram than ddr5 for whatever reason. 4gb hbm seems to be like 6gb ddr5 as far as capacity goes
> 
> 
> 
> 4GB HBM is just like 4GB DDR5 as far as capacity.
> 
> Basically what we see as VRAM usage in monitoring is an allocation of VRAM and not actual usage, see few posts here.
> 
> For example, I was googling for performance data on Lords of the fallen one day and noted on a RX 480 8GB "VRAM Usage" of upto 6GB @ 1080P max in game settings, but on my Fury X I see 3.9GB @ 1440P same settings, so the driver or game is aware that it's a 4GB card and only allocating as much (1080P I see 3.5GB).
> Quote:
> 
> 
> 
> Originally Posted by *Johan45*
> 
> I think you Missed the "E" I was testing Extreme when I first noticed the "glitch" and 50 FPS is not right in GT2 on FSE
> 
> Click to expand...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ahhh yes, my mistake missed the E
> 
> 
> 
> 
> 
> 
> 
> and answered on what I previously was posting regarding 3DM FS GT2 80FPS anomaly.
> Quote:
> 
> 
> 
> Originally Posted by *supermiguel*
> 
> You guys able to get better overclocking with custom cooling?
> 
> Click to expand...
> 
> From what I've noted members with WC don't have much better an OC for everyday use as stock AIO/air coolers TBH. Alastair has 2x Fury CF EKWB blocks and ht_addict has a meaty setup and still 1150/550, his post.
Click to expand...

I have yet to try beyond 1150 yet. They are dumping a lot of heat into the system OC'ed and my CPU doesn't like it much. I will try more in the future come winter.


----------



## kfxsti

Quote:


> Originally Posted by *NightAntilli*
> 
> You sure it'll run in x16-x4 and not x8-x8?
> 
> In any case, there will be a difference... You can use this as a reference (although it's an extremely old test);
> https://www.bit-tech.net/hardware/graphics/2007/10/12/crossfire_comparison_intel_x38_versus_p35/3119
> 
> There's also this, as another reference with an HD7970;
> 
> http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa
> 
> It's likely that the crossfire scaling will be s more significant performance loss than the x4 bottleneck.


I checked with Gpuz and it is indeed showing x16-x4
Still had nice scores on Firestrike with the board. Its hard to test with the games i like as BF1 and The Division Crossfire performance is horrid at best. lol


----------



## microchidism

hey guys, just joined this club, picked up a nano after my 970 kicked the bucket

going to mess around with it this week, might get a fury x or a 1070/980 ti based on how it does so stay tuned


----------



## Sburms015

Late to the party but http://www.3dmark.com/fs/11465170 currently running at 1170 Mhz core and 550 Mhz HBM but haven't benched at those clocks yet but it's gaming stable! My ASIC is 61.1%


----------



## Thoth420

Quote:


> Originally Posted by *Sburms015*
> 
> Late to the party but http://www.3dmark.com/fs/11465170 currently running at 1170 Mhz core and 550 Mhz HBM but haven't benched at those clocks yet but it's gaming stable! My ASIC is 61.1%


Nice score and a pretty nice OC on that GPU on what appears to be an older system(no digs I rebuild way more than needed) with just a Fury X added in.


----------



## supermiguel

So with 2 fury X i get about 22800 fire strike score, when adding a 3rd one i get 23900-24100... The fury X are so cheap right now ($300) is worth it getting 2 of them, but NOT so much going with a 3rd one... almost no gain, but they sure look cool inside the case, im currently debating on returning or not


----------



## Tgrove

Quote:


> Originally Posted by *gupsterg*
> 
> 4GB HBM is just like 4GB DDR5 as far as capacity.
> 
> Basically what we see as VRAM usage in monitoring is an allocation of VRAM and not actual usage, see few posts here.
> 
> For example, I was googling for performance data on Lords of the fallen one day and noted on a RX 480 8GB "VRAM Usage" of upto 6GB @ 1080P max in game settings, but on my Fury X I see 3.9GB @ 1440P same settings, so the driver or game is aware that it's a 4GB card and only allocating as much (1080P I see 3.5GB).
> 
> 
> 
> 
> 
> 
> 
> ahhh yes, my mistake missed the E
> 
> 
> 
> 
> 
> 
> 
> and answered on what I previously was posting regarding 3DM FS GT2 80FPS anomaly.
> From what I've noted members with WC don't have much better an OC for everyday use as stock AIO/air coolers TBH. Alastair has 2x Fury CF EKWB blocks and ht_addict has a meaty setup and still 1150/550, his post.


What i mean is i can play at settings that would not be possible on 4gb ddr5.


----------



## diggiddi

Quote:


> Originally Posted by *supermiguel*
> 
> So with 2 fury X i get about 22800 fire strike score, when adding a 3rd one i get 23900-24100... The fury X are so cheap right now ($300) is worth it getting 2 of them, but NOT so much going with a 3rd one... almost no gain, but they sure look cool inside the case, im currently debating on returning or not


Dude let me give you my address


----------



## gupsterg

@Tgrove

Ahh, cool







.

@supermiguel

I was tempted to go CF with the Fury X I recently got, but then thought I'd not.

I equated that 2x Fury X is costing similar to what launch price of Vega maybe and Vega as a single card solution is gonna be a better upgrade. 4GB HBM may not be an issue now, may also not be an issue in future for some time, if I tweak settings, but it will at some point, so again Vega seemed better as upgrade.

So I'd probably suggest keep the 1 Fury X that clocks the best out of the 3, this is what I'm doing







.

I may even forgo going to Vega, if Fury X suffices for my gaming needs, but after seeing the AMD Instinct slides and Vega being 12.5 TFlops vs 8.6 TFlops of Fury X, I'd like Vega just to see [email protected] get a boost







.


----------



## dagget3450

Quote:


> Originally Posted by *gupsterg*
> 
> @Tgrove
> 
> Ahh, cool
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @supermiguel
> 
> I was tempted to go CF with the Fury X I recently got, but then thought I'd not.
> 
> I equated that 2x Fury X is costing similar to what launch price of Vega maybe and Vega as a single card solution is gonna be a better upgrade. 4GB HBM may not be an issue now, may also not be an issue in future for some time, if I tweak settings, but it will at some point, so again Vega seemed better as upgrade.
> 
> So I'd probably suggest keep the 1 Fury X that clocks the best out of the 3, this is what I'm doing
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I may even forgo going to Vega, if Fury X suffices for my gaming needs, but after seeing the AMD Instinct slides and Vega being 12.5 TFlops vs 8.6 TFlops of Fury X, I'd like Vega just to see [email protected] get a boost
> 
> 
> 
> 
> 
> 
> 
> .


2 wy cf seems to be decent its 3 /4 way thats a disaster. I wish we had the ability to customize cf like sli has with software tools. Either way waiting for vega is asolid option for sure. It just needs to be faster than 1080gtx and hopefully close to titanx pascal. At least for me to consider it.


----------



## supermiguel

Quote:


> Originally Posted by *gupsterg*
> 
> @Tgrove
> 
> Ahh, cool
> 
> 
> 
> 
> 
> 
> 
> .
> 
> @supermiguel
> 
> I was tempted to go CF with the Fury X I recently got, but then thought I'd not.
> 
> I equated that 2x Fury X is costing similar to what launch price of Vega maybe and Vega as a single card solution is gonna be a better upgrade. 4GB HBM may not be an issue now, may also not be an issue in future for some time, if I tweak settings, but it will at some point, so again Vega seemed better as upgrade.
> 
> So I'd probably suggest keep the 1 Fury X that clocks the best out of the 3, this is what I'm doing
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I may even forgo going to Vega, if Fury X suffices for my gaming needs, but after seeing the AMD Instinct slides and Vega being 12.5 TFlops vs 8.6 TFlops of Fury X, I'd like Vega just to see [email protected] get a boost
> 
> 
> 
> 
> 
> 
> 
> .


I think the biggest issue right now is that not that many games support multiple card configuration... Heck even Doom with the new api doesnt support multi card.

And i agree with @dagget3450 2 cards is good, 3/4 sucks... Can you set a card to do physxs only like nvidia?


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> 2 wy cf seems to be decent its 3 /4 way thats a disaster. I wish we had the ability to customize cf like sli has with software tools. Either way waiting for vega is asolid option for sure. It just needs to be faster than 1080gtx and hopefully close to titanx pascal. At least for me to consider it.


+rep for info







. Yeah I'm hoping the same as you on Vega







.
Quote:


> Originally Posted by *supermiguel*
> 
> I think the biggest issue right now is that not that many games support multiple card configuration... Heck even Doom with the new api doesnt support multi card.


Yeah I have read x doesn't support CF at all and also x did not have day 1 CF support, etc, etc. A reason why I've stuck with single GPU always.


----------



## neurotix

I agree guys, no point in adding a 3rd card in Crossfire, it's been this way forever. Even back to the 7970 days.

In good scenarios you can get like 95% scaling with two cards, but adding a 3rd just isn't worth it because the increase will never be anywhere near that much.

Overall I'm extremely happy with my 2 Fury's for the price I paid, though like others said, I do worry about the low amount of VRAM being an issue as time goes on. I'd like to hold on to these til 2018/2019 sometime when Navi comes out, and skip Vega, if I can.

I disagree about Vega being twice as strong as Fury though... traditionally the new flagships offer about 25% performance improvement over the last generation's high end cards. It was this way from 7970 to 290X, and 290X to Fury X. I will really be very surprised if Vega is twice as powerful as the Fury X. I'd say 50% more powerful is probably the best case scenario here.


----------



## gupsterg

~1/3 of my 3DM runs data has been posted in this thread







. Will add at least a 1/3 more tomorrow







.


----------



## diggiddi

Can the performance of Vega be extrapolated(guesstimated) from Compute performance vs Fiji/Hawaii compute at all?


----------



## supermiguel

what are the max temps you guys get while running fury x? and what are your idle temp?


----------



## dagget3450

Quote:


> Originally Posted by *diggiddi*
> 
> Can the performance of Vega be extrapolated(guesstimated) from Compute performance vs Fiji/Hawaii compute at all?


i want to say people did something similar educated guesses based on math. i think in threads and videos but its really guessing at this point i would say.


----------



## gupsterg

Endnotes of slide from Jan preview was:-



It made me go OMG, I'm so hoping Vega is a big hit.

I hope it doesn't feel like Fiji where we got so many SP and it doesn't seem enough performance for the count, ROPs is the other aspect I'm hoping they go big on and another one on the wish list is no negative scaling with voltage!

I do hope they make an AIO card as well, really like the build quality/total package of Fury X TBH.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Endnotes of slide from Jan preview was:-
> 
> 
> 
> It made me go OMG, I'm so hoping Vega is a big hit.
> 
> I hope it doesn't feel like Fiji where we got so many SP and it doesn't seem enough performance for the count, ROPs is the other aspect I'm hoping they go big on and another one on the wish list is no negative scaling with voltage!
> 
> I do hope they make an AIO card as well, really like the build quality/total package of Fury X TBH.


Well Gupsterg, AdoredTV did a nice break down of the potential of Vega.











He did not feel it was as good as he had hoped. But his estimates look very good from my point of view.









Nice work on the benches.









The short version is he figures +10% better performance wise than that of GTX 1080.
Quote:


> Originally Posted by *Johan45*
> 
> Short answer is yes as to how much I don't know but when testing for max GPU clock. I run my CPU at stock to save heat in my cold loop.


I looked through your earlier posts. That is quite the cold loop.


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> Endnotes of slide from Jan preview was:-
> 
> 
> 
> It made me go OMG, I'm so hoping Vega is a big hit.
> 
> I hope it doesn't feel like Fiji where we got so many SP and it doesn't seem enough performance for the count, ROPs is the other aspect I'm hoping they go big on and another one on the wish list is no negative scaling with voltage!
> 
> I do hope they make an AIO card as well, really like the build quality/total package of Fury X TBH.


That AIO is really what made me get the Fury x over the 980ti when I was looking... price was the same... 980ti seemed to win out in some aspects... but Fury x in others... so the only thing that was seriously different was the AIO... really love that cooling solution... but I suppose I could get a water block and add it to my loop... it is coming time for me to re-build my system soon anyway... maybe a zen vega build







but for temporary getting a cheap fury x wouldn't be a totally bad solution if one needed it... Love the temps, even folding for hours on end it never reached 60C overclocked.


----------



## LionS7

In guru3d i saw that 4GB is not enough for Resident Evil 7. The problem setting is "shadow cache" they say. Did somebody checked the game yet ? For now, there is 2 games that really cannot be played with 4GB max, even HBM. Rise of the Tomb Raider (Very high tex) and Mirror's Edge: Catalyst (Hyper tex). Is R9 Fury X coming to the end of the road... All of those are Nvidia games sure. AMD really good killed Fiji with these 4GB.


----------



## dagget3450

Quote:


> Originally Posted by *LionS7*
> 
> In guru3d i saw that 4GB is not enough for Resident Evil 7. The problem setting is "shadow cache" they say. Did somebody checked the game yet ? For now, there is 2 games that really cannot be played with 4GB max, even HBM. Rise of the Tomb Raider (Very high tex) and Mirror's Edge: Catalyst (Hyper tex). Is R9 Fury X coming to the end of the road... All of those are Nvidia games sure. AMD really good killed Fiji with these 4GB.


Yeah dx12 was supposed to do things like share memory. Too bad that was all bs. AMD might be doing mlre trickery with the vega like texture streaming so we will see. On one hand i find it ironic that not long after fury comes games "requiring" 4gb vram and yet according to steam stats last i checked not many using 4gb cards. Makes you wonder why that happened.

Anyways time will tell


----------



## kondziowy

Looks like 4GB is not good enough for 4K maxed in Resident Evil 7. But... you need to lower details to get 60 fps. Teaser was running great for me (45-70fps) at 4K Very High, Shadows High, SSAO, Reflections Variable(this effect is broken visually anyway so noting lost honestly).

HBAO+ at 4K is cutting framerate from 60 -> to 35fps and power draw is going down from 300W -> 250W so utilisation is bad. Turning off Nvidia stuff is the first thing I do in every game and I recommend to do so.

At Guru3d 1440p regular Fury beats GTX1070 and 980Ti so big win for 4GB card







At pcgamershardware Fury X on par with 980Ti but they test with forced max AF in drivers.

Anyway, 4GB is not enoug, but still beats the crap out of 8GB cards so


----------



## LionS7

Quote:


> Originally Posted by *dagget3450*
> 
> Yeah dx12 was supposed to do things like share memory. Too bad that was all bs. AMD might be doing mlre trickery with the vega like texture streaming so we will see. On one hand i find it ironic that not long after fury comes games "requiring" 4gb vram and yet according to steam stats last i checked not many using 4gb cards. Makes you wonder why that happened.
> 
> Anyways time will tell


Well, Nvidia happened..., like in Mirror's Edge: Catalyst with the marketing on 8GB VRAM with GP104.
Quote:


> Originally Posted by *kondziowy*
> 
> Looks like 4GB is not good enough for 4K maxed in Resident Evil 7. But... you need to lower details to get 60 fps. Teaser was running great for me (45-70fps) at 4K Very High, Shadows High, SSAO, Reflections Variable(this effect is broken visually anyway so noting lost honestly).
> 
> HBAO+ at 4K is cutting framerate from 60 -> to 35fps and power draw is going down from 300W -> 250W so utilisation is bad. Turning off Nvidia stuff is the first thing I do in every game and I recommend to do so.
> 
> At Guru3d 1440p regular Fury beats GTX1070 and 980Ti so big win for 4GB card
> 
> 
> 
> 
> 
> 
> 
> At pcgamershardware Fury X on par with 980Ti but they test with forced max AF in drivers.
> 
> Anyway, 4GB is not enoug, but still beats the crap out of 8GB cards so


Yeah, Im not using nothing from Gameworks anymore, even if it supported by Radeon platform. Recently I stopped using HBAO+ too. Im really tired from this "underground" marketing policy.


----------



## Alastair

Quote:


> Originally Posted by *kondziowy*
> 
> Looks like 4GB is not good enough for 4K maxed in Resident Evil 7. But... you need to lower details to get 60 fps. Teaser was running great for me (45-70fps) at 4K Very High, Shadows High, SSAO, Reflections Variable(this effect is broken visually anyway so noting lost honestly).
> 
> HBAO+ at 4K is cutting framerate from 60 -> to 35fps and power draw is going down from 300W -> 250W so utilisation is bad. Turning off Nvidia stuff is the first thing I do in every game and I recommend to do so.
> 
> At Guru3d 1440p regular Fury beats GTX1070 and 980Ti so big win for 4GB card
> 
> 
> 
> 
> 
> 
> 
> At pcgamershardware Fury X on par with 980Ti but they test with forced max AF in drivers.
> 
> Anyway, 4GB is not enoug, but still beats the crap out of 8GB cards so


Does someone have like a comprehensive list on what "nvidia" settings need to be turned off in games?


----------



## LionS7

Quote:


> Originally Posted by *Alastair*
> 
> Does someone have like a comprehensive list on what "nvidia" settings need to be turned off in games?


Those who are supported by Radeon are few. HBAO+, PCSS (very unoptimize), Hairworks. Those are the main ones. If you are playing Mirror's Edge: Catalyst, don't bother with Hyper settings.


----------



## Bojamijams

Is HBAO+ different from regular HBAO?


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Does someone have like a comprehensive list on what "nvidia" settings need to be turned off in games?


Quote:


> Originally Posted by *LionS7*
> 
> Those who are supported by Radeon are few. HBAO+, PCSS (very unoptimize), Hairworks. Those are the main ones. If you are playing Mirror's Edge: Catalyst, don't bother with Hyper settings.


TXAA(Temporal AA like in SC Blacklist) is Nvidia as well right?
Quote:


> Originally Posted by *Bojamijams*
> 
> Is HBAO+ different from regular HBAO?


Also HBAO and + are both Nvidia. It is there fancy named version of SSAO. I can't tell you the specifics of what is different about the + but the HBAO is all Nvidia terminology for their Ambient Occlusion.

If I recall AMD had their own special AO as well that Deus Ex Human Revo uses I believe...


----------



## kondziowy

https://www.overclock3d.net/reviews/software/resident_evil_7_biohazard_pc_performance_review/12
Fury X 4% behind GTX1080. I knew that Fury will beat it out eventually in non vram bottlenecked scenarios. Just wait for Volta to release and it will be done.


----------



## supermiguel

For custom cooling do you guys recommend installing water blocks in series or in parallel?


----------



## battleaxe

Quote:


> Originally Posted by *supermiguel*
> 
> For custom cooling do you guys recommend installing water blocks in series or in parallel?


Mine are in series. Parallel only really has an advantage if your blocks are restrictive. Otherwise, just do it the way it works best for your loop. For me, it was series. But my blocks are not restrictive.


----------



## supermiguel

Quote:


> Originally Posted by *battleaxe*
> 
> Mine are in series. Parallel only really has an advantage if your blocks are restrictive. Otherwise, just do it the way it works best for your loop. For me, it was series. But my blocks are not restrictive.


Im getting a 10C degree difference between stock and custom







not sure if the problem are my blocks or thermal paste application issues


----------



## LionS7

@Thoth420
Yes, TXAA is Nvidia thing, but it's not supported by Radeon gpu, so don't think about it. It's blurry version of Temporal AA.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> @Thoth420
> Yes, TXAA is Nvidia thing, but it's not supported by Radeon gpu, so don't think about it. It's blurry version of Temporal AA.


Yeah I never used in the few games it was listed in the game options even when I was using an Nvidia GPU as I saw no IQ improvement over FXAA and unlike FXAA it had a performance impact.


----------



## Performer81

In RE7 shadow cache has to be turned off for 4Gig cards or less.


----------



## Charcharo

HBAO+ generally runs better on AMD than Nvidia, at least from my personal experience. I really like it.

Most NV tech works with AMD, just may have performance issues. Though things like Hairworks are OK on Fiji and quite good on Polaris (due to tessellation improvements in Fiji and the PDA in Polaris).

These days I have less of a problem with Gamewreks. It is OK all in all. All that I am missing is the fine tuning for older games:
http://www.overclock.net/t/1608196/how-to-improve-image-quality-in-pre-dx10-games-with-nvidia-inspector

And Hardware Physx (also for older games as newer ones use well optimized, multi-threaded PhysX versions and can work OK on the CPU. Not great, but good enough).


----------



## Kana-Maru

Quote:


> Originally Posted by *Charcharo*
> 
> HBAO+ generally runs better on AMD than Nvidia, at least from my personal experience. I really like it.


From my test in Rise of the Tomb Raider with Nvidia's "HBAO+" ran like straight garbage on my Fury X compared to the GTX 980 Ti or NV Maxwell GPUs. Using the built in AO is much better since Nvidia's HBAO+ actually kills the quality. It's noticeable.

Oh and when I say it ran like garbage on my Fury X, during my benchmark tests I saw a 11.63% difference in overall FPS score using the built in benchmark @ 1440p. I saw a 20.04% difference at 4K. I say no to HBAO+ after that experience. Plus the built in AO looks much better anyways.


----------



## gupsterg

@supermiguel

Did you get the bottom of why there is 10C difference between custom cooling vs stock?

I'm eyeing up a ekwb block/backplate for cheap as backup if AIO ever packed up. Bit hesitant as I have no other WC parts or would have jumped at dipping my toe in custom WC'ing for experience.

@bluezone

Cheers for video links, seen them though. I think until Vega is out the jury will be out on where it will sit







.

Yeah been sweet do the benchs and get some power readings. Adding more data each day







.

@all members

Has anyone else experienced display corruption just for regular OS usage recently on Fiji?

I've had this issue very little on all Fiji cards, like once or twice only, but this morning was just browsing OCuk in one tab and AMD community forum and had the display corruption.

Known good screen/cable (MG279Q via DP) on Win 7 Pro x64 with Crimson Relive v16.12.2 WHQL, known good OC via ROM. Card was sitting at idle clocks/voltage (300MHz/900mV/25C), Power efficency enabled in driver as usual daily setup.


----------



## Charcharo

Quote:


> Originally Posted by *Kana-Maru*
> 
> From my test in Rise of the Tomb Raider with Nvidia's "HBAO+" ran like straight garbage on my Fury X compared to the GTX 980 Ti or NV Maxwell GPUs. Using the built in AO is much better since Nvidia's HBAO+ actually kills the quality. It's noticeable.
> 
> Oh and when I say it ran like garbage on my Fury X, during my benchmark tests I saw a 11.63% difference in overall FPS score using the built in benchmark @ 1440p. I saw a 20.04% difference at 4K. I say no to HBAO+ after that experience. Plus the built in AO looks much better anyways.


I have experience with HBAO+ in other games, wont bother installing ROTTR ever so I can not test it.

But in WItcher 3 it is worth it.


----------



## supermiguel

@gupsterg i will take it apart next week and apply more thermal paste and change it from serial to parallel

Currently getting:

Card 1 idle 33
Card 2 idle 31
Card 3 idle 30

Card 1 load 60
Card 2 load 45
Card 3 load 44

I think most people here get 2x idle and 4x load, so running a bit hot


----------



## Kana-Maru

Quote:


> Originally Posted by *Charcharo*
> 
> I have experience with HBAO+ in other games, wont bother installing ROTTR ever so I can not test it.
> 
> But in WItcher 3 it is worth it.


I see. I no longer use Nvidia Gamework tech since it normally kills performance for no good reason. I can test out the The Witcher 3 as well [actually I have already], but during my benchmarks I turn off all Nvidia blackbox tech.


----------



## Charcharo

Quote:


> Originally Posted by *Kana-Maru*
> 
> I see. I no longer use Nvidia Gamework tech since it normally kills performance for no good reason. I can test out the The Witcher 3 as well [actually I have already], but during my benchmarks I turn off all Nvidia blackbox tech.


Meh, I have nothing against the tech's effects. I can deal with somewhat worse performance if it gives good results (note: killing performance is bad, like old PhysX).

At least in the games I play, HBAO+ is a good option. It is barely costlier than SSAO and has better quality.


----------



## budgetgamer120

Quote:


> Originally Posted by *Performer81*
> 
> In RE7 shadow cache has to be turned off for 4Gig cards or less.


I kept in on an played the demo fine on a furyX.


----------



## Sonikku13

Getting a Fury X on Monday - it'll be my stopgap until Vega 10 rolls around, hopefully I can keep from spending too much money though.


----------



## steadly2004

Quote:


> Originally Posted by *Sonikku13*
> 
> Getting a Fury X on Monday - it'll be my stopgap until Vega 10 rolls around, hopefully I can keep from spending too much money though.


That's what I did, only with just a fury.


----------



## damarad21

Great card !! For a good price. Only worried about 4gb VRAM. It seems that now on it will be not enough. In any case I will stay with my Nitro Fury. 
I'm pretty sure all cards as 1060 or 480 will jot have muscle enough for 8Gb, and current gtx 10XX nVidia will fail with DX12.


----------



## Minotaurtoo

I know this has been covered before... but I can't find it... if you have two fury x's will the 4 gb each combine to give you 8gb ram?... when I ask questions like this I feel such a noob.


----------



## budgetgamer120

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I know this has been covered before... but I can't find it... if you have two fury x's will the 4 gb each combine to give you 8gb ram?... when I ask questions like this I feel such a noob.


No it is still 4GB ram. Dx 12 has the ability to combine the ram. But that is up to devs


----------



## dagget3450

Yep dx12 was supposedly able to combine vram, however for now it will never happen .... we keep hoping... but i don't see it happening even if it could might hurt future sales is a guess.


----------



## Minotaurtoo

Quote:


> Originally Posted by *budgetgamer120*
> 
> No it is still 4GB ram. Dx 12 has the ability to combine the ram. But that is up to devs


I was thinking that's the way it was.. but couldn't remember one way or the other .... not that it matters now... with these 12hr shifts and working weekends the only action my gpu has seen is folding... seems to do pretty good at it.


----------



## barracudax01

Hi guys
I need some advise...I'm currently running 2 hd7950's in CF. I'm not a seroius gamer and only running at 1080p.Looking to get rid of the CF setup, so I'm thinking of replacing them with a single r9 fury.
The only concern i have is the 4gb going into the next few years.
so basicaly i'm asking .....should i?...or should i not.?


----------



## steadly2004

Quote:


> Originally Posted by *barracudax01*
> 
> Hi guys
> I need some advise...I'm currently running 2 hd7950's in CF. I'm not a seroius gamer and only running at 1080p.Looking to get rid of the CF setup, so I'm thinking of replacing them with a single r9 fury.
> The only concern i have is the 4gb going into the next few years.
> so basicaly i'm asking .....should i?...or should i not.?


I would wait for VEGA if you can wait a couple months. Otherwise, the Fury is a good price/performance regardless of 4gb vram. It is for sure enough the for 1080p.


----------



## barracudax01

Thanks
I would love to get a VEGA, but i think it will be a bit to costly for my pocket,. i was thinking a Fury now, and then a VEGA in 1 1/2 - 2 years, Since it would only be released around june, so A VEGA at the end of next year:thumb:


----------



## steadly2004

Quote:


> Originally Posted by *barracudax01*
> 
> Thanks
> I would love to get a VEGA, but i think it will be a bit to costly for my pocket,. i was thinking a Fury now, and then a VEGA in 1 1/2 - 2 years, Since it would only be released around june, so A VEGA at the end of next year:thumb:


Yea, Newegg has the fury OC+ for 260 plus 20 rebate. Cheaper than anywhere else I could find except for used on eBay. And cheaper than 480's that perform worse, unless the 4gb is vram limited.


----------



## supermiguel

So im having an issue: i can game, run 3d marks, all good but if i try running heaven or furmark after few seconds my monitor looses its signal and all the little led lights on the card go off, i tried 3 different psu and same thing, the only thing that change was that i installed a new water block on my motherboard, but temps seem normal, also this card is a stock card, was working few days ago


----------



## barracudax01

You guys in America are lucky. you don't pay as much as the rest if the world for gpu's.
Prices here in South Africa are mad....

RX 480 - $360
GTX 1060 - $360
GTX 1070 - $560
GTX 1080 - $900
R9 FURY - $ 510....NEW
R9 FURY - $ 340...USED

So i imagine a VEGA will cost around the $800 mark when it arrives....
so that's why i'm thinking of buying a used fury now....


----------



## Alastair

Quote:


> Originally Posted by *barracudax01*
> 
> You guys in America are lucky. you don't pay as much as the rest if the world for gpu's.
> Prices here in South Africa are mad....
> 
> RX 480 - $360
> GTX 1060 - $360
> GTX 1070 - $560
> GTX 1080 - $900
> R9 FURY - $ 510....NEW
> R9 FURY - $ 340...USED
> 
> So i imagine a VEGA will cost around the $800 mark when it arrives....
> so that's why i'm thinking of buying a used fury now....


I feel ya


----------



## barracudax01

Hey Alastair....How's life in the east today...?
What Gpu's u running and what you think of me getting a fury?....worth it, or not?


----------



## Alastair

Quote:


> Originally Posted by *barracudax01*
> 
> Hey Alastair....How's life in the east today...?
> What Gpu's u running and what you think of me getting a fury?....worth it, or not?


I have a pair of them mate. 3840 unlocked on both as well. Fantastic cards I think. Managed to pull 1150mhz on them under a custom loop with a bit of voltage. They make 2560x1440 look like a peice of cake. I look forward to seeing how they handle 3440x1440 shortly.


----------



## barracudax01

Quote:


> Originally Posted by *Alastair*
> 
> I have a pair of them mate. 3840 unlocked on both as well. Fantastic cards I think. Managed to pull 1150mhz on them under a custom loop with a bit of voltage. They make 2560x1440 look like a peice of cake. I look forward to seeing how they handle 3440x1440 shortly.


Awsome mate...
I think i'll pull the trigger and get get one. I doubt that 1080p will max out the 4gb vram anytime soon........
I'll save up for a vega and waterblock:thumb:


----------



## supermiguel

Quote:


> Originally Posted by *supermiguel*
> 
> So im having an issue: i can game, run 3d marks, all good but if i try running heaven or furmark after few seconds my monitor looses its signal and all the little led lights on the card go off, i tried 3 different psu and same thing, the only thing that change was that i installed a new water block on my motherboard, but temps seem normal, also this card is a stock card, was working few days ago


This is exactly the problem im having https://community.amd.com/thread/183786 :|


----------



## domrockt

Quote:


> Originally Posted by *Alastair*
> 
> I have a pair of them mate. 3840 unlocked on both as well. Fantastic cards I think. Managed to pull 1150mhz on them under a custom loop with a bit of voltage. They make 2560x1440 look like a peice of cake. I look forward to seeing how they handle 3440x1440 shortly.


I can speak for one fury @ 3440*1440, there are games that hit the 4gb wall but almost with "ultra shadow settings" in games.
I found massive fps hits in Doom4 and Shadow of Mordor with shadows in Ultra.
Both games run all maxed out between 50 and 90 fps. So far till Vega even one fury fill my needs.


----------



## gupsterg

Quote:


> Originally Posted by *supermiguel*
> 
> So im having an issue: i can game, run 3d marks, all good but if i try running heaven or furmark after few seconds my monitor looses its signal and all the little led lights on the card go off, i tried 3 different psu and same thing, the only thing that change was that i installed a new water block on my motherboard, but temps seem normal, also this card is a stock card, was working few days ago


Quote:


> Originally Posted by *supermiguel*
> 
> This is exactly the problem im having https://community.amd.com/thread/183786 :|


I think your issue is different than linked thread, main difference their having issue at idle/low loads.

To me it seems Heaven/Furmark is violating OCP on card, when OCP is violated the display will go blank and card will power down. I have never used Furmark / Kombustor / OCCT on a GPU, usually the driver is supposed to see it as a "power virus" and intervene and not allow GPU to go to MAX.

Every Fury X I have had has not had the latest ROM out of box, you may wish to try this updated ROM from AMD.

I'm willing to mod a ROM with slightly higher OCP for you, but I would not recommend running Furmark on such ROM.


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> Every Fury X I have had has not had the latest ROM out of box.


Yes, that's true. Even if the bios number is the same, the actual bios is not. So yeah, download the latest bios from AMD website. It is super stable. With the def. bios of my card, after crashing, I was getting "no video input" from the monitor. With the latest bios form AMD, Im actually seeing the crash details, like in Star Wars: Battlefront, witch is just asking for more core voltage.

Apr 5, 2016 is the latest on the AMD forums for R9 Fury X and R9 Nano.


----------



## budgetgamer120

Quote:


> Originally Posted by *LionS7*
> 
> Yes, that's true. Even if the bios number is the same, the actual bios is not. So yeah, download the latest bios from AMD website. It is super stable. With the def. bios of my card, after crashing, I was getting "no video input" from the monitor. With the latest bios form AMD, Im actually seeing the crash details, like in Star Wars: Battlefront, witch is just asking for more core voltage.
> 
> Apr 5, 2016 is the latest on the AMD forums for R9 Fury X and R9 Nano.


I did not know I had to update bios for a GPU.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> Yes, that's true. Even if the bios number is the same, the actual bios is not. So yeah, download the latest bios from AMD website. It is super stable. With the def. bios of my card, after crashing, I was getting "no video input" from the monitor. With the latest bios form AMD, Im actually seeing the crash details, like in Star Wars: Battlefront, witch is just asking for more core voltage.
> 
> Apr 5, 2016 is the latest on the AMD forums for R9 Fury X and R9 Nano.


I think I will give this a shot as well...maybe it will fix my BF4 issue. Not that it matters much...I have one day to game at best a week until well after Vega is on the market so I am probably upgrading anyways.


----------



## LionS7

Quote:


> Originally Posted by *Thoth420*
> 
> With the large influx of buyers
> I think I will give this a shot as well...maybe it will fix my BF4 issue. Not that it matters much...I have one day to game at best a week until well after Vega is on the market so I am probably upgrading anyways.


What is your issue with Battlefield 4 ?
Yeah everyone, update your bios with those versions, if you have R9 Fury X or R9 Nano - https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware.


----------



## budgetgamer120

Quote:


> Originally Posted by *LionS7*
> 
> What is your issue with Battlefield 4 ?
> Yeah everyone, update your bios with those versions, if you have R9 Fury X or R9 Nano - https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware.


There is no information on what the bios fixes or what if adds...


----------



## HagbardCeline

Quote:


> Originally Posted by *budgetgamer120*
> 
> There is no information on what the bios fixes or what if adds...


The main thing I remember from that BIOS update is UEFI support was added. (I needed this in order to install Windows from a thumb drive at the time)


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> What is your issue with Battlefield 4 ?
> Yeah everyone, update your bios with those versions, if you have R9 Fury X or R9 Nano - https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware.


I get random major framerate drops from my max 144 and avg of 90ish down to between 2-5 fps for 15 second intervals. I even get this with all settings to low or off. A few other games also have microstutter or hard freezing issues but I have never seen a driver crash.


----------



## Bojamijams

Quote:


> Originally Posted by *LionS7*
> 
> What is your issue with Battlefield 4 ?
> Yeah everyone, update your bios with those versions, if you have R9 Fury X or R9 Nano - https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware.


So the regular Fury's are OK then and need no update? I have a Sapphire Fury Nitro and see nothing on Sapphire's website about a BIOS


----------



## Alastair

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LionS7*
> 
> What is your issue with Battlefield 4 ?
> Yeah everyone, update your bios with those versions, if you have R9 Fury X or R9 Nano - https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware.
> 
> 
> 
> There is no information on what the bios fixes or what if adds...
Click to expand...

Also upps the voltage from whatever the stock ViD is to 1.25V for better overclocking.


----------



## supermiguel

Quote:


> Originally Posted by *Thoth420*
> 
> I get random major framerate drops from my max 144 and avg of 90ish down to between 2-5 fps for 15 second intervals. I even get this with all settings to low or off. A few other games also have microstutter or hard freezing issues but I have never seen a driver crash.


What sound card do you have? or what headset? I used to get this issue with Astro headset, normally unpluging them and replugging fixed the issue (weird i know)
Quote:


> Originally Posted by *gupsterg*
> 
> I think your issue is different than linked thread, main difference their having issue at idle/low loads.
> 
> To me it seems Heaven/Furmark is violating OCP on card, when OCP is violated the display will go blank and card will power down. I have never used Furmark / Kombustor / OCCT on a GPU, usually the driver is supposed to see it as a "power virus" and intervene and not allow GPU to go to MAX.
> 
> Every Fury X I have had has not had the latest ROM out of box, you may wish to try this updated ROM from AMD.
> 
> I'm willing to mod a ROM with slightly higher OCP for you, but I would not recommend running Furmark on such ROM.


Ya all of mine are on latest BIOS, what is weird is that if i have that specific card as single the issue happens, but if i have all 3 in, and have crossfire enable then the issue goes away... weird

But since you talked about modding the BIOS, i have a question is there a way to disable ULPS in the video card bios???


----------



## Thoth420

Quote:


> Originally Posted by *supermiguel*
> 
> What sound card do you have? or what headset? I used to get this issue with Astro headset, normally unpluging them and repl
> Ya all of mine are on latest BIOS, what is weird is that if i have that specific card as single the issue happens, but if i have all 3 in, and have crossfire enable then the issue goes away... weird
> 
> But since you talked about modding the BIOS, i have a question is there a way to disable ULPS in the video card bios??? also after i find stable OC, what is the procedure to write those values into the BIOS of the card so i dont have to deal with software?


Just a Fury X with an EK block on it and backplate. I don't have a working headset nor has one ever been plugged into the system as of yet so the sound is piped out of the back I/O to a pair of Bose Satellite speakers.


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> Just a Fury X with an EK block on it and backplate. I don't have a working headset nor has one ever been plugged into the system as of yet so the sound is piped out of the back I/O to a pair of Bose Satellite speakers.


Have you been monitoring what processes are running on your computer? Anything stick out? It sounds to me like something running or trying to run and tying up system resources...


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> Have you been monitoring what processes are running on your computer? Anything stick out? It sounds to me like something running or trying to run and tying up system resources...


My first suspicion but unless it is the LCore.exe which my mouse uses to function with the binds it has via the devices onboard memory then no. (I have the LGS software set to not start with boot and it I never run just a single profile off the mouse itself which requires the aforementioned to run but that is such a common piece of software that I doubt it would be an issue even if the entire suite was running) I have never had any issues running that in the past but everything else that could be disabled is.

I have gone through a pretty long list of things and none have any effect. The issue only occurs on occasion so it isn't the easiest to troubleshoot. I am not too concerned....building a whole new system once the new line of GPUs are out. It could be some aspect of w10 however since I am still getting used to everything they added to the OS package. Talk about a ton of unnecessary junk...


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> My first suspicion but unless it is the LCore.exe which my mouse uses to function with the binds it has via the devices onboard memory then no. (I have the LGS software set to not start with boot and it I never run just a single profile off the mouse itself which requires the aforementioned to run but that is such a common piece of software that I doubt it would be an issue even if the entire suite was running) I have never had any issues running that in the past but everything else that could be disabled is.
> 
> I have gone through a pretty long list of things and none have any effect. The issue only occurs on occasion so it isn't the easiest to troubleshoot. I am not too concerned....building a whole new system once the new line of GPUs are out. It could be some aspect of w10 however since I am still getting used to everything they added to the OS package. Talk about a ton of unnecessary junk...


So weird... I wonder if it is some random junk from W10. Well, hopefully you'll be able to get rid of the issue with the new build, at the very least!


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> So weird... I wonder if it is some random junk from W10. Well, hopefully you'll be able to get rid of the issue with the new build, at the very least!


Yeah the only other thing I haven't tried is increase my RAM frequency past 2133 but I don't see how it being on the lower end of speed for DDR4 would have that much effect on performance. This is the second mobo I have had to try with this CPU so it could be a problem with the chip but those tend to be very rare and it doesn't show any physical signs of defect(bent pins etc.). The first mobo would not stabilize at all even with everything at stock clocks or any tweaks thereof. (That was an ASUS Deluxe Z170 however). This MSI board has never seen a single crash at all but this issue occurs.


----------



## LazarusIV

Quote:


> Originally Posted by *Thoth420*
> 
> Yeah the only other thing I haven't tried is increase my RAM frequency past 2133 but I don't see how it being on the lower end of speed for DDR4 would have that much effect on performance. This is the second mobo I have had to try with this CPU so it could be a problem with the chip but those tend to be very rare and it doesn't show any physical signs of defect(bent pins etc.). The first mobo would not stabilize at all even with everything at stock clocks or any tweaks thereof. (That was an ASUS Deluxe Z170 however). This MSI board has never seen a single crash at all but this issue occurs.


You're making me think it's the chip... any way to RMA it?


----------



## Thoth420

Quote:


> Originally Posted by *LazarusIV*
> 
> You're making me think it's the chip... any way to RMA it?


I am not sure as I have never had to RMA a CPU before. I guess even if the process is long and arduous it is worth a shot since my plan is to snag an SR7 Black and an AM4+ board as soon as I can and toss it in a wetbench to play with until Vega and the 1080Ti come out to decide on what GPU path to go. I haven't been impressed my Intel since the 2600k. If Ryzen can even just compete that is good enough for me.


----------



## gupsterg

Quote:


> Originally Posted by *Bojamijams*
> 
> So the regular Fury's are OK then and need no update? I have a Sapphire Fury Nitro and see nothing on Sapphire's website about a BIOS


The Fury Tri-X and Nitro have had ROM updates IIRC, in the Fiji bios mod thread search for a post by Szaby59. You may also gain a newer ROM by contacting Sapphire via their support ticket system, I have in the past on Hawaii and Fiji.
Quote:


> Originally Posted by *Alastair*
> 
> Also upps the voltage from whatever the stock ViD is to 1.25V for better overclocking.


If you flash a Fury with Fury X ROM you will gain increased VID







. This is due to the increased GPU clock sending ASIC profiling loopy plus how the ROM has a DB on setting VID based on ASIC.

Also if you flash say a Fury with a Fury OC clocks ROM you will increase VID







. For example Tri-X STD is 1000MHz if you flash to Tri-X OC which is 1040MHz you will see increased VID. In this post is screenies of me lowering GPU clock on Nitro and what effect it has on VID.
Quote:


> Originally Posted by *supermiguel*
> 
> But since you talked about modding the BIOS, i have a question is there a way to disable ULPS in the video card bios???


Not came across it, I used the Linux driver to translate ROM and saw no clear reference to it.


----------



## bluezone

I anyone is studying up on VEGA. here is an interesting read. 26 minute video as well.

http://www.techarp.com/articles/amd-vega-memory-architecture-qa/


----------



## steadly2004

Quote:


> Originally Posted by *bluezone*
> 
> I anyone is studying up on VEGA. here is an interesting read. 26 minute video as well.
> 
> http://www.techarp.com/articles/amd-vega-memory-architecture-qa/


Nice link! I have been following VEGA, but haven't heard this specific video.







I very much enjoy these videos from the employees at AMD and not just for the public showing to the press.

I really think this type of interview shows more about VEGA and new features than most of the simplified slides. Hopefully these improvements actually translate into improvements in real usage scenarios. At 16:10 he actually talks about how it applied to gaming.


----------



## angelsalam

Fiji is really a strange beast O__o'
but most importantly fiji like to be COOL.
I was doing some undervolting test with my fury nitro OC+ with wattman. I managed a voltage of 1075mv at 1050mhz core clock, but to maintain that low voltage, my card need to be below 55 degrees.
I can manage to stay below 55 degrees by setting my fan speed at 45-47% ( still quite with low noise).
Then i looked at the power consumption of my card with hwinfo, i was BELOW 200w the entire time, with my power consumption between 140 to 190w during my entire gaming session ( Overwatch with uncapped framerate, around 130-150 fps at epic settings).
I'm more and more impressed by Fiji power consumption, i get WAY BETTER performance than a gtx 980and even a GTX 980 ti in newer games while consuming LESS power than a 980 with my undervolt :O
I'm so pissed at AMD, if they optimized sample's voltage better, the Fiji cards could have been a real success for AMD :/
I really don't regret my fury purchase instead of a more expensive gtx 1070


----------



## Minotaurtoo

Quote:


> Originally Posted by *angelsalam*
> 
> Fiji is really a strange beast O__o'
> but most importantly fiji like to be COOL.
> I was doing some undervolting test with my fury nitro OC+ with wattman. I managed a voltage of 1075mv at 1050mhz core clock, but to maintain that low voltage, my card need to be below 55 degrees.
> I can manage to stay below 55 degrees by setting my fan speed at 45-47% ( still quite with low noise).
> Then i looked at the power consumption of my card with hwinfo, i was BELOW 200w the entire time, with my power consumption between 140 to 190w during my entire gaming session ( Overwatch with uncapped framerate, around 130-150 fps at epic settings).
> I'm more and more impressed by Fiji power consumption, i get WAY BETTER performance than a gtx 980and even a GTX 980 ti in newer games while consuming LESS power than a 980 with my undervolt :O
> I'm so pissed at AMD, if they optimized sample's voltage better, the Fiji cards could have been a real success for AMD :/
> I really don't regret my fury purchase instead of a more expensive gtx 1070


yes it is a strange beast... mine at stock can't top 1075... but if I mod the bios even with maintaining the 1.25v suddenly I can hit 1100... gup helped me with a bios to top that by keeping it cooler, but without adding rediculess volts I can't hit 1150... mem hit 545 easy... 600 barely stable... thought about adding volts to mem


----------



## Alastair

How are your Fury's running the Wildlands Beta?


----------



## angelsalam

really bad indeed, at 1080p with every option maxed out, i get 44 fps on the integrated benchmark.
In game i get around 40-45 fps, but when driving, my fps tanks to 30-35.
It's really weird. Just because of that i haven't played much of the beta, really bad performance and an unoptimized game a month from release.
Disabling HBAO+ or enhanced god rays only add 1 fps each. putting Shadows from very high to high gives you 4-5 fps.
and that's is with my fury at 1156mhz


----------



## Thoth420

Quote:


> Originally Posted by *angelsalam*
> 
> really bad indeed, at 1080p with every option maxed out, i get 44 fps on the integrated benchmark.
> In game i get around 40-45 fps, but when driving, my fps tanks to 30-35.
> It's really weird. Just because of that i haven't played much of the beta, really bad performance and an unoptimized game a month from release.
> Disabling HBAO+ or enhanced god rays only add 1 fps each. putting Shadows from very high to high gives you 4-5 fps.
> and that's is with my fury at 1156mhz


Saved me a download. Shame to hear about the poor performance....how does it look though?


----------



## angelsalam

Tt looks average, not ugly and not beautiful.
For me, the game requirement doesn't match the visuals.
The gameplay is weird, it seems like it was made primary for a gamepad, playing with keyboard isn't accurate for driving, moving around. the only accurate thing is aiming with the mouse.
The most troubling thing, is the lack of real cover system, the character go to cover when he's near a wall, but moving around can make you uncovered. it's not a locked cover system like in the division.
The game is too arcade while advertised like a tactical shooter. Don't get me wrong, you have a lot of ways to coordonate your attacks, but the gameplay is too arcadish for my taste.
And finally there is A LOT of bugs.
I was really looking forward this game since the first gameplay footage at E3 ( btw the visuals has been really downgraded since that trailer!!) but this beta made me worry about this game, there is too much things the devs have to fix ( performance and bugs) and they can't do it in one month. So i'll pass my turn and won't buy this game.
I'll stick with the division who looks FAR BETTER and runs better (looked 75fps on my fury) and more fun ( i got really bored playing this beta :/ )


----------



## Thoth420

Quote:


> Originally Posted by *angelsalam*
> 
> Tt looks average, not ugly and not beautiful.
> For me, the game requirement doesn't match the visuals.
> The gameplay is weird, it seems like it was made primary for a gamepad, playing with keyboard isn't accurate for driving, moving around. the only accurate thing is aiming with the mouse.
> The most troubling thing, is the lack of real cover system, the character go to cover when he's near a wall, but moving around can make you uncovered. it's not a locked cover system like in the division.
> The game is too arcade while advertised like a tactical shooter. Don't get me wrong, you have a lot of ways to coordonate your attacks, but the gameplay is too arcadish for my taste.
> And finally there is A LOT of bugs.
> I was really looking forward this game since the first gameplay footage at E3 ( btw the visuals has been really downgraded since that trailer!!) but this beta made me worry about this game, there is too much things the devs have to fix ( performance and bugs) and they can't do it in one month. So i'll pass my turn and won't buy this game.
> I'll stick with the division who looks FAR BETTER and runs better (looked 75fps on my fury) and more fun ( i got really bored playing this beta :/ )


That is pretty sad especially about the cover system. It seems Ubi killed the last of the 3 major Clancy franchises now....such a shame


----------



## Notarnicola

Does bios mod still work with the new relive driver (17.1.2)?


----------



## Sonikku13

Back to a Fury X (had a Nano before, but I ended up with a Fury X cause of that Amazon sale, though I got it on eBay). Gonna love being able to run 1080p/1440p max settings again in games I play, such as Final Fantasy XIV, Civilization VI, and Battlefield 1.


----------



## budgetgamer120

Quote:


> Originally Posted by *Sonikku13*
> 
> Back to a Fury X (had a Nano before, but I ended up with a Fury X cause of that Amazon sale, though I got it on eBay). Gonna love being able to run 1080p/1440p max settings again in games I play, such as Final Fantasy XIV, Civilization VI, and Battlefield 1.


You were not able to max games with the Nano?


----------



## HyeVltg3

Finding it really hard to find Rise of the Tomb Raider and Witcher 3 benchmarks for the R9 Fury using latest or close-to latest drivers. does anyone have a good source?

Contemplating between jumping on the amazing R9 Fury deal thats been on Amazon and Newegg, or buying a 480 and then CF-ing it later down the line.

Pairing with an i7, 144hz 1080p freesync monitor, I dont really NEED to game at 144fps but monitor has freesync so would love to finally be able to see what all the fuss is about. so really, Fury Vs 480 (now and then like when vega releases, another 480 if they get price drops, I cannot afford both 480s now, and cannot afford a Nano/Fury X now ~$600cad+. currently running on the i7-4790k Intel HD 4600...haha)


----------



## Sonikku13

Quote:


> Originally Posted by *budgetgamer120*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> Back to a Fury X (had a Nano before, but I ended up with a Fury X cause of that Amazon sale, though I got it on eBay). Gonna love being able to run 1080p/1440p max settings again in games I play, such as Final Fantasy XIV, Civilization VI, and Battlefield 1.
> 
> 
> 
> You were not able to max games with the Nano?
Click to expand...

Not that... it's cause I wanted a 480 cause of 8 GB of VRAM, but found out it was worse performance wise than the Nano. Then the Fury X was cheaper than the Nano when I bought the Fury X.


----------



## Bojamijams

So is the Fury exhibiting negative scaling due to voltage increases at every level? Or just at a certain +mV offset?

Because according to Buildzoid, any amount of voltage offset gives negative scaling

http://cxzoid.blogspot.ca/2016/02/fury-x-testing-results.html

It seems like the best way to do it is just to flash a modified BIOS with a 1.3V VID, never touch voltage offsets, set the power limit to +50% and see what overclock you can get with that, right?


----------



## Wuest3nFuchs

Hello guys and girls !

Bought a r9 fury nitro from sapphire ~300 euros on caseking, nice price !

The fury tri-x i had last year had a reproduceable issue with monitors ,so when i watched a movie and then boom artifacts in a 3 line row ,sometimes even while relogging into windows and doing just nothing it artifacted....i hope they fixed this !
The tri-x was the cooletst gpu i ever had in 15 years of gaming,the feeling with this gpu with the frametimes was much more stable,smoother, the image quality was better ,the colours omg all this on a GTX980









Fury should arrive tomorrow









What would you guys say with a i7 2700k @ 4ghz or should i give the cpu more speed due to cpu-overhead ?


----------



## steadly2004

Quote:


> Originally Posted by *Wuest3nFuchs*
> 
> Hello guys and girls !
> 
> Bought a r9 fury nitro from sapphire ~300 euros on caseking, nice price !
> 
> The fury tri-x i had last year had a reproduceable issue with monitors ,so when i watched a movie and then boom artifacts in a 3 line row ,sometimes even while relogging into windows and doing just nothing it artifacted....i hope they fixed this !
> The tri-x was the cooletst gpu i ever had in 15 years of gaming,the feeling with this gpu with the frametimes was much more stable then on my GTX980
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury should arrive tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What would you guys say with a i7 2700k @ 4ghz or should i give the cpu more speed due to cpu-overhead ?


Doesn't have to be any more OC, but doesn't hurt. If it's stable T a higher rate and you have adequate cooling, run it. If you have instability, back it down.


----------



## Wuest3nFuchs

If i ran into the CPU-overhead, i hope to achieve ~4.8 ghz minimum.

I already maxed out my memory from 1866-2133 maybe 2400mhz is too much ... i tried but windows wont boot may i have to go higher with the timings ,but no time for that now.

So is there something about the new drivers i should know, the last one i used then was a the last Catalystdriver,before they came out with radeon software crimson


----------



## kondziowy

Quote:


> Originally Posted by *Bojamijams*
> 
> So is the Fury exhibiting negative scaling due to voltage increases at every level? Or just at a certain +mV offset?
> 
> Because according to Buildzoid, any amount of voltage offset gives negative scaling
> 
> http://cxzoid.blogspot.ca/2016/02/fury-x-testing-results.html
> 
> It seems like the best way to do it is just to flash a modified BIOS with a 1.3V VID, never touch voltage offsets, set the power limit to +50% and see what overclock you can get with that, right?


This is news for me. I tested only in 3dmark though. I get better scores with higher core clock and voltage. If not stable adding voltage makes it stable until 1150MHz - last stable overclock but possible only with +96mv, and that gives best score.


----------



## MAMOLII

Quote:


> Originally Posted by *Minotaurtoo*
> 
> yes it is a strange beast... mine at stock can't top 1075... but if I mod the bios even with maintaining the 1.25v suddenly I can hit 1100... gup helped me with a bios to top that by keeping it cooler, but without adding rediculess volts I can't hit 1150... mem hit 545 easy... 600 barely stable... thought about adding volts to mem


Hi please can u upload your modded bios? its only fan profile modded? or tdp and gpu votlage?vishera and sabertooth owner here


----------



## Notarnicola

Does anyone know whats this max safe temp is "GPU VR VDDC Temperature"? Ingame i have 76C max.
I have my fury x at 1090/600Mhz 24/7.


----------



## Minotaurtoo

Quote:


> Originally Posted by *MAMOLII*
> 
> Hi please can u upload your modded bios? its only fan profile modded? or tdp and gpu votlage?vishera and sabertooth owner here


I tried to upload, but obviously I didn't know what I was doing as it kept giving me errors... but the gist of it is I set the clock to 1100 mhz, set the mem to 545 and set the overdrive limits to 2000 and 600 respectively, then set the temp goal to 55C and left the throttle temp alone TDP I raised to 300 TDC A is 325 and MPDL W is 350 not much change there, but enough that I never see any downclocking due to power usage. edit: I forgot to mention I set the voltage for the DPM7 slot to 1.275... this was just enough to make my card maintain 1.25v under load

I used the fiji bios editor from the bios editing thread and ati win flash from amd.


----------



## LionS7

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I tried to upload, but obviously I didn't know what I was doing as it kept giving me errors... but the gist of it is I set the clock to 1100 mhz, set the mem to 545 and set the overdrive limits to 2000 and 600 respectively, then set the temp goal to 55C and left the throttle temp alone TDP I raised to 300 TDC A is 325 and MPDL W is 350 not much change there, but enough that I never see any downclocking due to power usage. edit: I forgot to mention I set the voltage for the DPM7 slot to 1.275... this was just enough to make my card maintain 1.25v under load
> 
> I used the fiji bios editor from the bios editing thread and ati win flash from amd.


Can you tell me which bios did you used for editing on R9 Fury X ?


----------



## Minotaurtoo

I used the latest from amd's own site... https://community.amd.com/external-link.jspa?url=http%3A%2F%2Fsupport.amd.com%2Fen-us%2Fdownload%2Fgpu-firmware-download


----------



## Wuest3nFuchs

The R9 Fury seems to handle my games better than my 980 did only on some opengl games the fury can't beat the 980 sometimes. e.g. Farming Simulator ... someone knows a fix for the 30 fps restriction?

Asic : 59.8 %









The new drivers are great compared to the ones i used with my first fury 2 years ago !

*Little question regard to oc tools:*
What tool should i try: MSI afterburner or sapphire trixx or wattmann ?


----------



## Notarnicola

Use sapphire trixx.


----------



## Flamingo

Well then, finally sold my R9 Nano. Great card, great performance - never disappointed me.

Sold it because of lack of AMD support around rendering apps. Especially sad I couldnt test out the Nano with AMD's upcoming blender plugin. But AMD's been taking too long in this area, whereas nvidia owners already enjoy matured CUDA support for such applications

I could go Vega, but unlikely.

Here's to waiting for Vega and how nvidia responds to it \o/


----------



## MrKoala

Quote:


> Originally Posted by *Flamingo*
> 
> Sold it because of lack of AMD support around rendering apps. Especially sad I couldnt test out the Nano with AMD's upcoming blender plugin. But AMD's been taking too long in this area, whereas nvidia owners already enjoy matured CUDA support for such applications


Haven't played with Blender for a while, but isn't Blender's AMD GPU support fixed since the kernel-split patch?


----------



## Flamingo

Quote:


> Originally Posted by *MrKoala*
> 
> Haven't played with Blender for a while, but isn't Blender's AMD GPU support fixed since the kernel-split patch?


They do have OpenCL support under cycles and GPU rendering was better than CPU but only just - I checked my temperatures vs other rendering apps such as LuxRender and my temps wouldnt go as high under blender - so it seemed like a temporary solution - like that GPU wasnt being utilized to its max and knowing that CUDA support was there and much much faster was always a back thought.

Just to give an idea:

https://community.amd.com/thread/208616

AMD cards are crazy strong in compute and hence OpenCL based rendering, but they need the underlying software support which is meh (or improving at its own pace).


----------



## MrKoala

Are CUDA devices actually faster than comparable AMD GPUs these days? I loosely remember back when Blender Cycles couldn't compile well on AMD people were testing both CUDA and OpenCL (1.1) implementations on the same NVidia hardware and often got incidental performance down to the exact second in rendering time.


----------



## Flamingo

http://blenchmark.com/gpu-benchmarks

A 750Ti is faster than a Fiji

Where there is proper nvidia support, and bad AMD support, yep - nvidia cards are faster.


----------



## Minotaurtoo

the biggest reason I love my fury x really has very little to do with the performance... but it's the cooling system... That AIO.. no fuss, no mess and so far no problems and even after 3 days of folding at full power it kept below 60C. and using it as exhaust it keeps my case cooler than any card I've ever had. I actually thought of going with nvidia many times because they seem to have the driver advantage and many times game developers seem to favor them as well... but to be honest, is strong enough that even with that disadvantage it does quite well... still loving my fury x... oh... that nice red radeon looks cool too


----------



## ht_addict

Quote:


> Originally Posted by *Minotaurtoo*
> 
> the biggest reason I love my fury x really has very little to do with the performance... but it's the cooling system... That AIO.. no fuss, no mess and so far no problems and even after 3 days of folding at full power it kept below 60C. and using it as exhaust it keeps my case cooler than any card I've ever had. I actually thought of going with nvidia many times because they seem to have the driver advantage and many times game developers seem to favor them as well... but to be honest, is strong enough that even with that disadvantage it does quite well... still loving my fury x... oh... that nice red radeon looks cool too


I loved the AIO, but ended up going EKWB blocks. Currently folding at 100% and temps at 32-34oC. I'll always be an AMD Fanboy. Got my FuryX's lovin the fact that i got a piece of new tech with the HBM memory. And its plenty fast for my gaming at 4K


----------



## weespid

Quote:


> Originally Posted by *Wuest3nFuchs*
> 
> Hello guys and girls !
> 
> Bought a r9 fury nitro from sapphire ~300 euros on caseking, nice price !
> 
> The fury tri-x i had last year had a reproduceable issue with monitors ,so when i watched a movie and then boom artifacts in a 3 line row ,sometimes even while relogging into windows and doing just nothing it artifacted....i hope they fixed this !
> The tri-x was the cooletst gpu i ever had in 15 years of gaming,the feeling with this gpu with the frametimes was much more stable,smoother, the image quality was better ,the colours omg all this on a GTX980
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury should arrive tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What would you guys say with a i7 2700k @ 4ghz or should i give the cpu more speed due to cpu-overhead ?


I run a 2600 at 4.1 with an nano clocked 1060/545 GTA v was my biggest issue with CPU overhead. Mind you I have not played the Witcher 3 or AC syndicate


----------



## bluezone

PCPER had a video interview with David Kanter, very smart guy







(assume Wayne and Garth worship position).











 sorry to lazy to look for the Gif.

Dissecting AMD Zen Architecture - Interview with David Kanter

https://www.pcper.com/reviews/Processors/Dissecting-AMD-Zen-Architecture-Interview-David-Kanter






For those who are interested.

Oh, also hot off the presses.

Radeon Software Crimson ReLive Edition 17.2.1 Release Notes.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.2.1-Release-Notes.aspx

Win 10 64

http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Win 7 64

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Wuest3nFuchs

Quote:


> Originally Posted by *weespid*
> 
> I run a 2600 at 4.1 with an nano clocked 1060/545 GTA v was my biggest issue with CPU overhead. Mind you I have not played the Witcher 3 or AC syndicate


How much memory do yo u have installed on the motherboard?
I didn't had problems while testing my fury on [email protected] mostly vsr 1440p in GTA V,BF 1, SW BF,D00M ,Cod mw2,Ark ,Arma3 ,Insurgency ,Elite Dangerous ,Project Cars ,Coh2 .

The goods were better IQ ,better and more stable frametimes than i ever had on my 980 @1443mhz!
I also never played W3 or AC Syndicate....









So back to GTA V what resolution did you choose to play 1080 or 1440p?

cheers fox


----------



## Performer81

Somehow my HBM memory improved over time.








In the past i got frequent red graphic glitches with 545MHZ, but recently no more at all. Runs flawless for over a week now. Or maybe its the newer drivers.


----------



## weespid

Quote:


> Originally Posted by *Wuest3nFuchs*
> 
> How much memory do yo u have installed on the motherboard?
> I didn't had problems while testing my fury on [email protected] mostly vsr 1440p in GTA V,BF 1, SW BF,D00M ,Cod mw2,Ark ,Arma3 ,Insurgency ,Elite Dangerous ,Project Cars ,Coh2 .
> 
> The goods were better IQ ,better and more stable frametimes than i ever had on my 980 @1443mhz!
> I also never played W3 or AC Syndicate....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So back to GTA V what resolution did you choose to play 1080 or 1440p?
> 
> cheers fox


I have 16 GB 1685mhz 8-8-8-22 for ram. I play at 1080p ultra wide or 1080p ultra wide with two normal 1080p monitors in eyefinity. I'm an bit of an weird one








you could be right with the ram (speed) being the bottleneck as it increases with CPU frequency since I run an 105.3 bclk (non k chip) but then again I also take my frame rates from driving full speed through the city 55-60 if I stand still it's like 75 to 80 no advanced graphics 4x AA and everything else maxed going off of the top of my head. The game is also installed to an ssd.

I don't have as sw bf or bf1 but bf4 ran great doom also both tomb Raider and rise of the tomb Raider run well also all while using less power than my old 290


----------



## xkm1948

Sniper Elite review from TPU. It seems with proper implemented DX12 and Async Compute FuryX is finally shining. Given the fact that Vulkan and DX12 are used more and more I feel like our Fiji can keep up with future games pretty well. Fine wine is true!


----------



## gupsterg

Quote:


> Originally Posted by *Minotaurtoo*
> 
> the biggest reason I love my fury x really has very little to do with the performance... but it's the cooling system... That AIO.. no fuss, no mess and so far no problems and even after 3 days of folding at full power it kept below 60C. and using it as exhaust it keeps my case cooler than any card I've ever had. I actually thought of going with nvidia many times because they seem to have the driver advantage and many times game developers seem to favor them as well... but to be honest, is strong enough that even with that disadvantage it does quite well... still loving my fury x... oh... that nice red radeon looks cool too


At first I didn't like the AIO, just due to seeming like a mare to handle card for installation compared with air cooled. Now I don't think I'd want a GPU without AIO







. Handles overclocking well, stock clocks with undervolt and 89 FPS FRTC finding it extremely quiet. Recently had an urge to go full custom WC on my rig but after pricing up setup and what I may gain on temps it seemed very uneconomical, especially as I may not gain anymore performance.

FreeSync has been another great bit of tech with this GPU, that I'm enjoying using and no premium for it compared to G-Sync.

Thoroughly enjoying the 1440P gaming experience on Fury X with FreeSync. Frankly best ever GPU as total package in my PC gaming experience since 1996.


----------



## Wuest3nFuchs

Quote:


> Originally Posted by *weespid*
> 
> I have 16 GB 1685mhz 8-8-8-22 for ram. I play at 1080p ultra wide or 1080p ultra wide with two normal 1080p monitors in eyefinity. I'm an bit of an weird one
> 
> 
> 
> 
> 
> 
> 
> 
> you could be right with the ram (speed) being the bottleneck as it increases with CPU frequency since I run an 105.3 bclk (non k chip) but then again I also take my frame rates from driving full speed through the city 55-60 if I stand still it's like 75 to 80 no advanced graphics 4x AA and everything else maxed going off of the top of my head. The game is also installed to an ssd.
> 
> I don't have as sw bf or bf1 but bf4 ran great doom also both tomb Raider and rise of the tomb Raider run well also all while using less power than my old 290


ok nice infos man thx !

Forgot to tell i'm using 17.1.2 driver









Let's test my memory downclocked to 1600mhz ,then i tell you if i noticed something,i will record it....if you have afterburner or use hwinfo so i would say ,turn it on and record it ...i'll post my results here *if allowed* ^^

*Results:*



lowest low i saw was 54 and i only run on 60hz monitor ,i need one of those freesync wqhd ones









ocn_1600mhzmemory_gtav_fury.CSV 369k .CSV file


So what happens if you play only on one monitor?

To the time i only have issues with doom on vulkan,not on opengl...and that's the next thing it shows opengl 4.3 when i enable the ingame infos ????

> www.amd.com/en-gb/products/graphics/desktop/r9

*footnotes : OpenGL® 4.5 support available in AMD Catalyst™ 15.30 WHQL driver.*

The bad here is on some games i play use opengl 4.5 and they run really bad,specially farming simulator 2015,but could also be the engine...i'm a bit lost here


----------



## LionS7

Quote:


> Originally Posted by *xkm1948*
> 
> Sniper Elite review from TPU. It seems with proper implemented DX12 and Async Compute FuryX is finally shining. Given the fact that Vulkan and DX12 are used more and more I feel like our Fiji can keep up with future games pretty well. Fine wine is true!


Imagine if it was with 8GB HBM.







The test in gamegpu.com is interesting too. The card handles For Honor beautiful as well.

For Honor GPU test (updated with a release version)
http://gamegpu.com/action-/-fps-/-tps/for-honor-beta-test-gpu


----------



## Sonikku13

The wait for Ryzen, and therefore the end of my CPU bottlenecking my Fury X, is stressing me out. I know Ryzen should outperform even an FX-9590 for the 8-core version. What I'm stressed out about is being able to snatch one up at launch, where Ryzen will fall, etc...

I'm currently on an A10-7850K, and I refuse to go from quad to quad... nor pay an exorbitant amount for an 8-core, nor pay for a platform that will be outdated in a month or already is outdated. Pretty much the only option left is Ryzen.

I get 20 FPS in Idyllshire in FFXIV... with my Fury X and A10-7850K. With a Radeon RX 480, and a Core i5 4460, I got 40 FPS in Idyllshire in FFXIV. This screams CPU bottleneck, as the Fury X is more powerful than the 480, and hopefully Ryzen will fix that.

The only new games I want this year, so far, are FFXIV: Stormblood, and maybe ESO: Morrowind.

Having my Fury X truly flex it's muscle when I get my hands on Ryzen will be nice... it's hamstrung right now.


----------



## Skyl3r

Quote:


> Originally Posted by *xkm1948*
> 
> Sniper Elite review from TPU. It seems with proper implemented DX12 and Async Compute FuryX is finally shining. Given the fact that Vulkan and DX12 are used more and more I feel like our Fiji can keep up with future games pretty well. Fine wine is true!


That's cool to see! That's one thing I love about AMD. Somehow my hardware just gets better the longer I let it sit


----------



## LionS7

Did somebody else has graphics artefacts on Battlefront tatooine map with Crimson 17.2.1. It's appear too on the snow maps ?


----------



## steadly2004

Quote:


> Originally Posted by *LionS7*
> 
> Did somebody else has graphics artefacts on Battlefront tatooine map with Crimson 17.2.1. It's appear too on the snow maps ?


I would test it and reply, but I don't have the game.


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> Did somebody else has graphics artefacts on Battlefront tatooine map with Crimson 17.2.1. It's appear too on the snow maps ?


Yet to upgrade to 17.2.1, using still 16.12.2 WHQL 3rd Jan release on Win 7 x64 and no issues in SWBF for me; I play quite a bit (~117hrs shown in Origin for that).


----------



## Radox-0

Dumb question, using a 4k TV and the Nano in a HTPC. Is there any way when using the HDMI 1.4 on the Nano (damm why did AMD not put 2.0 on this card, was perfect for it!!!!) I can have titles running at 2560*1440 @ 60hz? Seems I am always stuck with the 30hz option no matter what, or maby I missed something.

I can if I do not turn V-Sync on in game have it running higher then 30hz, but then get a lot of tearing.


----------



## Acoma_Andy

Quote:


> Originally Posted by *Radox-0*
> 
> Dumb question, using a 4k TV and the Nano in a HTPC. Is there any way when using the HDMI 1.4 on the Nano (damm why did AMD not put 2.0 on this card, was perfect for it!!!!) I can have titles running at 2560*1440 @ 60hz? Seems I am always stuck with the 30hz option no matter what, or maby I missed something.
> 
> I can if I do not turn V-Sync on in game have it running higher then 30hz, but then get a lot of tearing.


If the TV has a displayport you can play games at 60hz without issues.


----------



## dagget3450

Quote:


> Originally Posted by *Radox-0*
> 
> Dumb question, using a 4k TV and the Nano in a HTPC. Is there any way when using the HDMI 1.4 on the Nano (damm why did AMD not put 2.0 on this card, was perfect for it!!!!) I can have titles running at 2560*1440 @ 60hz? Seems I am always stuck with the 30hz option no matter what, or maby I missed something.
> 
> I can if I do not turn V-Sync on in game have it running higher then 30hz, but then get a lot of tearing.


If you dont have diplayport and only hdmi, check out CRU and AMD Pixel Clock patcher by Toasty X. It may allow you to do what you want.


----------



## Radox-0

Quote:


> Originally Posted by *Acoma_Andy*
> 
> If the TV has a displayport you can play games at 60hz without issues.


Sadly does not, though I guess most TV's don't in general
Quote:


> Originally Posted by *dagget3450*
> 
> If you dont have diplayport and only hdmi, check out CRU and AMD Pixel Clock patcher by Toasty X. It may allow you to do what you want.


Brilliant, will give these a shot. Thanks


----------



## Acoma_Andy

Modded the cooling on my R9 Nano, the thread can be found here.


----------



## microchidism

For anyone wondeirng, The Fury / Fury x and Nano mounting hole spacing are 64mm x 64mm. I'm guessing the other HBM cards in the future from AMD are going to be the same. The HBM and GPU die seem to be about the same height (I had read somewhere that the HBM was a lot higher)

Also, out of curiosity what kind of hit you guys think the Fury X will take when Vega comes out? Right now the Fury X go for ~300-340 used


----------



## domrockt

Quote:


> Originally Posted by *microchidism*
> 
> For anyone wondeirng, The Fury / Fury x and Nano mounting hole spacing are 64mm x 64mm. I'm guessing the other HBM cards in the future from AMD are going to be the same. The HBM and GPU die seem to be about the same height (I had read somewhere that the HBM was a lot higher)
> 
> Also, out of curiosity what kind of hit you guys think the Fury X will take when Vega comes out? Right now the Fury X go for ~300-340 used


i think they will go cheap like a regular fury about 200€ because a lot of user want to sell their furys for Vega money









that is my plan for my fury ;P


----------



## microchidism

I ask as I have a Fury X and 1070.......I need too figure out which one i'm going to sell now and which to keep and sell when Vega comes out


----------



## Thoth420

Quote:


> Originally Posted by *microchidism*
> 
> For anyone wondeirng, The Fury / Fury x and Nano mounting hole spacing are 64mm x 64mm. I'm guessing the other HBM cards in the future from AMD are going to be the same. The HBM and GPU die seem to be about the same height (I had read somewhere that the HBM was a lot higher)
> 
> Also, out of curiosity what kind of hit you guys think the Fury X will take when Vega comes out? Right now the Fury X go for ~300-340 used


Link to people selling Fury X for that much still. About to leap...mine is blocked already as well. I'll use a pleb card til Vega.


----------



## looncraz

Quote:


> Originally Posted by *microchidism*
> 
> Also, out of curiosity what kind of hit you guys think the Fury X will take when Vega comes out? Right now the Fury X go for ~300-340 used


If rumors of the RX 580 being Vega 11 and a replacement for RX 480 are true, then Fury* may become $150 cards or under on the used market in no time, maybe less.

AMD is trying to remake themselves and that would be a really good way of doing it (1070 performance for $250).


----------



## microchidism

Quote:


> Originally Posted by *looncraz*
> 
> If rumors of the RX 580 being Vega 11 and a replacement for RX 480 are true, then Fury* may become $150 cards or under on the used market in no time, maybe less.
> 
> AMD is trying to remake themselves and that would be a really good way of doing it (1070 performance for $250).


If that's the case it's time to put the Fury X on CL or something and keep the 1070

Thanks!


----------



## Thoth420

Quote:


> Originally Posted by *microchidism*
> 
> If that's the case it's time to put the Fury X on CL or something and keep the 1070
> 
> Thanks!


Thinking something along the same lines as well.


----------



## microchidism

Quote:


> Originally Posted by *Thoth420*
> 
> Thinking something along the same lines as well.


overall it is a hell of a card though, looking at some of the games on my list to play such Doom, Mankind Divided and Quantum Break the performance seems better than the 1070, sometimes creeping on the 1080.


----------



## budgetgamer120

Quote:


> Originally Posted by *microchidism*
> 
> I ask as I have a Fury X and 1070.......I need too figure out which one i'm going to sell now and which to keep and sell when Vega comes out


sell the one that is worth more money now. And keep the cheaper one.


----------



## ht_addict

Quote:


> Originally Posted by *gupsterg*
> 
> At first I didn't like the AIO, just due to seeming like a mare to handle card for installation compared with air cooled. Now I don't think I'd want a GPU without AIO
> 
> 
> 
> 
> 
> 
> 
> . Handles overclocking well, stock clocks with undervolt and 89 FPS FRTC finding it extremely quiet. Recently had an urge to go full custom WC on my rig but after pricing up setup and what I may gain on temps it seemed very uneconomical, especially as I may not gain anymore performance.
> 
> FreeSync has been another great bit of tech with this GPU, that I'm enjoying using and no premium for it compared to G-Sync.
> 
> Thoroughly enjoying the 1440P gaming experience on Fury X with FreeSync. Frankly best ever GPU as total package in my PC gaming experience since 1996.


Custom WC looks so cool though. Performance wise i saw no increase in GPU/HBM clocks. Temperature wise though i saw a huge difference. Idles in mid to upper 20's, gaming at 1150/550 in mid 30's and lower. Where you also see or should i say hear the difference is in fan noise. There is none.


----------



## xkm1948

Turned on "VEGA" mode of my FuryX.


----------



## Thoth420

Quote:


> Originally Posted by *microchidism*
> 
> overall it is a hell of a card though, looking at some of the games on my list to play such Doom, Mankind Divided and Quantum Break the performance seems better than the 1070, sometimes creeping on the 1080.


Don't get me wrong I loved the Fury X but personally I don't have much time to game lately so I am more interested in playing with an 1800x with just whatever GPU for now until we see the next gen of cards.


----------



## gupsterg

Quote:


> Originally Posted by *microchidism*
> 
> Also, out of curiosity what kind of hit you guys think the Fury X will take when Vega comes out? Right now the Fury X go for ~300-340 used


I doubt there will be a huge shift in price for selling. I would be very surprised if AMD do what they have done on Ryzen for pricing vs Intel equivalent, I reckon Vega will be priced in a way how it stacks up against nVidia offering. WhyCry from VideoCardz said this in a recent post on his site:-
Quote:


> What we do know is that Vega will not be cheap (at least not as cheap as Polaris series). We were told it should cost between 599 to 699 USD, but yet again, no final decisions were made and I'm certain that upcoming launch of GTX 1080 Ti will be a huge factor for this decision.


Quote:


> Originally Posted by *ht_addict*
> 
> Custom WC looks so cool though. Performance wise i saw no increase in GPU/HBM clocks. Temperature wise though i saw a huge difference. Idles in mid to upper 20's, gaming at 1150/550 in mid 30's and lower. Where you also see or should i say hear the difference is in fan noise. There is none.


I agree WC look swish







, but as I'd gain very little on performance I can't justify it, but appreciate those who have WC loops why they would get block.

My Fury X ~27°C idle, 45-50°C load on my undervolt ROM. OC ROM idle is same, load is 50-55°C. I have room ambient of 21°C to 24°C, to me it has seemed very unobtrusive and quiet. As AIO is pretty much FOC with GPU







, pretty good "bang for $", especially considering I paid ~£250 for card in March 16. Do also like the look of card shroud/panels as well.


----------



## Alastair

Guys. So I just ran Tomb Raider since I first got my Fury's over a year ago. Holy heck drivers have come a long way. Same settings as way back when. Ultimate 1440P


Spoiler: Old result 2 Fury's @ 1000/500









Spoiler: New Result 17.2.1 @ 1000/500









Spoiler: Now at 1150/550







Now I wanna see how this machine runs ROTR. But every time I try running the bench it crashes out. Sometimes I get a low memory error in windows, yet im only using around 11GB-12GB of memory. But I am high up there on my VRAM. Around 3930MB at 2560x1440 at very high. But surely it wont cause the game to crash out?

Edit also with 17.2.1 does ATIKMDAG patcher and CRU.exe no longer work with the drivers. It doesn't seem to want to detect my CRU settings. And it is irritating me that it Crimson only loads up my custom resolution when I tell it too. It doesn't do it automatically.


----------



## battleaxe

Quote:


> Originally Posted by *microchidism*
> 
> I ask as I have a Fury X and 1070.......I need too figure out which one i'm going to sell now and which to keep and sell when Vega comes out


Sell the 1070. The fury will only improve while the 1070 will get progressively worse over time. Plus the 1070 is likely worth more because Nvidia fans will buy them up regardless.


----------



## domrockt

Quote:


> Originally Posted by *battleaxe*
> 
> Sell the 1070. The fury will only improve while the 1070 will get progressively worse over time. Plus the 1070 is likely worth more because Nvidia fans will buy them up regardless.


Doooont;!!!; just sell the fury it will fall rapidly in price ... The 1070 will sell higher when vega arrives!!!


----------



## Minotaurtoo

Quote:


> Originally Posted by *domrockt*
> 
> Doooont;!!!; just sell the fury it will fall rapidly in price ... The 1070 will sell higher when vega arrives!!!


I was thinking that.... since he's planning on ditching it anyway when vega arrives, future performance gains of the fury is an invalid argument.


----------



## domrockt

Exactly i bought my fury tri-x for 250€ and plan and hope to sell it for 200€ with my ekwb watercooler in Combo.


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Guys. So I just ran Tomb Raider since I first got my Fury's over a year ago. Holy heck drivers have come a long way. Same settings as way back when. Ultimate 1440P
> 
> 
> Spoiler: Old result 2 Fury's @ 1000/500
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: New Result 17.2.1 @ 1000/500
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Now at 1150/550
> 
> 
> 
> 
> 
> 
> 
> Now I wanna see how this machine runs ROTR. But every time I try running the bench it crashes out. Sometimes I get a low memory error in windows, yet im only using around 11GB-12GB of memory. But I am high up there on my VRAM. Around 3930MB at 2560x1440 at very high. But surely it wont cause the game to crash out?
> 
> Edit also with 17.2.1 does ATIKMDAG patcher and CRU.exe no longer work with the drivers. It doesn't seem to want to detect my CRU settings. And it is irritating me that it Crimson only loads up my custom resolution when I tell it too. It doesn't do it automatically.


I had a great experience recently with Tomb Raider 2013. Played at 5760x1080 with everything maxed out, TressFX off. It's a Gaming Evolved game as far as I know, so it worked great with my Eyefinity without having to use Flawless Widescreen etc. Capped 60fps and used power efficiency mode in drivers: no stutter. Top card maxed out at 45C. Just awesome.

I played it even though it's "old" because I've had it for years but never finished it.

I also have ROTTR and it seems to run great with the same settings. No stutter. Cards run relatively cool as well.

Have you tried increasing the size of the Windows page file? I had out of memory problems with Dragon Age Inquisition until I increased the page file size to 16GB.


----------



## Alastair

I run my page file really low. I use the Samsung recommended settings of 200MB-1GB for my SSD. Is it really necessary to have a larger page file for ROTR?


----------



## gupsterg

@Alastair








Is it not the same screenshots in New Result 17.2.1 @ 1000/500 and Now at 1150/550?


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> @Alastair
> 
> 
> 
> 
> 
> 
> 
> 
> Is it not the same screenshots in New Result 17.2.1 @ 1000/500 and Now at 1150/550?


Yes thank you! Did not pick that up. I fixed it. Old result was 123.7 fps at stock and I think it was 15.8 or something along those lines.

New at 144 ish. and OC result was 162.2.

In other news. I seem to actually get better performance in ROTR at DX 11 vs DX12. Am I doing something? Also damn the game is a resource hog! 22GB of memory used if you include my now 16GB pagefile.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Yes thank you! Did not pick that up. I fixed it. Old result was 123.7 fps at stock and I think it was 15.8 or something along those lines.
> 
> New at 144 ish. and OC result was 162.2.
> 
> In other news. I seem to actually get better performance in ROTR at DX 11 vs DX12. Am I doing something? Also damn the game is a resource hog! 22GB of memory used if you include my now 16GB pagefile.


16GB pagefile to make a game run right means the dev team should be dragged out in the street and shot one at a time.


----------



## neurotix

Quote:


> Originally Posted by *Alastair*
> 
> Yes thank you! Did not pick that up. I fixed it. Old result was 123.7 fps at stock and I think it was 15.8 or something along those lines.
> 
> New at 144 ish. and OC result was 162.2.
> 
> In other news. I seem to actually get better performance in ROTR at DX 11 vs DX12. Am I doing something? Also damn the game is a resource hog! 22GB of memory used if you include my now 16GB pagefile.


I think you're probably just running out of memory, yeah.

There's probably other games that also need a large page file to function correctly with Fury's.

It seems my advice was correct; glad you got it fixed (seemingly).

When monitoring my cards in DA:I it says the memory usage is generally 7-8GB. Obviously overrunning the limited HBM of the Fury. Regardless, probably thanks to all the driver tricks from AMD, I don't get any stutter or framerate dips that I can perceive. The reason for the high memory usage is probably my Eyefinity resolution. Although, I've heard that the game is a memory hog anyway, even at 1080p.

I haven't checked the memory usage of ROTTR because I haven't played the game much, but as you say, it is probably also high. It would probably be even higher for me because of resolution. I think you said you run 1440p.


----------



## nadja92

Here is my Sapphire Fury X - I've yet to finish my build but will I have any problems with current games/upcoming games to skip Vega? My CPU is a 5820k as well.

Or could I hope I can find another nice cheap one to xfire to help in the meantime?


----------



## domrockt

2*furys in xfire can do a lot in 1440p gaming for a good while.

Even one single fury will do its job with 1440p for a good while.

So my answer is yes it is possible to skip Vega.
Processing power is there even with an i5 hasswell.

I face only problems with 3440*1440p on nightmare settings in Doom4 i get about 40ish fps... But when i set shadows on ultra instead of nightmare i reach 70-90ish fps.

If you own a good monitor 1440p there is no need for Antialiasing what indeed gives problems with memory.

Furys are an amazing chip


----------



## TwirlyWhirly555

My just homed pro duo : D , CPU is 5820K .


----------



## domrockt

niiiiiiceeeee







!!!! I have the same case








I use an 200mm radiator in front for my CPU and fury tri-X.


----------



## diggiddi

Yeah me want to join the club, seriously thinking about picking up a couple of fury Nitro's


----------



## Ehsteve

So I'm decommissioning the custom loop on my system due to maintenance concerns (and a mobo/cpu upgrade) and am reinstating the stock coolers on both of the Fury X. I was thinking it would be a piece of cake until I noticed that one of the thermal pads had somehow peeled off the stock cooler in the time since the waterblock was installed to now. Does anyone know the thickness of the thermal padding on the stock cooler? Still have some 0.5mm and 1mm thermal padding left over from the EK blocks and was hoping by some stretch of luck this might just work.


----------



## TwirlyWhirly555

Quote:


> Originally Posted by *domrockt*
> 
> 
> 
> 
> 
> 
> 
> 
> niiiiiiceeeee
> 
> 
> 
> 
> 
> 
> 
> !!!! I have the same case
> 
> 
> 
> 
> 
> 
> 
> 
> I use an 200mm radiator in front for my CPU and fury tri-X.


Thanks : D ,

Nice , its a good case though I turned the front 200mm fan round to blow out the front , the GPU fan is pushing air towards the front with two 80mm at the back and one 120mm on the side pushing air into the case .


----------



## domrockt

Quote:


> Originally Posted by *TwirlyWhirly555*
> 
> Thanks : D ,
> 
> Nice , its a good case though I turned the front 200mm fan round to blow out the front , the GPU fan is pushing air towards the front with two 80mm at the back and one 120mm on the side pushing air into the case .


 IMG_20170227_210105.jpg 3051k .jpg file


IMG_20170227_210054.jpg 4548k .jpg file


----------



## diggiddi

Quote:


> Originally Posted by *looncraz*
> 
> If rumors of the RX 580 being Vega 11 and a replacement for RX 480 are true, then Fury* may become $150 cards or under on the used market in no time, maybe less.
> 
> AMD is trying to remake themselves and that would be a really good way of doing it (1070 performance for $250).


So not good idea to pick up a Nitro?


----------



## looncraz

Quote:


> Originally Posted by *diggiddi*
> 
> So not good idea to pick up a Nitro?


Really hard to tell. You will lose value, but that's always the case. How much is the question.

If the rumors of the RX 580 being Vega 11 are true - and at RX 480 prices with GTX 1070 performance... then a Fury will become a very cheap card (and RX 480 even more so).

It seems AMD wants to steal market and mind-share, so undercutting is about the only option they have. With that, they don't need to be the fastest to be the best choice.


----------



## Ceadderman

580 should beat 1080. Won't beat Titan X but should give 1080 fits.









~Ceadder


----------



## diggiddi

Quote:


> Originally Posted by *looncraz*
> 
> Really hard to tell. You will lose value, but that's always the case. How much is the question.
> 
> If the rumors of the RX 580 being Vega 11 are true - and at RX 480 prices with GTX 1070 performance... then a Fury will become a very cheap card (and RX 480 even more so).
> 
> It seems AMD wants to steal market and mind-share, so undercutting is about the only option they have. With that, they don't need to be the fastest to be the best choice.


Too Late







twins on the way Oh BTW how is power consumption compared to my 290x lightnings?
I need a new PSU soon, would I be able to get 50mhz+ OC?


----------



## Ceadderman

6pin=75w

8pin=150w

So it should remain equal or better tHan what you're running now. 290x Lightning iirc is a 300w card.









~Ceadder


----------



## Alastair

Guys I am having a problem with one of my Fury's. On Sunday while gaming I noticed through afterburner my voltages were running a bit higher than usual. Around 1.25ish yet I was not overclocking anything I was stock 1000/500. I went into wattman to see if anything was conflicting with my MSI afterburner settings only to see that EVERYTHING in wattman was greyed out. I decided that maybe the BIOS I had put on it (015.049.000.004.000000 for the xfx liquid cooled Fury Pro) was wonky. I started to reflash it to the stock rom (015.049.000.003.000000) when it BSOD'ed during the flash. I figured no worries. I will just switch to the back up BIOS. I did that, easy enough. In windows switched to the now bricked BIOS and flashed it. Simple AF right?

Well now I have a strange issue. With the clean installed drivers(17.2.1), I still have NO OPTIONS in wattman. Except for DPM7. Which is default set in wattman to 500MHz @ 0.975V? My cards defaulting to 500/250 core/hbm. It is either in DPM1 or DPM7. There is no inbetween. And obviously due to the reduced clocks everything in windows and my games are choppy. What do you suppose the issue is?

I have tried reflashing the BIOS.
Clean installed the drivers.
Don't know whats up.

I am worried my card is damaged. And figuring should I go through the effort of RMA or should I just scrap it, claim insurance and downsize to some 480's.









EDIT: Ok both cards are doing it, but I only bricked one card's BIOS. So both cards could not have been damaged by a single bricked BIOS isolated to one card. So it must be a software issue! Time to roll back to 16.12.1!

EDIT2: 16.12.1 did not fix it. But toggling crossfire off and on again temporarily fixes the issue until the next reboot. What the heck?

Here is a screenshot of wattman. And the cards get set to those clocks. of 500/250 @ .925V


----------



## Bojamijams

I'd do a Reset PC option and start from scratch.. you have some corruption somewhere


----------



## kondziowy

No there is no damage on the card, I get this grayed out screen sometimes with Asus Fury unlocked in bios to 3840cores and 17.2.1 driver. Reset option in right corner always fixes it for me.
MSI Afterburner doesn't work, so don't use it. The only setting that works in Afterburner for Fury is fan control under "User Define" button. There you can set custom fan curve.
Never click "Apply" setting in Afterburner main window as it conflicts with Wattman.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Guys I am having a problem with one of my Fury's. On Sunday while gaming I noticed through afterburner my voltages were running a bit higher than usual. Around 1.25ish yet I was not overclocking anything I was stock 1000/500. I went into wattman to see if anything was conflicting with my MSI afterburner settings only to see that EVERYTHING in wattman was greyed out. I decided that maybe the BIOS I had put on it (015.049.000.004.000000 for the xfx liquid cooled Fury Pro) was wonky. I started to reflash it to the stock rom (015.049.000.003.000000) when it BSOD'ed during the flash. I figured no worries. I will just switch to the back up BIOS. I did that, easy enough. In windows switched to the now bricked BIOS and flashed it. Simple AF right?
> 
> Well now I have a strange issue. With the clean installed drivers(17.2.1), I still have NO OPTIONS in wattman. Except for DPM7. Which is default set in wattman to 500MHz @ 0.975V? My cards defaulting to 500/250 core/hbm. It is either in DPM1 or DPM7. There is no inbetween. And obviously due to the reduced clocks everything in windows and my games are choppy. What do you suppose the issue is?
> 
> I have tried reflashing the BIOS.
> Clean installed the drivers.
> Don't know whats up.
> 
> I am worried my card is damaged. And figuring should I go through the effort of RMA or should I just scrap it, claim insurance and downsize to some 480's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Ok both cards are doing it, but I only bricked one card's BIOS. So both cards could not have been damaged by a single bricked BIOS isolated to one card. So it must be a software issue! Time to roll back to 16.12.1!
> 
> EDIT2: 16.12.1 did not fix it. But toggling crossfire off and on again temporarily fixes the issue until the next reboot. What the heck?
> 
> Here is a screenshot of wattman. And the cards get set to those clocks. of 500/250 @ .925V


Sounds like a problem DDU or Bradley's guide is the best first step. Weed out software/driver issue before worrying about the hardware. If you already did that then yes you should consider RMA.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Guys I am having a problem with one of my Fury's. On Sunday while gaming I noticed through afterburner my voltages were running a bit higher than usual. Around 1.25ish yet I was not overclocking anything I was stock 1000/500. I went into wattman to see if anything was conflicting with my MSI afterburner settings only to see that EVERYTHING in wattman was greyed out. I decided that maybe the BIOS I had put on it (015.049.000.004.000000 for the xfx liquid cooled Fury Pro) was wonky. I started to reflash it to the stock rom (015.049.000.003.000000) when it BSOD'ed during the flash. I figured no worries. I will just switch to the back up BIOS. I did that, easy enough. In windows switched to the now bricked BIOS and flashed it. Simple AF right?
> 
> Well now I have a strange issue. With the clean installed drivers(17.2.1), I still have NO OPTIONS in wattman. Except for DPM7. Which is default set in wattman to 500MHz @ 0.975V? My cards defaulting to 500/250 core/hbm. It is either in DPM1 or DPM7. There is no inbetween. And obviously due to the reduced clocks everything in windows and my games are choppy. What do you suppose the issue is?
> 
> I have tried reflashing the BIOS.
> Clean installed the drivers.
> Don't know whats up.
> 
> I am worried my card is damaged. And figuring should I go through the effort of RMA or should I just scrap it, claim insurance and downsize to some 480's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Ok both cards are doing it, but I only bricked one card's BIOS. So both cards could not have been damaged by a single bricked BIOS isolated to one card. So it must be a software issue! Time to roll back to 16.12.1!
> 
> EDIT2: 16.12.1 did not fix it. But toggling crossfire off and on again temporarily fixes the issue until the next reboot. What the heck?
> 
> Here is a screenshot of wattman. And the cards get set to those clocks. of 500/250 @ .925V
> 
> 
> 
> 
> Sounds like a problem DDU or Bradley's guide is the best first step. Weed out software/driver issue before worrying about the hardware. If you already did that then yes you should consider RMA.
Click to expand...

I already DDU'd. And clean installed. Didn't help.

The thing is everything seems fine until the driver loads up. Like the card behaves normal u til the drivers load up. I don't know what's up. Maybe a system restore will help. I hope.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> I already DDU'd. And clean installed. Didn't help.
> 
> The thing is everything seems fine until the driver loads up. Like the card behaves normal u til the drivers load up. I don't know what's up. Maybe a system restore will help. I hope.


Damn that bites...I guess try a system restore and cross those fingers. I have never had a half working GPU they either work or black screen but I tend to avoid doing some of the more advanced stuff like alot of you all(like BIOS swaps etc.). I have never even messed with the Fury X at all yet not even a light OC.


----------



## Alastair

Looks like I fixed it. Turns out even though MSI afterburner was told NOT To it was still loading my horribly unstable benching clocks. And was causing watt man to go haywire!


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Looks like I fixed it. Turns out even though MSI afterburner was told NOT To it was still loading my horribly unstable benching clocks. And was causing watt man to go haywire!


Ah nice glad to see it was just a software glitch. I always completely wipe that stuff(profiles included) before I swap drivers even because I have had very odd stuff happen if I don't. Perhaps the BIOS flash + AB caused the problem in a similar fashion.


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Looks like I fixed it. Turns out even though MSI afterburner was told NOT To it was still loading my horribly unstable benching clocks. And was causing watt man to go haywire!
> 
> 
> 
> Ah nice glad to see it was just a software glitch. I always completely wipe that stuff(profiles included) before I swap drivers even because I have had very odd stuff happen if I don't. Perhaps the BIOS flash + AB caused the problem in a similar fashion.
Click to expand...

probably. Now I am just redoing my overclock. For some reason Tomb Raider 2013 falls on its face on settings that can run Heaven bench 24/7. :/


----------



## u3a6

So it seems that the new top end AMD card will be called RX Vega!


----------



## 99belle99

Quote:


> Originally Posted by *u3a6*
> 
> So it seems that the new top end AMD card will be called RX Vega!


And then the Next card after that will be RX Vega X.

Possibly anyway.


----------



## Alastair

Guys why do the voltages vary so much underload on these cards. Back in the Cypress/Caymen days when you set a voltage to the graphics card it would maintain that voltage. Or at least it told me it was maintaining that voltage. When I told my 6850's to eat 1.35V they would hold that voltage. With my Fury's I am basically trying to get them to run 1250mv but the voltage is constantly changing under load. Sometimes so much that it causes stability (during OC) to be lost. Any way to keep the voltage more constant under load?


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Guys why do the voltages vary so much underload on these cards. Back in the Cypress/Caymen days when you set a voltage to the graphics card it would maintain that voltage. Or at least it told me it was maintaining that voltage. When I told my 6850's to eat 1.35V they would hold that voltage. With my Fury's I am basically trying to get them to run 1250mv but the voltage is constantly changing under load. Sometimes so much that it causes stability (during OC) to be lost. Any way to keep the voltage more constant under load?


It looks like two things are fighting(terribly worded). I only have one GPU but the GPU1 looks like it tried to recover the voltage after the drop and nearly did then back down. Mine does the same thing with the down back up then down again as well. This is at stock clocks too....love the Fury X but it does some weird stuff every now and again for sure.


----------



## Ceadderman

iirc, when not in use GPU go power saver and revert back to stock settings til you open an app that would benefit from the OC. That's how it's been since 6*** or 7*** series so far as I recall.









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> iirc, when not in use GPU go power saver and revert back to stock settings til you open an app that would benefit from the OC. That's how it's been since 6*** or 7*** series so far as I recall.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


yes but it's the v droop underload that's killing me. Sometimes the voltage dips as low as 1.218 for what should be 1.25. And this causes my overclocks to fall on their face. And sometimes the voltage can vboost right up to 1.3 v as per the picture I posted.

Thoth 420 at the end of the afterburner graph I posted the voltages are varying like crazy cause I alt+tabbed out of Tomb Raider and the card started rapidly cycling through its DPM states.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> Thoth 420 at the end of the afterburner graph I posted the voltages are varying like crazy cause I alt+tabbed out of Tomb Raider and the card started rapidly cycling through its DPM states.


Ah gotcha


----------



## Ceadderman

Then it's possible that Afterburner is the problem?

Question: Not sure if it's been asked but did you go from nVidia to AMD?

If yes did you scrub those drivers before installing AMD drivers?

Cause if I hadda guess there seems to be driver conflict going on.









~Ceadder


----------



## Alastair

Quote:


> Originally Posted by *Ceadderman*
> 
> Then it's possible that Afterburner is the problem?
> 
> Question: Not sure if it's been asked but did you go from nVidia to AMD?
> 
> If yes did you scrub those drivers before installing AMD drivers?
> 
> Cause if I hadda guess there seems to be driver conflict going on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I haven't owned an nVidia card since Fermi. I've been sitting pretty with team red since Evergreen.


----------



## Thoth420

Quote:


> Originally Posted by *Alastair*
> 
> I haven't owned an nVidia card since Fermi. I've been sitting pretty with team red since Evergreen.


Same here for never having Nvidia GPU in this hardware. Fury X since it's birth.
One thing I noticed is my GPU tach never goes single green light anymore but did when I first owned it. Now it just sits at single red when it is in idle. Card is very strange...


----------



## Alastair

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I haven't owned an nVidia card since Fermi. I've been sitting pretty with team red since Evergreen.
> 
> 
> 
> Same here for never having Nvidia GPU in this hardware. Fury X since it's birth.
> One thing I noticed is my GPU tach never goes single green light anymore but did when I first owned it. Now it just sits at single red when it is in idle. Card is very strange...
Click to expand...

ULPS has been disabled somehow? My cards do it all the time. I have to disable it in Afterburner when I want to overclock.


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> ULPS has been disabled somehow? My cards do it all the time. I have to disable it in Afterburner when I want to overclock.


Out of curiosity what are your temperatures like under load?


----------



## Alastair

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> ULPS has been disabled somehow? My cards do it all the time. I have to disable it in Afterburner when I want to overclock.
> 
> 
> 
> Out of curiosity what are your temperatures like under load?
Click to expand...

I'm actually a bit puzzled about that right now. Because it seems to be what's holding me back. Tomb Raider seems to be the most graphically stressful thing for my Fury's so far.

Pushing 50/45C on the cards respectively at 1125/550 but it seems constant at that temp right to 1150/550. That's 100% fan speed on my custom loop (Jetflo 120's and ML140's) in a 27C ambient so I'm looking at around 20c ish Delta temps. I'm not sure if I should be getting better temps on card one. Wondering if I screwed up that mount. And trying to figure out if it would be worth it to buy a higher performing paste over my MX-4. (kryonaut or Master maker Nano gel) or if I should just pop some cash on a portable air con to set up in my room to try get the ambient down.


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> I'm actually a bit puzzled about that right now. Because it seems to be what's holding me back. Tomb Raider seems to be the most graphically stressful thing for my Fury's so far.
> 
> Pushing 50/45C on the cards respectively at 1125/550 but it seems constant at that temp right to 1150/550. That's 100% fan speed on my custom loop (Jetflo 120's and ML140's) in a 27C ambient so I'm looking at around 20c ish Delta temps. I'm not sure if I should be getting better temps on card one. Wondering if I screwed up that mount. And trying to figure out if it would be worth it to buy a higher performing paste over my MX-4. (kryonaut or Master maker Nano gel) or if I should just pop some cash on a portable air con to set up in my room to try get the ambient down.


I might have an idea then. Depending on the individual GPU silicon, there is a toggle (tipping point with Fiji) where the voltage leakage increases due to temperature. Somewhere in the neighborhood of 20-30% higher current draw. One our board posters noticed this awhile ago. His tipping point was 48c. When I checked my GPU, the tipping point was 50c.
This tipping point when reached could make a stable clock, unstable, due to voltage droop/draw.

Can you increase your cooler fan speed?


----------



## Alastair

Quote:


> Originally Posted by *bluezone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I'm actually a bit puzzled about that right now. Because it seems to be what's holding me back. Tomb Raider seems to be the most graphically stressful thing for my Fury's so far.
> 
> Pushing 50/45C on the cards respectively at 1125/550 but it seems constant at that temp right to 1150/550. That's 100% fan speed on my custom loop (Jetflo 120's and ML140's) in a 27C ambient so I'm looking at around 20c ish Delta temps. I'm not sure if I should be getting better temps on card one. Wondering if I screwed up that mount. And trying to figure out if it would be worth it to buy a higher performing paste over my MX-4. (kryonaut or Master maker Nano gel) or if I should just pop some cash on a portable air con to set up in my room to try get the ambient down.
> 
> 
> 
> I might have an idea then. Depending on the individual GPU silicon, there is a toggle (tipping point with Fiji) where the voltage leakage increases due to temperature. Somewhere in the neighborhood of 20-30% higher current draw. One our board posters noticed this awhile ago. His tipping point was 48c. When I checked my GPU, the tipping point was 50c.
> This tipping point when reached could make a stable clock, unstable, due to voltage droop/draw.
> 
> Can you increase your cooler fan speed?
Click to expand...

well I would reckon the tipping point is in the 45C range then. Because if I keep the temps below that I can keep 1150/550 happy. But above that 1125/550 is a stretch.

I'm on a 640mm loop (EK PE360 and CE280) and that's the sort of temps I am hitting at 100% fan speed (around 2200rpm on all my fans) in a 27C ambient. If I could drop my ambient I'm sure I could do better.


----------



## domrockt

so it is true... i have an stable overclock when i open my windows and the room keeps quite ambient in the 20°ish range but when i close my windows and play for a while .. sometimes my GPU crashes...

so i hooked a second monitor on my rig and did a quick future mark stress test with my radiator fan off and watched my temps in real time ... the stress test crashed just above 50°

just right before the crash the VDDC fluktuates rapidly about 0.06V or between 1.30V to 1.36V


----------



## kondziowy

Yes







I'm always crashing at 47*C 1140MHz +96mv. It's stable at lower temps.


----------



## domrockt

CPU 100% GPU 100% power draw Form the wall


My PSU is rated 650watt 80%plus silver...


----------



## Thoth420

Why is the card buckling at such low temps?


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> well I would reckon the tipping point is in the 45C range then. Because if I keep the temps below that I can keep 1150/550 happy. But above that 1125/550 is a stretch.
> 
> I'm on a 640mm loop (EK PE360 and CE280) and that's the sort of temps I am hitting at 100% fan speed (around 2200rpm on all my fans) in a 27C ambient. If I could drop my ambient I'm sure I could do better.


I forgot that you are in the summer season where you are. The snow is still flying form me.

When people have contacted me about Nanos. My mantra has been roughly "Cold air, cold air, cold air"
For them:
1st Get the coolest air possible entering the card. Ducting can help.
2nd. Have a fan blow on the back of the card. It's not very efficient at displacing heat, but is a large surface area.
3rd Increase the air flow through the heat sink with a second fan/better fan.

iirc you already have a good case plus your water blocks. Could you add a secondary fan to pull air through the rad?


----------



## domrockt

Quote:


> Originally Posted by *Thoth420*
> 
> Why is the card buckling at such low temps?


Its a relative high overclock, furys dont go much higher







 afaik


----------



## Gdourado

If buying today, how much of an issue is the 4gb of vram on the fury?
For 1080p gaming.
I read that 4gb will not run doom in nightmare.
Are there any more games that don't run on 4gb?

Cheers


----------



## domrockt

idk anymore games than Doom 4 .. but you can start Doom 4 with nighmare settings! i recomend to set the shadows on the second highest Level, shadows on Nightmare give a massive fps drop.

with that set Doom 4 runs smooth with nightmare settings.

i play all my games with 3440x1440p and have no problems what so ever


----------



## Thoth420

Quote:


> Originally Posted by *domrockt*
> 
> Its a relative high overclock, furys dont go much higher
> 
> 
> 
> 
> 
> 
> 
> afaik


Ah ok so just the wall. I haven't bothered trying to OC mine at all yet.


----------



## budgetgamer120

Quote:


> Originally Posted by *Gdourado*
> 
> If buying today, how much of an issue is the 4gb of vram on the fury?
> For 1080p gaming.
> I read that 4gb will not run doom in nightmare.
> Are there any more games that don't run on 4gb?
> 
> Cheers


I do not play Doom single player. I play Multiplayer at the highest settings and have no issues. Also play Crysis 3 Multiplayer at highest settings and have no issues.


----------



## jdip

Does MSI Afterburner still work with the R9 Fury (Sapphire Nitro)? I am using Windows 7 Pro.

Ever since ReLive and Wattman were implemented, I haven't been able to use Afterburner properly. Using 16.9 (before ReLive and Wattman implementation), I had a very healthy -84mV undervolt at stock clocks using Afterburner.

When I updated the driver to 16.12 (ReLive and Wattman were implemented), I was unable to undervolt in Afterburner at all without the display driver crashing. However, changing clock speed in Afterburner still worked.

Today I installed 17.2.1 and any undervolt still results in crashing. But now the card doesn't respond to clock changes in Afterburner at all. I tried removing the display driver using DDU and reinstalling only the driver so there would be no Wattman. But my results in Afterburner were the same.

I want to undervolt to save power and generate less heat. Is it still possible to do this with Afterburner with the current drivers? I don't like using Wattman. I feel like just rolling back to 16.9 when Afterburner and my undervolt worked flawlessly.


----------



## Minotaurtoo

personally I would ditch afterburner and go with bios modding for a more permanent fix that will live through driver updates. You many have to do a driver clean though because you have tried with software first it many override the bios changes if you don't... may not.... but since modding my bios I've updated drivers twice with no changes to clocks/volts

Directions and tools can be found here: http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo


----------



## Skyl3r

http://www.3dmark.com/fs/11900321

Decently sure I've reached the max my GPU's will do.
1200 Core Clock
640 Mem Clock

I'm tempted to say that a CPU bottleneck is the cause of my apparent stability. Guess we'll have to wait on Ryzen to find out


----------



## Minotaurtoo

Quote:


> Originally Posted by *Skyl3r*
> 
> http://www.3dmark.com/fs/11900321
> 
> Decently sure I've reached the max my GPU's will do.
> 1200 Core Clock
> 640 Mem Clock
> 
> I'm tempted to say that a CPU bottleneck is the cause of my apparent stability. Guess we'll have to wait on Ryzen to find out


with two fury x's it's kinda hard to not be cpu bound with your cpu at 1080p, you should try using time spy on them or fire strike ultra... that'll give your cards a workout.


----------



## Skyl3r

Quote:


> Originally Posted by *Minotaurtoo*
> 
> with two fury x's it's kinda hard to not be cpu bound with your cpu at 1080p, you should try using time spy on them or fire strike ultra... that'll give your cards a workout.


I can't get Time Spy to work anymore. I've tried everything in my arsenal to make it happen and it refuses.
"Oops, an error has occured"
A nice and useful error, thanks FutureMark









I did try running FireStrike at 4k to see if I could create a GPU bottleneck instead, but I'm still seeing users with better CPU's and the same GPU configuration outscoring me substantially on graphics tests. I'll try FireStrike Ultra though! I have an 1800x sitting right next to me as well, so I'm extremely excited to see what my Fury X's can do with that thing feeding though.


----------



## Minotaurtoo

usually that "oops an error occurred" is either unstable oc or something like raptor is interfering with it.... anyway good luck... btw I assume you are paying no attention to the final score but only the graphics score when comparing between cpu's


----------



## Skyl3r

Quote:


> Originally Posted by *Minotaurtoo*
> 
> usually that "oops an error occurred" is either unstable oc or something like raptor is interfering with it.... anyway good luck... btw I assume you are paying no attention to the final score but only the graphics score when comparing between cpu's


Yeah, I am comparing graphics scores. It would seem weird to me if it's an OC issue when Firestrike ultra can run fine.

Anyhow, increased my OC and tested on Firestrike Ultra:
http://www.3dmark.com/3dm/18407944?
Core clock 1,210 MHz
Memory bus clock 640 MHz

Do I have golden cards?


----------



## steadly2004

Quote:


> Originally Posted by *Skyl3r*
> 
> Yeah, I am comparing graphics scores. It would seem weird to me if it's an OC issue when Firestrike ultra can run fine.
> 
> Anyhow, increased my OC and tested on Firestrike Ultra:
> http://www.3dmark.com/3dm/18407944?
> Core clock 1,210 MHz
> Memory bus clock 640 MHz
> 
> Do I have golden cards?


I know I can pass 3dmark but not game (BF1) at the same clock


----------



## Minotaurtoo

Quote:


> Originally Posted by *Skyl3r*
> 
> Yeah, I am comparing graphics scores. It would seem weird to me if it's an OC issue when Firestrike ultra can run fine.
> 
> Anyhow, increased my OC and tested on Firestrike Ultra:
> http://www.3dmark.com/3dm/18407944?
> Core clock 1,210 MHz
> Memory bus clock 640 MHz
> 
> Do I have golden cards?


I know I can't hit those memory clocks.... and my card is a poor clocker so I can't easily hit those clocks either... maybe not golden, but still good... and btw... at 4k your system seems well balanced with gpu vs cpu power.... Something will always be the "bottle neck" in a system, but for now yours seems balanced at least at 4k.... now 1080p is different lol...


----------



## Skyl3r

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I know I can't hit those memory clocks.... and my card is a poor clocker so I can't easily hit those clocks either... maybe not golden, but still good... and btw... at 4k your system seems well balanced with gpu vs cpu power.... Something will always be the "bottle neck" in a system, but for now yours seems balanced at least at 4k.... now 1080p is different lol...


Quote:


> Originally Posted by *steadly2004*
> 
> I know I can pass 3dmark but not game (BF1) at the same clock


Okay, this is a good clock on a Fury X but not a great clock then. All I was really going on was the other scores I was looking it in FireStrike and TimeSpy where I've only seen one other score running at 1210MHz GPU clock.
Once I get my cooling setup complete, hopefully I'll be able to increase this by a bit









Thanks for the input!


----------



## steadly2004

Quote:


> Originally Posted by *Skyl3r*
> 
> Okay, this is a good clock on a Fury X but not a great clock then. All I was really going on was the other scores I was looking it in FireStrike and TimeSpy where I've only seen one other score running at 1210MHz GPU clock.
> Once I get my cooling setup complete, hopefully I'll be able to increase this by a bit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the input!


I just realized I miss spoke. I meant at same clocks for my system, not same as your clocks. My bad. I can bench at 1200/575 or so, but drop it to 1150/550 for gaming.


----------



## LionS7

Quote:


> Originally Posted by *steadly2004*
> 
> I just realized I miss spoke. I meant at same clocks for my system, not same as your clocks. My bad. I can bench at 1200/575 or so, but drop it to 1150/550 for gaming.


What voltage for 1150/550 in games with unlocked framerates, like Battlefield 1 ?


----------



## steadly2004

Quote:


> Originally Posted by *LionS7*
> 
> What voltage for 1150/550 in games with unlocked framerates, like Battlefield 1 ?


I believe it's it runs at 1.25v with an occasional spike to 1.28v. but I do have a Fury nitro OC, not a fury X. I can't remember at the moment if that's with just the power limit raised or with a +.25v in afterburner. It's been a while since I've messed with any settings. This is also with a custom fan curve.


----------



## Skyl3r

Whoo, got my clocks up a little more:
http://www.3dmark.com/3dm/18472752?

1230 GPU Clock
630 HBM


----------



## LionS7

@Skyl3r, nice. Can you put result with one card on 1230/630 ?


----------



## bluezone

OK, Play3r TV recently did a nice round up and testing of thermal pastes (TIM).






Looks like I'm sticking with my Gelid GC Extreme.


----------



## dagget3450

Quote:


> Originally Posted by *Skyl3r*
> 
> Whoo, got my clocks up a little more:
> http://www.3dmark.com/3dm/18472752?
> 
> 1230 GPU Clock
> 630 HBM


This is the type of information i am looking for on doing a ryzen cf fiji build.

I really dont know why it says base clock is 4ghz on 1800x its actually like 3.6 or something right?

What ram speed were you running?


----------



## Skyl3r

Quote:


> Originally Posted by *LionS7*
> 
> @Skyl3r, nice. Can you put result with one card on 1230/630 ?


I'm gonna be pretty slammed today, but I'll try to get to that for you.
Quote:


> Originally Posted by *dagget3450*
> 
> This is the type of information i am looking for on doing a ryzen cf fiji build.
> 
> I really dont know why it says base clock is 4ghz on 1800x its actually like 3.6 or something right?
> 
> What ram speed were you running?


Overall I'm scoring pretty well, in my opinion.

The boost clock is 4GHz iirc. I have it clocked up to 4.05 though. That was my max stable clock. I'm running 2400MHz because I thought it was the max supported when I ordered it


----------



## ramos29

good evening/mornig every one.well i used to have a 295x2, it is broken for no reason: the black screen crash + high speed fan rotaion + pc shutting down...
so i bought 2 radeon fury nitro
i have an msi z97 gaming 5 with 3 pci express 3.0, 16x,8x8x or 8x 4x 4x
the prblem is that my second pci express is broken
so i put both cards on the first and third slot
the gpu z is showing that the first card runs at 8x and the second at 4x 3.0
can i fix that?
is there any performance impact?


----------



## steadly2004

Quote:


> Originally Posted by *ramos29*
> 
> good evening/mornig every one.well i used to have a 295x2, it is broken for no reason: the black screen crash + high speed fan rotaion + pc shutting down...
> so i bought 2 radeon fury nitro
> i have an msi z97 gaming 5 with 3 pci express 3.0, 16x,8x8x or 8x 4x 4x
> the prblem is that my second pci express is broken
> so i put both cards on the first and third slot
> the gpu z is showing that the first card runs at 8x and the second at 4x 3.0
> can i fix that?
> is there any performance impact?


It's either something you can fix in the BIOS, or not possible with your board. Have you tried to go into the BIOS and fiddle with your PCI express slot settings? Maybe if you limit the first to 8x the second will get 8x? Or possibly (worst case scenario) the 3rd slot will not support anything greater than 4x. Do you have your user's manual?

*edit*
go into your bios and look for this







from your user manual


----------



## ramos29

the bios gave me the possibility to use 16x /0x/0x or 8x/8x/0x or 8x/4x/4x
if i choose the 8x/8x/0x the crossfire is disabled as the second fury is on the third slot, i cant select which slot to get the 8x
i think i am screwed -_- i did not have that issu with the 295x2 as it was a single card


----------



## steadly2004

Quote:


> Originally Posted by *ramos29*
> 
> the bios gave me the possibility to use 16x /0x/0x or 8x/8x/0x or 8x/4x/4x
> if i choose the 8x/8x/0x the crossfire is disabled as the second fury is on the third slot, i cant select which slot to get the 8x
> i think i am screwed -_- i did not have that issu with the 295x2 as it was a single card


Ah, I see. You may not be able to use it in that slot at full speed. Why don't you run a few benchmarks and see how you add up at comparable clocks speeds? Like on our 3dmark pages, or other benchmark leaderboards. Just look at the bottom under the Titan and 1080 entries to find a Fury, LOL.

If you're happy with your gaming results, I wouldn't worry about it too much, unless you want to go get another motherboard. My OCD might drive me to get a differnet MOBO, even if the gaming is fine.


----------



## ramos29

in the 3dmark charts i found only single gpu score not dual ones, maybe i will compare to the radeon pro duo and see if i am far behind


----------



## diggiddi

Quote:


> Originally Posted by *Skyl3r*
> 
> Whoo, got my clocks up a little more:
> http://www.3dmark.com/3dm/18472752?
> 
> 1230 GPU Clock
> 630 HBM


Dude were you running this setup and the one in your sig on 1000w PSU? cos I'm looking at getting 1300w power


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Skyl3r*
> 
> Whoo, got my clocks up a little more:
> http://www.3dmark.com/3dm/18472752?
> 
> 1230 GPU Clock
> 630 HBM
> 
> 
> 
> 
> Dude were you running this setup and the one in your sig on 1000w PSU? cos I'm looking at getting 1300w power
Click to expand...

I hit around 1K from the wall when gaming on my rig. 2X Fury's OC'ed and 4.95GHz on my FX.


----------



## Alastair

Its actually damned irritating. Wildlands and For Honor get crossfire profiles but Siege which runs on the same freaking engine is still left in the dark. And Siege hits a single Fury hard even at 1080P.


----------



## diggiddi

Quote:


> Originally Posted by *Alastair*
> 
> I hit around 1K from the wall when gaming on my rig. 2X Fury's OC'ed and 4.95GHz on my FX.


Is that right, so i can rock with a 1000w PSU and be safe, btw repped up


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I hit around 1K from the wall when gaming on my rig. 2X Fury's OC'ed and 4.95GHz on my FX.
> 
> 
> 
> Is that right, so i can rock with a 1000w PSU and be safe, btw repped up
Click to expand...

I would say 1.2K and your golden. I think a 1K PSU might be cutting it close. Assuming 90% efficiency your still pulling 900 watts through it. You only have 100 watts of wiggle room.


----------



## weespid

Quote:


> Originally Posted by *weespid*
> 
> -snip- looking to get all my displays connected to my card. -snip-
> There is alot of misinformation going around regarding this subject and i'm just looking to try and help clear it up (as well as connect my desplays
> 
> 
> 
> 
> 
> 
> 
> )
> TLDR
> does the r9 nano support 2 legacy connections through 1 passive adptor and the HDMI port?


while life always has to get in the way of things but i have finally received the flee bay adapters for an grand total of $3.50 cad and i am happy to say it works







.
this is just an proof of concept
ultra wide acting as two monitors (pbp) dvi and hdmi to the cards native ports
syncmaster to the desplayport to vga adapter
dell to the passive desplayport to hdmi adapter.


Spoiler: pics for proof









now to take down this setup and move it across the room to the bigger desk....


----------



## diggiddi

I hope you like this portrait its called 2 for 2


----------



## Ceadderman

Decided last night to pull the trigger on RX 480 8gb. Just couldn't resist the price to performance ratio. Picked it up for $210 for the XFX OC model a $30 rebate and Doom. So I will be skipping Fury. Been fun fellas.









~Ceadder


----------



## ramos29

i got a couple of them a week ago( the picture with the two nitro ), but did not found hdmi cable inside both cases, i dont need them i am just surprised of their abscence


----------



## Alastair

Maybe you need this?


----------



## ramos29

i am already using a display cable so no real need for a hdmi one,
the one you posted looks like a hosepipe XD


----------



## Thoth420

Good old HDMI to H2O....classic


----------



## Johan45

Just had to post this here. Do you know how to make your FX score like Intel haha. Run it at 6.2GHz


----------



## kondziowy

Guys, if there a way to check VRM temperature on Fury? Asus Fury displays 34-40*C VRM VDD temperature at full load using newest HWiNFO. It's like that for a year now.

There is also "GPU Thermal Diode" and "VR VDDC Temperature" and both have the same temperature (depending on load 60-75*C)

Shouldn't VRM be like 80*C or something?


----------



## Ceadderman

Quote:


> Originally Posted by *ramos29*
> 
> i got a couple of them a week ago( the picture with the two nitro ), but did not found hdmi cable inside both cases, i dont need them i am just surprised of their abscence


If I get just the card, Doom and the disc in my box am gonna be pissed. I have an HDMI cable so I should be able to connect it straight to 1 monitor. But I'm considering connecting to a wide screen TV and don't know if the HDMI on the card can power that large of a desktop. If it can't and I don't have the DP to HDMI that would be Fail.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> If I get just the card, Doom and the disc in my box am gonna be pissed. I have an HDMI cable so I should be able to connect it straight to 1 monitor. But I'm considering connecting to a wide screen TV and don't know if the HDMI on the card can power that large of a desktop. If it can't and I don't have the DP to HDMI that would be Fail.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Hey Ceadder!

The HDMI connection is just a connection, it will connect to a TV just fine!


----------



## Ceadderman

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> If I get just the card, Doom and the disc in my box am gonna be pissed. I have an HDMI cable so I should be able to connect it straight to 1 monitor. But I'm considering connecting to a wide screen TV and don't know if the HDMI on the card can power that large of a desktop. If it can't and I don't have the DP to HDMI that would be Fail.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey Ceadder!
> 
> The HDMI connection is just a connection, it will connect to a TV just fine!
Click to expand...

Sweet. Thanks Laz, I knew I cud count on my OCN brethren to set things straight.









Next splurge is a 4k TV. But that won't be for awhile since I still need CVIHero and RAM cooling solutions worked out. Picked up Ryzen 7 1800x with the 480. Will be here Wednesday at the latest.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> Sweet. Thanks Laz, I knew I cud count on my OCN brethren to set things straight.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next splurge is a 4k TV. But that won't be for awhile since I still need CVIHero and RAM cooling solutions worked out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


No worries Brohemius Maximus! Unless it's something funky like MHL, all the HDMI does is pass the signal through. Introducing 4K kinda messed things up with HDMI 1.4a and HDMI 2.0 and whatnot, but whatever. It looks like the RX 480 is HDMI 2.0 compliant so it shouldn't be an issue!


----------



## Ceadderman

Yup 2.0 compliant. The price was too good to pass on considering the MSI and Visiontek cards were 4gb and the XFX was 8gb at about the same price. So hades yes I went with the XFX even though I may not be able to drop an EK block on it.









Hopefully I can. Though it may take some modification. I have two 6870s that have Thermosphere blocks but no 1440p or 4k capability so even if I can't it's not like I will be broken hearted since I also have 5770 and all are Reference cards. I could gone water on the latter but it was not EK so chose to stick with stock cooling option.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> Yup 2.0 compliant. The price was too good to pass on considering the MSI and Visiontek cards were 4gb and the XFX was 8gb at about the same price. So hades yes I went with the XFX even though I may not be able to drop an EK block on it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully I can. Though it may take some modification. I have two 6870s that have Thermosphere blocks but no 1440p or 4k capability so even if I can't it's not like I will be broken hearted since I also have 5770 and all are Reference cards. I could gone water on the latter but it was not EK so chose to stick with stock cooling option.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah, this card will demolish your old ones for sure! lol, I moved from custom water (selling my stuff off here on OCN) to big air, now to AIO for my Ryzen 7 1700... a quality AIO is good enough for me. Heck, if I'm not happy with it I'll just put better fans on it and deal.


----------



## 99belle99

My Fury X is not a good overclocker. I can run benchmarks at 550 and 1150 any higher and it will crash. But with the same settings in game it will crash. So I just leave it not overclocked anymore.

My old R9 290 reference was a brilliant overclocker but I used to crank up the fan speed so it was seriously loud.


----------



## dagget3450

Quote:


> Originally Posted by *99belle99*
> 
> My Fury X is not a good overclocker. I can run benchmarks at 550 and 1150 any higher and it will crash. But with the same settings in game it will crash. So I just leave it not overclocked anymore.
> 
> My old R9 290 reference was a brilliant overclocker but I used to crank up the fan speed so it was seriously loud.


Two different beasts, furyx has almost double shaders and HBM. Few other issues like node process they just arent good overclockers per say. Will be interesting to see how vega works out and if it comes with HBM. Not many HBM equipped cards to compare and test HBM clocks.


----------



## Ceadderman

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Yup 2.0 compliant. The price was too good to pass on considering the MSI and Visiontek cards were 4gb and the XFX was 8gb at about the same price. So hades yes I went with the XFX even though I may not be able to drop an EK block on it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully I can. Though it may take some modification. I have two 6870s that have Thermosphere blocks but no 1440p or 4k capability so even if I can't it's not like I will be broken hearted since I also have 5770 and all are Reference cards. I could gone water on the latter but it was not EK so chose to stick with stock cooling option.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, this card will demolish your old ones for sure! lol, I moved from custom water (selling my stuff off here on OCN) to big air, now to AIO for my Ryzen 7 1700... a quality AIO is good enough for me. Heck, if I'm not happy with it I'll just put better fans on it and deal.
Click to expand...

Fer sure! I knew that going in. 100fps will murder anything 60fps or less in terms of speed and rendering.









My systems all have custom loops (cept inherited Dell) so it's a need in terms of silence and Temps at this point. An 8gb OC'ed card will be hot so I hope the 480 block will work or can be model to.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> Fer sure! I knew that going in. 100fps will murder anything 60fps or less in terms of speed and rendering.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My systems all have custom loops (cept inherited Dell) so it's a need in terms of silence and Temps at this point. An 8gb OC'ed card will be hot so I hope the 480 block will work or can be model to.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Did you get the XFX RX 480 RS? or GTR? I think the RS is reference pcb, not sure about the GTR though...


----------



## Ceadderman

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Fer sure! I knew that going in. 100fps will murder anything 60fps or less in terms of speed and rendering.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My systems all have custom loops (cept inherited Dell) so it's a need in terms of silence and Temps at this point. An 8gb OC'ed card will be hot so I hope the 480 block will work or can be model to.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you get the XFX RX 480 RS? or GTR? I think the RS is reference pcb, not sure about the GTR though...
Click to expand...

GTR isn't Reference. Mine is the Reference cooler. Not sure whether the OC edition will interfere with the block or not since it may have extra caps on it for the bios.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> GTR isn't Reference. Mine is the Reference cooler. Not sure whether the OC edition will interfere with the block or not since it may have extra caps on it for the bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Ahhhh, ok gotcha. Well here's to hoping your blocks work just fine! I'm curious to hear about your OCs on water with that card.


----------



## Ceadderman

Stock clock on this card is 1265. If, I can get it under water I will try for better. If not, then 1265 is where it will remain.









~Ceadder


----------



## LazarusIV

Quote:


> Originally Posted by *Ceadderman*
> 
> Stock clock on this card is 1265. If, I can get it under water I will try for better. If not, then 1265 is where it will remain.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I've been under the impression even at stock clocks these cards are great performers... Seems like a win-win!


----------



## Skyl3r

I was able to bench again at 1250/640 on my Fury X's and 4.2GHz on my 1800x.
http://www.3dmark.com/3dm/18571858?


... It's stock







...

I'll try to get Time Spy to work so I can get a run on that, but honestly I have no faith that it will work. I don't know why it's so troublesome for me.

*EDIT:*

Real men run 595v per MHz. Haters will say AMD runs hot.


But really, what's going on here lol


----------



## 99belle99

Quote:


> Originally Posted by *Skyl3r*
> 
> I was able to bench again at *1250/640 on my Fury X's* and 4.2GHz on my 1800x.


How are you getting such great numbers on your Fury X's I can barely get 1150, any higher and it just crashes. I only tried 550 on the memory.


----------



## Skyl3r

Quote:


> Originally Posted by *99belle99*
> 
> How are you getting such great numbers on your Fury X's I can barely get 1150, any higher and it just crashes. I only tried 550 on the memory.


A couple factors. First of all, before doing anything crazy, I was able to run 1200/560. So, I may have pretty decent cards to begin with.

1.

__
https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/
 - this is what I had to do to get up to 1200/560 the first time.

2. Custom BIOS - Sounds very intimidating, but it worked flawlessly for me. The BIOS here is what I'm running.

3. Chillbox - With the BIOS, I was soundly at 1230/630; but it took the chillbox to get me to 1250/640.

Keep in mind, each of these things comes with its own series of risks. Only do what you are comfortable doing.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> A couple factors. First of all, before doing anything crazy, I was able to run 1200/560. So, I may have pretty decent cards to begin with.
> 
> 1.
> 
> __
> https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/
> - this is what I had to do to get up to 1200/560 the first time.
> 
> 2. Custom BIOS - Sounds very intimidating, but it worked flawlessly for me. The BIOS here is what I'm running.
> 
> 3. Chillbox - With the BIOS, I was soundly at 1230/630; but it took the chillbox to get me to 1250/640.
> 
> Keep in mind, each of these things comes with its own series of risks. Only do what you are comfortable doing.


Ah, you have built your own personal meat locker for your PC.


----------



## Alastair

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *99belle99*
> 
> How are you getting such great numbers on your Fury X's I can barely get 1150, any higher and it just crashes. I only tried 550 on the memory.
> 
> 
> 
> A couple factors. First of all, before doing anything crazy, I was able to run 1200/560. So, I may have pretty decent cards to begin with.
> 
> 1.
> 
> __
> https://www.reddit.com/r/3tljrf/sapphire_trixx_how_to_remove_voltage_limit_0075mv/
> - this is what I had to do to get up to 1200/560 the first time.
> 
> 2. Custom BIOS - Sounds very intimidating, but it worked flawlessly for me. The BIOS here is what I'm running.
> 
> 3. Chillbox - With the BIOS, I was soundly at 1230/630; but it took the chillbox to get me to 1250/640.
> 
> Keep in mind, each of these things comes with its own series of risks. Only do what you are comfortable doing.
Click to expand...

is there any sort of negative performance scaling due to the high voltage levels? This is a problem that seems to plague all Fiji based cards at anything above the 1250mv level.

What sort of cooling you running? Stock AIO in the chill box or do you have a block on it as well? What are your temps like under load at your game stable settings?

Will this work on a reference PCB Sapphire Tri-x? Can I use this rom to still get my unlocked shaders?


----------



## dagget3450

I have 4 FuryX on the stock coolers, testing bench OC's i was able to only get 2 of them to 1200+/560+ and they were obviously not game stable. The other two would barely make it to 1100/560 and werent stable. I did not bother bios/exotic cooling because at the time others were running water blocks and getting roughly same clocks. I game on them at stock clocks because i got tired of trying to OC them and have it stable.

Of the 4 i have and tested they all are right above each other on bench clocks i used, 1100\1100\1200\1220 roughtly give ot take a few mhz


----------



## Skyl3r

Quote:


> Originally Posted by *Alastair*
> 
> is there any sort of negative performance scaling due to the high voltage levels? This is a problem that seems to plague all Fiji based cards at anything above the 1250mv level.


There doesn't seem to be negative performance as much as diminishing returns.
Consider the following image:


The BIOS I linked to is built by Buildzoid and will increase your load voltage to 1.3v as well as your power limit to I believe 400w.
The last two runs in my image above are done at +44mV..

Here's another example where I went from +44mV to +75mV


Focusing just on the graphics scores, it seems that I am still getting gains even after a sizeable increase in voltage.
Quote:


> Originally Posted by *Alastair*
> 
> What sort of cooling you running? Stock AIO in the chill box or do you have a block on it as well? What are your temps like under load at your game stable settings?


It's a full custom loop. Two GPU blocks and a CPU block. I'm not sure what coolant they put in those AIO's, but I wouldn't want to bet on it working fine under 0c.
If I have the chillbox down to temp, my GPU's will peak at 10-12c. When I did the run at 1250/640 the peak temperature was 14c.

Just so we're on the same page, it should be colder than that. The AC in my chillbox has clogged capillary tubes (which delivers refrigerant). So theoretically, once I get this fixed, I should see much colder. That's the theory anyways.

Quote:


> Originally Posted by *Alastair*
> 
> Will this work on a reference PCB Sapphire Tri-x? Can I use this rom to still get my unlocked shaders?


In the link I sent, the download contains Fury BIOS for the Nitro, STRIX, Tri X and Windforce. There's 4 rom files in each of the Fury folders. They'll look like:

3584FuryTriXBIOSMAX.rom
3776FuryTriXBIOSMAX.rom
3840FuryTriXBIOSMAX.rom
4096FuryTriXBIOSMAX.rom
Those will be for unlocked shaders.


----------



## Alastair

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> is there any sort of negative performance scaling due to the high voltage levels? This is a problem that seems to plague all Fiji based cards at anything above the 1250mv level.
> 
> 
> 
> There doesn't seem to be negative performance as much as diminishing returns.
> Consider the following image:
> 
> 
> The BIOS I linked to is built by Buildzoid and will increase your load voltage to 1.3v as well as your power limit to I believe 400w.
> The last two runs in my image above are done at +44mV..
> 
> Here's another example where I went from +44mV to +75mV
> 
> 
> Focusing just on the graphics scores, it seems that I am still getting gains even after a sizeable increase in voltage.
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> What sort of cooling you running? Stock AIO in the chill box or do you have a block on it as well? What are your temps like under load at your game stable settings?
> 
> Click to expand...
> 
> It's a full custom loop. Two GPU blocks and a CPU block. I'm not sure what coolant they put in those AIO's, but I wouldn't want to bet on it working fine under 0c.
> If I have the chillbox down to temp, my GPU's will peak at 10-12c. When I did the run at 1250/640 the peak temperature was 14c.
> 
> Just so we're on the same page, it should be colder than that. The AC in my chillbox has clogged capillary tubes (which delivers refrigerant). So theoretically, once I get this fixed, I should see much colder. That's the theory anyways.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Will this work on a reference PCB Sapphire Tri-x? Can I use this rom to still get my unlocked shaders?
> 
> Click to expand...
> 
> In the link I sent, the download contains Fury BIOS for the Nitro, STRIX, Tri X and Windforce. There's 4 rom files in each of the Fury folders. They'll look like:
> 
> 3584FuryTriXBIOSMAX.rom
> 3776FuryTriXBIOSMAX.rom
> 3840FuryTriXBIOSMAX.rom
> 4096FuryTriXBIOSMAX.rom
> Those will be for unlocked shaders.
Click to expand...

That is pretty cool. You are not seeing any negative scaling in 3D mark. What about heaven and in games with a benchmark feature?

I downloaded the BIOS. It basically says what it does on the box. Wanted to crank me to 1.260Ish. So I set Afterburner to - 6 and its giving me around 1.25V. I only tested from 1175mv up to 1250mv. But 1250mv seems to be giving me the same results in heaven as 1175mv for the same clocks.

I unfortunately do not have access to a chill box. Ambient in my room is 27C. (still pretty toasty in Africa) I'm seeing, 47C and 45C on my cards on my 640mm loop and best I can maintain in these ambients is 1140 core. Still working on memory.

Heaven Bench 4.0 at extreme preset @ 4K and 8x AA is 37.5fps at stock 1000/500
41.7fps at 1250mv 1140/550.

Both runs 3840 shaders.


----------



## Skyl3r

Quote:


> Originally Posted by *Alastair*
> 
> That is pretty cool. You are not seeing any negative scaling in 3D mark. What about heaven and in games with a benchmark feature?


To be honest, I somehow fried my 240GB SSD again and I'm running on a 60GB SSD (which was halfway full after just the Windows installation







)
So, I'm sorta limited to how much I can run right now as a lot of popular games now adays are larger than the amount of space on my SSD.
I don't know if these Intel SSDs are great and I'm somehow using it wrong or if they are supposed to have a life expectancy of several months.

I'll test out Heaven though and see what I can get to happen.
I'll actually run a test increasing voltage and increasing clock and see if there's a tipping point where voltage starts hurting performance. Worth a test.


----------



## Alastair

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> That is pretty cool. You are not seeing any negative scaling in 3D mark. What about heaven and in games with a benchmark feature?
> 
> 
> 
> To be honest, I somehow fried my 240GB SSD again and I'm running on a 60GB SSD (which was halfway full after just the Windows installation
> 
> 
> 
> 
> 
> 
> 
> )
> So, I'm sorta limited to how much I can run right now as a lot of popular games now adays are larger than the amount of space on my SSD.
> I don't know if these Intel SSDs are great and I'm somehow using it wrong or if they are supposed to have a life expectancy of several months.
> 
> I'll test out Heaven though and see what I can get to happen.
> I'll actually run a test increasing voltage and increasing clock and see if there's a tipping point where voltage starts hurting performance. Worth a test.
Click to expand...

awesome thanks. Yeah I posted some of my initial results in my previous post. I hope to get an aircon sometime so I don't have to try run +300w monsters in 27C ambient temperatures. :smile:


----------



## Skyl3r

Quote:


> Originally Posted by *Alastair*
> 
> awesome thanks. Yeah I posted some of my initial results in my previous post. I hope to get an aircon sometime so I don't have to try run +300w monsters in 27C ambient temperatures. :smile:


Ah, I did miss that you posted results.
I'll more or less copy your tests and see if I get different results. It would be very interesting if I did.

Build a phase change cooler!


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> To be honest, I somehow fried my 240GB SSD again and I'm running on a 60GB SSD (which was halfway full after just the Windows installation
> 
> 
> 
> 
> 
> 
> 
> )
> So, I'm sorta limited to how much I can run right now as a lot of popular games now adays are larger than the amount of space on my SSD.
> I don't know if these Intel SSDs are great and I'm somehow using it wrong or if they are supposed to have a life expectancy of several months.
> 
> I'll test out Heaven though and see what I can get to happen.
> I'll actually run a test increasing voltage and increasing clock and see if there's a tipping point where voltage starts hurting performance. Worth a test.


Perhaps the chill box is adversely effecting your SSDs. could you mount them outside the box? I'm thinking condensation.

Nice to see your getting good results with the chill box. I've already tried Hybrid cooling with an added Peltier device with poor results.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> Perhaps the chill box is adversely effecting your SSDs. could you mount them outside the box? I'm thinking condensation.
> 
> Nice to see your getting good results with the chill box. I've already tried Hybrid cooling with an added Peltier device with poor results.


I didn't mention this earlier, but I actually do have the SSDs outside of the box and have not ever put them in the box. I heard SSDs don't like the cold.

I tried peltiers first too actually







I thought it'd be a great idea but the cost to performance ended up outweighing any benefits. Now I'm speccing out a system that'll run on R-404a with a roommate. We're building a centralized phase change cooling system to use on both of our PCs. If we build it correctly, we should see as low as -40C.


----------



## Alastair

I wish I could do things like phase change and stuff. But water is as "exotic" as I will go. My pc kinda moves around a lot. Lans and stuff. So yeah. Water it will have to stay.


----------



## LionS7

How many of you have coil whine even on low voltage ? On my GPU, the coil whine is loud even on 1.03V on the core.


----------



## ramos29

what do you think of my score ? both furies are running at 1050/500
i am asking to check if the 8x/4x crossfire is holding me back


----------



## Skyl3r

Quote:


> Originally Posted by *LionS7*
> 
> How many of you have coil whine even on low voltage ? On my GPU, the coil whine is loud even on 1.03V on the core.


I do not. My roommate with Nanos doesn't either. Not sure how normal that is.
Quote:


> Originally Posted by *ramos29*
> 
> what do you think of my score ? both furies are running at 1050/500
> i am asking to check if the 8x/4x crossfire is holding me back


At stock frequencies, my dual Fury X's score 10,148 in graphics on Time Spy. So, at least to me, it wouldn't seem like you're being restricted by bandwidth.


----------



## Thoth420

Quote:


> Originally Posted by *LionS7*
> 
> How many of you have coil whine even on low voltage ? On my GPU, the coil whine is loud even on 1.03V on the core.


Only under load and it is not too loud...imagine if it wasn't blocked in a near silent system I wouldn't hear it all with the case door on. I haven't found a config that didn't have coil whine(physical location be damned) in years to some degree or other in some scenario or other. My GPU is running stock clocks latest BIOS and clockblocker is running as well. I was getting coil whine at 2d clocks when moving my mouse on desktop before I added that in. Are you using clockblocker?


----------



## ramos29

Quote:


> Originally Posted by *Skyl3r*
> 
> I do not. My roommate with Nanos doesn't either. Not sure how normal that is.
> At stock frequencies, my dual Fury X's score 10,148 in graphics on Time Spy. So, at least to me, it wouldn't seem like you're being restricted by bandwidth.


thx bro


----------



## LionS7

Quote:


> Originally Posted by *Thoth420*
> 
> Only under load and it is not too loud...imagine if it wasn't blocked in a near silent system I wouldn't hear it all with the case door on. I haven't found a config that didn't have coil whine(physical location be damned) in years to some degree or other in some scenario or other. My GPU is running stock clocks latest BIOS and clockblocker is running as well. I was getting coil whine at 2d clocks when moving my mouse on desktop before I added that in. Are you using clockblocker?


No, Im getting whine only in 3D, and a loud one in 100% core load. My VID is 1.20V, Im on 1.26V+ for 1100Мhz core, so the whine is a little bit louder, but no much more from 1.20V. I don't know, I don't want to change my stable PSU or motherboard for non certain fix for the coil whine, cos the card maybe will go away first from the 3. I just wanted to ask, if it is so louder and to others.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> I didn't mention this earlier, but I actually do have the SSDs outside of the box and have not ever put them in the box. I heard SSDs don't like the cold.
> 
> I tried peltiers first too actually
> 
> 
> 
> 
> 
> 
> 
> I thought it'd be a great idea but the cost to performance ended up outweighing any benefits. Now I'm speccing out a system that'll run on R-404a with a roommate. We're building a centralized phase change cooling system to use on both of our PCs. If we build it correctly, we should see as low as -40C.


Very weird they are dying then.

Ya, I found that the Peltier drew far to much current for the cooling supplied and it pumped way too much heat into the case. I tried externally mounting the Peltier as well, using it to cool the air drawn into the GPU cooler. Same results. Made for a nice space heater though.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> Very weird they are dying then.


Well, this one that died yesterday or two days ago is really the most perplexing one. It seems to have just gone for no reason at all. It was less than 6 months old.
The previous one was definitely my fault







Quote:


> Originally Posted by *bluezone*
> 
> Ya, I found that the Peltier drew far to much current for the cooling supplied and it pumped way too much heat into the case. I tried externally mounting the Peltier as well, using it to cool the air drawn into the GPU cooler. Same results. Made for a nice space heater though.


Yeah, on my attempt I found the same thing







The peltier just kept getting hotter and hotter and I couldn't figure out why. A thought that has occured to me recently is using a phase change cooler running on R-404a to cool a peltier array to achieve theoretically -70 to -80c.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> Yeah, on my attempt I found the same thing
> 
> 
> 
> 
> 
> 
> 
> The peltier just kept getting hotter and hotter and I couldn't figure out why. A thought that has occured to me recently is using a phase change cooler running on R-404a to cool a peltier array to achieve theoretically -70 to -80c.


Peliter coolers lose all of their efficacy if their temperature delta is too great. Meaning that if the temperature deferential between the two plates is too high, they end up doing very little. Other than consuming power.

That's why I tried to cool the intake air, but ended up being too much ambient heat to be effective.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> Peliter coolers lose all of their efficacy if their temperature delta is too great. Meaning that if the temperature deferential between the two plates is too high, they end up doing very little. Other than consuming power.
> 
> That's why I tried to cool the intake air, but ended up being too much ambient heat to be effective.


Well, seeing as a lot of them are rated at 50-60c dT, they must be able to work at least to that dT. If you are referring to the phase change idea, the tecs would never see a higher dT than 40c.

I may not have understood you right.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> Well, seeing as a lot of them are rated at 50-60c dT, they must be able to work at least to that dT. If you are referring to the phase change idea, the tecs would never see a higher dT than 40c.
> 
> I may not have understood you right.


Yes your following correctly. But you bring up an important point. Most people assume a larger heat sink simply dissipates heat. It actually does two things. It transports and dissipates heat. Peliter coolers do this as well. But they are limited by how many BTU's they can pass through the junction. So if you place a 100 BTU thermal (or Watts) load on a junction only capable of say 75 BTUs, well it will not keep up. If you follow my drift.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> Yes your following correctly. But you bring up an important point. Most people assume a larger heat sink simply dissipates heat. It actually does two things. It transports and dissipates heat. Peliter coolers do this as well. But they are limited by how many BTU's they can pass through the junction. So if you place a 100 BTU thermal (of Watts) load on a junction only capable of say 75 BTUs, well it will not keep up. If you follow my drift.


Yes I understand this point. This is why it would need to be an array of peltiers instead of simply one or two. I have done the math, but the biggest thing holding me back is the cost associated with getting the peltiers and waterblocks.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> Yes I understand this point. This is why it would need to be an array of peltiers instead of simply one or two. I have done the math, but the biggest thing holding me back is the cost associated with getting the peltiers and waterblocks.


Tec's are not cheap. If I was to try this again, It would be with a water block and Tecs mounted on the return loop from the radiator, with a separate loop for the Tec set. The power costs to run them is a bit high though.

Regardless of my personal fantasy water loop. I'm interested to see how your chill loop turns out.


----------



## xBastek

So i just got myself a sapphire r9 fury nitro and i was wondering if i can still get the unlocked cores. Even though th CU shows: Sorry, all 8 disabled CUs can't be unlocked by BIOS replacement.
Or is that it for my GPU?


----------



## rubenlol2

The nitro cards from sapphire are mostly always hardlocked, aka the silicon itself is locked so no matter what you write to it you can't unlock it.
Softlocks were on the more earlier fury cards, hence how you can unlock the cores.


----------



## Wuest3nFuchs

Quote:


> Originally Posted by *xBastek*
> 
> So i just got myself a sapphire r9 fury nitro and i was wondering if i can still get the unlocked cores. Even though th CU shows: Sorry, all 8 disabled CUs can't be unlocked by BIOS replacement.
> Or is that it for my GPU?


Same here but don't worry ,this card does a good job, even on my old SandyBridge i7-2700K .


----------



## diggiddi

Guys I am getting negative scaling with Pcars in crossfire. I disabled ULPS, enabled High Performance in windows, Disabled frame pacing and frame rate target control in CCC but its pushing same number of frames in 1080 as in 4k VSR and 20 fps more in single config what gives????

Or is the FX 8350 @4.2ghz the issue here?


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Guys I am getting negative scaling with Pcars in crossfire. I disabled ULPS, enabled High Performance in windows, Disabled frame pacing and frame rate target control in CCC but its pushing same number of frames in 1080 as in 4k VSR and 20 fps more in single config what gives????
> 
> Or is the FX 8350 @4.2ghz the issue here?


well how many frames are you pushing? The Fury has never been a great player at 1080P. It's always performed above those resolutions.

And why so low on the FX?


----------



## domrockt

Your AMD is pretty much the limit on 1080p.
An Fury comes in Handy with an res of 1440p and higher.


----------



## Skyl3r

Well, I'm about done with 3dMark at this point. It started running extremely slowly (IE, 20FPS when normal is 80FPS). Looked at GPU clock and it was pegged at 300MHz. Thought I might have fried my Fury X's (given what I've put them through, I wouldn't be surprised







).
Ran FurMark; both GPU's shot up to the specified clock with no problems. Tested a few other benchmarks and the GPUs operated completely normally.

I've had a ridiculous amount of problems with 3DMark every time I've tried to use it:

TimeSpy will work for several benchmarks then stop working making me have to uninstall, manually delete the files it left over and reinstall. This happens nearly every time I bench
When TimeSpy runs fine, I have about a 50/50 shot of it saying "Drivers not supported" which means my score will be "private"
Now it's trying to trick me into buying new replacing my GPUs?...


----------



## xzamples

http://image.prntscr.com/image/8dbfe71249e04e0eb62379fa3a1996cd.png

what is this called and what does it do?


----------



## Skyl3r

Quote:


> Originally Posted by *xzamples*
> 
> http://image.prntscr.com/image/8dbfe71249e04e0eb62379fa3a1996cd.png
> 
> what is this called and what does it do?


From Guru3D
Quote:


> Now, if you looked closely (and as a Guru3D reader we assume you already did), you would have noticed a set of micro switches on the back side as well. This card has phase load LEDs, full load and a series of LEDs will activate. These switches allow you to configure the colors. 8+1 LEDs located above the PCIe power connectors that indicate the intensity level of the GPU operation. For example, during a typical gaming session all 8 LEDs will be lit, and while typically idling on the desktop a single LED will be lit. These 8 LEDs are user configurable to either red and/or blue by the physical dip-switch located on the back side of the graphics card. The 1 green LED located alongside the 8 LEDs indicates when the graphics card is in ZeroCore Power operation.


http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,4.html


----------



## Alastair

So you guys seen the stats for Greenland XT on techpowerup's VGA a database? They assuming 4096 shaders, 64 ROP's and 256 TMU's @ 1000MHz base and 1200MHz boost for 225 watts. With 8GB HBM2 on a 2048 wide bus with 1000MHz clock speed.

Any ideas on how accurate this is or may be? Sounds like a shrunk Fiji and that's that.


----------



## diggiddi

Quote:


> Originally Posted by *Alastair*
> 
> well how many frames are you pushing? The Fury has never been a great player at 1080P. It's always performed above those resolutions.
> 
> And why so low on the FX?


Night Thunderstorm + heavy fog with rain @ 1080
pCARS64 Dubai Intl 20 cars

2017-03-12 16:14:10 - pCARS64
Frames: 17630 - Time: 280000ms - Avg: 62.964 - Min: 38 - Max: 85

2017-03-12 16:21:39 - pCARS64
Frames: 16959 - Time: 280000ms - Avg: 60.568 - Min: 40 - Max: 80

2017-03-12 16:29:03 - pCARS64
Frames: 17508 - Time: 280000ms - Avg: 62.529 - Min: 41 - Max: 83

Crossfire
2017-03-15 14:07:07 - pCARS64
Frames: 11958 - Time: 280000ms - Avg: 42.707 - Min: 28 - Max: 63

2017-03-15 14:36:18 - pCARS64
Frames: 12368 - Time: 280000ms - Avg: 44.171 - Min: 29 - Max: 66

PSU needs replacement is why I have it so low

Quote:


> Originally Posted by *domrockt*
> 
> Your AMD is pretty much the limit on 1080p.
> An Fury comes in Handy with an res of 1440p and higher.


See 4K crossfire figures below

4k VSR Crossfire
2017-03-15 14:53:25 - pCARS64
Frames: 12456 - Time: 280000ms - Avg: 44.486 - Min: 29 - Max: 62


----------



## Alastair

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> well how many frames are you pushing? The Fury has never been a great player at 1080P. It's always performed above those resolutions.
> 
> And why so low on the FX?
> 
> 
> 
> Night Thunderstorm + heavy fog with rain @ 1080
> pCARS64 Dubai Intl 20 cars
> 
> 2017-03-12 16:14:10 - pCARS64
> Frames: 17630 - Time: 280000ms - Avg: 62.964 - Min: 38 - Max: 85
> 
> 2017-03-12 16:21:39 - pCARS64
> Frames: 16959 - Time: 280000ms - Avg: 60.568 - Min: 40 - Max: 80
> 
> 2017-03-12 16:29:03 - pCARS64
> Frames: 17508 - Time: 280000ms - Avg: 62.529 - Min: 41 - Max: 83
> 
> Crossfire
> 2017-03-15 14:07:07 - pCARS64
> Frames: 11958 - Time: 280000ms - Avg: 42.707 - Min: 28 - Max: 63
> 
> 2017-03-15 14:36:18 - pCARS64
> Frames: 12368 - Time: 280000ms - Avg: 44.171 - Min: 29 - Max: 66
> 
> PSU needs replacement is why I have it so low
> 
> Quote:
> 
> 
> 
> Originally Posted by *domrockt*
> 
> Your AMD is pretty much the limit on 1080p.
> An Fury comes in Handy with an res of 1440p and higher.
> 
> Click to expand...
> 
> See 4K crossfire figures below
> 
> 4k VSR Crossfire
> 2017-03-15 14:53:25 - pCARS64
> Frames: 12456 - Time: 280000ms - Avg: 44.486 - Min: 29 - Max: 62
Click to expand...

Honestly it just looks like crossfire for project cars is broken.


----------



## diggiddi

Ok so looking to OC what do I need to know? any guides out there?


----------



## Skyl3r

Quote:


> Originally Posted by *diggiddi*
> 
> Ok so looking to OC what do I need to know? any guides out there?


There's lots of GPU overclocking guides online.
Here's a decent guide here:
http://www.tomshardware.com/faq/id-2749337/safe-gpu-overclocking-guide-2016.html

They include some good stuff, like images to help identify which part of your GPU is causing instability when you're overclocked.
On top of that though, you really want to be very careful about temperatures. Find out what a max safe temperature for your card is and don't go past it.

I would use Heaven or Valley. I can no longer recommend 3DMark









Normal stable clocks for the Fury's seems to be 1150MHz to 1200MHz GPU and 540MHz to 560MHz HBM.

I'm getting another Fury X. Nothing says AMD better than a stack of GPU's drawing as much as 2 full GTX 1080 builds


----------



## LeadbyFaith21

So I just got a Fury X to crossfire with my unlocked Fury and ran some tests in BF1, but got worse frame rates than using a single card, any ideas as too why?

Or do you guys have any references for fixing/optimizing crossfire setups in general?


----------



## dagget3450

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> So I just got a Fury X to crossfire with my unlocked Fury and ran some tests in BF1, but got worse frame rates than using a single card, any ideas as too why?
> 
> Or do you guys have any references for fixing/optimizing crossfire setups in general?


I dont have the game, does it support crossfire? Also you run it in dx11 or 12


----------



## Skyl3r

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> So I just got a Fury X to crossfire with my unlocked Fury and ran some tests in BF1, but got worse frame rates than using a single card, any ideas as too why?
> 
> Or do you guys have any references for fixing/optimizing crossfire setups in general?


From checking online it sounds like it only works in DX11 mode and it might default to DX12?
I don't have the game; but it's worth checking that on your setup.

https://forums.battlefield.com/en-us/discussion/45365/is-crossfire-working


----------



## LeadbyFaith21

Quote:


> Originally Posted by *dagget3450*
> 
> I dont have the game, does it support crossfire? Also you run it in dx11 or 12


It does in DX11 mode, which is what I'm using. Monitoring utilization, one card will run almost maxed, while the other is practically idle for a few seconds, then they'll switch for a few seconds, and back again. It's really odd, unless there's a CPU bottleneck happening, though none of the threads of my CPU are maxed out.
Quote:


> Originally Posted by *Skyl3r*
> 
> From checking online it sounds like it only works in DX11 mode and it might default to DX12?
> I don't have the game; but it's worth checking that on your setup.
> 
> https://forums.battlefield.com/en-us/discussion/45365/is-crossfire-working


I know I'm running DX11, that was the first thing I checked, but I'll look on that link and see if there's anything helpful there. Thanks!!


----------



## Skyl3r

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> I know I'm running DX11, that was the first thing I checked, but I'll look on that link and see if there's anything helpful there. Thanks!!


Yeah, non problem. In addition you might want to check with something like Furmark to make sure crossfire is working at all, before worrying about BF1 so much.

If it works, I'd play with crossfire mode in Radeon Settings.


----------



## dagget3450

Maybe test each cf setting under game profile and see. Default, afr, 1x1 and afr compat ?


----------



## LeadbyFaith21

I've done Default, afr, and 1x1 so far, I get some screen flickering with afr and 1x1 though, so those are out lol

I'll test with the others, and also try disabling frame pacing to see if that helps.


----------



## Minotaurtoo

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> I've done Default, afr, and 1x1 so far, I get some screen flickering with afr and 1x1 though, so those are out lol
> 
> I'll test with the others, and also try disabling frame pacing to see if that helps.


I actually bought a fury x to get away from crossfire lol... I had two super clocking 7950's that together actually beat out this fury x by just a little... but the crossfire problems annoyed the heck out of me... I don't' know if this helps, but I set up custom profiles for problem games that simply disable crossfire because, like you, I actually had worse frame rates with both running than with only one with the problem games.


----------



## 99belle99

I went top the toilet and came back and notice that the red Radeon logo on my Fury X was flashing. My computer is under my desk so I do not see it while I am on the desktop. Anyone know what is the problem? I'm pretty sure it is not suppose to do that.


----------



## LeadbyFaith21

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I actually bought a fury x to get away from crossfire lol... I had two super clocking 7950's that together actually beat out this fury x by just a little... but the crossfire problems annoyed the heck out of me... I don't' know if this helps, but I set up custom profiles for problem games that simply disable crossfire because, like you, I actually had worse frame rates with both running than with only one with the problem games.


Were you able to get better performance that way? Might be fun to mess with and be able to get performance at the same time lol
Quote:


> Originally Posted by *Skyl3r*
> 
> From checking online it sounds like it only works in DX11 mode and it might default to DX12?
> I don't have the game; but it's worth checking that on your setup.
> 
> https://forums.battlefield.com/en-us/discussion/45365/is-crossfire-working


So after checking that forum, it would seem that crossfire is just broke on BF1 at the moment. But I have a new question now, I max out with frame rates in Titanfall 2 at around 95-110 in crossfire regardless of settings, unless I try to push maxed out at 4k, which drops them down to the 50-60 range, basically anything else has it pegged at 95-110. To me that sounds like a CPU bottleneck and that I need to overclock my CPU, does that sound right to anyone else?


----------



## Alastair

So I have discovered that DX12 in ROTR does not necessarily provide me with more framerates under DirectX 12. But it is a smoother and more consistent overall experience.


----------



## Ceadderman

Quote:


> Originally Posted by *Alastair*
> 
> So I have discovered that DX12 in ROTR does not necessarily provide me with more framerates under DirectX 12. But it is a smoother and more consistent overall experience.


iirc, the difference between DXs' will rarely be seen in frame rates but clarity of pixel shaders. I have zero experience with 12 because I don't use Win10. Neither do any of the clients I build for. The last one I put on a R9 390 even though I also put him on Windows7. His card will do fine with DX11 even though it's a DX12 Gen card. With the lack of DX12 games it doesn't make sense to jump into DX12.









~Ceadder


----------



## lanofsong

Hey AMD R9 Radeon Fury/Nano/X/Pro DUO Fiji owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## NBrock

@lanofsong you know my Fury X will be folding


----------



## xkm1948

@gupsterg You got a RyZen platform? How is it doing with FuryX? I am tempted to go AMD once their Napless X399 platform drops this summer. 16core32threads sound sexy as hell


----------



## Ceadderman

Quote:


> Originally Posted by *xkm1948*
> 
> @gupsterg You got a RyZen platform? How is it doing with FuryX? I am tempted to go AMD once their Napless X399 platform drops this summer. *16core32threads* sound sexy as hell












Anyone wanna purchase an 1800x NIB?







lol

~Ceadder


----------



## Minotaurtoo

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> Were you able to get better performance that way? Might be fun to mess with and be able to get performance at the same time lol
> So after checking that forum, it would seem that crossfire is just broke on BF1 at the moment. But I have a new question now, I max out with frame rates in Titanfall 2 at around 95-110 in crossfire regardless of settings, unless I try to push maxed out at 4k, which drops them down to the 50-60 range, basically anything else has it pegged at 95-110. To me that sounds like a CPU bottleneck and that I need to overclock my CPU, does that sound right to anyone else?


yeah in ways I got better performance, like I could disable stuff that didn't work well with specific games instead of doing it universally. But the big thing was to keep crossfire from killing certain games... spin tires was the worst for me with crossfire.


----------



## Ceadderman

Not a lot of games are set up to run SLi/xFire right out of the gate. The developers would rather the manufacturers do it and vice versa. Both manufacturers do what they can to support multi card setups but there is no money in it for both the developers or the manufacturers, so it's slow going where SLi/xFire is concerned IMHO.









It's up to us Gamers to voice our frustrations to both. Maybe if they hear enough of us, they will both work together to knock it out of the park.









~Ceadder


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ceadderman*
> 
> Not a lot of games are set up to run SLi/xFire right out of the gate. The developers would rather the manufacturers do it and vice versa. Both manufacturers do what they can to support multi card setups but there is no money in it for both the developers or the manufacturers, so it's slow going where SLi/xFire is concerned IMHO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's up to us Gamers to voice our frustrations to both. Maybe if they hear enough of us, they will both work together to knock it out of the park.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


that is the reason I wanted out of crossfire... yeah one fury x isn't going to play all games at 4k with max settings while maintaining 60 fps... but believe it or not, in most games I play it does actually, the others I usually drop to 1440p.... one day I hope for a better solution, but I'm looking for about double the fury x performance before I bail out.... that and I'm absolutely spoiled by the AIO liquid cooling.


----------



## MrKoala

Quote:


> Originally Posted by *Minotaurtoo*
> 
> that is the reason I wanted out of crossfire... yeah one fury x isn't going to play all games at 4k with max settings while maintaining 60 fps... but believe it or not, in most games I play it does actually, the others I usually drop to 1440p.... one day I hope for a better solution, but I'm looking for about double the fury x performance before I bail out.... that and I'm absolutely spoiled by the AIO liquid cooling.


We should see a much faster (than PCIe) interconnect that allows multi-GPU without per-engine support when Navi arrives, at least that's what AMD says with their road map. The two GPUs will likely still seat on the same PCB though, so you can't buy one card and another later.


----------



## Minotaurtoo

If they do that I suppose it would be essentially like having two gpu's acting as one single gpu then... if there was no need for crossfire software support and the two gpu's showed as one then most likely games would make better use of the two, but call me skeptical, I'd have to see it to believe it.... would be great if it did that.


----------



## MrKoala

Quote:


> Originally Posted by *Minotaurtoo*
> 
> If they do that I suppose it would be essentially like having two gpu's acting as one single gpu then... if there was no need for crossfire software support and the two gpu's showed as one then most likely games would make better use of the two, but call me skeptical, I'd have to see it to believe it.... would be great if it did that.


Yes, it will (should) be exposed to the application as one device, and legacy software will just run that way. Programs made with it in mind will still be able to detect and adapt to the multi-GPU configuration through some API extension, similar to how consumer multi-CPU-threaded software can scale up to cores on different sockets/clusters with no modification when running on a multi-socket server, but software programmed with such a platform in mind will detect the CPU topology for better optimization.

What they build for the GPU setup should be something similar to the HyperTransport/InfinityFabric interconnect used on Opterons (and now consumer Ryzen), just scaled for higher bandwidth. I think there's zero doubt they can get it working. Whether it works well is anyone's guess.


----------



## Alastair

Direct X 12 is supposed to be doing that for us anyway. From what I remember DX12 is supposed to see two cards as one unit. Regardless if its 2GPU's on a single PCB or two separate cards.


----------



## Ceadderman

Quote:


> Originally Posted by *Alastair*
> 
> Direct X 12 is supposed to be doing that for us anyway. From what I remember DX12 is supposed to see two cards as one unit. Regardless if its 2GPU's on a single PCB or two separate cards.


Not everyone is on Win10 nor want to be. So DX12 is fine if on that OS but for Win8.1 and earlier it's ineffective.









Myself am on Win7 Ultimate 64bit and won't be going to Win10. Until someone breaks DX12 from the Win10 shell DX12 won't be available to those of us who run previous Windows releases.









~Ceadder


----------



## rubenlol2

Vulkan has shown how good it is with DOOM, DX12 has yet to show such a change.
Of course it's mostly up to the devs with these newer APIs, but I haven't seen any game run as good as doom does, both GPU wise and CPU wise.
I was blown away on how good DOOM ran under vulkan with a 1.5ghz 12 core broadwell-ep engineering sample, not a single core was ever seen pinned at 100% and I noticed no difference between it and a overclocked 5820k(4.4ghz) with a R9 Fury.
Literally all other games I tried would choke on the pathetic single threaded performance, but not doom.


----------



## Alastair

I can't seem to FORCE crossfire on for games that dont have a profile any more. Can anyone confirm?


----------



## Skyl3r

I'm thinking it's possible to see an only AMD machine lead the leaderboards in Ryzen scores on 3DMark.
2 Fury X's and a Pro Duo would beat out every other score in the Ryzen section. I don't think NVidia has a way of getting three+ GPU's on any of the Ryzen motherboards yet, so this could be the chance to see AMD at the top









Might have to get a Pro Duo...


----------



## faizreds

Is R9 Fury a good card for 1440p resolution? Is it a good upgrade from r9 290?


----------



## rubenlol2

Fury can play 1440p fine, also a fun card to tweak.
Might not be worth it over a 290 though.


----------



## sydefekt

Quote:


> Originally Posted by *faizreds*
> 
> Is R9 Fury a good card for 1440p resolution? Is it a good upgrade from r9 290?


Yes. Fury and Fury X are currently the best 1440p and above consumer GPUs in the AMD lineup. I think it is a good upgrade from 290 if you are targeting a specific higher FPS to better match your monitor. The drivers are also mature and stable, so performance has been getting better. The price as well can't be beat. Can buy brand new in the $250+ range. Or <$200 used. I just bought a second Fury TriX on ebay for $174.


----------



## 99belle99

My last card was a R9 290. I now have a R9 Fury X. I was just glad I didn't have to put up with the loud noise of fans as I had mine overclocked to the limits and with a custom fan profile that kept the card in 80 degrees. Bear in mind that card runs at 90+ stock so you would nearly guess I had loud fan noise with it being overclocked and all.


----------



## LionS7

Quote:


> Originally Posted by *faizreds*
> 
> Is R9 Fury a good card for 1440p resolution? Is it a good upgrade from r9 290?


Well, we have 59% gain in 2560x1440p for R9 Fury X vs R9 290 (ref. cards). This is the review from gamegpu.ru on 15 games from 2016. I have upgraded from R9 290. The only thing I hate about Fiji chips - bad coil whine ! I owned 2 ref. R9 290 and no coil whine so... but do the upgrade if you find R9 Fury X on good price.


----------



## faizreds

Thanks for the advice. Will try to find a good deal on used fury first.


----------



## Alastair

I have posted this around in a few places. Looking for some advice.

Guys in my current pc I have my 360mm rad set up with 3 Jetflo 120's pushing out and two Storm Force 200's pulling out. Would I see a benefit if I switched out the two 200's for another three Jetflo's? Or would it be marginal at best?


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> @gupsterg You got a RyZen platform? How is it doing with FuryX? I am tempted to go AMD once their Napless X399 platform drops this summer. 16core32threads sound sexy as hell


Hi ya mate







,

Yep went Ryzen







, 8C/16T is big step for me TBH.

I pre-ordered 1st March, mobo didn't come til 10th March. My Trident Z 3200MHz C14 were DOA, so to get rig going had to go with some slower stuff locally available. Amazon ran out of Tri-Z stock so RMA is with manufacturer







.

Bare in mind on Ryzen RAM effective speed / 2 = NorthBridge speed, aka "Data Fabric", The Stilt has highlighted the "abnormal" performance gains from increased RAM speed are down to DF clocking higher rather than RAM. So my benches may improve later and bare in mind the i5/Z97 is "polished", R7/X370 is "immature".

3x R7 1700 @ CPU: 3.8GHz Cache: 3.8GHz DF: 1200MHz RAM: 2400MHz C14

3x i5 4690K @ CPU: 4.9GHz Cache:4.4GHz RAM: 2400MHz C11

[email protected] PPD on R7 is vastly better than on i5. Power consumption seems very good for a 8C/16T CPU IMO.

3DM FSE the gap on combined test closes compared with FS, GS/GT1/GT2 remains within run to run variance and in both case similar gap







.

FSE R7 1700 vs i5 4690K

FS R7 1700 vs i5 4690K

At mo on Win 7, above are all R7 Win 7 and i5 Win 10, will be doing some Win 10 testing soon. From a post that The Stilt placed in his Ryzen: Strictly technical it seemed to me Win 7 was better route to go with Ryzen.

Mainly been just getting OC stability tested as my i5 setup before get into bench mode, as what results which I can say have had no variance due an error on setup.



Spoiler: Warning: Spoiler!






Spoiler: Setup of OC



i) measure voltages for SB, 1.8V PLL, VDDP, DRAM, NB SOC on DMM when CPU at stock, loaded with x264.

ii) fix all measured voltages in UEFI manually to gain same as testing in step i

iii) alter PState 0 for 3800MHz, set CPU voltage offset to +137.5mV so final VCORE measured on DMM was ~1.350V when under load. I also set LLC LVL1 as wanted to make sure if left on "Auto" UEFI "Auto Rules" were not adjusting it to higher levels.

iv) Global C-State Control [Enabled] in AMD CBS menu as wanted to make sure if left on "Auto" UEFI "Auto Rules" were not changing it.





Spoiler: x264 48 loops pass, ~5.25hrs.









Spoiler: Y-Cruncher 6hrs pass, all tests except FFT (Disabled)









Spoiler: x264 / Y-Cruncher HWiNFO logs



3.8GHz_OC_x264_Y-Crun.zip 1822k .zip file






Spoiler: 3DM FSE set to combined test loop only, ~1.25hrs.



The attached 3DM save file IIRC should open on another's system for viewing.





3DM_FSE_CT_Log.zip 423k .zip file






Spoiler: SWBF HML file



Next I ran SWBF, 1440P "Ultra" preset, with Crimson driver set to FRTC: 89FPS, FreeSync: On, Power Efficiency: On. Due to these settings the GPU will not stick to max clock in game, I don't find performance lacking or see any stutter/issues. Once stability testing of CPU OC finish will revert to my Fury X OC ROM 1145/545. HML file attached with all monitoring data that MSI AB support.

SWBF.zip 81k .zip file






Spoiler: [email protected] ~20.5hrs







Total rig uptime since last reboot: ~35hrs, ~5.25hrs x264 > ~1hr idle > ~6hrs Y-Cruncher > 3DM FSE CT loop ~1.25hrs > SWBF 0.25hrs > [email protected] 20.5hrs.

Will be checking log once finish [email protected], on when tCTL reached >~76°C, how many times it happened. As viewing screenshots at it's first occurrence (~9hrs in [email protected]), the fans on front intake/CPU did not ramp up above ~750 RPM and they should have. Viewing x264 tCTL over 48 loops MAX: ~70°C AVE: ~61°C and Y-Cruncher over 6hrs MAX: ~78°C AVE: ~70°C, fan ramping did occur to ~1200 RPM.

I will be repeating above tests, one at time, leisurely overnight. In preparation for when other RAM is back for some tweaking on that element of OC.



The few hours I have played saw SWBF / Crysis 3, the games feel "smoother", I did wonder if this was a "placebo effect" of new setup, but other owners think the same. It's like FreeSync+ IMO







, even if some data I viewed for frame times/fps don't show this "smoothness" if you get what I mean.


----------



## LazarusIV

Quote:


> Originally Posted by *Alastair*
> 
> I have posted this around in a few places. Looking for some advice.
> 
> Guys in my current pc I have my 360mm rad set up with 3 Jetflo 120's pushing out and two Storm Force 200's pulling out. Would I see a benefit if I switched out the two 200's for another three Jetflo's? Or would it be marginal at best?


My answer remains the same as the 8350 thread!









So I got my Ryzen Win7 platform all updated, check sig rig. I played BF4 with CPU and GPU at stock and good gravy, what a beastial setup, I'm very happy! I was able to play Ultra settings, 2560,[email protected] easily staying above 60fps, even mins. Really sweet, exceedingly smooth gameplay. I have yet to OC my monitor, was able to get to 110Hz before, I'll probably just do 96Hz though, that seems to be a sweet spot for my monitor. Will report again!

Also played these games with fantastic results:
Diablo III
Heroes of the Storm
Grim Dawn


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> Hi ya mate
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> Yep went Ryzen
> 
> 
> 
> 
> 
> 
> 
> , 8C/16T is big step for me TBH.
> 
> I pre-ordered 1st March, mobo didn't come til 10th March. My Trident Z 3200MHz C14 were DOA, so to get rig going had to go with some slower stuff locally available. Amazon ran out of Tri-Z stock so RMA is with manufacturer
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Bare in mind on Ryzen RAM effective speed / 2 = NorthBridge speed, aka "Data Fabric", The Stilt has highlighted the "abnormal" performance gains from increased RAM speed are down to DF clocking higher rather than RAM. So my benches may improve later and bare in mind the i5/Z97 is "polished", R7/X370 is "immature".
> 
> 3x R7 1700 @ CPU: 3.8GHz Cache: 3.8GHz DF: 1200MHz RAM: 2400MHz C14
> 
> 3x i5 4690K @ CPU: 4.9GHz Cache:4.4GHz RAM: 2400MHz C11
> 
> [email protected] PPD on R7 is vastly better than on i5. Power consumption seems very good for a 8C/16T CPU IMO.
> 
> 3DM FSE the gap on combined test closes compared with FS, GS/GT1/GT2 remains within run to run variance and in both case similar gap
> 
> 
> 
> 
> 
> 
> 
> .
> 
> FSE R7 1700 vs i5 4690K
> 
> FS R7 1700 vs i5 4690K
> 
> At mo on Win 7, above are all R7 Win 7 and i5 Win 10, will be doing some Win 10 testing soon. From a post that The Stilt placed in his Ryzen: Strictly technical it seemed to me Win 7 was better route to go with Ryzen.
> 
> Mainly been just getting OC stability tested as my i5 setup before get into bench mode, as what results which I can say have had no variance due an error on setup.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Setup of OC
> 
> 
> 
> i) measure voltages for SB, 1.8V PLL, VDDP, DRAM, NB SOC on DMM when CPU at stock, loaded with x264.
> 
> ii) fix all measured voltages in UEFI manually to gain same as testing in step i
> 
> iii) alter PState 0 for 3800MHz, set CPU voltage offset to +137.5mV so final VCORE measured on DMM was ~1.350V when under load. I also set LLC LVL1 as wanted to make sure if left on "Auto" UEFI "Auto Rules" were not adjusting it to higher levels.
> 
> iv) Global C-State Control [Enabled] in AMD CBS menu as wanted to make sure if left on "Auto" UEFI "Auto Rules" were not changing it.
> 
> 
> 
> 
> 
> Spoiler: x264 48 loops pass, ~5.25hrs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Y-Cruncher 6hrs pass, all tests except FFT (Disabled)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: x264 / Y-Cruncher HWiNFO logs
> 
> 
> 
> 3.8GHz_OC_x264_Y-Crun.zip 1822k .zip file
> 
> 
> 
> 
> 
> 
> Spoiler: 3DM FSE set to combined test loop only, ~1.25hrs.
> 
> 
> 
> The attached 3DM save file IIRC should open on another's system for viewing.
> 
> 
> 
> 
> 
> 3DM_FSE_CT_Log.zip 423k .zip file
> 
> 
> 
> 
> 
> 
> Spoiler: SWBF HML file
> 
> 
> 
> Next I ran SWBF, 1440P "Ultra" preset, with Crimson driver set to FRTC: 89FPS, FreeSync: On, Power Efficiency: On. Due to these settings the GPU will not stick to max clock in game, I don't find performance lacking or see any stutter/issues. Once stability testing of CPU OC finish will revert to my Fury X OC ROM 1145/545. HML file attached with all monitoring data that MSI AB support.
> 
> SWBF.zip 81k .zip file
> 
> 
> 
> 
> 
> 
> Spoiler: [email protected] ~20.5hrs
> 
> 
> 
> 
> 
> 
> 
> Total rig uptime since last reboot: ~35hrs, ~5.25hrs x264 > ~1hr idle > ~6hrs Y-Cruncher > 3DM FSE CT loop ~1.25hrs > SWBF 0.25hrs > [email protected] 20.5hrs.
> 
> Will be checking log once finish [email protected], on when tCTL reached >~76°C, how many times it happened. As viewing screenshots at it's first occurrence (~9hrs in [email protected]), the fans on front intake/CPU did not ramp up above ~750 RPM and they should have. Viewing x264 tCTL over 48 loops MAX: ~70°C AVE: ~61°C and Y-Cruncher over 6hrs MAX: ~78°C AVE: ~70°C, fan ramping did occur to ~1200 RPM.
> 
> I will be repeating above tests, one at time, leisurely overnight. In preparation for when other RAM is back for some tweaking on that element of OC.
> 
> 
> 
> The few hours I have played saw SWBF / Crysis 3, the games feel "smoother", I did wonder if this was a "placebo effect" of new setup, but other owners think the same. It's like FreeSync+ IMO
> 
> 
> 
> 
> 
> 
> 
> , even if some data I viewed for frame times/fps don't show this "smoothness" if you get what I mean.


I'm thinking of skipping the first gen of zen.... not sure it will really be worth getting new motherboard and ram to get the cpu.... maybe I'm wrong, but here are my reasons

1. I game at 4k and a highly gpu bound atm... still I only aim for 60fps

2. My 9590 easily hits 5 ghz with stock base clock voltage... runs nice and cool with my custom loop

3. As already mentioned, I'd have to buy new motherboard and new ram... currently my ram is running 2400mhz and my cpu's nb easily clocks up to 2700mhz so even the dismal ram perf of piledriver isn't really killing me.

Right now I'm looking at the zen 1800x and it offers 120% to 200% of the performance I'm getting now with cinebench and 3dmark being the biggest increases...

question I have... simple: Should I go ahead and drop the cash (I don't have right now) or go with my gut and skip a generation and put my savings into another fury x or upgrade to new gpu keeping in mind I am aiming for 4k?


----------



## domrockt

In my opinion.. Iam no fanboy ... AMD did not a good job with ryzen.. They do good with GPUs. I love my Fury but i would nobody recomend an AMD mobo and CPU setup..
IF you drop money on a new mobo and CPU Combo consider haswell e and x99 (high CPU counts in the future) and are quite cheap when looking up the second Hand market.

But defenitive skip ryzen gen1







and sell your fury and put some safed money in Vega.


----------



## Ceadderman

I dropped the credit on Ryzen and a 480. IMHO, it is worth the cash for it. Sure I still have to get the board and the RAM, but if you piecemeal it, it takes some of the sting out.

I say do it and don't look back. Just hold onto that GPU as you will need it to keep Micro$haft from locking down your updates. Those d#@ks are lying to Ryzen owners to get them to adopt Win10 for absolutely no reason Ryzen is an x86 architecture after all and Win7 supports that. So Ryzen should still be supported for autoupdates. Hopefully one of my watercooled 6870s will keep the hounds at bay.









~Ceadder


----------



## battleaxe

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I'm thinking of skipping the first gen of zen.... not sure it will really be worth getting new motherboard and ram to get the cpu.... maybe I'm wrong, but here are my reasons
> 
> 1. I game at 4k and a highly gpu bound atm... still I only aim for 60fps
> 
> 2. My 9590 easily hits 5 ghz with stock base clock voltage... runs nice and cool with my custom loop
> 
> 3. As already mentioned, I'd have to buy new motherboard and new ram... currently my ram is running 2400mhz and my cpu's nb easily clocks up to 2700mhz so even the dismal ram perf of piledriver isn't really killing me.
> 
> Right now I'm looking at the zen 1800x and it offers 120% to 200% of the performance I'm getting now with cinebench and 3dmark being the biggest increases...
> 
> question I have... simple: Should I go ahead and drop the cash (I don't have right now) or go with my gut and skip a generation and put my savings into another fury x or upgrade to new gpu keeping in mind I am aiming for 4k?


I'm actually waiting a bit too. I think Ryzen is great and I can't wait to use it on my rendering rig. I will also use it for 4k gaming as you are. I think the performance is pretty fantastic overall, and I don't really care that its a 'hair' less FPS crazy on a 1080p setup. I don't run 1080p nor will I in the future. Its a fantastic bargain really even as is.

But, my guess is that it will get better still, and the boards will improve too, so I plan to wait and get one that is actually up to date and has all the improvements I'm after. Should be about 12 months or less if my guess is correct. I can wait that long.


----------



## diggiddi

Quote:


> Originally Posted by *Ceadderman*
> 
> I dropped the credit on Ryzen and a 480. IMHO, it is worth the cash for it. Sure I still have to get the board and the RAM, but if you piecemeal it, it takes some of the sting out.
> 
> I say do it and don't look back. *Just hold onto that GPU as you will need it to keep Micro$haft from locking down your updates*. Those d#@ks are lying to Ryzen owners to get them to adopt Win10 for absolutely no reason Ryzen is an x86 architecture after all and Win7 supports that. So Ryzen should still be supported for autoupdates. Hopefully one of my watercooled 6870s will keep the hounds at bay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Huh! whaddaya mean?? How is the GPU going to prevent this?


----------



## gupsterg

@Minotaurtoo

I jumped to Ryzen as:-

i) can off load my i5/Z97 without making a loss due to savvy buying







.

ii) seemed a "better bang for $" route than Intel.

Perhaps if I had an i7 I may have not gone for R7. If I had paid for an i7 4790K when I built my Z97 rig I reckon on R7 launch I would have thought what? AMD offering 8C/16T for similar price of what an i7 4790K 4C/8T would have cost be then.

Amazed some of the power readings at wall, just finalized an OC this week I really plan to start using it and getting some data on benches, power use, etc together. Yeah there are "teething" issues with the UEFI of mobo and SMU FW of Ryzen, hopefully in coming weeks get sussed.

@domrockt

Yeah may or may not go Vega. If an opportunity comes along like the scenario above then yeah will go Vega. Hope it has AIO like Fury X.


----------



## ht_addict

Anyone know the warranty on the Sapphire Fury X? I'm hoping it 2yrs as one of my cards went belly up and I need to send it in.


----------



## gupsterg

It used say 2yrs on their site in warranty section then it changed to stating country dependent. Have not checked recently, also a mod few months on the Sapphire forum said "country dependent".


----------



## huzzug

I hate when things go country dependent. The local seller here does not issue warranty. With Sapphire, Asus, Gigabyte etc earlier having 5 years warranty, due to which the local seller offered nominal limited warranty for a year. 2 years back it caused me to lose warranty on my Z68 after the board started doing belly flops with the processor. Asus denied RMA as they changed their warranty page online from 5 to 2 and said I'm out of warranty. This is an headache and has forced me to go used with my system as if I'm not offered a seatbelt, I might as well drive blind.


----------



## gupsterg

Yeah, sucks.

I've gone to buying totally from Amazon UK now. As manufacturer warranties pretty much suck and some etiailers policies suck.

For example I bought an MG279Q, went through 3, on the 3rd one I thought I'll ring Asus UK to highlight QC is poor and explain that the FW has an issue. The MG279Q will not display an image from my Sapphire HD5850, where as Eizo Fg2421 / Dell U2515H / Panasonic PDP & LG PDP have no issues displaying an image. Basic stance was we will take your feedback and report to a dept. I explained I didn't think my 3rd screen was up to "scratch", some BLB issue plus dead pixels. They approved RMA but said I may get a refurb and it may have cosmetic marks. I thought







, I pay ~£400 for a screen and you say this! They said RMA with Amazon as I will get new.

I had some what pants experience with warranty claim with Eizo as well, Dell similar policy you get refurb. Pretty much to me the warranty is worthless in the industry.

Then you have some etailers here that charge like a usage fee, so if your item goes wrong in x mths they'll knock off x value from purchase. At least Amazon for 1st year is full £ back, no sodding testing fees either, which I've seen some do.

Amazon been so good TBH. They had no stock of say Eizo or recently some G.Skill RAM I bought that had an issue, they paid the courier fees for me to send to manufacturer. Truly go above and beyond in my experience.


----------



## Ceadderman

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> I dropped the credit on Ryzen and a 480. IMHO, it is worth the cash for it. Sure I still have to get the board and the RAM, but if you piecemeal it, it takes some of the sting out.
> 
> I say do it and don't look back. *Just hold onto that GPU as you will need it to keep Micro$haft from locking down your updates*. Those d#@ks are lying to Ryzen owners to get them to adopt Win10 for absolutely no reason Ryzen is an x86 architecture after all and Win7 supports that. So Ryzen should still be supported for autoupdates. Hopefully one of my watercooled 6870s will keep the hounds at bay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Huh! whaddaya mean?? How is the GPU going to prevent this?
Click to expand...

Windows7 compatability parts *may* prevent Micro$haft from blocking future security updates for Ryzen and Kabylake platforms. Nevermind that both are x86 architecture bound anyway.









~Ceadder


----------



## diggiddi

Quote:


> Originally Posted by *Ceadderman*
> 
> Windows7 compatability parts *may* prevent Micro$haft from blocking future security updates for Ryzen and Kabylake platforms. Nevermind that both are x86 architecture bound anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yeah, but how does the gpu stop this from happening is what i'm asking?


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> It used say 2yrs on their site in warranty section then it changed to stating country dependent. Have not checked recently, also a mod few months on the Sapphire forum said "country dependent".


they just replaced a Fury of mine for me that had a memory error. Purchased through Amazon. Amazon refused to help me. Sapphire did. Took a little while. Best part of two months. But it's done. So at least in South Africa we seem to be covered


----------



## Gdourado

How is the performance of a single fury Nitro in 1080p, compared to a pair of 290x cards in crossfire?


----------



## steadly2004

Quote:


> Originally Posted by *Gdourado*
> 
> How is the performance of a single fury Nitro in 1080p, compared to a pair of 290x cards in crossfire?


290x Crossfire will be faster if Crossfire is supported, fury will be faster if it's not.


----------



## Gdourado

With current drivers, is micro stutter still an issue with crossfire?


----------



## huzzug

It hasn't been since Omega 3 years ago


----------



## Ceadderman

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> I dropped the credit on Ryzen and a 480. IMHO, it is worth the cash for it. Sure I still have to get the board and the RAM, but if you piecemeal it, it takes some of the sting out.
> 
> I say do it and don't look back. *Just hold onto that GPU as you will need it to keep Micro$haft from locking down your updates*. Those d#@ks are lying to Ryzen owners to get them to adopt Win10 for absolutely no reason Ryzen is an x86 architecture after all and Win7 supports that. So Ryzen should still be supported for autoupdates. Hopefully one of my watercooled 6870s will keep the hounds at bay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Huh! whaddaya mean?? How is the GPU going to prevent this?
Click to expand...

Old hardware being Win7 capable should keep the door open for Win7 updates.









~Ceadder


----------



## Alastair

How good is the Firestrike stress tests vs Heaven? I seem to be pulling 950watts out the wall in 3D mark vs, 850-930 (varies a lot) in Heaven.


----------



## HagbardCeline

I've finally saved up some extra cash to upgrade the 1080 monitors I've been using. What is the best ultradwide that supports the AMD freesync? I've been studying reviews on six different sites and it's frustrating that people seem to have no consistent agreement. Is QC that bad with these large IPS panels? Or is the price so high that people are just extra picky? Thanks!


----------



## diggiddi

Quote:


> Originally Posted by *HagbardCeline*
> 
> I've finally saved up some extra cash to upgrade the 1080 monitors I've been using. What is the best ultradwide that supports the AMD freesync? I've been studying reviews on six different sites and it's frustrating that people seem to have no consistent agreement. Is QC that bad with these large IPS panels? Or is the price so high that people are just extra picky? Thanks!


Best place to get your answer
http://www.overclock.net/t/1541528/official-21-9-owners-appreciation-thread-post-anything-related-to-21-9/980_20#post_25937473


----------



## Alastair

Quote:


> Originally Posted by *HagbardCeline*
> 
> I've finally saved up some extra cash to upgrade the 1080 monitors I've been using. What is the best ultradwide that supports the AMD freesync? I've been studying reviews on six different sites and it's frustrating that people seem to have no consistent agreement. Is QC that bad with these large IPS panels? Or is the price so high that people are just extra picky? Thanks!


yes the qc is that bad. You drop 1K for a CF791 only to have possible quality control issues like BLB and dust and hair in between the panel and the glass. And also a fold down the center of the screen on some models.

Alot of people agree that the Microboard M340CLZ and Crossover 34U100 are some of the best budget hi-def freesync ultrawides available. Samsung PVA panel. 100hz @ 3444x1440. Of course as with most Korean monitors you have to deal with panel lottery over and above QC control lottery. Dead pixels and the likes as these are second rate factory reject panels from Samsung.

If you don't want to play the panel lottery. The Asus MX34VQ is probably the best bet I would say. Same 100hz panel in the above Korean screen but these aren't reject panels. Only disadvantage for the Asus is lack of VESA mount support. But a VESA mount can be fairly easily modded on from what I have seen.

If you wanting something IPS. The ACER XR341CK gets an honorable mention. It's also 3440x1440 at 75hz. There is also the XR342CK. Similar to its older brother in most aspects except it isn't freesync certified. But it still ha adaptive sync but it's range is a bit lower than it's older brother.

I haven't been looking at the 1080p ultrawide monitors. I feel that 1080p is a peasant resolution for a Fury.


----------



## h2323

Finding some good deals on Sapphire Nitro R9 FURY cards. Makes me want to switch up cards as I want to stay Radeon and also run 4k freesync display. I am currently running a 390X 8GB MSI Gaming. All the reviews have the fury out ahead especially as resolution goes up. All the reviews also have this card pulling less watts and running quieter and cooler. Found it for $375 Canadian and I can nearly get that for my 390x. Thanks for any theory's.


----------



## Alastair

Quote:


> Originally Posted by *h2323*
> 
> Finding some good deals on Sapphire Nitro R9 FURY cards. Makes me want to switch up cards as I want to stay Radeon and also run 4k freesync display. I am currently running a 390X 8GB MSI Gaming. All the reviews have the fury out ahead especially as resolution goes up. All the reviews also have this card pulling less watts and running quieter and cooler. Found it for $375 Canadian and I can nearly get that for my 390x. Thanks for any theory's.


Well if the cost to move is minimal then I say go ahead. Nothing wrong with extra performance at a great price.


----------



## Tcoppock

So whats the average memory oc on fury? Isn't it supposed to artifact if oc'd too high?


----------



## domrockt

Average is 550-600 MHz if that 650mhz is stabile than yours is top notch


----------



## rubenlol2

One thing to note is that after relive update OCing memory can be tricky with software, it might display a higher frequency than it actually runs at.
It might report what you set it to, but it actually doesn't run it.


----------



## ramos29

any one tried to oc his/her monitor? i have a 28" 4k monitor 60hz, the lag/input lag is too dam high for me, some games run better when i disale the vsync : like bf4, some other no
borderless gaming does not work fine with all game as its not freindly with crossfire . the CCC shows me that my max refresh rate is 75, but only lets me choose 60hz, does this mean i can make a custom resolution 4k 75hz? i am away from home that s why i cant verify it by myself, any advise/help will be more than welcome
ps: i am not willing to get a freesync/gsync monitor


----------



## steadly2004

Quote:


> Originally Posted by *ramos29*
> 
> any one tried to oc his/her monitor? i have a 28" 4k monitor 60hz, the lag/input lag is too dam high for me, some games run better when i disale the vsync : like bf4, some other no
> borderless gaming does not work fine with all game as its not freindly with crossfire . the CCC shows me that my max refresh rate is 75, but only lets me choose 60hz, does this mean i can make a custom resolution 4k 75hz? i am away from home that s why i cant verify it by myself, any advise/help will be more than welcome
> ps: i am not willing to get a freesync/gsync monitor


I personally was unable to overclock my 4k monitor comma even 1hz. Even tried overclocking it with a lower resolution comma and was unsuccessful. I think my Qnix reported a higher Max refresh rate somewhere comma but I was never able to get to work past 60 hertz


----------



## ramos29

Quote:


> Originally Posted by *steadly2004*
> 
> I personally was unable to overclock my 4k monitor comma even 1hz. Even tried overclocking it with a lower resolution comma and was unsuccessful. I think my Qnix reported a higher Max refresh rate somewhere comma but I was never able to get to work past 60 hertz


you got black screen/lines/artefact?


----------



## Ceadderman

Quote:


> Originally Posted by *steadly2004*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ramos29*
> 
> any one tried to oc his/her monitor? i have a 28" 4k monitor 60hz, the lag/input lag is too dam high for me, some games run better when i disale the vsync : like bf4, some other no
> borderless gaming does not work fine with all game as its not freindly with crossfire . the CCC shows me that my max refresh rate is 75, but only lets me choose 60hz, does this mean i can make a custom resolution 4k 75hz? i am away from home that s why i cant verify it by myself, any advise/help will be more than welcome
> ps: i am not willing to get a freesync/gsync monitor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I personally was unable to overclock my 4k monitor comma even 1hz. Even tried overclocking it with a lower resolution comma and was unsuccessful. I think my Qnix reported a higher Max refresh rate somewhere comma but I was never able to get to work past 60 hertz
Click to expand...

Posting and driving is a dangerous endeavor. Glad to see you using Google speak.









For a minute there I thought there was some new software called "comma".









~Ceadder


----------



## bluezone

Apparently AdoredTV is stirring the pot again (in a good way) over Ryzen bench marks not using RTG GPU's.

Ryzen of the Tomb Raider - 




Interesting results.

Go team Ryzen.









What do you think Gupsterg.


----------



## Silent Scone

preteh lawh.


----------



## PontiacGTX

is there any owner with air cooling from Sapphire ?


----------



## sydefekt

Quote:


> Originally Posted by *PontiacGTX*
> 
> is there any owner with air cooling from Sapphire ?


Yes. Whats up?


----------



## PontiacGTX

I am wondering how are the temps of the card under load on a closed case


----------



## miklkit

Here is what mine is doing in an air cooled case. All I did was give it a little more aggressive fan profile in Afterburner. The game is The Witcher 3. Only the right hand column is relevant to this subject.


----------



## looncraz

Quote:


> Originally Posted by *PontiacGTX*
> 
> I am wondering how are the temps of the card under load on a closed case


My Sapphire Nitro R9 Fury OC can hit > 75C in my case pretty easily, despite three Noctua Industrial fans feeding fresh air into the case and case temps being used to control those fans with a thermal probe.

This is entirely because of the default fan curve. I usually leave this card running with just a 900MHz frequency at 1.2V - and despite pulling ~50W less power, it still hits the same temperature (but it does take longer).


----------



## Sleazybigfoot

Quote:


> Originally Posted by *PontiacGTX*
> 
> I am wondering how are the temps of the card under load on a closed case


Was playing Rainbow Six Siege yesterday for 3+ hours straight. Running at 1100Mhz with a bit more aggressive fan profile. Never passed 50 Celsius. Ambient probably around 17 Celsius or something.



Running my 5820k at 1.175v 4.4Ghz as well cooled with a Noctua NH-U14s or something and the 3 default case fans that come with the Corsair Obsidian 700D.

Sapphire R9 Fury NITRO version

EDIT

I do want to add. This was during the evening and it was quite cool outside. Just tried again with the sun shining directly on my windows and the windows closed.
After a round of terrorist hunt I hit 51 Celsius (and very short 52 Celsius). Ambient is currently probably something like 24 Celsius.

So too be safe. I'll say mine will very rarely get hotter than 60 celcius.


----------



## PontiacGTX

how much was the ambient temperature


----------



## loktite78

Quote:


> Originally Posted by *miklkit*
> 
> Here is what mine is doing in an air cooled case. All I did was give it a little more aggressive fan profile in Afterburner. The game is The Witcher 3. Only the right hand column is relevant to this subject.


Why would you be using such old drivers? Are they more stable for the Fury?


----------



## LazarusIV

I sent a PM but I'll post up here. Sig rig info is up to date, highest temps I've seen on my NITRO+ is about 60°C to maybe at highest 67°C. Games I've been playing lately are Diablo III, HotS, Planetside 2, Grim Dawn, BF4, Titanfall 2 and I think that's about it. I don't have a lot of time to game but I gotta say this Ryzen / Fury combo is just fantastic.


----------



## steadly2004

Quote:


> Originally Posted by *Ceadderman*
> 
> Posting and driving is a dangerous endeavor. Glad to see you using Google speak.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For a minute there I thought there was some new software called "comma".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Hahahaha, yea. I definitely don't type while driving but I still read and reply. LOL my bad. I didn't read my post well before clicking reply.
Quote:


> Originally Posted by *ramos29*
> 
> you got black screen/lines/artefact?


Just black screen and I have to wait for the timer to revert settings.


----------



## miklkit

Quote:


> Originally Posted by *loktite78*
> 
> Why would you be using such old drivers? Are they more stable for the Fury?


Those screenies are from last year. It's currently on the 17.2.1 drivers with no noticeable change.


----------



## u3a6

Quote:


> Originally Posted by *PontiacGTX*
> 
> I am wondering how are the temps of the card under load on a closed case


Quote:


> Originally Posted by *looncraz*
> 
> My Sapphire Nitro R9 Fury OC can hit > 75C in my case pretty easily, despite three Noctua Industrial fans feeding fresh air into the case and case temps being used to control those fans with a thermal probe.
> 
> This is entirely because of the default fan curve. I usually leave this card running with just a 900MHz frequency at 1.2V - and despite pulling ~50W less power, it still hits the same temperature (but it does take longer).


As far as I know, the card adjusts the fan speed to get whatever the target temp is set on the bios. That is why despite lowering the voltage and clock you get the same temperature. I bet that the fans are spinning at lower speed when the vcore and core clock are reduced.


----------



## looncraz

Quote:


> Originally Posted by *u3a6*
> 
> As far as I know, the card adjusts the fan speed to get whatever the target temp is set on the bios. That is why despite lowering the voltage and clock you get the same temperature. I bet that the fans are spinning at lower speed when the vcore and core clock are reduced.


You are correct. The default fan curve is a dynamic curve created by the driver based on a target temperature.

I modified that base fan speed to be 33%, though, since it's still dead silent through my case at that speed.


----------



## loktite78

Quote:


> Originally Posted by *miklkit*
> 
> Those screenies are from last year. It's currently on the 17.2.1 drivers with no noticeable change.


good to know, thanx.


----------



## faizreds

What is the difference between Nitro R9 Fury and Tri-X R9 Fury ?


----------



## sydefekt

Quote:


> Originally Posted by *faizreds*
> 
> What is the difference between Nitro R9 Fury and Tri-X R9 Fury ?


TriX is Sapphire's first version of the Fury. It has the TriX orange cooler and uses the standard PCB from AMD. Some of these early versions can also be bios unlocked to a full FuryX. EK water blocks will fit the TriX.

The Nitro is a newer version that has the black and white Nitro cooler. I believe it has a custom PCB from Sapphire, so power and OC may be better, however it cannot be unlocked and EK Fury Water block does not fit.


----------



## u3a6

Quote:


> Originally Posted by *faizreds*
> 
> What is the difference between Nitro R9 Fury and Tri-X R9 Fury ?


Afaik, Nitro is a custom pcb and the Tri-X is built on the fury x reference pcb.


----------



## ramos29

-my nitro saphire oc reaches 70 quit easily, and some times goes up to 80 81°


----------



## ramos29

Quote:


> Originally Posted by *steadly2004*
> 
> Just black screen and I have to wait for the timer to revert settings..


the amd driver are blocking to resolution/refresh rate overide
17.2.1-17.3.3 has an issue with EDID overrides not loading on startup.

Workaround: Toggling GPU scaling will load the EDID override, but this has to be done after every reboot.

now i can get my 4k screen from 60 to 65hz at 4k, in a lower resolution ( between 1440p and 2160p) i can heat 75hz but with 75hz i think i am having frame skipping


----------



## PontiacGTX

Quote:


> Originally Posted by *ramos29*
> 
> -my nitro saphire oc reaches 70 quit easily, and some times goes up to 80 81°


what is your ambient temperature? and is this under benchmarks or games?


----------



## ramos29

Quote:


> Originally Posted by *PontiacGTX*
> 
> what is your ambient temperature? and is this under benchmarks or games?


ambiant temp 15-20, i put 2 12cm fans in front of each gpu ( 4 in total) , i wounder in the middle of the summer how hot will those cards will be
some times gpu 1 is 60 the second one 71
some times it is the opposite
those temp are ingame at max load


----------



## Alastair

I am really not sure about these temps on my 640mm loop. 28C ambient. All fans maxed at around 2200RPM.
1100MHz both cards at 1.2V
CPU @ 4.9
Firestrike Ultra stress test.


----------



## PontiacGTX

Quote:


> Originally Posted by *ramos29*
> 
> ambiant temp 15-20, i put 2 12cm fans in front of each gpu ( 4 in total) , i wounder in the middle of the summer how hot will those cards will be
> some times gpu 1 is 60 the second one 71
> some times it is the opposite
> those temp are ingame at max load


it seems that these card run hotter than th ones form the other owners probably your ambient tmep is higher


----------



## ramos29

Quote:


> Originally Posted by *PontiacGTX*
> 
> it seems that these card run hotter than th ones form the other owners probably your ambient tmep is higher


maybe i shall play with the fan speed curve but this will surely result in louder fan noise


----------



## Alastair

Quote:


> Originally Posted by *ramos29*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PontiacGTX*
> 
> it seems that these card run hotter than th ones form the other owners probably your ambient tmep is higher
> 
> 
> 
> maybe i shall play with the fan speed curve but this will surely result in louder fan noise
Click to expand...

the fans aren't generally loud to begin with unless you have a nano.


----------



## ramos29

Quote:


> Originally Posted by *Alastair*
> 
> the fans aren't generally loud to begin with unless you have a nano.


the main problem is that my second gpu is too close the the psu
i used to live in sweden ( no heat issue then ) now i moved to tunisia
and why did i move here? i guess it was the weather XDD ( gta5 trailer )


----------



## shadowxaero

Anyone with crossfire furys crashing in dx11 titles with the new drivers 17.4.1?


----------



## Alastair

Quote:


> Originally Posted by *shadowxaero*
> 
> Anyone with crossfire furys crashing in dx11 titles with the new drivers 17.4.1?


I haven't tried yet.


----------



## shadowxaero

Quote:


> Originally Posted by *Alastair*
> 
> I haven't tried yet.


Lol well let me know.


----------



## Alastair

Quote:


> Originally Posted by *shadowxaero*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I haven't tried yet.
> 
> 
> 
> Lol well let me know.
Click to expand...

I didn't even get that far. 17.4.1 breaks MSI afterburner. Or afterburner breaks the driver for me. I apply my clocks and the cards get stuck in state 1. So meh. Rolled back to 17.3.1 and its working again.

Edit: I know crossfire isn't really working for ark right now flickering textures and all. But what sort of FPS are you guys getting in ark?


----------



## shadowxaero

Quote:


> Originally Posted by *Alastair*
> 
> I didn't even get that far. 17.4.1 breaks MSI afterburner. Or afterburner breaks the driver for me. I apply my clocks and the cards get stuck in state 1. So meh. Rolled back to 17.3.1 and its working again.
> 
> Edit: I know crossfire isn't really working for ark right now flickering textures and all. But what sort of FPS are you guys getting in ark?


What kind of FPS am I trying to get? Lol anything. As soon as I launch any Dx11 title, CTD. What is strange is that dx12 titles work really great.


----------



## bluezone

Apparently NEW Relive drivers are available.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.4.1-Release-Notes.aspx

Be careful, OC software is causing BSODs with this driver. So probably uninstall or deactivate it to be safe.


----------



## diggiddi

I'm having no issues cept for the regular flickering in BF4 and Negative scaling in PCARS which started b4 the current driver
Strangely enough I noticed there's no option in MSI AB to display monitor on game screen


----------



## Alastair

Quote:


> Originally Posted by *bluezone*
> 
> Apparently NEW Relive drivers are available.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.4.1-Release-Notes.aspx
> 
> Be careful, OC software is causing BSODs with this driver. So probably uninstall or deactivate it to be safe.


Software OC wasn't even applying for me. Just set cards to their lowest DPM state and that was that. IO already went back to 17.1.3 and 1135/600 is back on the cards.


----------



## Alastair

Manged to spit this out today. Pretty easy OC on the GPU's. Could probably bench them higher. But this looks like the fastest Dual Fury run posted by an FX processor from my initial searches. And only about 200-300 hundred points behind the Intels.
CPU @ 4.95/2700/3000
GPU's 1150/600
on 17.1.3


----------



## Bojamijams

I've never had any issues with Trixx and any of the 17.x drivers and my Trixx hasn't been updated since 16.11. Afterburner just looks like a finicky piece of software


----------



## Skyl3r

Quote:


> Originally Posted by *Alastair*
> 
> CPU @ 4.95/2700/3000
> GPU's 1150/600
> on 17.1.3


That's around the best I could get after fine tuning my overclocks for quite a while.
My result says "Timing inconsistencies", so I don't know exactly what that means or if I shouldn't trust it. But I ran it like normal and it looked normal running.
http://www.3dmark.com/fs/11909600

My guess would be this is probably about the best performance you can squeeze out of this setup without going more extreme on the cooling side.

*EDIT:*
Considering that those are Fury's and not Fury X's, that's actually quite good. Occasionally this makes me regret not just getting Fury's instead


----------



## Alastair

Quote:


> Originally Posted by *Bojamijams*
> 
> I've never had any issues with Trixx and any of the 17.x drivers and my Trixx hasn't been updated since 16.11. Afterburner just looks like a finicky piece of software


17.4.1 broke all software OC including Trixx.


----------



## LionS7

With Trixx is it breaking fan speed and core clock in 3D - stays at 300Мhz. But I don't have any problems with Afterburner 4.3.0. And Im using a modded bios.


----------



## Bojamijams

Quote:


> Originally Posted by *Alastair*
> 
> 17.4.1 broke all software OC including Trixx.


Not mine. Core clock is at 1100 and it is only stable if I give a +30mV overvolt. Since my core clock is at 1100 and it is stable, I can assume therefore that my Trixx overclock is working just fine


----------



## Alastair

Anyone else having issues with relive recording and the new creators win 10 update? I've got no recording. Doesn't matter what combination of buttons I press.


----------



## shadowxaero

Quote:


> Originally Posted by *Alastair*
> 
> Anyone else having issues with relive recording and the new creators win 10 update? I've got no recording. Doesn't matter what combination of buttons I press.


Which drivers are you on? I personally haven't had issues recorded with relive. I do have issues with screenshots on Windows CU however. This is on 17.3.3


----------



## Alastair

Quote:


> Originally Posted by *shadowxaero*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Anyone else having issues with relive recording and the new creators win 10 update? I've got no recording. Doesn't matter what combination of buttons I press.
> 
> 
> 
> Which drivers are you on? I personally haven't had issues recorded with relive. I do have issues with screenshots on Windows CU however. This is on 17.3.3
Click to expand...

17.3.1 here.


----------



## Thoth420

Seems the update has caused GPU issues in both camps. Most people so far that have solved it have done a clean OS reinstall but if that is too much to bear perhaps wait on a fix via MS or GPU driver update.


----------



## NBrock

The Windows 10 update killed both my rigs (1070 and Fury X) it looks like it removed some important DDLs. I fixed it on both machines by running the installer for the driver to fix everything. Then doing an uninstall and DDU. Then clean install of drivers. Didn't need a clean OS install.


----------



## LionS7

Well, Im uninstalling every time sound and video driver after every big MS update of Windows 10.


----------



## Himo5

I haven't done the W10 update but I've just updated to 17.4.1 and suddenly out of the blue I get Fury max Fan with just Firefox and a text editor running.


----------



## CptAsian

That's odd, I think I had the opposite of you guys. I was on an older version on my folding rig with 2 Furys, probably 17.2 or something, and I got the crazy fan activity. It would go up to 100%, then down to about 40 or so, then back up seemingly at random. I did a clean uninstall with DDU and a fresh installation of 17.4.1, and all seems to be well.
I do, however, get odd flickering even on my desktop when directly plugged in via my Displayport cable, but I don't see it with the remote connection, so I assume that's something with the cable, or at least it's a minor issue that's not a concern for me.


----------



## Alastair

Quote:


> Originally Posted by *Himo5*
> 
> I haven't done the W10 update but I've just updated to 17.4.1 and suddenly out of the blue I get Fury max Fan with just Firefox and a text editor running.


At least you know its running cool


----------



## xkm1948

17.4.2 is out folks.


----------



## bluezone

Quote:


> Originally Posted by *xkm1948*
> 
> 17.4.2 is out folks.


REP: +1

Unigine releases Superposition Graphics Benchmark With 8K, VR Tests .

https://unigine.com/en/products/benchmarks/superposition

I'm downloading this right now.


----------



## xkm1948

Dont know whether it is 17.4.2 or Win10 creator update, i am actually seeing performance improvements in most benchmarks.


----------



## shadowxaero

Superposition Benchmarks Crossfire Fury's


----------



## Alastair

Quote:


> Originally Posted by *shadowxaero*
> 
> Superposition Benchmarks Crossfire Fury's


May I ask how you got it to run with 2 cards. Crossfire is enabled on my system but as per the tach LED's in my system only 1 card is working.


----------



## dagget3450

Quote:


> Originally Posted by *Alastair*
> 
> May I ask how you got it to run with 2 cards. Crossfire is enabled on my system but as per the tach LED's in my system only 1 card is working.


its somewhat flakey, try setting the profile on the superposition.exe in the bin folder. may have to toy with crossfire setting as well like afr friendly/1x1/ etc.

for fury stock clocks from 2x to 4x for fun:


Spoiler: Warning: Spoiler!



2x [email protected] stock clocks
1080p ext


4k opt


[email protected] 1080p extreme -


[email protected] 4k opt


Also for funsies 4way furyx stock clocks
1080p extreme

4kopt


----------



## Skyl3r

Well, I succumbed to temptation. My 3rd Fury X is on the way








Found someone selling one on eBay for $250 and had to pull the trigger. Much too good of a deal to pass up.


----------



## Tcoppock

I may pick up another if vega doesn't hurry and launch


----------



## Skyl3r

Quote:


> Originally Posted by *Tcoppock*
> 
> I may pick up another if vega doesn't hurry and launch


What do you have right now? I don't see it in your signature builds.


----------



## Tcoppock

XFX R9 Fury


----------



## diggiddi

Quote:


> Originally Posted by *dagget3450*
> 
> its somewhat flakey, *try setting the profile on the superposition.exe in the bin folder*. may have to toy with crossfire setting as well like afr friendly/1x1/ etc.
> 
> for fury stock clocks from 2x to 4x for fun:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 2x [email protected] stock clocks
> 1080p ext
> 
> 
> 4k opt
> 
> 
> [email protected] 1080p extreme -
> 
> 
> [email protected] 4k opt
> 
> 
> Also for funsies 4way furyx stock clocks
> 1080p extreme
> 
> 4kopt


How do you do that?


----------



## diggiddi

Guys how do you increase voltage on HBM? Already got core to 1150 @78mV w +50 power limit + custom fan curve


----------



## shadowxaero

Quote:


> Originally Posted by *diggiddi*
> 
> Guys how do you increase voltage on HBM? Already got core to 1150 @78mV w +50 power limit + custom fan curve


You have to do it with a custom bios. If you check out the Fury Bios editing thread you will find ROMs with the offsets you need to adjust to increase HBM voltage.


----------



## diggiddi

^^^TY Repped up!


----------



## divinity666

Weird thing about memory OC with my fury nitro: at any clocks from 401 to 500mhz brandwidth is the same: tested in oclmembench, aida64 gpu bench, even downloaded damn ethereum miner - no difference. If I OC mem to 501+(max I've tried was 610MHz) - results were the same for any frequency in between that clocks and they were lower. E.g. [email protected] - 350Gb\s, @401-500Mhz - 386Gb/s, 501-610Mhz - 378Gb\s. Results in 3dmark confirm perfomance drop as well.


----------



## shadowxaero

Quote:


> Originally Posted by *divinity666*
> 
> Weird thing about memory OC with my fury nitro: at any clocks from 401 to 500mhz brandwidth is the same: tested in oclmembench, aida64 gpu bench, even downloaded damn ethereum miner - no difference. If I OC mem to 501+(max I've tried was 610MHz) - results were the same for any frequency in between that clocks and they were lower. E.g. [email protected] - 350Gb\s, @401-500Mhz - 386Gb/s, 501-610Mhz - 378Gb\s. Results in 3dmark confirm perfomance drop as well.


It is because HBM clocks in steps of 500, 545, 600, and 666. If you use the GPBench in AIda64 you will notice performance increases at every step. In previous testing HBM will clock to what ever step you set the clocks closest to. For example, setting clocks to 525, HBM would actually be running at 500, clock at 570 and you would be running at 545.

HBM gains are marginal at best with gaming benchmarks, most of which I just credit to margin of error. Maybe 40 or 50 point increase in TimeSpy at 600Mhz.


----------



## MrKoala

Quote:


> Originally Posted by *shadowxaero*
> 
> It is because HBM clocks in steps of 500, 545, 600, and 666. If you use the GPBench in AIda64 you will notice performance increases at every step. In previous testing HBM will clock to what ever step you set the clocks closest to. For example, setting clocks to 525, HBM would actually be running at 500, clock at 570 and you would be running at 545.
> 
> HBM gains are marginal at best with gaming benchmarks, most of which I just credit to margin of error. Maybe 40 or 50 point increase in TimeSpy at 600Mhz.


There is speculation that Fuji is latency limited and bumping up the clock of the HBM doesn't help, decreasing CL does.


----------



## lanofsong

Hey AMD Radeon Fury / NANO / X / Pro Duo FIJI owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## divinity666

Quote:


> Originally Posted by *shadowxaero*
> 
> It is because HBM clocks in steps of 500, 545, 600, and 666. If you use the GPBench in AIda64 you will notice performance increases at every step. In previous testing HBM will clock to what ever step you set the clocks closest to. For example, setting clocks to 525, HBM would actually be running at 500, clock at 570 and you would be running at 545.
> 
> HBM gains are marginal at best with gaming benchmarks, most of which I just credit to margin of error. Maybe 40 or 50 point increase in TimeSpy at 600Mhz.


I tried clocks like 525, 545, 575, 600, 610, 625, even 650 - same result for them

Theese are aida64 results for 401, 500 and 600 respectively





though I have to say mem read\write may vary from 5700-6000 anyway, but you can notice that there is no improvement anyway


----------



## divinity666

Quote:


> Originally Posted by *MrKoala*
> 
> There is speculation that Fuji is latency limited and bumping up the clock of the HBM doesn't help, decreasing CL does.


Is there a way to change timings strap? I did so with my rx470 using polaris bios edior, but don't know of any soft for fury(

ps sorry for double-posting, thought they would auto-stack....


----------



## shadowxaero

Quote:


> Originally Posted by *divinity666*
> 
> I tried clocks like 525, 545, 575, 600, 610, 625, even 650 - same result for them
> 
> Theese are aida64 results for 401, 500 and 600 respectively
> 
> though I have to say mem read\write may vary from 5700-6000 anyway, but you can notice that there is no improvement anyway


Hmm maybe AMD changed something with recent drives as I am noticing the same thing.

http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo/830#post_25241806

That is from a while back at 600Mhz timings and there was definitely a difference.


----------



## 99belle99

Anyone else notice more game crashes with a Fury X? Admittedly it is rare but I never got any crashes with my old R9 290 which was highly overclocked. I twice over a few months got crashes in GTA V. Once in a graveyard at night looking at gravestones with torch and then once recently in GTA online.

Battleborn crashed on me once near the end of a level I had just beaten all the enemies and was just gathering up loot and it crashed. I do not know if it crashed again as I have not played it since as I was annoyed at having killed so many enemies and couldn't be bother ed to have to do it all over again.


----------



## bluezone

Quote:


> Originally Posted by *divinity666*
> 
> I tried clocks like 525, 545, 575, 600, 610, 625, even 650 - same result for them
> 
> Theese are aida64 results for 401, 500 and 600 respectively
> 
> 
> 
> 
> 
> though I have to say mem read\write may vary from 5700-6000 anyway, but you can notice that there is no improvement anyway


It's possible your memory clocks are stuck in software. You may need to clean uninstall and do a full shutdown and restart. Then reinstall the drivers. It happens sometimes.


----------



## divinity666

Quote:


> Originally Posted by *bluezone*
> 
> It's possible your memory clocks are stuck in software. You may need to clean uninstall and do a full shutdown and restart. Then reinstall the drivers. It happens sometimes.


I've made a clean install already when I updated drivers from 17.2.1 to 17.3.3. Nothing changed. All monitoring software I use (AB, hwinfo, trixx) show clock change when I adjust it


----------



## bluezone

Quote:


> Originally Posted by *divinity666*
> 
> I've made a clean install already when I updated drivers from 17.2.1 to 17.3.3. Nothing changed. All monitoring software I use (AB, hwinfo, trixx) show clock change when I adjust it


Ok that's not being a problem then. You would of then noticed, in Trixx or Hwinfo, after benchmarking if the clocks had reverted.


----------



## Skyl3r

*
*

*3 x Fury X*
*Ryzen 1800x*

First test on TimeSpy (all stock clocks)
http://www.3dmark.com/3dm/19353521?


I'm having a weird problem now where everytime I try to set an overclock via TriXX, Xfire stops working... At the very least, I can run Crysis now









*EDIT:*

So, I can recreate the problem with a fresh install of TriXX.
On a new boot, first I launch Furmark and see my FPS is at 260 and Crossfire is enabled.
Then I close out of the stress test, open TriXX and hit apply without changing anything.
Now I go back to Furmark and start the stress test and my FPS is now at 80 (still shows crossfire enabled, but I can see my second and third cards are showing no load)


----------



## angelsalam

yup, i noticed it too with my fury nitro OC, HBM don't overclock at all despite msi afterburner saying it's at 550-600mhz. Aida64 results are the same when you try to overclock the vram but are lower when you underclock HBM. seems like it's software locked :/


----------



## budgetgamer120

Quote:


> Originally Posted by *Skyl3r*
> 
> 
> 
> *
> *
> 
> *3 x Fury X*
> *Ryzen 1800x*
> 
> First test on TimeSpy (all stock clocks)
> http://www.3dmark.com/3dm/19353521?
> 
> 
> I'm having a weird problem now where everytime I try to set an overclock via TriXX, Xfire stops working... At the very least, I can run Crysis now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT:*
> 
> So, I can recreate the problem with a fresh install of TriXX.
> On a new boot, first I launch Furmark and see my FPS is at 260 and Crossfire is enabled.
> Then I close out of the stress test, open TriXX and hit apply without changing anything.
> Now I go back to Furmark and start the stress test and my FPS is now at 80 (still shows crossfire enabled, but I can see my second and third cards are showing no load)


3 x FuryX









That poor psu


----------



## Skyl3r

Quote:


> Originally Posted by *budgetgamer120*
> 
> 3 x FuryX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That poor psu


Hahaha don't worry, I got a couple. Silver one is powering pumps and fan controller.
Black one underneath it is powering mobo and dual Fury X's.
Black EVGA up front is powering the temporary Fury X.


Pretty messy setup, but I'm gonna be tearing down the chillbox relatively soon to construct a new one. The Fury X without a custom waterblock is not staying in this build either, I just wanted to benchmark with 3 on Ryzen.

*I Bleed Red*

Is there anything more *AMD* than over 2000w worth of power supplies and a 12,000BTU air conditioner to cool it? Maybe, but we're definitely closing in on the pinnacle of AMD_ism_ here.


----------



## EEFREETH

Regards. Sorry my bad english im from Mexico, hoppe the community find this helpful, this guide it's my final conclusion of 2 year's with the Radeon Fury X, so this is it...

Actually I own a Sapphire Radeon Fury X since it comes out, and until now i was looking for a decent OVERCLOCK to this Hardware, Finally and thanks to the community and reading some of the post i have what i want with, so that's why today reply this post to help someone like me who try to do a decent OC, without Kill the card or have any issue about, ok?
Actually the Hardware it's quite disappointing because all we think HBM was better on the OC point than DDR5 or Fiji will be better than Hawaii or the before gen of GPU, but no, the whole thing was released and abandoned by AMD, but ok I think, they will comes with VEGA, but thats not the point now, I love this GPU, really im happy with but like everyone here i was thinking its potential was sealed, ok so reading and doing some by myself near to screw the entire thing, THANK GOD for the dual BIOS!!!
I finally have an stable OC of 1140/570 72 mv Running all my games and several Bench, Thanks a lot all the community and special this entire - Page 1045.
So how come to this ok I make this 3 steps to made it:

1.- Reading some here and there in this 2 years take action to upgrade my BIOS, this make the OC more efficient and stable, belive me without this step can't do a better OC, so i put here the pages i read to achievement this:
https://www.techpowerup.com/forums/threads/amd-official-uefi-vbios-for-fury-x-released-apr-5th.221766/
https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware
https://www.techpowerup.com/download/ati-atiflash/ ============> This is the tool to upgrade the BIOS
http://support.amd.com/en-us/download/gpu-firmware-download ====> This is the latest official bios to this card 113-C8800100-107 BIOS PART NUMBER

2.- Done the BIOS thing u have to upgrade your DRIVERS to CRIMSON version 17.4.2 The actual release, so i strongly recommend to uninstall all the drivers of your PC with Display Driver Uninstaller Tool, and then install the Crimson Drivers ok?
http://www.wagnardsoft.com/content/display-driver-uninstaller-ddu-v17063-released =====> Display Driver Uninstaller
http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64 ====> AMD Latest Drivers

3.- Download and install the Sapphire TRIXX, and put this settings on:

========> http://www.sapphiretech.com/catapage_tech.asp?cataid=291&lang=en

Power Limit to 10%

GPU Voltage to + 72 mV

Fan Speed to 80% Actually put mine to 90%, but it's personal choice i think all the heat will be fine if u try since 60%, but i preffer do not run any risk so i recommend 75% to balance noise/performance.

GPU CLOCK 1140. ======> Stable to me with all that fixes i tell to you, i run several Bench like Heaven Benchmark, Metro Redux Benchmark, AIDA64 Benchmark and Resident EVIL 6 Benchmark, also I played with ARK Survival Evolved, PlayerUnknown's Battlegrounds, Titanfall 2, Star Wars: Battlefront, also Alien Isolation, and its Stable enough, i was trying the 1200/600 OC with Power limit 14% and 74 mV but crashes - so i try this until now all things run so much good the performance it's good.

MEMORY CLOCK 570/1140. ====> Stable and match with the GPU CLOCK 1140 it's the value of the EFFECTIVE MEMORY CLOCK, by the way for those don't know in Sapphire TRIXX settings.

TO AUTOMATICALLY RUN ALL THE OC SETTINGS FROM THE STARTUP OF THE PC U NEED TO SAVE THEM IN THE 1ST PROFILE OF THE SAPPHIRE TRIXX, AND GO IN THE SETTINGS PANEL AND CHECK THE EFFECTIVE MEMORY CLOCK, SAVE FAN SETTINGS WITH PROFILE, LOAD ON WINDOWS STARTUP, AND MINIMIZED, "AND RESTORE CLOCKS - IF U DON'T PUT THIS ONE U HAVE TO GO TO THE TRIXX APP EVERY TIME THE CPU STARTS TO WORK", AND THAT'S IT... ENJOY THIS!!!

validation gpuz =======> https://www.techpowerup.com/gpuz/details/9yz9





metro_report.doc 41k .doc file


----------



## looncraz

Quote:


> Originally Posted by *angelsalam*
> 
> yup, i noticed it too with my fury nitro OC, HBM don't overclock at all despite msi afterburner saying it's at 550-600mhz. Aida64 results are the same when you try to overclock the vram but are lower when you underclock HBM. seems like it's software locked :/


I noticed this problem long ago:

http://www.overclock.net/t/1609366/amd-16-8-1-16-8-2-memory-clocks-not-working

Not yet resolved, because AMD can't reproduce it... though I can't avoid it happening on any system I've tested.


----------



## gupsterg

Only what I've experienced no issue on HBM clock / performance.

This is W10 Creators edition installed few days ago, OC via ROM, Crimson v16.12.2 WHQL.



After bench downed Macrium to get a screenie of Fury X on i5 4690K from a backup.



Will find some more / same clocks.


----------



## Skyl3r

I'll ask again so it's less obfuscated in pictures and joking.

Has anyone had any problems with TriXX where applying settings caused crossfire to stop working?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> Apparently AdoredTV is stirring the pot again (in a good way) over Ryzen bench marks not using RTG GPU's.
> 
> Ryzen of the Tomb Raider -
> 
> 
> 
> 
> Interesting results.
> 
> Go team Ryzen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think Gupsterg.


Please accept my apologies mate







, been just engrossed in Ryzen meddling, not seen video and dunno if I'm the person to comment on it. Will try to watch it at some point.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Please accept my apologies mate
> 
> 
> 
> 
> 
> 
> 
> , been just engrossed in Ryzen meddling, not seen video and dunno if I'm the person to comment on it. Will try to watch it at some point.


No problem mate. New toys are always the most fun.









New driver Relive 17.4.3. Release notes and downloads.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.4.3-Release-Notes.aspx

Hmmm. It's half a Gig in size.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> No problem mate. New toys are always the most fun.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> New driver Relive 17.4.3. Release notes and downloads.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.4.3-Release-Notes.aspx
> 
> Hmmm. It's half a Gig in size.


Maybe this will finally be the answer to getting my scores to successfully validate in 3dMark


----------



## xkm1948

With VEGA being possibly another disappointment from AMD, would you guys say it is a good idea to invest in a EKWB block for the FuryX? I am thinking of going full liquid cooling. This way I can get the max amount of performance out of the FuryX.


----------



## gupsterg

Not worth it IMO.

Pretty much all the full WC shares I've seen from members, on what they use 24/7 is not vastly differing from what stock AIO gains, so I couldn't even warrant cost of just block even. I've had the itch a few times, but then sense prevails. I even once found a block/backplate cheap from a member on OCuk whose card died. As I didn't have parts for rest of loop I thought leave going custom WC until next GPU.

You gained news on VEGA to be flop?


----------



## xkm1948

Quote:


> Originally Posted by *gupsterg*
> 
> Not worth it IMO.
> 
> Pretty much all the full WC shares I've seen from members, on what they use 24/7 is not vastly differing from what stock AIO gains, so I couldn't even warrant cost of just block even. I've had the itch a few times, but then sense prevails. I even once found a block/backplate cheap from a member on OCuk whose card died. As I didn't have parts for rest of loop I thought leave going custom WC until next GPU.
> 
> You gained news on VEGA to be flop?


I figured the EKWB won't be too much of an improvement.

As for VEGA. With the same 4096SP count as Fiji and some improved design it may match 1080 performance. 1080Ti may be out of reach for VEGA. If the card is so competitive AMD should have released it a already.


----------



## gupsterg

No idea on VEGA.

I reckon all we've seen unofficially is guesses. What we've seen as official is too little.

God knows when it will be out. If it will ever see the light of day, then perhaps they have another ace up their sleeves.

Real shame we've been waiting so long and nVidia in a way march on.


----------



## xkm1948

I blame the slow and horrible implementation of Vulkan/DX12. In these API FuryX really shines.


----------



## geriatricpollywog

Has anybody gotten a Fury X to work in OSX? I know OSX has the drivers for it, but I'm not familiar with kext editing.


----------



## bluezone

A while back I linked to article on optimizing cooling.

http://www.tomshardware.com/reviews/optimizing-graphics-cooling,4838-4.html

Recently I decided to try a different thermal compound on my Nano and try, as suggested, using TIM on all the thermal pads (including adding one that was not included).

Long story short. My VRMs are 2-3C cooler running. The GPU seems about 1-2C cooler over the long run as well. I'm not sure if cooler running GPU is due to different TIM or better heat dissipation through the thermal pads.
Anyhow I'm pleased with the Deepcool Z5 TIM. Maybe I should of picked up some of the Z9 to do a compare.

Cheers


----------



## Skyl3r

Just figured I'd update on my overclocking issue.
Quick rehash - with my third Fury X, if I try to apply an overclock in TriXX, the cards quit crossfiring.

I now removed TriXX and tried Afterburner, which did not work. I reinstalled the drivers for the GPU's which did not work.
Then I tried overclocking from Radeon Settings. This worked in that I could see the overclock applied in Furmark and the cards were still XFiring. So something to do with Afterburner and TriXX maybe?


----------



## Offler

Hello guys.

1. Found some few AMD FuryX / Nano cards to be available at the seller here.
2. I am interested because mainly of the "rarity" factor of these cards.
3. Using it would provide not much boost to my current system (as the Phenom II is really old) and maximum 3d score for CPU+GPU combination is expected to be 10k in Firestrike.
4. My current Sapphire R9-290x is excellent (ASIC over 80%) yet its noisy, and produces far too much heat.

In the future the Phenom II might get replaced by a Ryzen 1800x

Benefits which are expected with R9-Fury or R9-Nano

a) 10k 3dMark score without any OC (10-20% increase, without oc)
b) Less noise?
c) Less heat (mainly around the card)?

TDP for my R9-290x is 250w, R9 Fury has 275, but watercooler, R9-Nano has 175 but with air cooler?

Performance wise, all cards seem to be equal with current CPU, what I am looking for is less heat coming from GPU. And Nano seem to archieve that (somehow).


----------



## gupsterg

Nano stock cooler can be noisy. Fury air coolers better. Fury X AIO probably the best setup IMO.

I have good airflow in case. Hawaii with aftermarket coolers (ie Vapor-X, Tri-X, DCUII), dump hot air in case vs Fury X AIO. Difference I saw ~5°C for CPU/mobo/internal case temps, etc.

You can see photos in my profile of rig and setup.


----------



## Offler

I found out some info about the FuryX and Nano cards i have here...

FuryX is one of the first Water Cooler from Sapphire. Nano is from Asus (but looks like reference design).

Seems like Nano cards have better ASIC chips in order to even work properly. Some coil whine was reported, but also incredible heat on the back of the card (92 degrees). Thats something I have to avoid, because its clear that current graphic card is making heat on CPU PWM worse.


----------



## budgetgamer120

Quote:


> Originally Posted by *Offler*
> 
> I found out some info about the FuryX and Nano cards i have here...
> 
> FuryX is one of the first Water Cooler from Sapphire. Nano is from Asus (but looks like reference design).
> 
> Seems like Nano cards have better ASIC chips in order to even work properly. Some coil whine was reported, but also incredible heat on the back of the card (92 degrees). Thats something I have to avoid, because its clear that current graphic card is making heat on CPU PWM worse.


All FuryXs are reference.

I think all nanos are reference also.


----------



## Offler

Quote:


> Originally Posted by *budgetgamer120*
> 
> All FuryXs are reference.
> 
> I think all nanos are reference also.


Thx. Good to know...

So... I checked some screens done with thermal camera. In both cases of R9 Nano and R9 Fury, most heat was coming out of PWM. If the screens can be trusted, Nano has 92 degrees of celsius at the a back of the card, FuryX has 58 (R9-290x has 87)

Thats exactly what I am looking for so the FuryX is way to go.


----------



## gupsterg

All Fury X as stated by budgetgamer120 are same PCB, etc. AIB just slap whatever sticker, box and what they may bundle. Same with Nano.

I posted some VRM temps taken with probe from back of PCB at VRM in this thread, you should find them with search option.

Fury air cooled are also sweet. I owned 2x Nitro and 1x Tri-X. On Fury X coolant flows rad > CPU block/pump > GPU VRM > rad. So I found GPU VRM temps were a tad higher on Fury X vs Fury Tri-X. HBM VRM is similar for temps between the 2.

Nitro I didn't like. I felt Fury Tri-X that use AMD ref PCB (same one as on Fury X) was nice. It was not as wide, they didn't even place bigger fans on Nitro vs Tri-X. Also water block compatibility was better on Tri-X, if this interests you, it can use Fury X block. Again my thoughts on WC posted just recently. So Fury X with AIO then again gets a win in my books.


----------



## Offler

Well I can select between Asus R9-Nano or Sapphire R9 FuryX. Nothing else is available with Fiji chip here.

Temperature at the back of the card is important, as the current card is heating so much, that I had to downclock my CPU by 100 Mhz... CPU VRM was getting too much heat from it.


----------



## gupsterg

You are best off getting Fury X IMO.

Even if Nano has full Fiji GPU, in it's default state throttles GPU to keep within "limits" of "power/thermal envelope". VRM is 4 GPU phases so for even like settings a Fury X has cooler VRM, as it has 6 GPU phases to spread the load.

In PowerPlay is a VRM temperature when reached card throttles, the Nano reaches this pretty easily from what I recall.

Some on the Nano see less throttling, some it is an issue, regardless OC or not from what I recall. I once was trying to help a guy on OCuk to get his WC'd Nano to have a flat clock speed like my Fury X does and we didn't achieve this. Others who own the Nano here may share their experience, but from what I recall you are better off with Fury X.

Like I said before the AIO improved all internal case temps for me, as all heat is being expelled out of case where I have it and none is being dumped in case.


----------



## Skyl3r

Quote:


> Originally Posted by *Offler*
> 
> Well I can select between Asus R9-Nano or Sapphire R9 FuryX. Nothing else is available with Fiji chip here.
> 
> Temperature at the back of the card is important, as the current card is heating so much, that I had to downclock my CPU by 100 Mhz... CPU VRM was getting too much heat from it.


I concur with gupsterg, definitely go Fury X over Nano. The only reason I see to get a Nano is space constraints. They are clocked lower, throttle hard and in my experience don't overclock nearly as well. Also, the Fury X AIO cooler is really pretty good. I saw essentially no performance gains moving to a custom loop.


----------



## Offler

R9 FuryX ordered.

Just a small update, its Gigabyte, not Sapphire as I initially thought.


----------



## Skyl3r

Quote:


> Originally Posted by *Offler*
> 
> R9 FuryX ordered.
> 
> Just a small update, its Gigabyte, not Sapphire as I initially thought.


I've actually never seen a Gigabyte Fury X. Hope it treats you well!


----------



## TwirlyWhirly555

The AIO Cooler definitely makes all the difference , works wonders on my pro duo : D


----------



## budgetgamer120

Quote:


> Originally Posted by *TwirlyWhirly555*
> 
> The AIO Cooler definitely makes all the difference , works wonders on my pro duo : D


True that... No throttling at all. I hope Vega has AIO liquid option.


----------



## bluezone

Quote:


> Originally Posted by *Offler*
> 
> R9 FuryX ordered.
> 
> Just a small update, its Gigabyte, not Sapphire as I initially thought.


It just the reseller name (AIB) on the card. Gigabyte or Sapphire. Other than pump revisions and maybe Bios versions they are the same.

As a Nano owner, get the Fury X. I've tamed a lot of my issues with the Nano, but it took a lot of work. Still a noisy card.

You get more stable headroom with the Fury X.









iirc one of the AIB's offered a Fury with an AIO just like the Fury X.

Found it. XFX

http://xfxforce.com/en-us/products/amd-radeon-r9-300-series/amd-radeon-r9-fury-liquid-cooled-r9-fury-4wfa


----------



## budgetgamer120

Quote:


> Originally Posted by *bluezone*
> 
> It just the reseller name (AIB) on the card. Gigabyte or Sapphire. Other than pump revisions and maybe Bios versions they are the same.
> 
> As a Nano owner, get the Fury X. I've tamed a lot of my issues with the Nano, but it took a lot of work. Still a noisy card.
> 
> You get more stable headroom with the Fury X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> iirc one of the AIB's offered a Fury with an AIO just like the Fury X.


I have a FuryX and would have gotten a Nano if the FuryX was not cheaper. I just like the form factor


----------



## bluezone

Quote:


> Originally Posted by *budgetgamer120*
> 
> I have a FuryX and would have gotten a Nano if the FuryX was not cheaper. I just like the form factor


What ever you paid, may Nano was cheaper.- free.


----------



## Jflisk

Question gtx1080 vs 2X fury X what one would have better performance, Considering buying a water cooled GTX 1080 and giving up on my Furies. This is not an AMD or NVIDIA fan question. Any help greatly appreciated Thanks


----------



## steadly2004

Quote:


> Originally Posted by *Jflisk*
> 
> Question gtx1080 vs 2X fury X what one would have better performance, Considering buying a water cooled GTX 1080 and giving up on my Furies. This is not an AMD or NVIDIA fan question. Any help greatly appreciated Thanks


I think if Crossfire is supported then the furyX wins if not then the 1080.

If no Crossfire support the 1080 wind by a large margin, but if Crossfire support the 1080 loses by a small margin.

This is what I think, but if you want to be sure Google some benchmarks. Also much of the performance may be game specific so also make sure to look at the games you play. I think if you want a for sure upgrade either way get a 1080ti or wait for VEGA to see what it brings to the table.


----------



## dagget3450

Quote:


> Originally Posted by *Jflisk*
> 
> Question gtx1080 vs 2X fury X what one would have better performance, Considering buying a water cooled GTX 1080 and giving up on my Furies. This is not an AMD or NVIDIA fan question. Any help greatly appreciated Thanks


if you hang in there Vega is around the corner... may/june or so.... I am waiting myself but there is a good chance i end up going green if vega disappoints...


----------



## gupsterg

Quote:


> Originally Posted by *TwirlyWhirly555*
> 
> The AIO Cooler definitely makes all the difference , works wonders on my pro duo : D
> 
> Quote:
> 
> 
> 
> Originally Posted by *budgetgamer120*
> 
> True that... No throttling at all. I hope Vega has AIO liquid option.
Click to expand...

I wasn't a fan of the AIO on first purchasing Fury X, relatively quickly my view changed from usage experience. I too hope that Vega has AIO.


----------



## Skyl3r

Quote:


> Originally Posted by *steadly2004*
> 
> I think if Crossfire is supported then the furyX wins if not then the 1080.
> 
> If no Crossfire support the 1080 wind by a large margin, but if Crossfire support the 1080 loses by a small margin.


This is what I've seen in benchmarks. There are some oddballs that crossfire but don't see the same kinds of gains as other games - so you definitely wanna check what you wanna play, like steadly2004 said.


----------



## Offler

Quote:


> Originally Posted by *Skyl3r*
> 
> I've actually never seen a Gigabyte Fury X. Hope it treats you well!


http://geektech.ie/review-amd-fury-x/2/

Gigabyte version has no sticker on fan and written "gigabyte" at top. The reviewers reported water pump whine, but did not provided any screenshot of the pump.

I guess the first thing I will do with the card will walk to the RMA office and ask for a screwdriver to check the pump.









Edit:
Review was written on 5th of August 2015, so its one of the early ones. And the label isnt visible as well


----------



## gupsterg

On the back of card you should see label with Z1545xxxxxxxx , green digits production year, red digits production week.

Be interesting to know that.


----------



## Offler

Its Z1544 and the whine is present.

1st result' from 3dMark:
http://www.3dmark.com/3dm/19459389

Just a random 3dmark run and I immediatelly made best result for combination of Phenom II and FuryX. I just need to update 3dmar,.


----------



## Jflisk

Quote:


> Originally Posted by *steadly2004*
> 
> I think if Crossfire is supported then the furyX wins if not then the 1080.
> 
> If no Crossfire support the 1080 wind by a large margin, but if Crossfire support the 1080 loses by a small margin.
> 
> This is what I think, but if you want to be sure Google some benchmarks. Also much of the performance may be game specific so also make sure to look at the games you play. I think if you want a for sure upgrade either way get a 1080ti or wait for VEGA to see what it brings to the table.


Think ill wait for the VEGA and see what the specs are.
Quote:


> Originally Posted by *dagget3450*
> 
> if you hang in there Vega is around the corner... may/june or so.... I am waiting myself but there is a good chance i end up going green if vega disappoints...


I think I may wait for the VEGA than make a decision
Quote:


> Originally Posted by *gupsterg*
> 
> I wasn't a fan of the AIO on first purchasing Fury X, relatively quickly my view changed from usage experience. I too hope that Vega has AIO.


I am not to worried about the AIO I have enough rads to hold back 3 x 290X and 9590 at 4.9 ghz . At least that's what was in my build a long time ago.

Almost forgot, Thanks everybody


----------



## gupsterg

Quote:


> Originally Posted by *Offler*
> 
> Its Z1544 and the whine is present.
> 
> 1st result' from 3dMark:
> http://www.3dmark.com/3dm/19459389
> 
> Just a random 3dmark run and I immediatelly made best result for combination of Phenom II and FuryX. I just need to update 3dmar,.


To me that's a nice score.

Your score vs i5 4690K 4.9GHz stock Fury X. Graphics score elements are so close. Your not even doing too badly vs Ryzen R7 1700 for graphics elements.


----------



## Offler

OK.

1. Are there any technical reasons to replace the pump if the sound isnt a problem? CPU is aircooled.
2. The 3dmark results are for me a nice surprise. I did not expected anything above 9500, and in any case no chance to get above current high-score, not even before I even try any overclock.


----------



## gupsterg

I see no reason to change pump if all is well.

IIRC only 1 out of the 8 Fury X I've had was unacceptable coil whine that I could hear when benching as no sound going on then. In normal gaming use if speakers/headphones not used it wasn't an issue. The 2x Fury Nitro I had were worse for me in coil whine. 1x Fury Tri-X no issue.


----------



## Ceadderman

Vega is being held til the AIBs lessen their stocks of 5** series cards I imagine. I just got(march) an RX480 and an EK block for it. Last week sometime I get a email from my Vendor about 580 which at the time was unavailable when I purchased my 480.

So, this is what leads me to believe that Vega is intentionally being held back. Personally my only dog in this fight is that speculation is too inconsistent and only worthy of roundfiling. The reviewers always tell us whether new hardware is a dog or a diamond. I will wait for the reviewers to give us a credible review before deciding on whether Vega will be worthy. I believe it will, because Ryzen knocked it outta the park. But who really knows if AMD will get two outta two right.









My only issue with this is now my brand new GPU will be two gens behind... well one actually since 580 is Polaris. But danged it all.









~Ceadder


----------



## xkm1948

Quick question, do you guys usually leave the power efficiency toggle on or off for your Fury?

On a different note, I start to realize that FuryX REALLY likes cooler temperature. Did some spring cleaning, and the card is MUCH more stable with overclocking


----------



## gupsterg

24/7 use PE: On, otherewise just in OS I see clock bounces just for light things like app window opening, etc. I also use FRTC of 89. When benching both not used.

When I gamed at 1080P having PE: On did cause some erratic FPS in some games, once I went 1440P it's been AOK @ On.


----------



## xkm1948

Looking at RX580, I am wondering a 14nm 1500MHz FuryX on HBM2 would be bad ass. Then I realized VEGA is just that, except with better core design.

I hope that VEGA won't be another overclockers' nightmare.


----------



## gupsterg

My opinion only. AMD seem to clock "silicon" as high as they can at stock. For me Fury / X been like that, the Ryzen is like that, so I expect Vega to be like that.

Read this post by The Stilt, then this one.

Now what happened when Hawaii was refreshed and Grenada was rolled out? link.

So the CPU post I linked shows you for Ryzen to be "clocked" higher "out of box" they used higher leakage CPU, they did same with Hawaii > Grenada. Now I have not read a single RX 580 review but I would not be surprised if all they have done is using higher leakage GPU for higher clocks.

So my expectations on OC ability are low TBH. As long as it has great ref PCB like Hawaii/Fiji I'm cool with dat. Polaris ref PCB was pants IMO for power sourcing and wouldn't touch with barge poll. Performance wise dunno where it can gonna be, need to be close to 1080 Ti I'm hoping as it been long wait for many.


----------



## Offler

http://www.3dmark.com/3dm/19464538

I dont know whats the problem with the driver... But the score is far better as I ever expected to have with such old CPU.


----------



## gupsterg

Driver you have used is not FutureMark approved, 3DM shows that message. Usually WHQL drivers are FM approved, on that page you see 17.4.3 WHQL but even a past WHQL driver will be fine.

Your rig is not bottlenecking GPU performance IMO, so GPU oriented test all AOK IMO, link.


----------



## Offler

I know that only WHQL drivers should work, but I had WHQL driver. nevertheless I did update for the current one.

http://www.3dmark.com/3dm/19465538

Still far beyond my expectations.


----------



## Tgrove

Quote:


> Originally Posted by *xkm1948*
> 
> Quick question, do you guys usually leave the power efficiency toggle on or off for your Fury?
> 
> On a different note, I start to realize that FuryX REALLY likes cooler temperature. Did some spring cleaning, and the card is MUCH more stable with overclocking


I leave power effifiency off because it adds latency and lag


----------



## Kana-Maru

Quote:


> Originally Posted by *xkm1948*
> 
> Quick question, do you guys usually leave the power efficiency toggle on or off for your Fury?
> 
> On a different note, I start to realize that FuryX REALLY likes cooler temperature. Did some spring cleaning, and the card is MUCH more stable with overclocking


I have a Fury X. I normally leave Power Efficiency "On" if I don't really need the extra horsepower for gaming. Especially outside of gaming for YT vids and Netflix.

I haven't really overclocked my Fury X in while, but I did undervolt it. I dropped the stock voltage from 1.23v to around 1.10v. It maintains 1050Mhz in 3DMark Ultra Stress Tests.

I've been thinking about overclocking my Fury X, but I still really have no real need to. I almost reached 1200Mhz in the past.


----------



## kondziowy

Power Efficiency keeps clock at 300mhz when browsing the web and watching movies/youtube. The result is 20-40W less power draw. But for gaming it's bad, tried it like a year ago and it was downclocking when not all 100% of gpu was being used - causing stuttering. So you are much better of setting power efficiency off, and lowering the clock -100MHz and undervolt slightly to shave of 100W of power. Better result, and no stuttering.


----------



## geriatricpollywog

Where is the power efficiency switch?


----------



## Kana-Maru

Quote:


> Originally Posted by *kondziowy*
> 
> Power Efficiency keeps clock at 300mhz when browsing the web and watching movies/youtube. The result is 20-40W less power draw. But for gaming it's bad, tried it like a year ago and it was downclocking when not all 100% of gpu was being used - causing stuttering. So you are much better of setting power efficiency off, and lowering the clock -100MHz and undervolt slightly to shave of 100W of power. Better result, and no stuttering.


I guess it depends on the rig and the game. I don't have issues when running Power Saving features while playing games. At least not in the games I play.

Quote:


> Originally Posted by *0451*
> 
> Where is the power efficiency switch?


AMD Radeon Settings > Gaming Tab > Global Setting > Power Efficiency

It's right before FRTC and after Tessellation Mode.


----------



## gupsterg

Quote:


> Originally Posted by *Offler*
> 
> I know that only WHQL drivers should work, but I had WHQL driver. nevertheless I did update for the current one.
> 
> http://www.3dmark.com/3dm/19465538
> 
> Still far beyond my expectations.


Which/when/how FM decide to "approve" a driver I have no idea, other than what I said previously







.
Quote:


> Originally Posted by *Kana-Maru*
> 
> I guess it depends on the rig and the game. I don't have issues when running Power Saving features while playing games. At least not in the games I play.


Agree







.

Same rig/games plus user, so same potential of user error







. At 1080P PE caused issues for me when using FRTC. Without FRTC less issues as card was then going high FPS. At 1440P PE has been perfect and combined with FRTC is sweet IMO







.


----------



## Offler

Quote:


> Originally Posted by *gupsterg*
> 
> To me that's a nice score.
> 
> Your score vs i5 4690K 4.9GHz stock Fury X. Graphics score elements are so close. Your not even doing too badly vs Ryzen R7 1700 for graphics elements.


http://www.3dmark.com/compare/fs/12415999/fs/2873251/fs/12415545#

11153 is result with current driver, FuryX still on stock.
10199 is fully overclocked R9-290x (1180Mhz, with fan on 94%)
11334 is non WHQL driver

Graphic results for FuryX looks quite constant, but there is just some "hiccup:" on combined test.

Also on 3900Mhz of CPU frequency were some stability issues, supposedly due heat coming from R9-290x. Tried some CPU overclocking tp 4000Mhz yesterday, while the heat problem is fixed with watercooler of FuryX. However no success, as it seemed that CPU simply failed LinX test when peak performance was above 74Gflops (problem size 9992), regadless voltage or temperature. On test with R9-290x i had memory set to 1600 7-7-7-18, effect of tight latencies in LinX test is almost same as extra 100Mhz on cpu.

Edit: and here is comparison with Ryzen

http://www.3dmark.com/compare/fs/12415999/fs/12271989

Graphic scores seems to have +5% while the card of that guy is clocked by 50Mhz. For a CPU released ad 2010 its still impressive.


----------



## gupsterg

Combined is where CPU and GPU are loaded. Post 10949 last link has guide to how tests work/score calculation.

So I reckon in a "load" where CPU/GPU are both being loaded you see "platform" bottlenecking, in the quoted post you can see one of your benches vs my Ryzen, ~50% difference on combined test.


----------



## Offler

Quote:


> Originally Posted by *gupsterg*
> 
> Combined is where CPU and GPU are loaded. Post 10949 last link has guide to how tests work/score calculation.
> 
> So I reckon in a "load" where CPU/GPU are both being loaded you see "platform" bottlenecking, in the quoted post you can see one of your benches vs my Ryzen, ~50% difference on combined test.


Which brings me to fact that Physics and such "funny" stuff faced heavy criticism. Mainly the hardware requirements have gone up greatly, with benefit which could be archieved by more optimized animation, instead of making it using Physics simulations. Only game I even found physics in action was Witcher 2 with ubersampling ON in the opening scenes with trebuchets.

Other side of the plattform bottlenecking is that my CPU might be a bottleneck for physics-related calculations, but I have one bugged version of OCCT stress test (basically furmark) and I found out that CPU is loaded up to 16% (one core) while R9 FuryX isnt even reaching 100% of utilization. And this is pretty much common for 90% of all 3d games i play.

Should better CPU increase performance (in FPS) and GPU utilization? Yes, but up to 84% of my cpu might be not utilized even now.

The 10% of apps which are optimized better for more draw calls is mainly represented by Thief and its Mantle Support, and by Crysis 3.0 engine based game which has support for multiple render threads (not even close to possibilities of Mantle).

I certainly expected that CPU/plattform will cause issues with rendering in graphics tests when comparing same GPUs, but it didnt. Physics and combined tests are of course expected to be lower, but I am no longer convinced its that much of a factor, as it seemed to be 5 years ago.


----------



## kfxsti

@gupsterg

http://www.3dmark.com/fs/12439944

after following your guide for unlocking to an absolute T , and reading some of your knowledge. I finally have my Fury where i want it. Will try for higher tomorrow









here is the temp graph.


----------



## diggiddi

Quote:


> Originally Posted by *kfxsti*
> 
> @gupsterg
> 
> http://www.3dmark.com/fs/12439944
> 
> after *following your guide* for unlocking to an absolute T , and reading some of your knowledge. I finally have my Fury where i want it. Will try for higher tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is the temp graph.


Which guide, could you link?


----------



## kfxsti

Quote:


> Originally Posted by *diggiddi*
> 
> Which guide, could you link?


Was supposed to have been guidance lol.not guide. .
http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo
And he also replied to a few of my questions about which rows I should unlock.


----------



## gupsterg

I have no idea how I helped







, but am glad to read you have your card as you wish







.


----------



## kfxsti

Quote:


> Originally Posted by *gupsterg*
> 
> I have no idea how I helped
> 
> 
> 
> 
> 
> 
> 
> , but am glad to read you have your card as you wish
> 
> 
> 
> 
> 
> 
> 
> .


All of your knowledge and testing. Answering my questions, and other members questions .
About to sit down and see what for sure the max OC I can get given how cool this card is running .


----------



## Skyl3r

I find it really weird that people are selling Fury X's on eBay for like $50 less than Pro Duo... Why? ... lol

Moar Fury X's!! Three is not enough!


----------



## Starbomba

Well, after some months, i finally have a good GPU. I bought a R9 Nano, and i've been testing it this week. This is way better than i expected. I am kinda regretting not getting a Fury X, but this card will go to my HTPC whenever Vega comes out, and i am a little bit more space limited on that rig, so the rad had to go.

http://www.3dmark.com/fs/12420989

Now, i need to flash the UEFI BIOS and an underclocked BIOS, plus mod the card to add a fan to it so it can survive the Pentathlon


----------



## Skyl3r

Quote:


> Originally Posted by *Starbomba*
> 
> Well, after some months, i finally have a good GPU. I bought a R9 Nano, and i've been testing it this week. This is way better than i expected. I am kinda regretting not getting a Fury X, but this card will go to my HTPC whenever Vega comes out, and i am a little bit more space limited on that rig, so the rad had to go.
> 
> http://www.3dmark.com/fs/12420989
> 
> Now, i need to flash the UEFI BIOS and an underclocked BIOS, plus mod the card to add a fan to it so it can survive the Pentathlon


What did you have the nano clocked at in that test?
I'm interested to know if you find the Nano constantly is underclocking itself.

Good score though! And I hope you enjoy it


----------



## Starbomba

Quote:


> Originally Posted by *Skyl3r*
> 
> What did you have the nano clocked at in that test?
> I'm interested to know if you find the Nano constantly is underclocking itself.
> 
> Good score though! And I hope you enjoy it


I really dunno why it appears as 1800 MHz (imagine how fast Fiji would be with those clocks!). I keep it still at 1000 MHz core and 500 MHz HBM, all i did for this score was increase power limit to +50% and boost fan speed to 100%. I still need to tweak my card, that was just a test run in base mode with no changes.


----------



## Skyl3r

Quote:


> Originally Posted by *Starbomba*
> 
> I really dunno why it appears as 1800 MHz (imagine how fast Fiji would be with those clocks!). I keep it still at 1000 MHz core and 500 MHz HBM, all i did for this score was increase power limit to +50% and boost fan speed to 100%. I still need to tweak my card, that was just a test run in base mode with no changes.


Yeah that's what was confusing me








Great! Like I said, I'll be interested to know if you can see the Nano's clock bouncing down to around 800-900MHz while playing games or benchmarks.

Also, I'll share here... I was helping @DigMan prepare to put together his hardline water loop. It's gonna be a mean system









Nanos!


----------



## gupsterg

^^^^







with some







.


----------



## Starbomba

Quote:


> Originally Posted by *Skyl3r*
> 
> Yeah that's what was confusing me
> 
> 
> 
> 
> 
> 
> 
> 
> Great! Like I said, I'll be interested to know if you can see the Nano's clock bouncing down to around 800-900MHz while playing games or benchmarks.


It does bounce. A lot. On both Unreal 2016, DOOM, and some regular 3DMark runs i've made it barely hits 900 MHz. Even running BOINC, which is more shader-oriented than ROP-oriented, it hits roughly 900-920 MHz. Unless you raise the power limit, you're guaranteed to never hit that 1000 MHz rating.
Quote:


> Originally Posted by *Skyl3r*
> 
> Also, I'll share here... I was helping @DigMan prepare to put together his hardline water loop. It's gonna be a mean system
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nanos!


I am doing a mod to survive this BOINC Pentathlon on my Nano (pictured below). Thing is, i cannot get it to display anything unless a fan is connected to the card's connector. Is there any way to fix it? I highly doubt the card will feed my Panaflo fan as it deserves at full tilt, since it needs roughly 5-6w (the other cable is a temp probe to the VRMs). I could get it to run with the fan plugged in, however it was awkwardly hanging inside my case.

Mod works tho, from 82c core and 95c VRM, i'm now down to 71c core and 80c VRM, with 35% power limit. In BOINC, that much is like 95% guaranteed to stay at 100%, there are some dips in usage, but those may be due how the task is being done on BOINC rather than power limitations.


----------



## Skyl3r

Quote:


> Originally Posted by *Starbomba*
> 
> Mod works tho, from 82c core and 95c VRM, i'm now down to 71c core and 80c VRM, with 35% power limit. In BOINC, that much is like 95% guaranteed to stay at 100%, there are some dips in usage, but those may be due how the task is being done on BOINC rather than power limitations.


I like it! What does your core clock do at 35% power limit with the extra cooling?


----------



## Starbomba

Quote:


> Originally Posted by *Skyl3r*
> 
> I like it! What does your core clock do at 35% power limit with the extra cooling?


I haven't tried OCing, and i'm reluctant to OC this card due to the VRM limitations. I actually need to get an undervolted BIOS (have a couple to try, which are meant for bitcoin mining) so i can continue lowering temps.

On 35% power limit, i still see some throttling while upscaling videos with madVR (using NNEDI 32 neurons) and i do not flatline 1000 MHz on gaming, though it does get there. I will be running all those tests this weekend, though i'm more focused on setting everything up for the Pentathlon in a week.


----------



## Ceadderman

Quote:


> Originally Posted by *Starbomba*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Skyl3r*
> 
> Yeah that's what was confusing me
> 
> 
> 
> 
> 
> 
> 
> 
> Great! Like I said, I'll be interested to know if you can see the Nano's clock bouncing down to around 800-900MHz while playing games or benchmarks.
> 
> 
> 
> It does bounce. A lot. On both Unreal 2016, DOOM, and some regular 3DMark runs i've made it barely hits 900 MHz. Even running BOINC, which is more shader-oriented than ROP-oriented, it hits roughly 900-920 MHz. Unless you raise the power limit, you're guaranteed to never hit that 1000 MHz rating.
> Quote:
> 
> 
> 
> Originally Posted by *Skyl3r*
> 
> Also, I'll share here... I was helping @DigMan prepare to put together his hardline water loop. It's gonna be a mean system
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nanos!
> 
> Click to expand...
> 
> I am doing a mod to survive this BOINC Pentathlon on my Nano (pictured below). Thing is, i cannot get it to display anything unless a fan is connected to the card's connector. Is there any way to fix it? I highly doubt the card will feed my Panaflo fan as it deserves at full tilt, since it needs roughly 5-6w (the other cable is a temp probe to the VRMs). I could get it to run with the fan plugged in, however it was awkwardly hanging inside my case.
> 
> Mod works tho, from 82c core and 95c VRM, i'm now down to 71c core and 80c VRM, with 35% power limit. In BOINC, that much is like 95% guaranteed to stay at 100%, there are some dips in usage, but those may be due how the task is being done on BOINC rather than power limitations.
Click to expand...

Have you considered running a splitter off a fan to mb *and* GPU using a jumper from the GPU to one of the splitter legs? This may solve your issue aND you may be able to rund it without the fan mod?









~Ceadder


----------



## Starbomba

Quote:


> Originally Posted by *Ceadderman*
> 
> Have you considered running a splitter off a fan to mb *and* GPU using a jumper from the GPU to one of the splitter legs? This may solve your issue aND you may be able to rund it without the fan mod?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


You, sir, are a genius. That really did the trick. Used a spare PWM splitter i had, since i ordered two of them very recently (one to use on my X58 to drive the PWM signal on two Delta 38mm fans without burning the power connector on my mobo and one spare)

However, i'm still convinced there should be a less MacGyver-y way of doing this, otherwise waterblocks wouldn't work alone.


----------



## Ceadderman

Quote:


> Originally Posted by *Starbomba*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> Have you considered running a splitter off a fan to mb *and* GPU using a jumper from the GPU to one of the splitter legs? This may solve your issue aND you may be able to rund it without the fan mod?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You, sir, are a genius. That really did the trick. Used a spare PWM splitter i had, since i ordered two of them very recently (one to use on my X58 to drive the PWM signal on two Delta 38mm fans without burning the power connector on my mobo and one spare)
> 
> However, i'm still convinced there should be a less MacGyver-y way of doing this, otherwise waterblocks wouldn't work alone.
Click to expand...

I am all about the quick an dirty paper clip and ducttape fixes when it's the logical trick to make things work. Been known to use bubble gum also.









~Ceadder


----------



## Starbomba

Well, i found a perfect BIOS for BOINC, courtesy of Eliovp. Who would've thought BIOSes tuned for Bitcoin mining would be decent for BOINC. BIOS has 1000 MHx @ -100mv in core, 575 MHz @+50mv in HBM. HBM OC allowed a small increase in FireStrike Ultra, from 3777 Graphics score to 3832 Graphics score. Plus, the core undervolting allowed me to go from 35% poweer limit to just 15% to get within spitting distance of the 1000 MHz (goes from 995 to 1000 MHz). And i've gone down from 71c core and 80c VRM to a measly 63c core and 71c VRM with my Panaflo fan.

If i ever had any regrets in getting a Nano over a Fury X, they are gone now


----------



## xkm1948

RX Vega is shaping up to be a huge failure. No wonder AMD has been hiding it since the beginning. According to Guru3D's newest leak, it is slightly overclocked FuryX with 14nm.









http://www.guru3d.com/news-story/possible-radeon-rx-vega-3dmark-time-spy-benchmark-result.html



Seriously good job AMD, you managed to **** up the high end for 2 years in a row.


----------



## dagget3450

Quote:


> Originally Posted by *xkm1948*
> 
> RX Vega is shaping up to be a huge failure. No wonder AMD has been hiding it since the beginning. According to Guru3D's newest leak, it is slightly overclocked FuryX with 14nm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/news-story/possible-radeon-rx-vega-3dmark-time-spy-benchmark-result.html
> 
> 
> 
> Seriously good job AMD, you managed to **** up the high end for 2 years in a row.


If you read what people have said, and look into it. This is just made up based on nerd math and guessing...


----------



## Alastair

Quote:


> Originally Posted by *xkm1948*
> 
> RX Vega is shaping up to be a huge failure. No wonder AMD has been hiding it since the beginning. According to Guru3D's newest leak, it is slightly overclocked FuryX with 14nm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/news-story/possible-radeon-rx-vega-3dmark-time-spy-benchmark-result.html
> 
> 
> 
> Seriously good job AMD, you managed to **** up the high end for 2 years in a row.


TBH I call BS on this since my 3840 Fury's can push out 6K on a single card. AMD isn't that stupid.


----------



## HexagonRabbit

I have the Nano now and for the most part, I'm fairly pleased. My gaming requirements aren't all that demanding either. 2560x1080 so that isn't much.
With that being said, I'm thinking about getting a fury (non X) to pair with the Nano but that just drops my Nano to Fury speeds. Is there any work around past this? I want another card to augment my performance, not gimp it.
Is anyone happy with their Fury's OC?
I'd like another nano but there is no amount of propaganda I could spread to get my wife to agree to spending another $500 on another Nano.


----------



## Skyl3r

Quote:


> Originally Posted by *HexagonRabbit*
> 
> I have the Nano now and for the most part, I'm fairly pleased. My gaming requirements aren't all that demanding either. 2560x1080 so that isn't much.
> With that being said, I'm thinking about getting a fury (non X) to pair with the Nano but that just drops my Nano to Fury speeds. Is there any work around past this? I want another card to augment my performance, not gimp it.
> Is anyone happy with their Fury's OC?
> I'd like another nano but there is no amount of propaganda I could spread to get my wife to agree to spending another $500 on another Nano.


How opposed to buying used are you?
You can get pretty good deals on eBay. I just scraped up a Fury X in "Like New" condition for $250 a week or two ago. They did pull a quick one on me, but the Fury X works fantastically.

Check this out. They advertised literally, word-for-word, as:
"*like new with original box.*"

Imagine my surprise when I unpackaged it and this is what they meant by "with original box":


I can't say the description wasn't "honest", but that's certainly not what I expected when they said "with original box" lol


----------



## Simmons572

That is actually kind of hilarious actually! Great price though!


----------



## Ceadderman

So they couldn't be bothered with packaging in the antistatic baggy the card comes with?









Meh, for $250 what do we expect.









I always use my antistatic baggies. Blocked my 480 and it now sits in the AS baggy in my EK RX480 box waiting to be used.







lol

~Ceadder


----------



## Skyl3r

Quote:


> Originally Posted by *Ceadderman*
> 
> So they couldn't be bothered with packaging in the antistatic baggy the card comes with?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meh, for $250 what do we expect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I always use my antistatic baggies. Blocked my 480 and it now sits in the AS baggy in my EK RX480 box waiting to be used.
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> ~Ceadder


What's in the picture is all it came with. I'm not sure what antistatic bag you're referring to but if it's not there then he didn't send it lol
For a second I considered messaging the guy and asking about why he just crumpled the box up and tossed it in there; but then I considered for a second that he sold it for $250. I should just be thankful I got a good working card for that cheap


----------



## Ceadderman

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> So they couldn't be bothered with packaging in the antistatic baggy the card comes with?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meh, for $250 what do we expect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I always use my antistatic baggies. Blocked my 480 and it now sits in the AS baggy in my EK RX480 box waiting to be used.
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's in the picture is all it came with. I'm not sure what antistatic bag you're referring to but if it's not there then he didn't send it lol
> For a second I considered messaging the guy and asking about why he just crumpled the box up and tossed it in there; but then I considered for a second that he sold it for $250. I should just be thankful I got a good working card for that cheap
Click to expand...

Drat! My mobile device won't get enough WiFi to upload my pic. :S

Anyway...

If you've ever purchased new gear, it's the baggy that electronic parts come inside of when you unbox. MBs, GPUs, Soundcards typpically. I could see your card didn't come with it because there were zero markings on your baggies and my XFX RX 480 came with a baggy that had squares printed on it's entire surface. So did my XFX 5770 and my 6870s from Sapphire that had a similar pattern on them. Should always repackage components in the AS baggy imho. I always do it when I sell a component.









~Ceadder


----------



## 99belle99

Quote:


> Originally Posted by *xkm1948*
> 
> According to Guru3D's newest leak, it is slightly overclocked FuryX with 14nm.
> 
> 
> 
> 
> 
> 
> 
> 
> .


If it was a slightly clocked fury X, why wait two years to release it? Surely they could have released it last year.


----------



## prom

Quote:


> Originally Posted by *99belle99*
> 
> If it was a slightly clocked fury X, why wait two years to release it? Surely they could have released it last year.


It's a fake. The reported clock speed doesn't even add up to what Vega will be.


----------



## geriatricpollywog

"AMD's "Vega" GPU architecture is on track to launch in Q2, and has been designed from scratch to address the most data- and visually-intensive next-generation workloads with key architecture advancements including: a differentiated memory subsystem, next-generation geometry pipeline, new compute engine, and a new pixel engine."

From AMD press release today.


----------



## HexagonRabbit

I would like to see a new GPU from AMD that is designed to work with the new Ryzen CPUs. Not only because I have one but that kind of marketing strategy would be unbelievably valuable to AMD as a company. And I'm not talking a little boost. I'm talking 25FPS or better in ALL games. If they designed something that could compete with nvidia out of the box, the icing would be a boost from ryzen owners.
That would be exactly what AMD needs.


----------



## Kuivamaa

Quote:


> Originally Posted by *HexagonRabbit*
> 
> I would like to see a new GPU from AMD that is designed to work with the new Ryzen CPUs. Not only because I have one but that kind of marketing strategy would be unbelievably valuable to AMD as a company. And I'm not talking a little boost. I'm talking 25FPS or better in ALL games. If they designed something that could compete with nvidia out of the box, the icing would be a boost from ryzen owners.
> That would be exactly what AMD needs.


It is a tricky topic. Lisa Su was asked about such a synergy and said "our CPUs are often getting paired with nvidia GPUs, and our GPUs with intel CPUs". AMD has to make sure their products work perfectly with every combo and not get into pigeonholes. I still get questions by otherwise tech savvy people in the vein of "if i get an AMD cpu, will nvidia work with it fine?". AMD can't afford to be seen as a vendor to be paired with itself only.


----------



## Ceadderman

Quote:


> Originally Posted by *Kuivamaa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HexagonRabbit*
> 
> I would like to see a new GPU from AMD that is designed to work with the new Ryzen CPUs. Not only because I have one but that kind of marketing strategy would be unbelievably valuable to AMD as a company. And I'm not talking a little boost. I'm talking 25FPS or better in ALL games. If they designed something that could compete with nvidia out of the box, the icing would be a boost from ryzen owners.
> That would be exactly what AMD needs.
> 
> 
> 
> It is a tricky topic. Lisa Su was asked about such a synergy and said "our CPUs are often getting paired with nvidia GPUs, and our GPUs with intel CPUs". AMD has to make sure their products work perfectly with every combo and not get into pigeonholes. I still get questions by otherwise tech savvy people in the vein of "if i get an AMD cpu, will nvidia work with it fine?". AMD can't afford to be seen as a vendor to be paired with itself only.
Click to expand...

Quote:


> Originally Posted by *Skyl3r*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ceadderman*
> 
> So they couldn't be bothered with packaging in the antistatic baggy the card comes with?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meh, for $250 what do we expect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I always use my antistatic baggies. Blocked my 480 and it now sits in the AS baggy in my EK RX480 box waiting to be used.
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> ~Ceadder
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's in the picture is all it came with. I'm not sure what antistatic bag you're referring to but if it's not there then he didn't send it lol
> For a second I considered messaging the guy and asking about why he just crumpled the box up and tossed it in there; but then I considered for a second that he sold it for $250. I should just be thankful I got a good working card for that cheap
Click to expand...

Exactly! Would be nice but there are a lot of fanbois who run nVidia. Imagine if AMD were to do that what kind of backlash that would have. Remember the Crysis/nVidia blowup? This would be to the 10th power x100 if AMD were to concentrate on their own gear to the detriment of the competition.









~Ceadder


----------



## bluezone

Well just to wade in here. AMD stated it would try to make back some of its investment in HBM 1. Would it be too much to guess that maybe the Fury line was planed for getting a refresh with new higher clocked cards in order to recoup costs. Maybe these strange benchmarks at higher clocks are for a potential Fiji refresh that may or may not see the light of day. Remember the speculation over RX 390 RX 490. It now looks an awful lot like the announced Xbox Scorpio GPU.

On another note. I got bored and tried a little experimenting today. I was wonder what a Duel-X Nano would look and work like. I'll let the pictures speak for themselves.


Spoiler: Warning: Spoiler!








Much quieter, but 9C hotter running due to heat sink optimization for single fan. Too bad.

Correction: I meant RX 490.


----------



## dagget3450

Quote:


> Originally Posted by *bluezone*
> 
> Well just to wade in here. AMD stated it would try to make back some of its investment in HBM 1. Would it be too much to guess that maybe the Fury line was planed for getting a refresh with new higher clocked cards in order to recoup costs. Maybe these strange benchmarks at higher clocks are for a potential Fiji refresh that may or may not see the light of day. Remember the speculation over RX 390. It now looks an awful lot like the announced Xbox Scorpio GPU.
> 
> On another note. I got bored and tried a little experimenting today. I was wonder what a Duel-X Nano would look and work like. I'll let the pictures speak for themselves.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Much quieter, but 9C hotter running due to heat sink optimization for single fan. Too bad.


You bring a very interesting idea to light. Given AMD's bizarre refreshes lately this could be a possibility. I will say i would be very sad if the rehash fiji with 8gb hbm due to my furyx investments. However if Vega comes out against upper tier of nvidia then i wont care.


----------



## ressonantia

Quote:


> Originally Posted by *bluezone*
> 
> Well just to wade in here. AMD stated it would try to make back some of its investment in HBM 1. Would it be too much to guess that maybe the Fury line was planed for getting a refresh with new higher clocked cards in order to recoup costs. Maybe these strange benchmarks at higher clocks are for a potential Fiji refresh that may or may not see the light of day. Remember the speculation over RX 390. It now looks an awful lot like the announced Xbox Scorpio GPU.
> 
> On another note. I got bored and tried a little experimenting today. I was wonder what a Duel-X Nano would look and work like. I'll let the pictures speak for themselves.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Much quieter, but 9C hotter running due to heat sink optimization for single fan. Too bad.


Kinda looks like my ugly hack, but at least its quiet now







Still maxes out at 82C and the VRMs go to 90+C









Spoiler: Warning: Spoiler!










How'd you make that shroud fit?


----------



## bluezone

Quote:


> Originally Posted by *dagget3450*
> 
> You bring a very interesting idea to light. Given AMD's bizarre refreshes lately this could be a possibility. I will say i would be very sad if the rehash fiji with 8gb hbm due to my furyx investments. However if Vega comes out against upper tier of nvidia then i wont care.


Ya I'd be a little irritated too. Still crossing my fingers for Vega.

Quote:


> Originally Posted by *ressonantia*
> 
> Kinda looks like my ugly hack, but at least its quiet now
> 
> 
> 
> 
> 
> 
> 
> Still maxes out at 82C and the VRMs go to 90+C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How'd you make that shroud fit?


Fans and brackets are from an old MSI GTX 560 Ti 2GB, same fan header plug as the Nano. The shroud is just a piece of plastic sheeting for testing and template purposes. If it had worked out well, the sheeting would of provided it self as a template for a sheet aluminum shroud. but I was not happy with the temperature results. So I am already back to the stock fan.

Here is a HwiNFO capture after a 3DM FS run with the stock fan.


Spoiler: Warning: Spoiler!







http://www.3dmark.com/3dm/19671286

Notice the peak temperatures I'm getting. @ 1050 MHz, 50% PL and 550 HBM.

I can clock higher (+50 MHz), But I run about 6C higher GPU and +8-9C on the VRMs. I'd would rather run cooler but noisier.

I'm only giving up about 600-650 points in FS @ lower clocks. So a fair trade off in my books.


----------



## LionS7

Something strange is happening. Im using 17.4.4, final official bios from AMD for R9 Fury X - from april. I have unlocked the unofficial OC in MSI Afterburner with PowerPlay. Now, the strange thing. I can put up to 600Mhz HBM, and be stable in everything. Before my stable frequency was 520Mhz on 1.30V stock voltage. Did somebody encounter the same situation ? It's not normal for me.


----------



## Starbomba

Quote:


> Originally Posted by *LionS7*
> 
> Something strange is happening. Im using 17.4.4, final official bios from AMD for R9 Fury X - from april. I have unlocked the unofficial OC in MSI Afterburner with PowerPlay. Now, the strange thing. I can put up to 600Mhz HBM, and be stable in everything. Before my stable frequency was 520Mhz on 1.30V stock voltage. Did somebody encounter the same situation ? It's not normal for me.


Well, my Nano is currently running @ 575 MHz HBM, however that is with a modded BIOS which has added some extra voltage. HBM can happily go to 600 MHz with very little voltage jumps (not that it consumes loads of voltage either). Not sure why you're seeing such an improvement though.


----------



## Skyl3r

Quote:


> Originally Posted by *Starbomba*
> 
> Well, my Nano is currently running @ 575 MHz HBM, however that is with a modded BIOS which has added some extra voltage. HBM can happily go to 600 MHz with very little voltage jumps (not that it consumes loads of voltage either). Not sure why you're seeing such an improvement though.


I found a similar increase when going to custom BIOS. I started capped at 1175/540 but with custom BIOS I got up to 1200/630. I can bench higher than that, but it's terrifying.
No idea what the exact change that increased the cap so much was, but I'm not complaining


----------



## LionS7

But it is super odd... I was seeing artefacts above 520Mhz, and now - 600Mhz no problems. Stock bios, stock voltage 1.30V. I don't know really. Is there are a passability that AMD optimize something in the drivers (for HBM stability) for say..., coming of Vega ? Hm...
Quote:


> Originally Posted by *Skyl3r*
> 
> I found a similar increase when going to custom BIOS. I started capped at 1175/540 but with custom BIOS I got up to 1200/630. I can bench higher than that, but it's terrifying.
> No idea what the exact change that increased the cap so much was, but I'm not complaining


How much voltage from 540 to 630Mhz ?


----------



## Skyl3r

Quote:


> Originally Posted by *LionS7*
> 
> But it is super odd... I was seeing artefacts above 520Mhz, and now - 600Mhz no problems. Stock bios, stock voltage 1.30V. I don't know really. Is there are a passability that AMD optimize something in the drivers (for HBM stability) for say..., coming of Vega ? Hm...
> How much voltage from 540 to 630Mhz ?


I believe it was 1.3v. I had to bump the voltage to go higher than 1200MHz GPU clock though.
Returns quickly diminished after that.


----------



## geriatricpollywog

Was a new bios released last month?


----------



## LionS7

Quote:


> Originally Posted by *Skyl3r*
> 
> I found a similar increase when going to custom BIOS. I started capped at 1175/540 but with custom BIOS I got up to 1200/630. I can bench higher than that, but it's terrifying.
> No idea what the exact change that increased the cap so much was, but I'm not complaining


Quote:


> Originally Posted by *Skyl3r*
> 
> I believe it was 1.3v. I had to bump the voltage to go higher than 1200MHz GPU clock though.
> Returns quickly diminished after that.


I mean the voltage of the HBM ?
Quote:


> Originally Posted by *0451*
> 
> Was a new bios released last month?


The last official from AMD, as far as I know is from april 2016. Im using it. It is 1.30V on HBM.
https://community.amd.com/community/gaming/blog/2016/04/05/radeon-r9-fury-nano-uefi-firmware


----------



## Skyl3r

Quote:


> Originally Posted by *LionS7*
> 
> I mean the voltage of the HBM ?


Are you asking me to get a multimeter out and hit the HBM VRMs? lol
I think I'm missing something here







I didn't volt mod, if that's the question.

*EDIT:*
Is there a way to adjust the HBM voltage from software?


----------



## Starbomba

Quote:


> Originally Posted by *Skyl3r*
> 
> Is there a way to adjust the HBM voltage from software?


Not that i know, only hard modding and BIOS Hex mods.


----------



## Tcoppock

Are there any new bios for the fury non X?


----------



## LionS7

Quote:


> Originally Posted by *Skyl3r*
> 
> Are you asking me to get a multimeter out and hit the HBM VRMs? lol
> I think I'm missing something here
> 
> 
> 
> 
> 
> 
> 
> I didn't volt mod, if that's the question.
> 
> *EDIT:*
> Is there a way to adjust the HBM voltage from software?


Yeah, BIOS Hex Mods. I have tried up to 1.35V for HBM. But with modded bios, you loose UEFI.
Look http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo/1580


----------



## HexagonRabbit

Anyone else have an issue where the radeon settings disappear and get uninstalled?


----------



## Skyl3r

Quote:


> Originally Posted by *LionS7*
> 
> Yeah, BIOS Hex Mods. I have tried up to 1.35V for HBM. But with modded bios, you loose UEFI.
> Look http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo/1580


I played with modded bios, but I didn't ever mess with HBM voltage. I wasn't actually aware of that. I'll read up on it. Thanks!








Quote:


> Originally Posted by *HexagonRabbit*
> 
> Anyone else have an issue where the radeon settings disappear and get uninstalled?


No o.o That's really weird. Problem with autoupdate maybe?
Can you recreate it?


----------



## bluezone

New drivers. Relive 17.5.1 is out.

Notes with download links: http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.5.1-Release-Notes.aspx


----------



## prom

I've noticed better Nano stability with the newer drivers as well.
I was getting artifacts when idling, and that's disappeared (knock on wood) entirely.

Additionally I had constantly unstable undervolts, yet now I'm running -50 or -60mv and _only_ crashing if I run the FireStrike stress test.
This is using WattMan of all things, so I might switch to MSI AB or Trixx and see how it goes now.


----------



## HexagonRabbit

Quote:


> Originally Posted by *Skyl3r*
> 
> I played with modded bios, but I didn't ever mess with HBM voltage. I wasn't actually aware of that. I'll read up on it. Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> No o.o That's really weird. Problem with autoupdate maybe?
> Can you recreate it?


The radeon settings app either uninstalls or doesn't install at all. Both occurs with overlapping installs and clean installs.


----------



## xkm1948

Finally went back to try @gupsterg new CSM modified UEFI bios editing overclock. Applied +40mV and adjusted fan curve, as well as target GPU temperature. Max OC is 1125, but this would fail after 3 hours of Valley benchmark. Dialed it down to 1105 and it has been 24hr stable so far. Pretty happy now I can use CSM=off and a customized UEFI bios.

One interesting I noticed is after increasing HBM overclock limit in Fiji editor, there are still options of overclock HBM in Radeon Wattman. Does anyone have any ideas regarding this?


----------



## Minotaurtoo

Ok, I may have missed this if it was already discussed as I have been gone quite a bit... as the construction crews have dug up the internet cables like 12 times in the last month...







you'd think they would have figured out by now where those cables were....

Why is the prices for Fury X's getting so high again? I got mine for well under 500$... I think it was like 420$ or something... now I see they are getting back up to the 650$ mark... cheapest I saw was 550 on a quick google search... I'm glad I bought when I did then lol... I was actually considering getting one for a build I may be working on for a gamer/engineer who was looking for a dual purpose PC, but the RX 480 may work for him.... he only plays at 1440P anyway.


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Ok, I may have missed this if it was already discussed as I have been gone quite a bit... as the construction crews have dug up the internet cables like 12 times in the last month...
> 
> 
> 
> 
> 
> 
> 
> you'd think they would have figured out by now where those cables were....
> 
> Why is the prices for Fury X's getting so high again? I got mine for well under 500$... I think it was like 420$ or something... now I see they are getting back up to the 650$ mark... cheapest I saw was 550 on a quick google search... I'm glad I bought when I did then lol... I was actually considering getting one for a build I may be working on for a gamer/engineer who was looking for a dual purpose PC, but the RX 480 may work for him.... he only plays at 1440P anyway.


Sounds like they are done with production and retailers no longer have to adhere to MSRP,
same thing happened to 290X prices when Fury X came out,
but that was after release though and not before IIRC


----------



## Medusa666

Anyone who have changed the fan on Radeon Pro Duo or know someone who has?

I'm very happy with this card, but I would prefer a more silent fan, but tinkering with the expensive card is not something I'm too keen on.

Anyone got ideas?

Thanks


----------



## geriatricpollywog

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone who have changed the fan on Radeon Pro Duo or know someone who has?
> 
> I'm very happy with this card, but I would prefer a more silent fan, but tinkering with the expensive card is not something I'm too keen on.
> 
> Anyone got ideas?
> 
> Thanks


Doesn't it already have a Gentle Typhoon? You can certainly find something quieter, but the stock fan is as quiet or quieter than the next best fan at the same RPM. You are better off limiting the max RPM and taking a temperature hit.


----------



## lanofsong

Hey there R9 Fury/Nano/X/PRo DUO FIJI owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project - This starts 8pm EST 5/8/17 :

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate









lanofsong

OCN - FTW


----------



## jearly410

Quote:


> Originally Posted by *Medusa666*
> 
> Anyone who have changed the fan on Radeon Pro Duo or know someone who has?
> 
> I'm very happy with this card, but I would prefer a more silent fan, but tinkering with the expensive card is not something I'm too keen on.
> 
> Anyone got ideas?
> 
> Thanks


Looking at pictures of the pro duo leads me to believe the disassembly would almost be the same as a fury x. Replacing the radiator fan is pretty easy, see the third cord zip tied to the water hoses? That plugs into a small 4 pin on the card, the only things keeping it in place require scissors to cut the zip ties, a Phillips screwdriver, and a very small hex screwdriver for the enclosure. It doesn't require removing the waterblock or really anything dangerous.

After replacing the fan you can choose to either control the fan through an external fan hub, or plug into the card itself with an adapter (4 pin to mini 4 pin.) there's no need to replace any tim on the card as it won't give you any temps benefits (replaced mine the antec diamond and thick Fuji pads.) removing the enclosure plates will drop the temps a bit, probably moreso for the duo.

I personally am very happy replacing the stock fan with a phanteks 120mm pwm. You won't get much temp drop if any however I don't notice the noise even at full speed (mine is 90% 24/7 connected to a grid+) Sure the typhoons push a lot of air but it gets LOUD and I don't like the way it sounds.


----------



## Offler

My particular Gigabyte R9 FuryX:

a) Coil whine.
Rarely in game menus, but its something I found in many other games.

b) High-pitched noise in water pump
I even dont need to dismantle the card to be sure its the old version, as its early production batch.

c) Its not even a win in chip lottery, as its only slighly above 62% in ASIC, but... it has all cores enabled.

Questions:
1. I got a suggestion to replace the pump ASAP.
General opinion on that? Was there any real issue aside from sound?

2. RMA could be a problem
Card costs 450 euro, There could be a problem with replacing within 30 days because there are no more pumps at manufacturer. So they will take this card, and offer a different one in that price range - which is something i dont want to. Any suggestions?


----------



## huzzug

Wait until Vega launches. There's bound to be a card in that price bracket and it may as well perform better than the FuryX. This is if you can wait and the warranty carries


----------



## Offler

Quote:


> Originally Posted by *huzzug*
> 
> Wait until Vega launches. There's bound to be a card in that price bracket and it may as well perform better than the FuryX. This is if you can wait and the warranty carries


Warranty will last one more year, but the current releases of CPUs and GPUs by AMD were reason why i decided to go for FuryX specifically









Edit:
to be clear... I consider 64 ROPs enough, but comparing to overkill done by Nvidia its not enough.


----------



## DigMan

Does anyone here know of any known good custom nano BIOS's out there? I think I solved my cooling issue, now it is just time to fix the power limit issue. Picture for reference.


----------



## drm8627

how long is the actual pcb on the r9 fury nitro by itself?


----------



## Ceadderman

Quote:


> Originally Posted by *drm8627*
> 
> how long is the actual pcb on the r9 fury nitro by itself?


Roughly 7". No longer. I have 480 which is based on the Fury PCB. So they should be the same length. Just measured it out and it came to 7".









~Ceadder


----------



## drm8627

Well the nitro has a custom pcb. So I was just wondering The only watercooling option i could find for the r9 fury nitro was 13 inches long, roughly, the same as the stock heatsink. Was trying to figure out a way to get the r9 fury nitro to fit in a silverstone ft03 mini tower, by swapping out the heat sink, but im not having any luck there.

any suggestions?


----------



## diggiddi

Only option I know of is Alphacool


----------



## drm8627

yea, and that cooler is 13 inches. isnt it? thats the dimension i came across.


----------



## Ceadderman

There is no water block solution for Nitro other than Alphacool plastic/disc insert covers to my knowledge. I hafta say this again...

If you're wanting to watercool a GPU stay away from multifan cards. They are the last option the block makers consider building blocks for. EK has a Fury and Fury X block. But they simply ignore Nitro. Nevermind there are so many multifan cards on the market with each generational release.









~Ceadder


----------



## drm8627

nah alpha cool makes a full cover block, for the nitro specifically, but its just as big as the fan cooler, that i could find. surprisingly hard to find dimensions for it.

https://modmymods.com/alphacool-nexxxos-gpx-ati-r9-fury-m04-incl-backplate-black-11323.html


----------



## Skyl3r

So, applying any overclock in TriXX locks my cards' frequency to 300/500.
Afterburner worked until I enabled Unofficial overclocking mode to be able to change HBM. Then it locked my cards frequency to 300/500 as well. This is repeatable. I can disable unofficial overclocking and it begins working, then reenable it and it doesn't work.

I noticed that 300/500 seems to be the settings for power state 0 in Radeon Settings. This leads me to believe that somehow trying to apply an HBM overclock is preventing the card from changing power states.

I can remove this "lock" by either disabling and reenabling crossfire or by rebooting my computer.

I first noticed this problem after installing my 3rd Fury X. I've repeated this problem on 1, 2 and 3 cards.


----------



## gupsterg

Quote:


> Originally Posted by *Skyl3r*
> 
> Are you asking me to get a multimeter out and hit the HBM VRMs? lol
> I think I'm missing something here
> 
> 
> 
> 
> 
> 
> 
> I didn't volt mod, if that's the question.
> 
> *EDIT:*
> Is there a way to adjust the HBM voltage from software?














.


----------



## Skyl3r

Well, the problem I mentioned might be related... One of my GPUs is definitely dead. Cards have been running fine then I launched a game of starcraft and the screen immediately started artifacting bad enough that nothing was intelligible anymore. The graphics drivers started crashing a few times.
So, I closed SC2, launched Furmark and it ran at normal FPS with no artifacting. So I ran Firestrike Ultra and it artifacted.
Then I disabled one card and the artifacting immediately disappeared. I then disabled the other card and... still no artifacting...

So one of my cards is definitely damaged, but I can't seem to figure out which one because the problem goes away as soon as I disable crossfire...


----------



## Ceadderman

Likely it's your xFire and not your card(s). Try rolling back your driver or waiting til you can update your testing software.









~Ceadder


----------



## Skyl3r

Quote:


> Originally Posted by *Ceadderman*
> 
> Likely it's your xFire and not your card(s). Try rolling back your driver or waiting til you can update your testing software.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I tried uninstalling, running DDU then reinstalling and the problem persists.
I'll try rolling back drivers. That's a good idea.

*EDIT:*

I installed drivers from last month (17.4.1) and the problem persists. I'm relatively sure there's something wrong with at least one of the cards. I've got another test to run just to be sure though.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> I tried uninstalling, running DDU then reinstalling and the problem persists.
> I'll try rolling back drivers. That's a good idea.
> 
> *EDIT:*
> 
> I installed drivers from last month (17.4.1) and the problem persists. I'm relatively sure there's something wrong with at least one of the cards. I've got another test to run just to be sure though.


iirc your running Fury X's. So it's not likely to be an over heating primary GPU in X-fire. So off the top of my head I'm thinking you have a unstable crossfire overclock on one of the cards. I ran into that problem with my HD 7950's. Either that or both cards have the same overclock applied and one doesn't like it. I've had that happen too.

Hope it's one of these problems rather than a bad card.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> iirc your running Fury X's. So it's not likely to be an over heating primary GPU in X-fire. So off the top of my head I'm thinking you have a unstable crossfire overclock on one of the cards. I ran into that problem with my HD 7950's. Either that or both cards have the same overclock applied and one doesn't like it. I've had that happen too.
> 
> Hope it's one of these problems rather than a bad card.


I have a custom loop on my cards and I don't think SC2 would load my GPU heavy enough at 1080p to fry it...
Also, I haven't been able to overclock these cards for the past few months because applying an overclock locks both of the cards to 300MHz GPU clock.

I think my next bet is to test them on another motherboard, but that's going to be a real challenge


----------



## drm8627

Hey guys question:
If i decided to watercool my fury crossfire setup, and do a very minimal overclock, would I get better minimum frames? Im looking at upgrading to the 38uc99, and the resolution is uses is 3840x1600, almost 4k, and the main issue these cards have at 4k is the variation from minimum to maximum framerate, and im afraid itll be choppy.

Any suggestions?

Running a 4790k as well.


----------



## Tgrove

My fury x crossfire setup isnt choppy. Just make sure you leave power effifiency off. Also monitor your clock speeds in game. If they fluctuate then youll need clockblocker


----------



## drm8627

I meant on more graphically demanding games, like metro, or crysis 3 etc.


----------



## Skyl3r

Quote:


> Originally Posted by *gupsterg*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


Very interesting.
I was browsing the TriXX binary and saw this:


I was deeply hoping that if I changed that to 0, a slider would become available for MVDDC







Of course nothing is that easy







But I haven't played around with it much yet.

Also quickly diverging back to my bad ("bad") GPUs. I have repeatedly stressed each individual card and cannot recreate any artifacting on individual cards.
Anyone following the 1700, 1700x, and 1800x owner's thread might recall that I ripped a PCIe slot off my motherboard. I wonder if it's possible that I somehow damaged the bus causing crossfire to go crazy?

Because I have the GPUs on a custom loop, it's sort of difficult to test on another motherboard, but I get the sneaking suspicion it'll work fine.
I'll swap out my Fury X's for some super quick Radeon HD 4870 x2's (







) and put the Fury X's on another board and see what happens.


----------



## Tgrove

Quote:


> Originally Posted by *drm8627*
> 
> I meant on more graphically demanding games, like metro, or crysis 3 etc.


I meant on all games, those included. Only like 3 games ive played in like 2 years with this setup have completely blown out the vram with tweaked settings. 4gb of hbm for vram is actually pretty amazing


----------



## Ceadderman

@Skyl3r Yes I remember that. I thought you replaced that board though. But since it looks like you didn't, it is certainly the best possibility that something was damaged and is causing your issue.









~Ceadder


----------



## diggiddi

Quote:


> Originally Posted by *Tgrove*
> 
> My fury x crossfire setup isnt choppy. Just make sure you leave power effifiency off. Also monitor your clock speeds in game. If they fluctuate then youll need clockblocker


errm I forget things, where is the power efficiency button again?


----------



## Skyl3r

Quote:


> Originally Posted by *diggiddi*
> 
> errm I forget things, where is the power efficiency button again?


----------



## diggiddi

THx my man repped you up. Does anyone adjust the texture filtering quality(top right)?


----------



## Tgrove

Yes i leave that on high quality at all times. Also use 16x anisotropic filtering on a per game basis depending on if the game has AF settings or not.


----------



## Skyl3r

Quote:


> Originally Posted by *Ceadderman*
> 
> @Skyl3r Yes I remember that. I thought you replaced that board though. But since it looks like you didn't, it is certainly the best possibility that something was damaged and is causing your issue.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I spent an hour or so last night playing a game on my first GPU with no issues.
Thought I'd be able to get away with just losing a PCIe slot; but it's looking pretty likely that it's not the cards.

To be honest, those GPUs have every right to be dead. I put them through hell haha But if they keep chugging along, I won't complain


----------



## diggiddi

Quote:


> Originally Posted by *Tgrove*
> 
> Yes i leave that on high quality at all times. Also use 16x anisotropic filtering on a per game basis depending on if the game has AF settings or not.


Thanx, repped up, does it make a difference tho?


----------



## Tgrove

Quote:


> Originally Posted by *diggiddi*
> 
> Thanx, repped up, does it make a difference tho?


Makes a pretty big difference actually. One of the most recent games i played was dying light, which has no AF settings. So i forced it through the crimson control panel and it makes a pretty noticeable difference.


----------



## bluezone

New drivers. Just a minor update for new games mainly.

Release notes and download links. http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.5.2-Release-Notes.aspx


----------



## gupsterg

It's borked for HBM performance







.



Spoiler: v16.12.2 WHQL 1050/500









Spoiler: v16.12.2 WHQL 1145/545









Spoiler: v17.5.1 non WHQL 1050/500









Spoiler: v17.5.1 non WHQL 1145/545









Spoiler: v17.5.2 non WHQL 1145/545







I also did some 3DM FS runs.

*1145/500* 3x v16.12.2 vs v17.5.2

*1145/545* 3x v16.12.2 vs v17.5.2.

Has AMD FineWine™ not been applied to Fiji? are we seeing AMD GimpWorks™?


----------



## Starbomba

Quote:


> Originally Posted by *gupsterg*
> 
> It's borked for HBM performance
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: v16.12.2 WHQL 1050/500
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: v16.12.2 WHQL 1145/545
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: v17.5.1 non WHQL 1050/500
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: v17.5.1 non WHQL 1145/545
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: v17.5.2 non WHQL 1145/545
> 
> 
> 
> 
> 
> 
> 
> I also did some 3DM FS runs.
> 
> *1145/500* 3x v16.12.2 vs v17.5.2
> 
> *1145/545* 3x v16.12.2 vs v17.5.2.
> 
> 
> 
> Has AMD FineWine™ not been applied to Fiji? are we seeing AMD GimpWorks™?


----------



## Alastair

So which drivers are the best? I can't view your graphs as I am not at my pc. Images are blocked.


----------



## gupsterg

For me v16.12.2 WHQL. Any v17.x.x tested so far reports HBM is xyz MHz but does not have the performance gain.


----------



## lanofsong

Hey R9 Fury/Nano/X/Pro DUO FIJI owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Mega Man

No, please stop spamming our thread monthly


----------



## huzzug

Hey man. It's just a monthly reminder to something OCN participates in and takes seriously. You can ignore it if you don't like it. Stop being mean.


----------



## geriatricpollywog

Quote:


> Originally Posted by *Mega Man*
> 
> No, please stop spamming our thread monthly


He's the editor. It's his thread.


----------



## Mega Man

if we did what they did every month, we would get infractions and or moderated


----------



## domrockt

This AMD driver madness.. Oo this is the only thing i regred going from NV to AMD ..
But neverless this powerhouse is amazing for just 250€


----------



## 99belle99

Quote:


> Originally Posted by *domrockt*
> 
> This AMD driver madness.. Oo this is the only thing i regred going from NV to AMD ..
> But neverless this powerhouse is amazing for just 250€


250 euro? What card did you buy?


----------



## domrockt

I bought an used fury tri-x for 250€ and an used ekwb waterblock to cool it.
It is supportet by an [email protected]

3440*1440P with all max except shadows (vram eater) @ all games 50-90fps AAA titles. I love my fury








I bet i can sell it for 250€-300€ with my waterblock included when Vega releases.


----------



## geriatricpollywog

Quote:


> Originally Posted by *domrockt*
> 
> I bought an used fury tri-x for 250€ and an used ekwb waterblock to cool it.
> It is supportet by an [email protected]
> 
> 3440*1440P with all max except shadows (vram eater) @ all games 50-90fps AAA titles. I love my fury
> 
> 
> 
> 
> 
> 
> 
> 
> I bet i can sell it for 250€-300€ with my waterblock included when Vega releases.


Really? I get about 45 FPS in Watchdogs 2 in High settings (not very high or ultra), 55 FPS in Witcher 3 max settings, and 60 FPS in Deus Ex in High (not very high ot ultra). The only games over 60 FPS are either not AAA or older than the card, which is a Fury X.

In what AAA titles does a regular Fury get over 60 FPS at 3440x1440?


----------



## diggiddi

Quote:


> Originally Posted by *0451*
> 
> Really? I get about 45 FPS in Watchdogs 2 in High settings (not very high or ultra), 55 FPS in Witcher 3 max settings, and 60 FPS in Deus Ex in High (not very high ot ultra). The only games over 60 FPS are either not AAA or older than the card, which is a Fury X.
> 
> In what AAA titles does a regular Fury get over 60 FPS at 3440x1440?


I believe BF3/4 Cry3 but you have to factor in your CPU too, for eg in CPU intensive titles like Pcars


----------



## geriatricpollywog

Quote:


> Originally Posted by *diggiddi*
> 
> I believe BF3/4 Cry3 but you have to factor in your CPU too, for eg in CPU intensive titles like Pcars


BF 3/4? Cry3? Again. Titles older than the card. Pcars is Nvidia intensive, not CPU intensive. And there is no way Pcars can run at max settings over 60fps on a Fiji.


----------



## diggiddi

Quote:


> Originally Posted by *0451*
> 
> BF 3/4? Cry3? Again. *Titles older than the card*. Pcars is Nvidia intensive, not CPU intensive. And there is no way Pcars can run *at max settings* over 60fps on a Fiji.


Sounds like shifting goal posts to me. Take a chill pill dude, why so angry? I'm out


----------



## Skyl3r

Quote:


> Originally Posted by *0451*
> 
> Really? I get about 45 FPS in Watchdogs 2 in High settings (not very high or ultra), 55 FPS in Witcher 3 max settings, and 60 FPS in Deus Ex in High (not very high ot ultra). The only games over 60 FPS are either not AAA or older than the card, which is a Fury X.
> 
> In what AAA titles does a regular Fury get over 60 FPS at 3440x1440?


The only game that matters - StarCraft II!!


----------



## Charcharo

Quote:


> Originally Posted by *Skyl3r*
> 
> The only game that matters - StarCraft II!!


Well it is a AAA game, PCMR to the core,popular, moddable...

Yeah, it matters more than Watch Dogs 2 IMHO (and SC > WD 2 as a game by a huge amount).

BTW Guys, Half Life 2 is also a AAA game. A weak RX 550 can do 4K 100 fps on it


----------



## Skyl3r

Quote:


> Originally Posted by *Charcharo*
> 
> Well it is a AAA game, PCMR to the core,popular, moddable...
> 
> Yeah, it matters more than Watch Dogs 2 IMHO (and SC > WD 2 as a game by a huge amount).
> 
> BTW Guys, Half Life 2 is also a AAA game. A weak RX 550 can do 4K 100 fps on it


Something completely off topic. Every once in a while I'll hear my roommate complain about his Nano not being fast enough or another friend complain about needing SLI 1080 Ti's. I've got two Radeon HD 4870 x2's and games like Call of Duty Black Ops 2 and Fallout 4 run flawlessly on them. They cost a mere $20 lol

But yeah SC2 and SC:BW are the only games I play. I've bought a fair amount of other games because I always think "Yeah, what I really need right now is a good RPG". Then I play it and after listening to the lady giving you a quest talk for more than 30 seconds my patience goes out the door.
I couldn't even beat the tutorial in DOTA 2. But StarCraft... That's a game that makes sense.

Anyways, yes, crossfire Fury X's and an 1800x was absolutely necessary for SC2 because of how well SC2 scales with core count and the excellent crossfire support (much sarcasm implied).


----------



## Starbomba

Quote:


> Originally Posted by *Skyl3r*
> 
> Something completely off topic. Every once in a while I'll hear my roommate complain about his Nano not being fast enough or another friend complain about needing SLI 1080 Ti's.


Solution: Get 2 Nanos







I'm seriously considering that, after Vega of course, just for BOINC and some lulz.

I dunno what is many of these people playing, or what false ideas they had, but for 4K gaming, my Nano has been nothing less but stellar. On the new DOOM, i can average 50-55 FPS's on Ultra (have not tried Nightmare), on Total War: Warhammer i can easily get over 40 FPS with everything maxed out but shadows, the new Unreal Tournament runs at 50 FPS, the new HITMAN averages 47 FPS on DX12, Wolfenstein New Order and Old Blood do not dip below 40 FPS, Company of Heroes 2 runs butter smooth at 60+ FPS all maxed out, SCII runs stellar as well, and the only change i've made is 15% more power limit, a small undervolt on the core, and overclocked HBM to 575 MHz. It is not even at full 50% power limit (so it is not pegged at 1000 MHz) nor it is overclocked so if i wanted i could still have more juice to get out of it.

This is what gives me hope for Vega, if they do manage to sort out their memory controller glitches (considering Fury is the first chip using HBM and they do not have the deep pockets of Nvidia for R&D it's somewhat forgivable) and the clock speed can really be cranked up to 1500 MHz+, Vega will be a beast.

Quote:


> Originally Posted by *Skyl3r*
> 
> Anyways, yes, crossfire Fury X's and an 1800x was absolutely necessary for SC2 because of how well SC2 scales with core count and the excellent crossfire support (much sarcasm implied).


Indeed, SCII was the reason i got my E5 2670 and had a 290X Crossfire on 1080p before my Nano, and will be the reason i get Threadripper and a Vega Xfire setup /sarcasm


----------



## Skyl3r

Quote:


> Originally Posted by *Starbomba*
> 
> Solution: Get 2 Nanos
> 
> 
> 
> 
> 
> 
> 
> I'm seriously considering that, after Vega of course, just for BOINC and some lulz.


He actually does have 2 Nanos.








From what I've been told (I have never experienced it), the only time these Fiji cards become a problem is when more than 4GB of VRAM is needed.
I've seen that issues could occur on games like Need For Speed (don't know which one).
Quote:


> Originally Posted by *Starbomba*
> 
> This is what gives me hope for Vega, if they do manage to sort out their memory controller glitches (considering Fury is the first chip using HBM and they do not have the deep pockets of Nvidia for R&D it's somewhat forgivable) and the clock speed can really be cranked up to 1500 MHz+, Vega will be a beast.
> Indeed, SCII was the reason i got my E5 2670 and had a 290X Crossfire on 1080p before my Nano, and will be the reason i get Threadripper and a Vega Xfire setup /sarcasm


Vega will be pretty interesting. I am not sure if I will invest in it personally, but I have high hopes for it being real competition for NVidia.
If you're planning on playing SC2, you might wanna wait for that Starship 48c96t processor and dual socket boards. You don't want to risk not having one thread per unit!


----------



## xkm1948

Anyone tried to use their Fury to mine Crypto coins?


----------



## neurotix

Quote:


> Originally Posted by *Starbomba*
> 
> Solution: Get 2 Nanos
> 
> 
> 
> 
> 
> 
> 
> I'm seriously considering that, after Vega of course, just for BOINC and some lulz.
> 
> I dunno what is many of these people playing, or what false ideas they had, but for 4K gaming, my Nano has been nothing less but stellar. On the new DOOM, i can average 50-55 FPS's on Ultra (have not tried Nightmare), on Total War: Warhammer i can easily get over 40 FPS with everything maxed out but shadows, the new Unreal Tournament runs at 50 FPS, the new HITMAN averages 47 FPS on DX12, Wolfenstein New Order and Old Blood do not dip below 40 FPS, Company of Heroes 2 runs butter smooth at 60+ FPS all maxed out, SCII runs stellar as well, and the only change i've made is 15% more power limit, a small undervolt on the core, and overclocked HBM to 575 MHz. It is not even at full 50% power limit (so it is not pegged at 1000 MHz) nor it is overclocked so if i wanted i could still have more juice to get out of it.
> 
> This is what gives me hope for Vega, if they do manage to sort out their memory controller glitches (considering Fury is the first chip using HBM and they do not have the deep pockets of Nvidia for R&D it's somewhat forgivable) and the clock speed can really be cranked up to 1500 MHz+, Vega will be a beast.
> Indeed, SCII was the reason i got my E5 2670 and had a 290X Crossfire on 1080p before my Nano, and will be the reason i get Threadripper and a Vega Xfire setup /sarcasm


How are you getting 40+ FPS in the Wolfenstein games? What settings and resolution?

I have both of them and want to play them but they run like... dog turds.

I was trying to run them at 5760x1080 (well, duh, because they support it and I can) and no matter what settings I changed, I could not get above 24 fps or so, with the average being 20 fps. I reduced everything to low and was still only getting 20 fps. I have no clue if those games support Crossfire or not but I don't think they should run that badly, even on a single card, at my resolution. Hot garbage I tell you.

DOOM 2016 plays fine on my system at that res, even in OpenGL4.2, and with Vulkan it's like butter. Different engine, I know. Am I missing something here (performance mod or driver settings or something???)


----------



## diggiddi

Quote:


> Originally Posted by *xkm1948*
> 
> Anyone tried to use their Fury to mine Crypto coins?


Yeah I was running Nicehash for the past day, rates were decent initially but they seem to have tanked


----------



## wege12

Quote:


> Originally Posted by *xkm1948*
> 
> Anyone tried to use their Fury to mine Crypto coins?


I've been mining Ethereum with my Fury X for a over a year now and it does a great job. I've made more money from this then most people make in a year!


----------



## Minotaurtoo

Quote:


> Originally Posted by *wege12*
> 
> I've been mining Ethereum with my Fury X for a over a year now and it does a great job. I've made more money from this then most people make in a year!


I've been thinking of mining myself... but according to the estimates I've seen a fury x would barely pay for itself in a year... how are you making so much?


----------



## wege12

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I've been thinking of mining myself... but according to the estimates I've seen a fury x would barely pay for itself in a year... how are you making so much?


I started mining ethereum when it was worth less than $3, so I got in pretty early. Back then, the difficulty wasn't nearly as high so I earned ether much quicker than you can today. I also built a couple of dedicated miners with a total of 16 graphics cards.

If you're going to get into mining now, don't get a fury X. It's way too expensive and power hungry. I would recommend 470/570 480/580.


----------



## Minotaurtoo

Quote:


> Originally Posted by *wege12*
> 
> I started mining ethereum when it was worth less than $3, so I got in pretty early. Back then, the difficulty wasn't nearly as high so I earned ether much quicker than you can today. I also built a couple of dedicated miners with a total of 16 graphics cards.
> 
> If you're going to get into mining now, don't get a fury X. It's way too expensive and power hungry. I would recommend 470/570 480/580.


I might... thanks for the info... +rep


----------



## xkm1948

If i already have a furyx and would like to get some extra bucks by mining, would it be worth it now?


----------



## diggiddi

Give it a shot you never know. I dabbled a little in 2013 and forgot all about it, with my $300 7950 I bought on Ebay. I made a little less than 1 bitcoin, 4 yrs later it paid for itself and then some
You can start out with nicehash they just released 2.0


----------



## Ceadderman

You won't get much but if you can live with modest gains (~$200yr) then why not.

The gains could be less than that by a substantial margin, I only listed the gains as to what I last read someone pulling from a 480, in the bit coin mining thread.

Results should differ to the positive for Fury but I wouldn't expect a healthy increase in gains.









Soon as I can I will throw up a link to the thread here so you can see for yourself what the more experienced members are doing as far as mining goes.









~Ceadder


----------



## xkm1948

Looks like it ain't worth the trouble. I was hoping for 50 bucks per month net gain so at least it can help pay for my phone bill. Thx anyway!


----------



## Starbomba

Quote:


> Originally Posted by *neurotix*
> 
> How are you getting 40+ FPS in the Wolfenstein games? What settings and resolution?
> 
> I have both of them and want to play them but they run like... dog turds.
> 
> I was trying to run them at 5760x1080 (well, duh, because they support it and I can) and no matter what settings I changed, I could not get above 24 fps or so, with the average being 20 fps. I reduced everything to low and was still only getting 20 fps. I have no clue if those games support Crossfire or not but I don't think they should run that badly, even on a single card, at my resolution. Hot garbage I tell you.
> 
> DOOM 2016 plays fine on my system at that res, even in OpenGL4.2, and with Vulkan it's like butter. Different engine, I know. Am I missing something here (performance mod or driver settings or something???)


I had that issue with older drivers, but after installing the latest one (17.5.2) all issues went away. I'll see if i can install New order again since i got it deleted due to lack of space.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Starbomba*
> 
> I had that issue with older drivers, but after installing the latest one (17.5.2) all issues went away. I'll see if i can install New order again since i got it deleted due to *lack of space*.


yeah I've had that same issue... 8TB just isn't enough any more







.... seriously though... I thought I would never use all of 8TB, but the other day I caught myself cleaning off my storage drives... btw even my slowest drives works well for most games... I only install really harsh games on my SSD.


----------



## ramos29

hi every one, i am facing random pc reboot, in the error log it shows me kernel power 41
i have an antec 80+ 1200w, two radeon fury
some people say it is a faulty psu, others say it is related to the "fast boot" option..
can someone tell me what may cause this
if i use a good inverter can it prevent my pc from shutting down if the psu failed to deliver at some point?


----------



## diggiddi

Quote:


> Originally Posted by *ramos29*
> 
> hi every one, i am facing random pc reboot, in the error log it shows me kernel power 41
> i have an antec 80+ 1200w, two radeon fury
> some people say it is a faulty psu, others say it is related to the "fast boot" option..
> can someone tell me what may cause this
> if i use a good inverter can it prevent my pc from shutting down if the psu failed to deliver at some point?


Sounds like a power issue, make sure all your connections are tight, unseat and reseat connectors etc to eliminate them as the source
When do the restarts happen when you are under high load ?? Might want to disconnect one gpu or undervolt both of them to see if issue continues
I had the same problem under high loads, I undervolted -30mv now waiting on my new PSU to get here
If the PSU is bad, connecting it to an inverter will not fix what is wrong with it , but I could be wrong about that


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> yeah I've had that same issue..*. 8TB just isn't enough any more*
> 
> 
> 
> 
> 
> 
> 
> .... seriously though... I thought I would never use all of 8TB, but the other day I caught myself cleaning off my storage drives... btw even my slowest drives works well for most games... I only install really harsh games on my SSD.


Whoaaa not quite there yet, you must have a buunch of movies stored on your drive??
or every single steam game , which is it ??


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> Whoaaa not quite there yet, you must have a buunch of movies stored on your drive??
> or every single steam game , which is it ??


long story actually... but the short of it is me and the wife both had extensive music and dvd collections when we got married... well... we moved from a huge place to a smaller house.... had to get rid of a lot of stuff... soooo we ripped all our dvd's and music and put the disks in storage boxes under our current house (it has a sorta large space under the floor) Now combine that with over 3000 games installed and yeah... space gets cramped... only about 15 of the games are over 1GB in size though... a few top 7GB.... also all our home videos are on here too...


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> long story actually... but the short of it is me and the wife both had extensive music and dvd collections when we got married... well... we moved from a huge place to a smaller house.... had to get rid of a lot of stuff... soooo we ripped all our dvd's and music and put the disks in storage boxes under our current house (it has a sorta large space under the floor) Now combine that with over 3000 games installed and yeah... space gets cramped... only about 15 of the games are over 1GB in size though... a few top 7GB.... also all our home videos are on here too...


I figured, but 3000 games







how do your eyes still work


----------



## prom

Quote:


> Originally Posted by *neurotix*
> 
> How are you getting 40+ FPS in the Wolfenstein games


The trick to high framerates in Bethesda's Wolfenstein games is 3 fold:

Make sure you've made the *MachineGames\Wolfenstein The New Order* storage folders in your *AppData\Local* directory.
The game doesn't make them by default.

Next crank everything to max, but *keep the PPF to 16* iirc. Bumping that up doesn't seem to have a visual change, but the performance hit is huge.

And if that doesn't cut it, you can change screenspace reflections to _medium_ instead of _high_ without a noticeable visual difference









I just played through it on my single Nano (1080p), and it was smoooooth as butter.
There were occasional momentary dips, but I credit that to the ****tily optimized game engine (iD Tech5) and my tired 6600k.

Anyway, a really fun game!


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> I figured, but 3000 games
> 
> 
> 
> 
> 
> 
> 
> how do your eyes still work


Keep in mind I've been a pc gamer since the early 90's...


----------



## Ceadderman

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> I figured, but 3000 games
> 
> 
> 
> 
> 
> 
> 
> how do your eyes still work
> 
> 
> 
> Keep in mind I've been a pc gamer since the early 90's...
Click to expand...

Same here! Still need to get and rip a copy of Phantasmagoria to a drive. For some reason, I never did finish that one. Probably due to D00M, King's Quest and Leisuresuit Larry taking much of my time when I had to return my friend's(who soon became not a friend for unfriendlike reasons) copy of it and I never found a copy for sale anywhere afterward. Lotta love for the game though. Should look ridiculously smooth on modern hardware.









~Ceadder


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ceadderman*
> 
> Same here! Still need to get and rip a copy of Phantasmagoria to a drive. For some reason, I never did finish that one. Probably due to D00M, King's Quest and Leisuresuit Larry taking much of my time when I had to return my friend's(who soon became not a friend for unfriendlike reasons) copy of it and I never found a copy for sale anywhere afterward. Lotta love for the game though. Should look ridiculously smooth on modern hardware.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


I'm starting to want to dig out my Leisure suit Larry.... you are the second person in as many days to bring that game up... My son was playing doom last week... and yes the old version of doom at that.... apparently you can get a cell phone version... I had to LOL a bit since he thought it was a new game : )


----------



## Ceadderman

I am round teens all the time. As you can imagine I am







much of the time.

Today for instance there was a Teen in the store with a hairdo worse than a hair bear. I said something to one of my coworkers about it, an he was







... "a what?" I had to take him on a trip through memory lane (Google) to explain it.







lol

~Ceadder


----------



## ramos29

it happened while playing games like pes 17 nier automata battlefield.. and all those games i was running them with crossfire disabled, i think i have a prblem in the 12 pin motherbord connector, i will try to fix it and see if this can sort out the prblem
i cant afford another psu as it will cost me a kidney in my country, i already lost a 295x2 due to this kernel power stuff, first months with the furys i had no issues and now the kernel power error surface again


----------



## xkm1948

This crypto mining is getting insane.FyryX on fleabay is reaching 700 bucks. Im seriously thinking of selling mine and get a 1080ti instead.

What is the best crypto coin to mine now?


----------



## Skyl3r

Quote:


> Originally Posted by *xkm1948*
> 
> This crypto mining is getting insane.FyryX on fleabay is reaching 700 bucks. Im seriously thinking of selling mine and get a 1080ti instead.
> 
> What is the best crypto coin to mine now?


Ethereum, Ethereum classic, Zcash and Monero. Those seem to be the biggest bang for the buck right now.
Fury X's do pretty good at eth, but if you could get $700 for it, you could pick up 2 1070's which will do better in the hash/watt department and similar in hash.

I have been checking ebay every once in a while looking for a Fury X and noticed that the prices on these things have soared through the roof. Glad I got mine for $250 on ebay a month or two ago


----------



## Minotaurtoo

Quote:


> Originally Posted by *Skyl3r*
> 
> Ethereum, Ethereum classic, Zcash and Monero. Those seem to be the biggest bang for the buck right now.
> Fury X's do pretty good at eth, but if you could get $700 for it, you could pick up 2 1070's which will do better in the hash/watt department and similar in hash.
> 
> I have been checking ebay every once in a while looking for a Fury X and noticed that the prices on these things have soared through the roof. Glad I got mine for $250 on ebay a month or two ago


well that does explain the soaring prices... I got mine for 380 new a bit back and was thinking of getting another fury x if they were still that price..... then I looked and my heart sunk.... I was like OMG WTH?!?!?!?! I was expecting a price drop with the new generations of Nvidia and the upcoming amd vegas.... but nope.


----------



## Skyl3r

Quote:


> Originally Posted by *Minotaurtoo*
> 
> well that does explain the soaring prices... I got mine for 380 new a bit back and was thinking of getting another fury x if they were still that price..... then I looked and my heart sunk.... I was like OMG WTH?!?!?!?! I was expecting a price drop with the new generations of Nvidia and the upcoming amd vegas.... but nope.


Had the same exact idea







Crazy how much these prices can change in the course of a few weeks.


----------



## gupsterg

I'm getting tempted to place the Fury X on ebay if a FVF promo comes up. Had thought I'd keep it as a momento of 1st HBM card to market. Dunno.

Don't wanna go nVidia for GPU. Would I be able to hold out for Vega ....

So I may just keep going with Fury X.


----------



## Starbomba

I am not into mining, however when i tried to look for a second Nano to crossfire, i was like "What in the world happened?!?!?!"

Glad i bought my first Nano for $245 tho. Hopefully Vega greatly outperforms these (Fiji should be between GTX 1060 and 1070) so these will go down in price and have a Xfire Nano setup.

Either that, or AMD releases a Nano successor. I'd buy two of those in a heartbeat.


----------



## Minotaurtoo

I've seen a couple used fury x's going for around 400$ on ebay this morning.


----------



## Digitalwolf

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I've seen a couple used fury x's going for around 400$ on ebay this morning.


Something happened to my Fury X's... not sure if it was a motherboard malfunction or the cards. I still have my EK Blocks etc so I had been looking off and on... at one point I remember seeing them on either Newegg or Amazon for $3xx for brand new cards... then when I decided to actually go and buy some... they were no longer listed... I see them here or there for more than I paid at launch...

and used on Ebay for $400'ish.... like you said.

It does make me wonder... what the hell... I should have pulled the trigger when I saw them for like 370 something on Newegg.

Oh and in regards to mine... I came home one day and turned on my computer. There was a flash and some smoke so I shut everything off... Both PCIE slots on my motherboard were partially apart and melted around the power part of the slot. Both the GPU's literally had the metal fingers that go into the slot melted into a slag like look... this was a setup I was running daily for a bit over a year when this happened. Stock bios on the cards.. didn't oc them because to me it didn't seem worth it and didn't have leaks (water level was still full after the event) etc


----------



## Minotaurtoo

Quote:


> Originally Posted by *Digitalwolf*
> 
> Something happened to my Fury X's... not sure if it was a motherboard malfunction or the cards. I still have my EK Blocks etc so I had been looking off and on... at one point I remember seeing them on either Newegg or Amazon for $3xx for brand new cards... then when I decided to actually go and buy some... they were no longer listed... I see them here or there for more than I paid at launch...
> 
> and used on Ebay for $400'ish.... like you said.
> 
> It does make me wonder... what the hell... I should have pulled the trigger when I saw them for like 370 something on Newegg.
> 
> Oh and in regards to mine... I came home one day and turned on my computer. There was a flash and some smoke so I shut everything off... Both PCIE slots on my motherboard were partially apart and melted around the power part of the slot. Both the GPU's literally had the metal fingers that go into the slot melted into a slag like look... this was a setup I was running daily for a bit over a year when this happened. Stock bios on the cards.. didn't oc them because to me it didn't seem worth it and didn't have leaks (water level was still full after the event) etc


Kinda scary when something unknown happens like that... Had a mainboard go up once... took the better part of a day to trace the problem down to a bad vrm... but really it was my fault, I was pushing a cheapo board to the limit with a bulldozer chip.


----------



## neurotix

So, I seemed to have fixed Wolfenstein New Order (and probably Old Blood) on my setup.

Before, I would barely manage 24 fps in Eyefinity, with dips down to below 5 fps, and this was with everything on low.

Now, the game seems to run at 50+ fps, with most everything on Ultra, and 16 PPF. I can't say exactly what the fps is, you'll see why later, but it's definitely *much much* smoother with a stable framerate.

I did what prom suggested earlier in this thread,
Quote:


> The trick to high framerates in Bethesda's Wolfenstein games is 3 fold:
> 
> Make sure you've made the MachineGames\Wolfenstein The New Order storage folders in your AppData\Local directory.
> The game doesn't make them by default.
> 
> Next crank everything to max, but keep the PPF to 16 iirc. Bumping that up doesn't seem to have a visual change, but the performance hit is huge.
> 
> And if that doesn't cut it, you can change screenspace reflections to medium instead of high without a noticeable visual difference


On top of that, the big trick I learned from a reddit thread I think is *don't use MSI Afterburner or any other monitoring software with an OpenGL plugin because it halves FPS*, once I stopped using Playclaw5 (similar to Afterburner OSD) my low FPS problem went away totally.

The game isn't even really that demanding on a Fury/Fury X, even in Eyefinity, I'm fairly sure I'm getting a steady 60 fps, and my top card isn't even heating up. The fans barely even come on. I'm pretty sure Crossfire does *NOT* work because the fans on my bottom card never start spinning (I have them set to turn off totally unless the cards pass 45C or so). Apparently, the OpenGL engine they used for these two games does not support Crossfire, makes sense because DOOM 2016 didn't support Crossfire in OpenGL mode either. If you want to try to get Crossfire working, try this thread on Hardocp. You can still set it up this way with Crimson, the interface to do so is just a little different but the profile is still there. Unfortunately, I tried this and it didn't seem to work, considering my bottom card's fans never came on. Doesn't matter much anyway, as I said, a single Fury even seems to be enough to run this game on Ultra even at 5760x1080 and probably at 4K too.

Hope this helps anyone having the same problems.


----------



## ABACABB

As I saw mentioned earlier in the thread, I am considering selling my Fury X and buying a 1080ti or hopefully waiting for Vega. I am not getting tremendous performance out of this card at 2K 144hz. I will not screw around with Crossfire because in my experience the cost benefit isn't there and certainly not the support.

Quote:


> Originally Posted by *neurotix*
> 
> The game isn't even really that demanding on a Fury/Fury X, even in Eyefinity, I'm fairly sure I'm getting a steady 60 fps, and my top card isn't even heating up. Doesn't matter much anyway, as I said, a single Fury even seems to be enough to run this game on Ultra even at 5760x1080 and probably at 4K too.


I am not sure why you are shilling a 2 year old card so hard. I have a Fury X and it is most certainly the bottleneck in my computer even with an i7 2600K.

I have a 4K monitor, and at 60fps it is mediocre at best. It artifacts consistently in BattleFront. On a 2k 144hz monitor it barely can push over 100fps in For Honor on medium settings while my CPU utilization is only at 20%.


----------



## Skyl3r

Quote:


> Originally Posted by *ABACABB*
> 
> I will not screw around with Crossfire because in my experience the cost benefit isn't there and certainly not the support.


Could you name some graphically demanding games that don't support xfire or have problems with it? Where are you getting this information from?
Quote:


> Originally Posted by *ABACABB*
> 
> I am not sure why you are shilling a 2 year old card so hard. I have a Fury X and it is most certainly the bottleneck in my computer even with an i7 2600K.


I've been preaching for a while that spending a lot on CPUs is normally pointless (as I parade around with my 1800x







). You can get an i5 for $80 that will not bottleneck most modern games. So trying to say that your GPU is the bottleneck therefore it's bad (especially at 2k 144hz) is ridiculous. This is expected and normal.
Quote:


> Originally Posted by *ABACABB*
> 
> I have a 4K monitor, and at 60fps it is mediocre at best. It artifacts consistently in BattleFront. On a 2k 144hz monitor it barely can push over 100fps in For Honor on medium settings while my CPU utilization is only at 20%.


"_at 60fps it is mediocre at best_" - What?
If your getting artifacting I think you have other problems to look into. I don't see the point in complaining about "barely pushing over 100fps" either. I don't know what you're expecting.

Also, it's silly to accuse neurotix of "shilling". I don't think he was wrong about anything he said. I think he is being realistic and you're expecting wild and unrealistic results.


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> Could you name some graphically demanding games that don't support xfire or have problems with it? Where are you getting this information from?
> I've been preaching for a while that spending a lot on CPUs is normally pointless (as I parade around with my 1800x
> 
> 
> 
> 
> 
> 
> 
> ). You can get an i5 for $80 that will not bottleneck most modern games. So trying to say that your GPU is the bottleneck therefore it's bad (especially at 2k 144hz) is ridiculous. This is expected and normal.
> "_at 60fps it is mediocre at best_" - What?
> If your getting artifacting I think you have other problems to look into. I don't see the point in complaining about "barely pushing over 100fps" either. I don't know what you're expecting.
> 
> Also, it's silly to accuse neurotix of "shilling". I don't think he was wrong about anything he said. I think he is being realistic and you're expecting wild and unrealistic results.


REP+1. well said.


----------



## neurotix

Thank you Skyler, repped+

I never said the card was the perfect 4K card, everyone here knows me and knows I run two of them. In the majority of the games I play, it's enough for my resolution (5760x1080), and a good majority of popular games from 2010 on support Crossfire. I could rattle off a list of just how many games support it, and how many I can max out with this setup, but I won't. In fact, members here can tell you that many times I have questioned the long-term viability of my Furys, given the low amount of VRAM and ever increasing demands of games.

I was trying to offer HELPFUL ADVICE and simply stating that with the right configuration, and no overlay software, Wolf New Order runs very fluidly in Eyefinity and doesn't seem to be very taxing. That's not the case with every game or even most games, there are games I have that are even older that utilize both cards well and heat them up a lot, for example Far Cry 3 (2012). Far Cry 3 actually uses both my cards and they get up in the high 50s.

If anything, YOU could be suspect of being a shill as the first thing you said was "Buy a 1080ti". And you joined yesterday and have no rep whereas most people on this site know me. Why should anyone listen to what you have to say? Anyway, I'll drop it, and I don't personally think you really are a shill, but hopefully you see my point. That kind of attitude is not going to make you very popular here.

Please continue guys.


----------



## bluezone

New drivers. Crimson Relive 17.6.1.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.6.1-Release-Notes.aspx


----------



## gupsterg

Have you by chance installed and seen if HBM performance being gimp'd?


----------



## Ceadderman

Nice Genesis reference with your SN Aba!









~Ceadder


----------



## xkm1948

17.6.1 seems to give better performance in DX12. Time Spy default score for me went from ~5400 to 5550.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Have you by chance installed and seen if HBM performance being gimp'd?


I just got around to installing it and testing.

Here's a compare to 16.12.2 on the left.


Spoiler: Warning: Spoiler!







Still gimp'd on memory read.


----------



## gupsterg

+rep







. Memory copy







..... 378869 vs 336307







.....

Toying with idea of placing Fury X on ebay, tonight ends a £1 FVF promo. But then currently don't know what I'd replace it with







.

Green team is out for me, it's red or dead







.

Polaris doesn't float my boat







. Had Hawaii before and Fiji was a step for me (as no cost upgrade at the time). I was viewing this RX 580 review, no results are recycled but all cards fresh tests, now that's Polaris 1450MHz vs Fiji 1050MHz and still 15% performance lead to Fiji. I practically luv the quietness and cooling of AIO unit of Fury X. Perhaps I should keep it now til Navi and forgo Vega.

Opinions fellow Fiji owners







.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> +rep
> 
> 
> 
> 
> 
> 
> 
> . Memory copy
> 
> 
> 
> 
> 
> 
> 
> ..... 378869 vs 336307
> 
> 
> 
> 
> 
> 
> 
> .....
> 
> Toying with idea of placing Fury X on ebay, tonight ends a £1 FVF promo. But then currently don't know what I'd replace it with
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Green team is out for me, it's red or dead
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Polaris doesn't float my boat
> 
> 
> 
> 
> 
> 
> 
> . Had Hawaii before and Fiji was a step for me (as no cost upgrade at the time). I was viewing this RX 580 review, no results are recycled but all cards fresh tests, now that's Polaris 1450MHz vs Fiji 1050MHz and still 15% performance lead to Fiji. I practically luv the quietness and cooling of AIO unit of Fury X. Perhaps I should keep it now til Navi and forgo Vega.
> 
> Opinions fellow Fiji owners
> 
> 
> 
> 
> 
> 
> 
> .


Well it looks like I was more wrong than right with the leaked benchmarks that could've been Vega(1200 MHz). I had thought that the benches were something that were in the line of an updated Fiji that would never much see the light of day. But I was sort of partially right too. I seems that the new Mac Pro will likely be using this version of Vega. So not a regular PC part per se.

Speculation I've have read also suggests the long ramp up to Vega has ruled out Vega 2 and RTG will move directly to Navi after Vega. So the wait to Navi might be relatively short.

Just my two cents.


----------



## xkm1948

Quote:


> Originally Posted by *gupsterg*
> 
> +rep
> 
> 
> 
> 
> 
> 
> 
> . Memory copy
> 
> 
> 
> 
> 
> 
> 
> ..... 378869 vs 336307
> 
> 
> 
> 
> 
> 
> 
> .....
> 
> Toying with idea of placing Fury X on ebay, tonight ends a £1 FVF promo. But then currently don't know what I'd replace it with
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Green team is out for me, it's red or dead
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Polaris doesn't float my boat
> 
> 
> 
> 
> 
> 
> 
> . Had Hawaii before and Fiji was a step for me (as no cost upgrade at the time). I was viewing this RX 580 review, no results are recycled but all cards fresh tests, now that's Polaris 1450MHz vs Fiji 1050MHz and still 15% performance lead to Fiji. I practically luv the quietness and cooling of AIO unit of Fury X. Perhaps I should keep it now til Navi and forgo Vega.
> 
> Opinions fellow Fiji owners
> 
> 
> 
> 
> 
> 
> 
> .


Depends on how much you can fetch for your FuryX.

If I can sell mine around $500 I would go for a 1080Ti for sure. The performance improvement is good enough.

I also got mine FuryX since launch. Feeling is kind of mixed positive and negative. This will probably be my last RTG GPU for a very long time. Performance wise RTG is just so far behind of nVidia.

So I say if you can sell it for a good price, go team green. RTG won't be competitive for quite a long time from their track record so far.


----------



## CptAsian

Quote:


> Originally Posted by *gupsterg*
> 
> +rep
> 
> 
> 
> 
> 
> 
> 
> . Memory copy
> 
> 
> 
> 
> 
> 
> 
> ..... 378869 vs 336307
> 
> 
> 
> 
> 
> 
> 
> .....
> 
> Toying with idea of placing Fury X on ebay, tonight ends a £1 FVF promo. But then currently don't know what I'd replace it with
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Green team is out for me, it's red or dead
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Polaris doesn't float my boat
> 
> 
> 
> 
> 
> 
> 
> . Had Hawaii before and Fiji was a step for me (as no cost upgrade at the time). I was viewing this RX 580 review, no results are recycled but all cards fresh tests, now that's Polaris 1450MHz vs Fiji 1050MHz and still 15% performance lead to Fiji. I practically luv the quietness and cooling of AIO unit of Fury X. Perhaps I should keep it now til Navi and forgo Vega.
> 
> Opinions fellow Fiji owners
> 
> 
> 
> 
> 
> 
> 
> .


I'm in the exact same boat as you. Not really considering, but just thinking of the idea of selling my two R9 Furys and trying to make a profit (bought each for about $230). But I just don't know where to go from there. I'd rather have AMD in my gaming rig (that's where they're gonna be transferred soon), and no other options look appealing.


----------



## gupsterg

@xkm1948

Fury X is going for ~£300-£350. Let's say I compared it with what I paid for it, I make a tidy return even after ~1yr 3mths use. Now a GTX 1080 TI is ~£630, that kinda money I've never spent on a GPU TBH. I plough in ~110% for ~72% performance jump plus I lose variable refresh rate tech, not gonna pay nVidia tax for G-Sync.

GTX 1080 ~£390, that is about the utter limit I'd pay for a GPU once flogging the Fury X and adding to pot (out right I'd never pay that kinda £). Then again not gonna pay nVidia tax for G-Sync, so again seems a false economy/swap out.

GTX 1070 isn't on my radar, just ref'ing the RX 580 review I linked before it just isn't a worthwhile change over Fury X plus nVida tax on G-Sync.

So I'm back to using Fury X







/ waiting for AMD to release a successor.

@CptAsian

I did not place it on ebay. I can sell it later and not lose money. I just may not make a profit, but the decision to change to "whatever" will be a more informed one.


----------



## xkm1948

So you have already made your decision, you just want to hear it from someone else. I would not call that "asking for fellow owners' opinion"









Whatever suits you then. I will be waiting for Volta. Not counting on RTG to be competitive in high end now.


----------



## geriatricpollywog

Quote:


> Originally Posted by *gupsterg*
> 
> @xkm1948
> 
> Fury X is going for ~£300-£350. Let's say I compared it with what I paid for it, I make a tidy return even after ~1yr 3mths use. Now a GTX 1080 TI is ~£630, that kinda money I've never spent on a GPU TBH. I plough in ~110% for ~72% performance jump plus I lose variable refresh rate tech, not gonna pay nVidia tax for G-Sync.
> 
> GTX 1080 ~£390, that is about the utter limit I'd pay for a GPU once flogging the Fury X and adding to pot (out right I'd never pay that kinda £). Then again not gonna pay nVidia tax for G-Sync, so again seems a false economy/swap out.
> 
> GTX 1070 isn't on my radar, just ref'ing the RX 580 review I linked before it just isn't a worthwhile change over Fury X plus nVida tax on G-Sync.
> 
> So I'm back to using Fury X
> 
> 
> 
> 
> 
> 
> 
> / waiting for AMD to release a successor.
> 
> @CptAsian
> 
> I did not place it on ebay. I can sell it later and not lose money. I just may not make a profit, but the decision to change to "whatever" will be a more informed one.


Have you considered mining on your Furys? Return is about $6 a day before power cost.


----------



## miklkit

This has nothing to do with DX12 but I tried to play Bioshock today for the first time in 5 months and it was stuck in windowed mode. So I installed the 17.6.1 drivers and it went straight to fullscreen. Less bugs is always a good thing.


----------



## xkm1948

I got someone low balling me on forum on the FuryX. $250, yeah right, that was before the mining boom.


----------



## steadly2004

Quote:


> Originally Posted by *xkm1948*
> 
> I got someone low balling me on forum on the FuryX. $250, yeah right, that was before the mining boom.


yea, our forum prices usually don't reflect crazy fluctuations in the market, like those caused by mining. Check out eBay if you want to ride that wave. I usually keep prices pretty respectable for selling things on the forum. Like your selling to a family member or something.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> So you have already made your decision, you just want to hear it from someone else. I would not call that "asking for fellow owners' opinion"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whatever suits you then. I will be waiting for Volta. Not counting on RTG to be competitive in high end now.


Well it may read that way, but when post 111116 was done I was looking for opinions. By the time i made post 11120 I'd sorta worked some things out, but was still looking for a view point.

Now I think the urge to sell has been quashed.

Quote:


> Originally Posted by *0451*
> 
> Have you considered mining on your Furys? Return is about $6 a day before power cost.


I may, but the issue is currently due to Ryzen being a developing platform I keep tweaking/testing things on differing UEFIs and or trying other CPU samples I may have on rig.
Quote:


> Originally Posted by *miklkit*
> 
> This has nothing to do with DX12 but I tried to play Bioshock today for the first time in 5 months and it was stuck in windowed mode. So I installed the 17.6.1 drivers and it went straight to fullscreen. Less bugs is always a good thing.


Hmmm, not had this issue even on older drivers. One thing that has always been broken for me besides HBM performance on v17.x is SWBF will always artifact, even on stock ROM. I roll back to v16.12.2 WHQL and stock is fine plus OC and HBM clock increases show good GPGPU bench/very minor gaming performance boost.

And I've done new installs/images of W7/W10C, etc. Reported bug but no response/resolution.


----------



## miklkit

My issue was most likely caused by my off brand 27" 1440P monitor and its buggy drivers. I hoped that the newer drivers might solve a compatibility problem and so far that seems to be true. This Nitro is still pure dog stock except for a more aggressive fan profile.


----------



## gupsterg

Nitros are overvolted like anything out of the box







. I had 2 of them. From here onwards you'll see some info.

Fury Tri-X wasn't (non OC edition) plus I preferred the ref PCB. Another reason the Fury X was what I kept after trying all those. None of the Nitros unlocked







, the Tri-X did to 3840SP. Some benches I did the Tri-X with SP unlock matched a genuine Fury X.


----------



## neurotix

My Sapphire Fury Nitros don't unlock; they also have a stock VID of 1.26v for 1050Mhz. (Way more voltage than they need for that.)


----------



## gupsterg

Do bios mod.


----------



## LionS7

Quote:


> Originally Posted by *Minotaurtoo*
> 
> Why is the prices for Fury X's getting so high again? I got mine for well under 500$... I think it was like 420$ or something... now I see they are getting back up to the 650$ mark... cheapest I saw was 550 on a quick google search... I'm glad I bought when I did then lol... I was actually considering getting one for a build I may be working on for a gamer/engineer who was looking for a dual purpose PC, but the RX 480 may work for him.... he only plays at 1440P anyway.


Quote:


> Originally Posted by *neurotix*
> 
> My Sapphire Fury Nitros don't unlock; they also have a stock VID of 1.26v for 1050Mhz. (Way more voltage than they need for that.)


Well, my R9 Fury X wants 1.26V+ for stable work on 1100Mhz core, so... you card is not something strange.


----------



## gupsterg

Firstly Fury Nitro is Fiji Pro ASIC.

The ROM/driver does ASIC profiling for VID setting based on AMD reference clocks for an ASIC.

So as GPU clock has been technically increased in regard to ASIC used, the ASIC profiling increases VID. As Sapphire have set 1050MHz it defaults to 1.25V VID for DPM 7, Fiji Pro is supposed to be 1000MHz.

So on stock ROM as 1050MHz my sample was 1.25V.


Spoiler: Warning: Spoiler!















Now at AMD reference clocks it was 1.206V


Spoiler: Warning: Spoiler!















Then that ROM has a VDDC offset from factory, which AIDA64 dump does not take into context. +38mV was offset. The GPU was technically getting ~+82mV more than it needs. Fury X has no VDDC offset on stock ROM, nor does Fury Tri-X non OC edition ie 1000Mhz using AMD ref PCB.

My Fury X needs DPM7 VID of 1.268V for 1145MHz, has been at that OC for ~1yr+.


----------



## xkm1948

Hey @gupsterg, the VEGA FE is coming out soon with lot of sites opening up for pre-order. Are you in for one? Only $1799 for the water cooled version. Bear in mind though it comes with a 375Watt TDP.. Since you are AMD GPU or bust, I am looking forward to your review of VEGA


----------



## gupsterg

LOL, yeah that is







.

Let's just see what RX Vega is







.


----------



## xkm1948

VEGA has been disappointing so far. Good thing is RyZen is not. I am looking foward to Threadripper. This way I can finally try the good old AMD CPU + nVidia GPU combination. Ah the AthlonXP days, those were glorious for the AMD+NV combination.


----------



## lanofsong

Hello R9 Radeon Fury/NANO/X/Pro DUO FIJI owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## HyeVltg3

Does anyone have R9 Nano Undervolt results.

Been trying to undervolt my R9 Nano while keep it at Fury X clocks (1050) for the past 2 weeks.

3DMark runs fine, but some times while gaming Display Driver crashes and resets all Wattman changes. getting a bit exhausting.

Does anyone have good resources for this?

please and thank you.


----------



## bluezone

Quote:


> Originally Posted by *HyeVltg3*
> 
> Does anyone have R9 Nano Undervolt results.
> 
> Been trying to undervolt my R9 Nano while keep it at Fury X clocks (1050) for the past 2 weeks.
> 
> 3DMark runs fine, but some times while gaming Display Driver crashes and resets all Wattman changes. getting a bit exhausting.
> 
> Does anyone have good resources for this?
> 
> please and thank you.


While undervolting can be a little hit or miss. The most successful strategy I have used is to lower temperatures and then lower voltage. This also makes for lower temperatures to a point.
Depending on the what your specific piece of silicon likes. There is a crossover point @ temperature in current usage (power) around 48-52C. iirc Amps consumed can go up 30-40% at that toggle temperature.
The short version is that the cooler Fiji runs, the lower the applied voltage potentially can be.

The explanation above is very short, it could run paragraphs, but it is relatively accurate and concise.

Results vary card to card. Have you tweaked the fan profile yet and what are your temperatures like. Fiji (Fury series) likes to be run cool. If you have already increased the fan speed, might I suggest a TIM replacement with Gelid GC-extreme or Deepcool Z5 or Z9. If you do a repaste, don't forget to apply a small amount of thermal paste to the thermal pads on the VRMs as well (improves temperatures 2-3C).


----------



## HyeVltg3

Temps were about 72-74c max while gaming on the Diode.
VRM temps (if they can be trusted in HWInfo) were around 89-93c.

It would always crash at random points so I dont have any "last" temp results. Just Maximums.

In Wattman, I have the last 4 steps as
[email protected]
[email protected]
[email protected]
[email protected] - 1200mV (tried all)

Stock was at 1225mV and temps would rise to 80c-86c

I've tried Power Limits 0%-50% still crashing.

I can leave clocks at Auto and just fiddle with the Voltage, but that gives 1000Mhz, but still crashes a some random point. never the same spot.


----------



## bluezone

Quote:


> Originally Posted by *HyeVltg3*
> 
> Temps were about 72-74c max while gaming on the Diode.
> VRM temps (if they can be trusted in HWInfo) were around 89-93c.
> 
> It would always crash at random points so I dont have any "last" temp results. Just Maximums.
> 
> In Wattman, I have the last 4 steps as
> [email protected]
> [email protected]
> [email protected]
> [email protected] - 1200mV (tried all)
> 
> Stock was at 1225mV and temps would rise to 80c-86c
> 
> I've tried Power Limits 0%-50% still crashing.
> 
> I can leave clocks at Auto and just fiddle with the Voltage, but that gives 1000Mhz, but still crashes a some random point. never the same spot.


That's way up there for the VRMs. iirc the cut off temperature is 95C on the Nano. For one thing, peak allowable operating current drops as temperature rises. So just when you would have the highest current draw, you have the least available.

OCP (Over Current Protection) is triggered when temperature remains too high for too long. Roughly 62 Hrz (clocks) of PWM working frequency, I do not recall the actual PWM working freq.



You need to work on your cooling. Maybe more fans, higher fan speeds, piping in cooler outside case air and/or TIM replacement. Can I assume your room temperature is a little high?

I see your sig. says your form Canada too. I'm on the eastern end of Lake Ont.


----------



## Minotaurtoo

really loving the fury x cooler now... hours of gaming in a hot room (78F) and only 61 C on core... didn't look at vrm... in summer the poor AC in this room just can't keep up with the space heater... I mean computer... lol


----------



## bluezone

https://videocardz.com/70465/msi-damn-rx-vega-needs-a-lot-of-power


----------



## neurotix

AMD's flagship cards (since the 290X anyway) have always used a lot of power, but offered similar performance to the high end Nvidia offerings at the time, usually at a lower price point.

I know I don't just speak for myself when I say that I don't care about power usage as long as it is powerful enough, especially if it's beating the 1080ti. I know a lot of people agree that they don't care about the power usage if they can get a powerful card at a lower price.

Though, if mining continues it will definitely NOT be the cheaper option, considering people can sell a Fury X for $900 on Ebay, and this thing is rumored to have supply shortages because of HBM2, good luck finding one for less than $1000 when it comes out, as it will be the single most powerful mining GPU ever created.


----------



## Kana-Maru

Quote:


> Originally Posted by *neurotix*
> 
> AMD's flagship cards (since the 290X anyway) have always used a lot of power, but offered similar performance to the high end Nvidia offerings at the time, usually at a lower price point.
> 
> I know I don't just speak for myself when I say that I don't care about power usage as long as it is powerful enough, especially if it's beating the 1080ti. I know a lot of people agree that they don't care about the power usage if they can get a powerful card at a lower price.
> 
> Though, if mining continues it will definitely NOT be the cheaper option, considering people can sell a Fury X for $900 on Ebay, and this thing is rumored to have supply shortages because of HBM2, good luck finding one for less than $1000 when it comes out, as it will be the single most powerful mining GPU ever created.


I undervolted the crap out of my Fury X and still get great performance. Most people eventually undervolt their AMD GPUs if possible.


----------



## neurotix

It depends on what it's going to be used for. For mining or folding or BOINC it makes sense.

I've been around here since 2010 and used exclusively AMD GPUs and 99% of users overclock and overvolt to get better performance.

Had numerous friends with the 290X and they overclocked and overvolted and really just didn't care about power use.

For the Fury and Fury X, it makes more sense, as often, they have a high stock VID and don't need so much voltage to maintain factory clocks. The performance improvement from overclocking is minimal and the headroom is very low, so it makes sense to leave it stock and lower the (inordinately high) voltage. This is the case for me, especially since I have two cards, I am yet to find a game I can't max out at 5760x1080. I really don't need any extra performance because two cards is enough.

Historically, though, most people overclock and overvolt, undervolting was always usually in the minority and done by people with things like low profile cards, passively cooled cards and so on, in SFF builds.


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> It depends on what it's going to be used for. For mining or folding or BOINC it makes sense.
> 
> I've been around here since 2010 and used exclusively AMD GPUs and 99% of users overclock and overvolt to get better performance.
> 
> Had numerous friends with the 290X and they overclocked and overvolted and really just didn't care about power use.
> 
> For the Fury and Fury X, it makes more sense, as often, they have a high stock VID and don't need so much voltage to maintain factory clocks. The performance improvement from overclocking is minimal and the headroom is very low, so it makes sense to leave it stock and lower the (inordinately high) voltage. This is the case for me, especially since I have two cards, I am yet to find a game I can't max out at 5760x1080. I really don't need any extra performance because two cards is enough.
> 
> Historically, though, most people overclock and overvolt, undervolting was always usually in the minority and done by people with things like low profile cards, passively cooled cards and so on, in SFF builds.


What voltage/speed are you running your Fury on?


----------



## neurotix

I have dual (CrossFireX) Sapphire R9 Fury Nitro.

They come at 1050Mhz/500Mhz stock. VID is 1.25V.

Generally I just leave them at that, yes they do get somewhat hot in certain games, but I don't see temps above 65C on the top card and 55C on the bottom. My fan profile is set to hit 100% fan at 70C. In some games, with Power Efficiency on, Vsync on, and Frame Rate Target Control on, my temps don't pass 50C on the top and 45C on the bottom (Tomb Raider 2013 is like this- with everything Ultra at 5760x1080, top card hardly reaches 50C).

I had front fans in my case but I recently removed them, I didn't think they made a difference in GPU temps, apparently I was wrong. With games like Witcher 3 and Crysis 3 (Ultra- 5760x1080), and the front fans on, my top card would peak at 58C. Now without the fans that's up to about 63C. That's okay though, I got tired of them blowing dust into my case and all over my stuff (couldn't turn them off all the way, only to low). Hard drives would get really dusty. My front fan filter would get covered in dust and need to be cleaned on a weekly basis. I got tired of it.

I've overclocked the cards to 1100/545Mhz with +35Mv and they would be stable but produce much more heat, and probably only give 3% more performance. Not worth it to me. (For benching I can do 1125/545Mhz but that's it, and in some stuff they crash, not usable for games.) I've also undervolted them by lowering the voltage by -10Mv and power limit by -10%, this seemed to give me less heat and power usage, but only slightly. I think the voltage went from 1.25V to 1.225V. This only resulted in like -3C under heavy load though. I tried undervolting further but even -5Mv less and my cards were unstable and would crash. One has 63% ASIC and the other has 60% ASIC. Some of the lowest I've had or seen from any AMD card and I've benched around 15 for HWBOT. I've heard that Fiji just has low ASIC though, or even that it totally doesn't apply. Either way, it is basically not worth the trouble to either overclock or undervolt my cards. In both cases it does not provide enough performance, or enough of a power/heat reduction to be worth it. So I just leave them. The plus side is that they are rock solid stable at stock and the two together have no problems maxing out any game I've tried at my resolution.









I hope this helps and is thorough enough.


----------



## gupsterg

Quote:


> Originally Posted by *Kana-Maru*
> 
> I undervolted the crap out of my Fury X and still get great performance. Most people eventually undervolt their AMD GPUs if possible.


I've ran my Fury X at 1145MHz 1.268V / 545MHz 1.325V as more of a long term experiment than necessity. Recent few months more and more I use the custom undervolt ROM than OC one. As I usually have FRTC / FreeSync on and gaming experience is smooth the FPS to an extent is irrelevant. And the AIO has been sweet







, considering the hours and hours I have used the card I'm shocked it still functions.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Recent few months more and more I use the custom undervolt ROM than OC one..


Ah, at last I've turned you over to the dark side.









Off and on.I've been over clocking and under volting my cards since my HD 7950's.

Right now I'm @ 1050/1162 on my Nano with 64-65C (GPU/VRM) normal gaming peaks. last I checked anyhow.


----------



## gupsterg

Indeed you have swayed me







.



For [email protected] I lose nothing IMO being at those clocks, but gain efficiency.

The R7 1700 is nutty PPD vs i5 4690K, I'll check logs and compare, it is a vast difference.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Indeed you have swayed me
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> For [email protected] I lose nothing IMO being at those clocks, but gain efficiency.
> 
> The R7 1700 is nutty PPD vs i5 4690K, I'll check logs and compare, it is a vast difference.


Cool, could you post or PM some links to what you have been up to lately. Always good reading.


----------



## gupsterg

No worries







.

I'm hoping VEGA isn't priced crazy, rig would be [email protected] monster.

IMO/IIRC when loosely comparing 2x Hawaii vs 1x Fiji, Fiji rocked for PPD/Watts.


----------



## neurotix

Quote:


> Originally Posted by *gupsterg*
> 
> Indeed you have swayed me
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> For [email protected] I lose nothing IMO being at those clocks, but gain efficiency.
> 
> The R7 1700 is nutty PPD vs i5 4690K, I'll check logs and compare, it is a vast difference.


Could you share your PPD on the Ryzen vs the i5? Check my folding badges. I don't even recall what my 4790k does but I don't think it was anything more than 30k. I don't know that I've ever heard PPD for Ryzen, considering GPU folding gives you vastly more points nowadays than CPU folding.


----------



## LeadbyFaith21

This may be the wrong thread for this, but I've got two Fury X GPUs in my computer with a custom loop, and the temps of them are normally the same (the bottom one will go to the next degree before the top one normally) except for the "GPU VR VDDC Temp." (HWiNFO label), which the bottom card get's anywhere from 5-10 C higher. Does anyone have any idea what could be the cause of that?


----------



## xkm1948

Alright folks, looks like the Benchmark is out for VEGA FE air cooled version using Gaming Mode. Base clock 13XX MHz, Highest possible boost is around 1600MHz. So not really impressive.

https://disqus.com/by/klaudiuszkaczmarzyk/comments/

IMO it is quite a disappointment consider RTG had two years since FuryX launch to get this right. Let's see how Navi turns out now.


----------



## Simmons572

IMO, we need to hold off until we have more review samples. I am still excited for Vega personally, just as I was for the Fury cards when they first came out.


----------



## xkm1948

Well good for you then. Logically RTG really doesn't have much resource to pour into VEGA development so I have already given up on VEGA. Best case scenerio it will be better than 1080, but then with a much higher power draw.

AMD shifted its focus on CPU and that is why VEGA is too little too late.


----------



## neurotix

I agree, it would be better to wait for the gaming cards to launch to really see how good Vega is. Though I'm not optimistic either.

It will be ridiculously difficult and expensive to get one though, with the mining craze.


----------



## gupsterg

Hmmm, waiting for some reviews. May still hang to Fury X for a while perhaps, not go green that's for sure







.

neurotix sorry for delay, from a log of 15hrs, on FS00 (ie CPU) I saw credits of:-

9934, 5564, 2516, 2509, 5625, 17932, 2046, 2518, 2513

log-20170625-211916.txt 137k .txt file


----------



## Minotaurtoo

I'm going to skip vega for sure unless.... they have aio water cooler... I may make the switch but only if they have aio water cooler : )


----------



## gupsterg

Yeah defo likey AIO







.

Besides gaming, [email protected] performance my swing my purchase, but then Navi may not be that far off, unless AMD do the usual delay







.


----------



## neurotix

Traditionally, the delay between releases for flagship AMD GPUs has been a year and a half.

At least, 5870 was released in Sept 2009, 6970 was released in Dec 2010, 7970 in Dec 2011 (one year between releases!), 290X in Oct 2013, and the Fury X in June 2015. It will soon be July 2017, meaning it has been 2 years, 1 month since the Fury X launched, and still no Vega. Making this the longest development cycle and longest time without a flagship release in recent history for AMD.

If the rumors are true and Navi will come faster, I'll be really happy, because that's what I'm saving my money for, but it better be competitive with Nvidia's high end at the time or I may be forced to jump ship. In any case, I'm happy with the performance of my dual Fury's for the time being.

Gupsterg, thanks for the log, it really doesn't tell me much though. I'm just looking for the PPD estimate on the most common work unit from either fahcontrol or HFM.NET. (e.g. 40k PPD as an example)


----------



## Ceadderman

Quote:


> Originally Posted by *xkm1948*
> 
> Well good for you then. Logically RTG really doesn't have much resource to pour into VEGA development so I have already given up on VEGA. Best case scenerio it will be better than 1080, but then with a much higher power draw.
> 
> AMD shifted its focus on CPU and that is why VEGA is too little too late.


ATi Enthusiast with no patience for AMD reviews to be compiled with working cards...









Personally Idc what you like or think about Vega. It's too early to say things like "Too little too late" and be taken seriously.

Just so you're aware, CPU and GPU are two very different divisions of the AMD brand. They do *not* rely on the same personnel in most respects. So there is zero credibility to your stance that AMD focused on the CPU side of things at the expense of Vega. That's ridiculous.









I will wait and see what the Reference cards are doing and going for before making my decision. I watercool, so looks like a block will be in order if I pull the trigger on an XFX HD Radeon. Sapphire pretty much lost me when they chose to drop the HD Radeon from their lineup of OC cards.









~Ceadder


----------



## gupsterg

Quote:


> Originally Posted by *neurotix*
> 
> Gupsterg, thanks for the log, it really doesn't tell me much though. I'm just looking for the PPD estimate on the most common work unit from either fahcontrol or HFM.NET. (e.g. 40k PPD as an example)


NP







, 60K-90K is what I have noted in fahcontrol depending on unit. I gave the credits a rough guide on "production", my i5 4690K would never get that rate of credit.


----------



## xkm1948

Quote:


> Originally Posted by *Ceadderman*
> 
> ATi Enthusiast with no patience for AMD reviews to be compiled with working cards...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally Idc what you like or think about Vega. It's too early to say things like "Too little too late" and be taken seriously.
> 
> Just so you're aware, CPU and GPU are two very different divisions of the AMD brand. They do *not* rely on the same personnel in most respects. So there is zero credibility to your stance that AMD focused on the CPU side of things at the expense of Vega. That's ridiculous.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will wait and see what the Reference cards are doing and going for before making my decision. I watercool, so looks like a block will be in order if I pull the trigger on an XFX HD Radeon. Sapphire pretty much lost me when they chose to drop the HD Radeon from their lineup of OC cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Ceadder


Yes they do not rely on the same people to develop CPU and GPU. But the entire AMD's financial support is the same source. Be it RTG or the RyZen development group. AMD clearly shifted most if money into developing RyZen/EPYC. AMD didn'r have tons of cash so all of those were probably poured into CPU division, and that is how we got a successful RyZen design.

As long as RTG still is part of AMD, they share the same income. So my statement stands.


----------



## Bojamijams

Quote:


> Originally Posted by *xkm1948*
> 
> Yes they do not rely on the same people to develop CPU and GPU. But the entire AMD's financial support is the same source. Be it RTG or the RyZen development group. AMD clearly shifted most if money into developing RyZen/EPYC. AMD didn'r have tons of cash so all of those were probably poured into CPU division, and that is how we got a successful RyZen design.
> 
> As long as RTG still is part of AMD, they share the same income. So my statement stands.


It CLEARLY shifted most of the money eh? You're speaking as if you know facts, when in fact, its all your opinion.

Ryzen design is successful because they used Jim Keller (legendary CPU architect).


----------



## xkm1948

Yeah right, design a successful CPU good for desktop, super computer and laptop doesn't need a lot of money. And according to you only one man contributed to the success of Zen. Perfect logic.


----------



## Minotaurtoo

one thing about this conversation is clear... no one here knows all the facts... its all guesses and maybe's...

Fact 1. it's been an unusually long time between both cpu and gpu flagship products for AMD (vishera 8cores to zen 8 cores was a very long time)

Fact 2. They have had some financial trouble in recent years.

Let the assumptions begin


----------



## bluezone

PCPER benching of Radeon Vega FE.






enjoy.









For reference later in the video. 5 (Vega) Vs 3 (Fiji) vertical triangle tile stripes.

http://www.overclock.net/t/1592384/fiji-bios-editing-fury-fury-x-nano-radeon-pro-duo/1050#post_25412915


----------



## xkm1948

And i am not surprised. Vega is doa.


----------



## steadly2004

Quote:


> Originally Posted by *xkm1948*
> 
> And i am not surprised. Vega is doa.


Sucks, but I'll still buy if it's $600 or less. Maybe it'll get better. Either way I still saved $300 off a gsync monitor going with a freesync alternative (ultra-wide). So if it competes with a $300 card I broke even.....


----------



## xkm1948

Quote:


> Originally Posted by *steadly2004*
> 
> Sucks, but I'll still buy if it's $600 or less. Maybe it'll get better. Either way I still saved $300 off a gsync monitor going with a freesync alternative (ultra-wide). So if it competes with a $300 card I broke even.....


Good for you!









Vega gaming variant should be placed at 399 at most. It is hot, slow and power hungry comparing to 1080 so AMD should really position it accordingly.


----------



## faizreds

Quote:


> Originally Posted by *xkm1948*
> 
> Good for you!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Vega gaming variant should be placed at 399 at most. *It is hot, slow and power hungry comparing to 1080* so AMD should really position it accordingly.


How do you know that vrga will be hot, slow and power hungry compare to 1080?


----------



## budgetgamer120

Quote:


> Originally Posted by *xkm1948*
> 
> Good for you!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Vega gaming variant should be placed at 399 at most. It is hot, slow and power hungry comparing to 1080 so AMD should really position it accordingly.


Some proof please...


----------



## huzzug

Guys, he's a well known future reader(I know what's someone like this is called, but I can't recollect it). He successfully predicted the end of the world in 2012. You guys should watch yourself.


----------



## bluezone

Wow,







(to whom it may concern)
Vega FE is a card meant for prosumer professional workload/game development. Not a gaming card. Which is what they mainly did with it , in the video.
It's merits at it's price point are actually very good compared to competing Team Green offering.
If it doesn't meet expectations, then it's likely not targeted at you.

EDIT: Quote from the now published article, supporting the video.
Quote:


> On the following pages, you will see a collection of tests and benchmarks that range from 3DMark to The Witcher 3 to SPECviewperf to LuxMark, attempting to give as wide a viewpoint of the Vega FE product as I can in a rather short time window. The card is sexy (maybe the best looking I have yet seen), but will disappoint many on the gaming front. For professional users that are okay not having certified drivers, performance there is more likely to raise some impressed eyebrows.


https://www.pcper.com/reviews/Graphics-Cards/Radeon-Vega-Frontier-Edition-16GB-Air-Cooled-Review


----------



## miklkit

Quote:


> Originally Posted by *huzzug*
> 
> Guys, he's a well known future reader(I know what's someone like this is called, but I can't recollect it). He successfully predicted the end of the world in 2012. You guys should watch yourself.


Rather than keeping an eye on meself, it sounds like the better idea would be to keep a weather eye on him. And a hand on me purse.


----------



## 1mpurity

I was really looking at Vega for an upgrade path but I can't really justify it now especially seeing Rx Vega performance; I just got to dangerously "bios tuning" my Fury X and I can most defiantly say the fury was an horribly tuned card from AMD I got my card down from the 275tdp 300Max tdp down to 170tdp and 250Maxx tdp with bios mods with no performance decrease and actually an performance increase slight but proven I think I'll just stay with my Fury X until I can't run my games any more. It's a great card ( and after my modifications ) even an power efficient card with total pc consumption hovering around the 250 watt mark. I just need AMD to release an Fury X successor and not what Vega is now made out to be. I feel if all they did was die shrink the Fury give it the latest gcn it would've been a much better card seeing how the performance is.


----------



## shadowxaero

Quote:


> Originally Posted by *1mpurity*
> 
> I was really looking at Vega for an upgrade path but I can't really justify it now especially seeing Rx Vega performance; I just got to dangerously "bios tuning" my Fury X and I can most defiantly say the fury was an horribly tuned card from AMD I got my card down from the 275tdp 300Max tdp down to 170tdp and 250Maxx tdp with bios mods with no performance decrease and actually an performance increase slight but proven I think I'll just stay with my Fury X until I can't run my games any more. It's a great card ( and after my modifications ) even an power efficient card with total pc consumption hovering around the 250 watt mark. I just need AMD to release an Fury X successor and not what Vega is now made out to be. I feel if all they did was die shrink the Fury give it the latest gcn it would've been a much better card seeing how the performance is.


You know I never though to lower TDP for some reason....actually going to try that now lol


----------



## 1mpurity

I lower the tdp to 170 and max tdp to 250 and Added voltage to memory and lowered voltage to core and man! my temps never go over 70c Ever this is my "stock settings" here


----------



## 1mpurity

AlsoI lowered my core voltage down from 1.212 down to 1.174which is an 38mv drop on the core and that helped a lot as well.


----------



## shadowxaero

Quote:


> Originally Posted by *1mpurity*
> 
> AlsoI lowered my core voltage down from 1.212 down to 1.174which is an 38mv drop on the core and that helped a lot as well.


I recently lowered my OC to 1100 from 1150 just so I could lower voltage. Recently started doing a bit of mining with my two Fury's and a few other cards, so been trying to make the Fury's as efficient as possible.

I am running at 1.193v on both cards. I'm water cooled so temps never go past 56c honestly lol.


----------



## 1mpurity

I'm using the stock cooling and

i never go above 65c while at 1050/650 or even overclocked at 1100/650


----------



## shadowxaero

Quote:


> Originally Posted by *1mpurity*
> 
> I'm using the stock cooling and
> 
> i never go above 65c while at 1050/650 or even overclocked at 1100/650


You get 650 at stable ***, that is impressive haha


----------



## LionS7

Quote:


> Originally Posted by *shadowxaero*
> 
> You get 650 at stable ***, that is impressive haha


It is possible after some drivers, don't know which one. My R9 Fury X got from 520 to 600 stable. And I don't know what AMD did with the drivers to achieve this difference with the HBM clocks.


----------



## bluezone

Speak of the Devil. New Drivers

Crimson Relive 17.7.1.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.7.1-Release-Notes.aspx


----------



## gupsterg

@1mpurity

Can you do AIDA64 GPGPU bench?

17.x.x drivers for me and other members that tested, give no performance gains on HBM with clock increase, link.


----------



## uramaru

Just got a used Fury X. I'm currently using it for mining. So far, the settings below are stable for me (using MSI AB)

-96mv
power limit - stock
CC - 940
CC - 500
Fan - Auto

I seriously wanted to undervolt it. Though, I did not try to mod the bios yet. I know it's not the best setting, but anyone here who's kind enough to share their settings? Maybe also share their modded bios rom?









TIA!


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> @1mpurity
> 
> Can you do AIDA64 GPGPU bench?
> 
> 17.x.x drivers for me and other members that tested, give no performance gains on HBM with clock increase, link.


No change for me from 500 to 600Mhz of the HBM. Still around 367-370GB/s on copy. Maybe the chip cannot push above 380GB/s. Somewhere before I was reading something like that, maybe in some french website.


----------



## diggiddi

Quote:


> Originally Posted by *uramaru*
> 
> Just got a used Fury X. I'm currently using it for mining. So far, the settings below are stable for me (using MSI AB)
> 
> -96mv
> power limit - stock
> CC - 940
> CC - 500
> Fan - Auto
> 
> I seriously wanted to undervolt it. Though, I did not try to mod the bios yet. I know it's not the best setting, but anyone here who's kind enough to share their settings? Maybe also share their modded bios rom?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TIA!


These are my current Fury Nitro mining settings
-0Mv
+50
1108/500


Spoiler: Warning: Spoiler!


----------



## angelsalam

There is a bug in the drivers, HBM overclock isn't applied even if software monitoring says it is. that's why you don't see gain from 500 to 600, and that's why you managed to "apply" 600mhz from 520 stable.
So while it's showing 600mhz on afterburner, your card is still at 500mhz HBM.


----------



## gupsterg

Quote:


> Originally Posted by *LionS7*
> 
> No change for me from 500 to 600Mhz of the HBM. Still around 367-370GB/s on copy. Maybe the chip cannot push above 380GB/s. Somewhere before I was reading something like that, maybe in some french website.


This is my OC ROM based on the last release of Fury X ROM on AMD Community site.



I just wish AMD would sort the issue with v17.x.x drivers.


----------



## dagget3450

I have one furyx left, lol selling off the others. I already miss them. I know the AIO got some hate but man do i love it. Going back to an air cooled card is a drag.

Gonna keep the last furyx in my backup rig for now.


----------



## 1mpurity

No problem man. I'm not working tomorrow so I'll be able to have those numbers in for you tomorrow(well today later on).
Quote:


> Originally Posted by *gupsterg*
> 
> @1mpurity
> 
> Can you do AIDA64 GPGPU bench?
> 
> 17.x.x drivers for me and other members that tested, give no performance gains on HBM with clock increase, link.


----------



## 1mpurity

I can guaranty you that my HBM clock is at 650 because I set that as default clock in bios and I can even do 700 on a few games.
Quote:


> Originally Posted by *angelsalam*
> 
> There is a bug in the drivers, HBM overclock isn't applied even if software monitoring says it is. that's why you don't see gain from 500 to 600, and that's why you managed to "apply" 600mhz from 520 stable.
> So while it's showing 600mhz on afterburner, your card is still at 500mhz HBM.


----------



## gupsterg

Quote:


> Originally Posted by *dagget3450*
> 
> I have one furyx left, lol selling off the others. I already miss them. I know the AIO got some hate but man do i love it. Going back to an air cooled card is a drag.
> 
> Gonna keep the last furyx in my backup rig for now.


Other than installation, for handling, I'd concur luv the Fury X AIO. Yeah finding it hard to part with mine ...
Quote:


> Originally Posted by *1mpurity*
> 
> I can guaranty you that my HBM clock is at 650 because I set that as default clock in bios and I can even do 700 on a few games.


These number on air really defy what in the main members have experienced. I went through 8 Fury X, 2 Fury Nitro and 1 Fury Tr-X. Most reached 545MHz, 600MHz only a few, anything above none. All my testing was with stock coolers, etc.
Quote:


> Originally Posted by *1mpurity*
> 
> No problem man. I'm not working tomorrow so I'll be able to have those numbers in for you tomorrow(well today later on).


Thanks, I would appreciate it







.


----------



## rubenlol2

Managed to grab a Pro duo for 500 during a local clearance sale, slapped a EK block on it once I saw them dropping the price.
Mining with them along with a fury strix that is out of view with a usb riser, temps are in the 40s and VRMs stick at around 45-50c.

Nano and Pro duo's ports are at different lengths and slightly different heights, so I think the 45x2 fittings are a really neat solution, fits perfectly.

http://www.3dmark.com/spy/1978403 Pro duo + nano crossfire, nice scaling.


----------



## Wuest3nFuchs

Hey guys !

What does this entry mean 

My question : would it be possible to push the HBM to 600mhz ?

I have some custom vbios on my hdd ,should i flash it ?

What can i expect and what should i know before flashing it !

The cause: i'm not familiar with flashing on amd, and also read somewhere theres an issue with hdmi or the driver or something .

I don't need the performance but for my next OC-Stability project i wanna look how far i can go in *winter* on some benches with my Fury nitro+ @ 59,8 ASIC









btw Coreclock with software oc via afterburner wont work no matter what i do !


----------



## bluezone

Vega looking better on water.






Glad I pointed out that it has the same screw spacing as Fiji (he should of noticed that).


----------



## 1mpurity

Your conclusion was 100% accurate I am giving stock HBM scores downloading 16.12.2 right now to verify.

Quote:


> Originally Posted by *gupsterg*
> 
> @1mpurity
> 
> Can you do AIDA64 GPGPU bench?
> 
> 17.x.x drivers for me and other members that tested, give no performance gains on HBM with clock increase, link.


----------



## 1mpurity

This is my Improvement just from dropping drivers to 16.12.2 wow from 370 to 445 decent jump in memory copy speed will be back with my 650Mhz memory speed

Quote:


> Originally Posted by *gupsterg*
> 
> Other than installation, for handling, I'd concur luv the Fury X AIO. Yeah finding it hard to part with mine ...
> These number on air really defy what in the main members have experienced. I went through 8 Fury X, 2 Fury Nitro and 1 Fury Tr-X. Most reached 545MHz, 600MHz only a few, anything above none. All my testing was with stock coolers, etc.
> Thanks, I would appreciate it
> 
> 
> 
> 
> 
> 
> 
> .


----------



## 1mpurity

Also this is with the FuryX's stock AIO cooler i could upload shots of my pc build if you need to verify.
Quote:


> Originally Posted by *gupsterg*
> 
> Other than installation, for handling, I'd concur luv the Fury X AIO. Yeah finding it hard to part with mine ...
> These number on air really defy what in the main members have experienced. I went through 8 Fury X, 2 Fury Nitro and 1 Fury Tr-X. Most reached 545MHz, 600MHz only a few, anything above none. All my testing was with stock coolers, etc.
> Thanks, I would appreciate it
> 
> 
> 
> 
> 
> 
> 
> .


----------



## 1mpurity

Here is my 650Mhz strap with my Fury X highly unstable now that the drivers aren't limiting me but this is the point of diminishing returns don't know if it is because the voltage but i can say unstable. This hurt my heart but hey I got 625Mhz stable. Still not satisfying but hey I prefer 625 over 600 anyway.

.
Quote:


> Originally Posted by *gupsterg*
> 
> Other than installation, for handling, I'd concur luv the Fury X AIO. Yeah finding it hard to part with mine ...
> These number on air really defy what in the main members have experienced. I went through 8 Fury X, 2 Fury Nitro and 1 Fury Tr-X. Most reached 545MHz, 600MHz only a few, anything above none. All my testing was with stock coolers, etc.
> Thanks, I would appreciate it
> 
> 
> 
> 
> 
> 
> 
> .


----------



## 1mpurity

Sorry for flooding the thread with my results should have just did them all then post them with just one post.


----------



## diggiddi

1mpurity Are your memory results improving gaming fps in any appreciable way?


----------



## 1mpurity

Yes, I see an performance increase an equivilent one of if i were to overclock my core to 1100-1125. That is what caused me to start overclocking my HBM instead of my core clock. I mean I can do both but when I have an undervolted core clock and an overclocked HBM the power requirements drop for this card like I am now using a bios That's 100% "stable"(can confidently say now) and give a good performance jump in performance while also lowering power consumption my cards tdp is now 185 with a max tdp of 225 any I don't notice any performance decreases. sorry for the long statement.






















Quote:


> Originally Posted by *diggiddi*
> 
> 1mpurity Are your memory results improving gaming fps in any appreciable way?


----------



## gupsterg

@1mpurity

+rep







, thank you for test data







. No need for photo of your card







. My comment about on air was not in reference to your cooler, it was in the context that extreme cooling would be needed for "gaming at 700MHz HBM" as you stated in one post.

We Fiji owners really need to club together and highlight to AMD driver team the gimping in v17.x.x drivers for HBM performance with clock increase. I did manage to get floated to top of a reddit thread where discussion was:-


__
https://www.reddit.com/r/6lkxf3/r9_fury_fury_x_owners_how_did_performance_change/

As AMD employees do seem to view that sub hopefully someone will sort it. I have also reported issue via driver page. I have also PM's / mentioned am AMD techie on here. I plan to do a thread on AMD Community.


----------



## Offler

So after roughly 2-3 months since i purchased one of the veri first Fiji cards (probably a piece used for reviews) the high pitched noise from the original pump is gone. I am not sure if its good or a bad sign..

Anyway it runs pretty well.


----------



## xkm1948

Pretty sure the entire RTG driver team is working on improving Vega. Fiji has lost their interest a long time ago.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xkm1948*
> 
> Pretty sure the entire RTG driver team is working on improving Vega. Fiji has lost their interest a long time ago.


Not only Fury was not impressive at launch it really only had like 1 year life spam. HD 7970 got 2 years, 290X got 2 years, Fury got 1 years. At the same time 7970 and 290X got extra 1 year thanks to 280X and 390X.


----------



## dagget3450

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not only Fury was not impressive at launch it really only had like 1 year life spam. HD 7970 got 2 years, 290X got 2 years, Fury got 1 years. At the same time 7970 and 290X got extra 1 year thanks to 280X and 390X.


You sure? I didn't see any Spam with my fury.

Going to assume you mean Life span, what does that even mean? Did you run out of locked AMD threads in news section and needed to come bash AMD in owners threads now?


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not only Fury was not impressive at launch it really only had like 1 year life spam. HD 7970 got 2 years, 290X got 2 years, Fury got 1 years. At the same time 7970 and 290X got extra 1 year thanks to 280X and 390X.


Fury non X is quite good for what there was then it outperformed by a good margin the 980 while being same price, and really it got 2 years from 2015 to 2017, not even 480 was a equivalent


----------



## ZealotKi11er

Quote:


> Originally Posted by *PontiacGTX*
> 
> Fury non X is quite good for what there was then it outperformed by a good margin the 980 while being same price, and really it got 2 years from 2015 to 2017, not even 480 was a equivalent


After Polaris was out most of the driver focus was for that.


----------



## PontiacGTX

Quote:


> Originally Posted by *ZealotKi11er*
> 
> After Polaris was out most of the driver focus was for that.


still a stock 480 couldnt be close a fury


----------



## rubenlol2

My friend has recently been playing with an RX 580, OCed to 1500mhz it still is behind a 56CU Fiji at 1000mhz.


----------



## dagget3450

So we discovered that a new architecture gets more driver focus after it's released than previous ones. Who would have guessed... I still don't see how that determines a gpu's "Life span". Seriously you could literally go through all Owners threads and claim this because its mostly subjective.

Some people upgrade constantly, others keep it for a long time. I still have a friend with GTX 470 SLI, i don't go knock on his door and say " hey man your gpu's life span is over". He is happy with them and they do what he wants.


----------



## Mega Man

Quote:


> Originally Posted by *bluezone*
> 
> Vega looking better on water.
> 
> 
> 
> 
> 
> 
> Glad I pointed out that it has the same screw spacing as Fiji (he should of noticed that).


Gn is a joke.
Quote:


> Originally Posted by *dagget3450*
> 
> So we discovered that a new architecture gets more driver focus after it's released than previous ones. Who would have guessed... I still don't see how that determines a gpu's "Life span". Seriously you could literally go through all Owners threads and claim this because its mostly subjective.
> 
> Some people upgrade constantly, others keep it for a long time. I still have a friend with GTX 470 SLI, i don't go knock on his door and say " hey man your gpu's life span is over". He is happy with them and they do what he wants.


Try going to the green side where they start to nerf perf, to get you to upgrade.


----------



## bluezone

Quote:


> Originally Posted by *Mega Man*
> 
> Gn is a joke..


Yes a little inconsistent.
But I think the testing he did is valid until we see more and/or Vega gaming SKU's


----------



## ZealotKi11er

Quote:


> Originally Posted by *rubenlol2*
> 
> My friend has recently been playing with an RX 580, OCed to 1500mhz it still is behind a 56CU Fiji at 1000mhz.


For sure because its 256-Bit, 32 ROP card. 290X can come very close to stock Fury. What I mean by Fury is that it has not gained much performance like 290X and 7970 did "FineWine".


----------



## PontiacGTX

Probably the Fury performance gains hasnt been as documented as 290x or 7970 because, it is newer ,there are no detailed benchmarks on Fury drivers and bottlenecks within the uarch or DX11 pipeline


----------



## PontiacGTX

does anyone here plays doom?


----------



## 1mpurity

I play Doom from time to time. What's up?
Quote:


> Originally Posted by *PontiacGTX*
> 
> does anyone here plays doom?


----------



## PontiacGTX

Quote:


> Originally Posted by *1mpurity*
> 
> I play Doom from time to time. What's up?


do you have some graphics artifacts on multi-player?


----------



## Kana-Maru

I play Doom from time to time as well and I haven't noticed any artifacts in MP/SP.


----------



## gupsterg

I don't play Doom.

But I have artifact in SWBF time to time when using a v17.x.x driver, even on stock ROM. And out of the games I play only that game does it.

I change over to v16.12.2 WHQL and all is well, stock ROM, undervolt ROM with stock clocks and overvolt ROM with OC clocks.

The cases above are same for W7 or W10C.

So for me v16.12.2 WHQL does not gimp HBM performance with clock increase and I have no issues in SWBF, either OS.


----------



## PontiacGTX

maybe it is due to the undervolt?


----------



## 1mpurity

In his case it would just be driver related because Nothing other than his driver changed and he started to experience those problems. so it couldn't be the undervolt.
Quote:


> Originally Posted by *PontiacGTX*
> 
> maybe it is due to the undervolt?


----------



## LionS7

Quote:


> Originally Posted by *gupsterg*
> 
> @1mpurity
> 
> Can you do AIDA64 GPGPU bench?
> 
> 17.x.x drivers for me and other members that tested, give no performance gains on HBM with clock increase, link.


Quote:


> Originally Posted by *gupsterg*
> 
> I don't play Doom.
> 
> But I have artifact in SWBF time to time when using a v17.x.x driver, even on stock ROM. And out of the games I play only that game does it.


This is from I think 17.1.2 with the artifacts. I have thread in the official forums.


----------



## PontiacGTX

Quote:


> Originally Posted by *1mpurity*
> 
> In his case it would just be driver related because Nothing other than his driver changed and he started to experience those problems. so it couldn't be the undervolt.


TSSAA 8x causes the artifacts, SMAA has no artifacts


----------



## bluezone

New drivers 17.7.2

Release notes and downloads. http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.7.2-Release-Notes.aspx

EDIT: Just read through the release notes. Lots of changes and we finally have colour control back, but it is now in the Relive and no additional settings menu.


----------



## rubenlol2

Biggest changes I noticed is that I can now change which GPUs are used in the crossfire configuration with 3 or 4 cards, higher bitrate for relive and some other stuff like color depth in the normal menu isn'tead of an additional settings menu.


----------



## xkm1948

Most new features won't support Fiji, so yeah, here is our FineWine from Raja


----------



## rubenlol2

What features?


----------



## dagget3450

I have vega and these drivers don't work for it yet.. So it has to be for fiji/polaris or something else.


----------



## rubenlol2

I don't think they will drop drivers for vega until launch of RX vega.
Drivers that enable the more advanced render stuff at least, tiled raster and whatnot.


----------



## LionS7

GPU frequency, power and temp limit do not work on Afterburner 4.3. It is global problem I see. Did somebody find a solution ?


----------



## Ne01 OnnA

New Wattman working NP, OC and HBM OC ! :banana:

All you need to do is:

Delete all OC software (TriXX in my case)
Reboot + Reset Wattman in 17.7.1 whql (or other)
Then Reboot and Install New and shiny 17.7.2


----------



## LionS7

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> New Wattman working NP, OC and HBM OC ! :banana:
> 
> All you need to do is:
> 
> Delete all OC software (TriXX in my case)
> Reboot + Reset Wattman in 17.7.1 whql (or other)
> Then Reboot and Install New and shiny 17.7.2


I don't want to use wattman, cos every time I'm reinstalling drivers with DDU, I need to setup wattman again. And I need software for monitoring. This is hardware/enthusiast forum, not casual one. I don't know why everybody want me to remove 3rd party OC soft. Im using these kind of software from 12 years. The same was in AMD forums. Anyway... Ne01 OnnA you are not helping.


----------



## PontiacGTX

A doubt how do you setup power states's voltage on wattman?
http://www.overclock.net/t/1635241/wattman-p-ower-states-voltage-and-msi-afterburner-voltages-per-power-states


----------



## Ne01 OnnA

Quote:


> Originally Posted by *LionS7*
> 
> I don't want to use wattman, cos every time I'm reinstalling drivers with DDU, I need to setup wattman again. And I need software for monitoring. This is hardware/enthusiast forum, not casual one. I don't know why everybody want me to remove 3rd party OC soft. Im using these kind of software from 12 years. The same was in AMD forums. Anyway... Ne01 OnnA you are not helping.


It is working









AB & TriXX will be avaible in 1-2 weeks for new 17.7.2 (Yeah, new drivers batch ATI-17.30. ...)








Im on 17.7.1 WHQL now


----------



## bluezone

And there is a newer version of 17.7.2 out already (July 27) , 10 MB smaller?

EDIT: no noticeable difference in bench's.


----------



## LionS7

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> It is working
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AB & TriXX will be avaible in 1-2 weeks for new 17.7.2 (Yeah, new drivers batch ATI-17.30. ...)


Where did you find this info ?


----------



## Ne01 OnnA

Quote:


> Originally Posted by *LionS7*
> 
> Where did you find this info ?


Driver Packaging Version
17.10.3211.1031-170704a-316027C-CrimsonReLive -> this is for 17.7.1 WHQL look into TAB in Relive: soft info

As for new upd. for OC soft, it might be the case but it might be some fu.k up also







(like in EDID 17.2.1 lol no CRU for this driver)


----------



## PontiacGTX

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> Driver Packaging Version
> 17.10.3211.1031-170704a-316027C-CrimsonReLive -> this is for 17.7.1 WHQL look into TAB in Relive: soft info
> 
> As for new upd. for OC soft, it might be the case but it might be some fu.k up also
> 
> 
> 
> 
> 
> 
> 
> (like in EDID 17.2.1 lol no CRU for this driver)


then TRIXX is working with 17.7.2?


----------



## Ne01 OnnA

Quote:


> Originally Posted by *PontiacGTX*
> 
> then TRIXX is working with 17.7.2?


17.7.2 NO
All prior to 17.7.1 WHQL (2nd one) Working great







(on Fiji with tMOD + HBM V-Mod + CU-nlock.)
No other soft can do that, even AB









Here my TriXX 5.2.1 with MOD & HEX edited Limits + #better hook
-> https://mega.nz/#!Ac82AB7T!zsxYESeVStmCSvjFCoVJO8epkAeBbsWkk3baV7yTonc


----------



## hellm

try the latest beta of Afterburner; works with Polaris, at least clockrates/voltage; powerlimit only with wattman.
http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


----------



## PontiacGTX

Quote:


> Originally Posted by *hellm*
> 
> try the latest beta of Afterburner; works with Polaris, at least clockrates/voltage; powerlimit only with wattman.
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html


t doesnt work for 17.7.2


----------



## neurotix

I have dual Fury Nitros. I installed 17.7.2 on Win7 x64 SP1.

It seems to work fine with the stock clocks of my card and most of the new features are available. I use TriXX 5.2.1 with it, and overclocking is broken, but the fan speed control seems to work. I generally leave my cards at stock anyway unless I'm benching because two of these together is still powerful enough for any game at my resolution (5760x1080).

I've had to use TriXX 5.2.1 with my Furys for some time anyway because any newer version of TriXX breaks the fan speed control. Then, my fans are always at 100%. So I've been using the older version anyway. Fan control still works and that's mostly all I care about.

I don't mess with WattMan but Chill seems to work for both cards, it's fairly useless though at my resolution in any demanding game because it makes the fps tank unless you're moving, which I dislike. (35-40 fps standing still in Witcher 3)


----------



## bluezone

lively discussion on Vega.

http://www.overclock.net/t/1635439/wccf-amd-rx-vega-64-56-pricing-leaked-499-399-respectively-liquid-cooled-model-to-cost-599


----------



## Bojamijams

So there's just no way to overclock on 17.7.2 until a new version of Afterburner/Trixx is released?

I'm finding even the FAN with regular Wattman on 17.7.2 is bad. It just doesn't want to go past 33% no matter how hot my Fury gets. I had to manually raise the minimum to 50% to be able to keep it from hitting 80C but that just makes the minimum at 50%. This is even after I uninstalled Trixx and 17.7.2 with DDU and reinstalled it again.

So annoying


----------



## xkm1948

Quote:


> Originally Posted by *Bojamijams*
> 
> So there's just no way to overclock on 17.7.2 until a new version of Afterburner/Trixx is released?
> 
> I'm finding even the FAN with regular Wattman on 17.7.2 is bad. It just doesn't want to go past 33% no matter how hot my Fury gets. I had to manually raise the minimum to 50% to be able to keep it from hitting 80C but that just makes the minimum at 50%. This is even after I uninstalled Trixx and 17.7.2 with DDU and reinstalled it again.
> 
> So annoying


Seeing similar things here as well. Reverting back to 17.7.1 soon.


----------



## gupsterg

Still on v16.12.2 WHQL here. As I don't use SW to OC or undervolt card that is not my reason to avoid v17.x.x drivers. Why I avoid is there is no performance gain from HBM clock change on v17.x.x drivers, anyone check if v17.7.2 has that issues? Cheers.


----------



## xkm1948

Well FineWine is gone. These new drivers from RTG seems to be a big FU to people who bought their now just 2 yrs old flag ship cards. None new software features added to Fiji, a lot more bugs. This is really stupid.

I hope no one buys their overpriced Vega. RTG under Raja's control is no longer the beloved brand ATi any more.


----------



## Alastair

Anyine having issues with the 17.7.2 and even 17.6.1 drivers and software OC? Keeps dropping to DPM 1 when I try OC?


----------



## diggiddi

Went back to 17.7.1, new drivers allows you to move memory control slider but seem to bork everything, even memory


----------



## neurotix

With mine it locks the cards at idle clocks if you attempt to OC. So my core clock is 300mhz or something. Obviously makes games + benches totally unusable. I believe the HBM stayed at the default 500MHz.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> With mine it locks the cards at idle clocks if you attempt to OC. So my core clock is 300mhz or something. Obviously makes games + benches totally unusable. I believe the HBM stayed at the default 500MHz.


This is exactly what I am experiencing.


----------



## Ish416

Anyone else having massively fluctuating clock speeds and black screens after trying 17.7.2 driver that won't go away even after reverting back to to older drivers?

After installing the 17.7.2 driver, I started having horrible performance in nearly every game (including black screen lock ups) with my Fury X and the core clock seems to randomly jump between 500 - 900 Mhz, never going to it's normal speed of 1050 Mhz. Also, according to GPU-Z and AB, I noticed that my VDDC basically stays under 1.000 V at all times. Even when running the GPU-Z render test, load stays at 99-100%, Core clocks jump between 500 - 900 Mhz and usually stay between 776-874 Mhz and VDDC is nearly locked in at .9750 V under load. Nearly the same thing happens when gaming. Temps never exceed 43C with the Swiftech block on my Fury X.

Before this driver install, clocks were solid at 1050 Mhz on the core and VDDC was always around 1.22 V at that speed.

I tired running DDU and reverting back to 16.12.2 and still have the same issues with the random clocks, and have lost the ability to manually set speeds and voltages with AB and Trixx. I tried flipping the bios switch (both were stock bios), same issue and also tried this Fury in my X58 system and it also has the same issues on that system and it's never had newer drivers than 16.12.2.

Also, wattman does absolutely nothing with trying to manually set clocks. Regardless of the settings I put into it, the card seems to do whatever it wants with clock speeds and voltages.


----------



## huzzug

Have all you guys who're having issues tried to not run any third party tools and using Wattman? There were talks about the developers of these third party applications needing to update their programs.


----------



## Alastair

Quote:


> Originally Posted by *huzzug*
> 
> Have all you guys who're having issues tried to not run any third party tools and using Wattman? There were talks about the developers of these third party applications needing to update their programs.


I don't like wattman. No HBM clocks.


----------



## LionS7

Check my thread in AMD forum. At the bottom there is MSI Afterburnber 4.4.0 Beta 15 for 17.7.2.
https://community.amd.com/thread/218516


----------



## Ne01 OnnA

New beta is available:
-> office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta15.rar

Changes:

- Active bus clock monitoring for Intel CPUs. Previous version measured Intel bclk just once on startup, so if you use software CPU overclocking via adjusting bus clock on the fly, wrong CPU clock speed could be displayed. Now bus clock is measured on each hardware polling period.
-Added OverdriveN X2 overclocking API support for 17.7.2 AMD display drivers.
- Unofficial overclocking API is currently broken in 17.7.2. drivers (applying the clocks with it makes GPU stuck in lowest P-state till reboot) so MSI Afterburner is forcibly disabling unofficial overclocking mode and always using official ADL overclocking codepath on 17.7.2 and newer drivers. However, unofficial overclocking mode can still be manually unlocked via configuration file on 17.7.2 and newer drivers if AMD decide to provide a fix for unofficial overclocking mode in the future.


----------



## Minotaurtoo

Quote:


> Originally Posted by *Alastair*
> 
> I don't like wattman. No HBM clocks.


I just used bios editing for hbm clocking... just worked better.... I say worked because the driver update seems to have borked even bios clocked hbm improvements... but once you figure out what you want to run OC wise it's a little less trouble in the long run to just plug it in the bios and leave it... then even if you swap pc's the changes will stay... of coarse there are risks.... that's what the dual bios is for


----------



## weespid

For trixx on 17.7.1 I could apply to of by clicking apply and restarting the computer any change of setting though after required an restart again.
Once I get back home gupsterg I will run HBM benches on 17.7.2 though I dought it is fixed I have 550mhz HBM shown as my clock in wattman and gpuz

Back on to doom any one figure out the whight squares issue in mp and vulken? Apperantily older drivers are supposed to fix it 16.9.x but I can't get the game to load with them.


----------



## gupsterg

Thanks







. If you have AIDA64 (even evaluation version will work) run a GPGPU bench







.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> . If you have AIDA64 (even evaluation version will work) run a GPGPU bench
> 
> 
> 
> 
> 
> 
> 
> .


do you have a HBM timming table?


----------



## Alastair

Anyone know which were the latest drivers of AMD's before they bored 3rd party OC support? I'm on 17.5.2 but still seem to be having issues.


----------



## gupsterg

Quote:


> Originally Posted by *PontiacGTX*
> 
> do you have a HBM timming table?


Nope.

I have used the tighter 400MHz strap timings in 500MHz and 600MHz strap, this leads to instability at times. These also yield very little performance gain.

HBM clock increase just gets me back some scaling loss from voltage increase I need for 1145MHz on GPU to be stable.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> Nope.
> 
> I have used the tighter 400MHz strap timings in 500MHz and 600MHz strap, this leads to instability at times. These also yield very little performance gain.
> 
> HBM clock increase just gets me back some scaling loss from voltage increase I need for 1145MHz on GPU to be stable.


is there a link from IMC to core clock voltage liek Hawaii then?

also exactly how do you know what is the highest stable clock for HBM without increasing VDDCI


----------



## LionS7

Quote:


> Originally Posted by *PontiacGTX*
> 
> is there a link from IMC to core clock voltage liek Hawaii then?
> 
> also exactly how do you know what is the highest stable clock for HBM without increasing VDDCI


No, no link. You are increasing the memory until you see artifacts, like red/green points. Highly unstable HBM will freeze the system, like black screen.


----------



## gupsterg

VDDCI can only be modified by hard volt mod. IR3567B is controlling GPU/HBM voltage only on Fiji. The controller on Fiji for VDDCI has no data interface for SW/BIOS to manipulate it.


----------



## 1mpurity

Quote:


> Originally Posted by *gupsterg*
> 
> VDDCI can only be modified by hard volt mod. IR3567B is controlling GPU/HBM voltage only on Fiji. The controller on Fiji for VDDCI has no data interface for SW/BIOS to manipulate it.


Hey, Me again







The HBM performance is still broken on this driver aswell so I'ma still avoid it with an 10ft pole at least


----------



## gupsterg

Cheers







.

I noted that and I still get artifacts in SWBF on stock VBIOS on a v17.x.x driver, 1440P Ultra using FreeSync, revert to 16.12.2 WHQL and 0 issues. Tested in both W7 Pro x64 and W10C Pro x64.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> VDDCI can only be modified by hard volt mod. IR3567B is controlling GPU/HBM voltage only on Fiji. The controller on Fiji for VDDCI has no data interface for SW/BIOS to manipulate it.


I wonder how someone that told me had a R9 Fury Nitro undervolted HBM


----------



## gupsterg

HBM voltage can be adjusted by VBIOS. I added ROMs with the relevant editable register in OP of my thread ages ago.


----------



## PontiacGTX

well it is not worth the hassle to undervolt a 10w editing the bios


----------



## gupsterg

I was able to only knock 25mV on HBM at stock clocks and in some case usage I had issues. For example [email protected] on a 12hr run I'd get GPU bad states.

I believe HBM was not as good quality as it should be. SKHynix / early talk was 1.2V, what we have is 1.3V.

HBM2 on VEGA is 1.35V, again not as low as it should be, dunno why it is that way for both gens of HBM.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> I was able to only knock 25mV on HBM at stock clocks and in some case usage I had issues. For example [email protected] on a 12hr run I'd get GPU bad states.
> 
> I believe HBM was not as good quality as it should be. SKHynix / early talk was 1.2V, what we have is 1.3V.
> 
> HBM2 on VEGA is 1.35V, again not as low as it should be, dunno why it is that way for both gens of HBM.


Probably to ensure stability under all scenarios the voltage wont make a big difference in overall video card power draw


----------



## Semel

Is anyone here who plays at 1080p\1440p going to sell their fury and get vega(56 or 64)? Tbh I wasn't impressed at all to say the least when watching duderandom84 videos.

Besides selling it now I would get just a fraction of the price I paid for it.


----------



## dagget3450

Quote:


> Originally Posted by *Semel*
> 
> Is anyone here who plays at 1080p\1440p going to sell their fury and get vega(56 or 64)? Tbh I wasn't impressed at all to say the least when watching duderandom84 videos.
> 
> Besides selling it now I would get just a fraction of the price I paid for it.


not saying Vega is fast because right now its not. That said duderandom84's videos of his vega fe are gimped somewhere.... his fps is way to low when i compared against my vega fe. He was using a 7700k if i recall and i am using haswell-e or ryzen... so his fps should have been higher than mine easily.


----------



## LionS7

Quote:


> Originally Posted by *dagget3450*
> 
> not saying Vega is fast because right now its not. That said duderandom84's videos of his vega fe are gimped somewhere.... his fps is way to low when i compared against my vega fe. He was using a 7700k if i recall and i am using haswell-e or ryzen... so his fps should have been higher than mine easily.


His Vega Frontier Edition is dropping frequency in almost every test. Only in Battlefield 1 DX12, the core is staying stable at 1600Мhz.


----------



## Alastair

So what you guys think about Vega so far. I haven't looked much into it yet. But you guys seeing ~65% improvement in performance over Fiji? Cause its clocked about that much faster?

Also what drivers worked before AMD broke 3rd party OC support?


----------



## rubenlol2

The reviews for vega seems like they're all over the place in regards to how they stack up against the competition.
And a lot of benchmarks show little to no improvements over Vega FE at launch.


----------



## Alastair

Quote:


> Originally Posted by *rubenlol2*
> 
> The reviews for vega seems like they're all over the place in regards to how they stack up against the competition.
> And a lot of benchmarks show little to no improvements over Vega FE at launch.


I am trying to work out what the near 70% improvement in clocks has translated to?


----------



## u3a6

Quote:


> Originally Posted by *Alastair*
> 
> I am trying to work out what the near 70% improvement in clocks has translated to?


Sadly, something like 550W under load on overclocked +50% PL on Vega 64 LC edition... I'm wondering what drivers will do... Is tiled based rendering even working right now?


----------



## Alastair

Quote:


> Originally Posted by *u3a6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> I am trying to work out what the near 70% improvement in clocks has translated to?
> 
> 
> 
> Sadly, something like 550W under load on overclocked +50% PL on Vega 64 LC edition... I'm wondering what drivers will do... Is tiled based rendering even working right now?
Click to expand...

Something is clearly going wrong SOMEWHERE!


----------



## miklkit

Ok, I have been reading things going on with this thread for some time and know about the driver issues, but have been lucky to have no problems. Until today.

My Sapphire Fury now seems to be stuck at 300 mhz core clocks. When I first updated to the 17.7.2 drivers I didn't notice any difference, but today things are awful. The only other thing I have done is update the motherboard PCI drivers yesterday.

So, which drivers are most likely the culprit, or is there some setting somewhere I am unaware of? I could not find anything in MSI Afterburner.


----------



## u3a6

Quote:


> Originally Posted by *Alastair*
> 
> Something is clearly going wrong SOMEWHERE!


I wanted to believe that it was the drivers, tiled based renderer not working etc... For me it looks like AMD transformed GCN into something like Netburst... The card sips so much power that it cant even properly stretch it's legs even with +50% PL... IIRC Buildzoid already reported that the frequency scaling is getting really affected by the power limit... This might shade some lit into the issues:






http://cxzoid.blogspot.pt/2017/08/first-impressions-of-vega-fe-on-ln2.html


----------



## Alastair

Quote:


> Originally Posted by *miklkit*
> 
> Ok, I have been reading things going on with this thread for some time and know about the driver issues, but have been lucky to have no problems. Until today.
> 
> My Sapphire Fury now seems to be stuck at 300 mhz core clocks. When I first updated to the 17.7.2 drivers I didn't notice any difference, but today things are awful. The only other thing I have done is update the motherboard PCI drivers yesterday.
> 
> So, which drivers are most likely the culprit, or is there some setting somewhere I am unaware of? I could not find anything in MSI Afterburner.


AMD broke OC support in their drivers unless you do it through wattman. I want to roll back to the last known good driver where 3rd party OC support works. But no-one as of yet has been able to tell me which driver it is.


----------



## miklkit

Well, I couldn't wait and used the AMD Cleanup Utility to clean out all AMD software, which it didn't. Then reinstalled the 17.6.1 drivers and all is back to normal so far. 8 fps in modern games is beyond bad.

This Sapphire Nitro Fury is and always has been running at stock clocks after I tried to OC it right after buying it and failing totally.


----------



## gupsterg

@Alastair

I've been on v16.12.2 WHQL in W7/W10C, no issues in games I use, no issues for MSI AB, no issues on HBM OC, etc, etc.


----------



## diggiddi

Last good driver is 17.7.1 AFAIK


----------



## gupsterg

Putting aside OC tool compatibility, etc, every v17.x.x driver for me artifacts in SWBF regardless of OS used, even on stock ROM. Other games I have no issue. I revert to v16.12.2 WHQL and SWBF is spot on, regardless of OS, regardless if I use stock/mod ROM, etc, etc.

For me since v17.x.x drivers it's become like the Fiji cards are not getting any real support







.


----------



## diggiddi

Yeah no mo' fine wine


----------



## bluezone

Gup have tried this Beta driver for Vega?

http://www.guru3d.com/news_story/amd_radeon_vega_17_8_1_beta_6_driver_download.html

Custom install only with install over top old driver.

I'm just trying them out now.


----------



## gupsterg

I wasn't aware of it, been upto my neck in Ryzen meddling and another little side project.

Will try it now, +rep for share







.
Quote:


> Originally Posted by *diggiddi*
> 
> Yeah no mo' fine wine


Yeah, don't look like it at all








.
Quote:


> Originally Posted by *Alastair*
> 
> I am trying to work out what the near 70% improvement in clocks has translated to?


Dunno







.

OCuk have highlighted VEGA 56 for limited pieces going to be £349 on 28th Aug, I may grab one then. For me VEGA at current prices is not right price to performance to jump from Fury X TBH.


----------



## ManofGod1000

Well, looks like I could use some help. I have the X370 Taichi and was using the 2.40 bios. I have 2 x Furies, a Sapphire TriX and Sapphire Nitro +. I decided to bench the Gears of War 4 game to see if what my results were compared to Vega 56 and 64. Well, the game is causing my system to black out, the usb bus shuts off, the video out shuts off and only the power on the computer stays on until I force it off and then back on.

I removed the drivers, tried the 17.5.2 and 7.2 with the same results. Sometimes I can be in the game and it will go out and other times, it will just go out. I tried each card individually and only the TriX has the issues. I switched the toggle switch to the port direction but that did not help. Also, when cross fire is enabled and the Trix is the second card from the top, it still does it. (It does not if I disable crossfire though.) I played Crysis 3, which worked fine with Crossfire enabled and also used 3d mark without issue.

I went ahead and flashed the 3.00 bios but that did not help. I also made sure everything was at stock but that did not help. I am running a 1700X with 2 x 8GB of GSkill ram and a 1KW Seasonic power supply. Any ideas? Thanks.


----------



## diggiddi

What are your cpu temps looking like?


----------



## ManofGod1000

Quote:


> Originally Posted by *diggiddi*
> 
> What are your cpu temps looking like?


Last time I checked, somewhere in the 60's at load.


----------



## diggiddi

Quote:


> Originally Posted by *ManofGod1000*
> 
> Last time I checked, somewhere in the 60's at load.


Hmm, try the 17.7.1 drivers and see


----------



## jearly410

Hey @ManofGod1000, I'm just commenting to say that I am running the taichi 3.0 bios, with a fury x and fury nitro, so we have very similar systems. I've only had a system shut off like that (everything black etc but power still there) is when my cpu overclock is bad. Do you have anything overclocked?


----------



## bluezone

Vega BETA drivers

With these new drivers, graphics scores.
OK. FS seems about the same maybe a hair lower (200 pts).
3DM11 seems to of picked up about 2000 points graphics score.
Metro 2033 (no redux) has no dip in the middle 1/3 of the run (exploding barrels to tank), but still very variable frame rate like always. Maybe 25 fps average increase (DX11).
RotTR is smoother and faster @ my old best settings. I will have to experiment with higher settings.

I will have to play around some more.


----------



## diggiddi

Quote:


> Originally Posted by *bluezone*
> 
> Vega BETA drivers
> 
> With these new drivers, graphics scores.
> OK. FS seems about the same maybe a hair lower (200 pts).
> 3DM11 seems to of picked up about 2000 points graphics score.
> Metro 2033 (no redux) has no dip in the middle 1/3 of the run (exploding barrels to tank), but still very variable frame rate like always. Maybe 25 fps average increase (DX11).
> RotTR is smoother and faster @ my old best settings. I will have to experiment with higher settings.
> 
> I will have to play around some more.


Where dey at?
Edit : nevamind, so what gpu do you have?


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Putting aside OC tool compatibility, etc, every v17.x.x driver for me artifacts in SWBF regardless of OS used, even on stock ROM. Other games I have no issue. I revert to v16.12.2 WHQL and SWBF is spot on, regardless of OS, regardless if I use stock/mod ROM, etc, etc.
> 
> For me since v17.x.x drivers it's become like the Fiji cards are not getting any real support
> 
> 
> 
> 
> 
> 
> 
> .


the last good driver for me was 17. something. But I am not sure which. But I don't want to go downloading a bunch of drivers on limited cap. Would rather just find the one first time and go from there.

You would think that Vega (besides the architectural changes) would trickle down improvements to Fiji since the core isn't massively different is it?


----------



## Alastair

Also is there a Crypto currency mining club on OCN where one can ask mining related questions? Don't want to ask mining questions regarding Polaris in the Fiji club cause


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Alastair*
> 
> AMD broke OC support in their drivers unless you do it through wattman. I want to roll back to the last known good driver where 3rd party OC support works. But no-one as of yet has been able to tell me which driver it is.


For me it's 17.7.1 WHQL









Forza -> OK
BF1 -> OK
TF2 -> OK
ME:A -> OK
NFS Ug3 -> OK
NFS Rivals -> OK








Hope this helps

17.7.1 -> Best for Fiji IMO
Best Stable OC add. mV not needed like in 17.1.2WHQL
Best stable HBM tMOD OC up to 565-569 in CE3 Games ! normally 550









1120/569 1.218v +12POW











Spoiler: Warning: Spoiler!


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> For me it's 17.7.1 WHQL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Forza -> OK
> BF1 -> OK
> TF2 -> OK
> ME:A -> OK
> NFS Ug3 -> OK
> NFS Rivals -> OK
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps
> 
> 17.7.1 -> Best for Fiji IMO
> Best Stable OC add. mV not needed like in 17.1.2WHQL
> Best stable HBM tMOD OC up to 565-569 in CE3 Games ! normally 550
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1120/569 1.218v +12POW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I was under the distinct impression that HBM 1 clocked in steps ie 500-545-600 and somewhere around 17.x.x the OC'ing was effectively disabled making 569 be 500 instead of 600.... not an expert here by any means, but some here that are experts have made these assertions.... if that's not the case I would like to see memory benchmarks proof of such please so I can pass them along


----------



## miklkit

I've been going to Passmark to compare my new rig to others and have noticed that they really don't like the Fury. It is well down the list and they have the Fury and FuryX combined. Then I found that my stock Fury scores way better which is nuts.

So where can I find a place that places the Fury accurately?


----------



## gupsterg

Quote:


> Originally Posted by *diggiddi*
> 
> so what gpu do you have?


bluezone has a Nano, he has done some mods to stock cooling on card







.
Quote:


> Originally Posted by *Minotaurtoo*
> 
> I was under the distinct impression that HBM 1 clocked in steps ie 500-545-600 and somewhere around 17.x.x the OC'ing was effectively disabled making 569 be 500 instead of 600.... not an expert here by any means, but some here that are experts have made these assertions.... if that's not the case I would like to see memory benchmarks proof of such please so I can pass them along


You don't wanna ask Ne01 OnnA for benchmarks on HBM, you'll be bewildered







.


----------



## Minotaurtoo

unfortunately the fury and furyx didn't really make that much of an impact... don't know why because AMD had trouble keeping up with sales they sold so fast.. but for whatever reason (rigged maybe) many benchmarks and websites just ignored it or pushed it aside... even steam numbers show it way down on the list... I have a hard time believing that there are so few being used gaming... but that's the way they make it look... maybe all in crypto cash..who knows... but good luck finding a site / bench that has them properly ranked


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> bluezone has a Nano, he has done some mods to stock cooling on card
> 
> 
> 
> 
> 
> 
> 
> .
> You don't wanna ask Ne01 OnnA for benchmarks on HBM, you'll be bewildered
> 
> 
> 
> 
> 
> 
> 
> .


I always ask people for benchmark proofs... btw... I have a pretty good BS-o-meter going on... I just act like I don't to see what they have first... sometimes I've actually been surprised at the claims that turned out to be true... others... well.. yeah.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> You don't wanna ask Ne01 OnnA for benchmarks on HBM, you'll be bewildered
> 
> 
> 
> 
> 
> 
> 
> .


HBM does get any performance increase at 600MHz?


----------



## gupsterg

@Minotaurtoo








.

@PontiacGTX

For me it's minimal at best.

Using 545MHz HBM gets me back ~1-2% performance scaling in gaming that I lose from increasing VID for DPM 7. This thread has lot's of Fury X 3DM 13. Even at stock clocks, lowering VID actually increases performance by a very small, consistent amount.


----------



## neurotix

Guys, if you want to overclock Fury on 17.7.2, just use this Afterburner Beta. I tried it and it works.

17.7.2 adds Radeon Chill for Fury, Chill works in Crossfire, and a host of other improvements. You should use it.

Hope this helps.


----------



## Alastair

Quote:


> Originally Posted by *neurotix*
> 
> Guys, if you want to overclock Fury on 17.7.2, just use this Afterburner Beta. I tried it and it works.
> 
> 17.7.2 adds Radeon Chill for Fury, Chill works in Crossfire, and a host of other improvements. You should use it.
> 
> Hope this helps.


Chill for crossfire say WHAT xD Imma give this a shot!


----------



## Ne01 OnnA

Please try this New Tool for GCN OC









-> http://forums.guru3d.com/showthread.php?t=416116

Maby this one will work with our Moded Fiji.

I will try this out, when 17.8.1 non-Beta hits us


----------



## jearly410

Quote:


> Originally Posted by *Alastair*
> 
> Chill for crossfire say WHAT xD Imma give this a shot!


Lol let me know how it goes.


----------



## PontiacGTX

I just found that the driver form aug11 improved graphics score slighly on 3dmark anyone have compared the performance vs the 17.7.2 drivers?


----------



## Bojamijams

Quote:


> Originally Posted by *neurotix*
> 
> Guys, if you want to overclock Fury on 17.7.2, just use this Afterburner Beta. I tried it and it works.
> 
> 17.7.2 adds Radeon Chill for Fury, Chill works in Crossfire, and a host of other improvements. You should use it.
> 
> Hope this helps.


Every time I used CHILL, it always made it sluggish. Not to mention, makes Freesync act crazy due to how low it gets.

Never seen a point to CHILL myself, and I don't believe any AMD marketing since the BS they pulled with the failure called Vega


----------



## neurotix

Quote:


> Originally Posted by *Bojamijams*
> 
> Every time I used CHILL, it always made it sluggish. Not to mention, makes Freesync act crazy due to how low it gets.
> 
> Never seen a point to CHILL myself, and I don't believe any AMD marketing since the BS they pulled with the failure called Vega


Can't really argue with this. It's probably better to turn on "Power Efficiency" mode in the drivers for certain games than Chill. Power Efficiency mode can reduce temps drastically in certain games. In others, it induces massive stuttering and performance loss so it is better to leave it off for some games (especially recent ones). Games it works well with are Tomb Raider/ROTTR, Skyrim, Bulletstorm (or any Unreal Engine game, especially older Unreal Engine). Games it works poorly with and causes stuttering, Dragon Age Inquisition and Witcher 3 are the biggest ones I play. Best to leave it off in those games.

The frame rate drop using Chill is pretty unacceptable, of course it only happens when you're not moving but still really distracting, and yeah I'd imagine it would ruin something like Freesync.

Word on the Vega thing. I'm really disappointed too. I think we probably all are.


----------



## bluezone

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I always ask people for benchmark proofs... btw... I have a pretty good BS-o-meter going on... I just act like I don't to see what they have first... sometimes I've actually been surprised at the claims that turned out to be true... others... well.. yeah.


Yes as you have been told, so kindly by Gupsterg, I have a NANO.








It's easy to claim BS and do nothing to check things out on your own. I see this all the time in other threads, its not constructive and starts flame wars. But maybe you should of noted this at the end of my post before throwing the BS card.








Quote:


> Originally Posted by *bluezone*
> 
> Vega BETA drivers
> I will have to play around some more.


It indicates that this is of a "tentative nature".









With no reference to this situation. I generally look at investigative posts at face value and ignore the card throwers to see if there is any value in what ever statement has been made, on its own. Then make my own judgement and maybe try things on out my own.









Now I will not do leg work for you for something that was tentative, but I will do it for myself and more than willingly for others if it is seems warranted. I'm kind of busy right now and ended up staying way after midnight trying to figure out what was going on with the driver, because I do not like open questions for myself.









As for the BETA driver, after playing around with it, at this point. I would not recommend it. It's buggy. I had to DDU it and reinstall it, for instance my Nano doesn't have 8BG of HBM. LOL
I didn't have my answer for my 3DM 11 scores until I saw this.














The short version is, it appears the driver is dropping some of the geometry on my test runs. Thus likely why it has not been released for other cards. I had done my original test runs unsupervised, while taking care of other priorities. A system restore to the previous install of the driver and rerun bench, showed missing assets during the run 3DM11. This was likely the cause of my higher score.









But before you go off again on this.
Quote:


> Originally Posted by *bluezone*
> 
> Vega BETA drivers
> I will have to play around some more.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I was under the distinct impression that HBM 1 clocked in steps ie 500-545-600 and somewhere around 17.x.x the OC'ing was effectively disabled making 569 be 500 instead of 600.... not an expert here by any means, but some here that are experts have made these assertions.... if that's not the case I would like to see memory benchmarks proof of such please so I can pass them along


No benchmark software (right now) is capable of showing Us real HBM benefits.
In 3D mark 13 i have +300pts when HBM is OCed.

AIDA shows exactly same numbers all't time (Yes same, no differences at all lol)
We need to wait for nV to adopt HBM then -> Any benchmark will have real data (It's 4th Wonder), now to have it right somebody with Fiji/Vega will have to write apropriate code to show off

Becouse now is trend that GDDR5x is faster than HBM --->









HBM_1 & HBM_2 have:

1. Ultra Fast access times & Ultra latency
2. 8x to 16x wider Bit Bus width
3. Short physical distance, Ultra fast GPU connect
4. 1GT/s to >2GT/s (GDDR6 will have ~512GT/s lol)

The fact is that HBM is Really 2nd Wonder for Ultra HD & VR gaming (right after 1st wonder GDDR







)
But GDDR is Old & obsolete now -> It's Cheap & easy to produce....
etc.

It's IMO only







)


----------



## PontiacGTX

Ne01 OnnA have you tried using VEGA drivers on Fiji?


----------



## Minotaurtoo

Quote:


> Originally Posted by *bluezone*
> 
> Yes as you have been told, so kindly by Gupsterg, I have a NANO.
> 
> 
> 
> 
> 
> 
> 
> 
> It's easy to claim BS and do nothing to check things out on your own. I see this all the time in other threads, its not constructive and starts flame wars. But maybe you should of noted this at the end of my post before throwing the BS card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It indicates that this is of a "tentative nature".
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With no reference to this situation. I generally look at investigative posts at face value and ignore the card throwers to see if there is any value in what ever statement has been made, on its own. Then make my own judgement and maybe try things on out my own.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I will not do leg work for you for something that was tentative, but I will do it for myself and more than willingly for others if it is seems warranted. I'm kind of busy right now and ended up staying way after midnight trying to figure out what was going on with the driver, because I do not like open questions for myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the BETA driver, after playing around with it, at this point. I would not recommend it. It's buggy. I had to DDU it and reinstall it, for instance my Nano doesn't have 8BG of HBM. LOL
> I didn't have my answer for my 3DM 11 scores until I saw this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The short version is, it appears the driver is dropping some of the geometry on my test runs. Thus likely why it has not been released for other cards. I had done my original test runs unsupervised, while taking care of other priorities. A system restore to the previous install of the driver and rerun bench, showed missing assets during the run 3DM11. This was likely the cause of my higher score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But before you go off again on this.


uh.. blue I wasn't talking to you.. I only say this if you thought I was picking on you not that you are not welcome to reply to my comments though... just wanting to make sure you know I wasn't talking to you. That comment was made because I was warned against asking for benchmark results confirming what Ne01 OnnA was claiming.... I was making that statement to assure the person warning me that I was capable or weeding out fake or otherwise wrongly generated results... I have very well intended to do my own checking once I knew the benchmarks that were used.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *PontiacGTX*
> 
> Ne01 OnnA have you tried using VEGA drivers on Fiji?


Yup -> the 17.8.1 b6 but OC is not working







so im on 17.7.1WHQL now.

We got new tool in Guru3D for Polaris and other GCNs -> try it


----------



## Minotaurtoo

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> No benchmark software (right now) is capable of showing Us real HBM benefits.
> In 3D mark 13 i have +300pts when HBM is OCed.
> 
> AIDA shows exactly same numbers all't time (Yes same, no differences at all lol)
> We need to wait for nV to adopt HBM then -> Any benchmark will have real data (It's 4th Wonder), now to have it right somebody with Fiji/Vega will have to write apropriate code to show off
> 
> Becouse now is trend that GDDR5x is faster than HBM --->
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HBM_1 & HBM_2 have:
> 
> 1. Ultra Fast access times & Ultra latency
> 2. 8x to 16x wider Bit Bus width
> 3. Short physical distance, Ultra fast GPU connect
> 4. 1GT/s to >2GT/s (GDDR6 will have ~512GT/s lol)
> 
> The fact is that HBM is Really 2nd Wonder for Ultra HD & VR gaming (right after 1st wonder GDDR
> 
> 
> 
> 
> 
> 
> 
> )
> But GDDR is Old & obsolete now -> It's for cheap GPUs -> Cheap & easy to produce (Yes make mo'$
> 
> 
> 
> 
> 
> 
> 
> )
> etc.
> 
> It's IMO only
> 
> 
> 
> 
> 
> 
> 
> )


I have tried various numbers in the past and couldn't disprove the statements of HBM stepping though... seems like they are accurate that it goes in steps... 500-545-600... but then I haven't played with Tmods either.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Minotaurtoo*
> 
> I have tried various numbers in the past and couldn't disprove the statements of HBM stepping though... seems like they are accurate that it goes in steps... 500-545-600... but then I haven't played with Tmods either.


Its all about Latency, GT/s & Throughput
My own research shows me -> it's great combined with FreeSync Monitor.

I have better frame Times, when tMOD is in use (as for FPS gain is small 2-5FPS Max, becouse HBM already is fast as light







)
It's all about Personal experience -> If you can use *@Gupsterg* tMOD, just do it and don't look back (also give small bump in V for HBM, i have 1.337v )

If you ask me about Stepping in HBM, i can't tell you if this really true. IMO it is like any other gRAM.
I can go for gaming in one game up to 570 but in the other i need to lower it to 565 etc.









Hope this helps, take care Bratan'


----------



## bluezone

Quote:


> Originally Posted by *Minotaurtoo*
> 
> uh.. blue I wasn't talking to you.. I only say this if you thought I was picking on you not that you are not welcome to reply to my comments though... just wanting to make sure you know I wasn't talking to you. That comment was made because I was warned against asking for benchmark results confirming what Ne01 OnnA was claiming.... I was making that statement to assure the person warning me that I was capable or weeding out fake or otherwise wrongly generated results... I have very well intended to do my own checking once I knew the benchmarks that were used.


I had to reread your post I was referring to several times. Not to be a grammar Nazi, I highly dislike them, but your umbrella quote of Gupsterg included me. It had had me scratching my head







as to your intent. Thank you for enlightening me as to what you were trying to say.









As for OnnA, I do not know him well at all, but English isn't his 1st language. Once in a while he has found a few interesting and useful things.


----------



## Ne01 OnnA

THX @bluezone
Trying to be usefull


----------



## gupsterg

@bluezone

I think some confusion has occurred.

i) Post 11303 diggiddi asked what card you have. I replied to him in post 11309. Minotaurtoo never asked about your card.

ii) Post 11306 has Ne01 OnnA info on his experience with driver v17.7.1. Next post is Minotaurtoo states HBM clocks in steps and requested some memory benchmarks. Post 11309 I highlight he may not wish to ask Ne01 OnnA, as he may get bewildered from info. Post 11311 Minotaurtoo IMO made a mistake by not editing the quote of my post to exclude reply to diggiddi on what your card was. At no point Minotaurtoo had asked about your card, by not editing the quote of my post, I think mistakenly it has occurred that it looks like his comments for discussion about Ne01 OnnA posts of info are also directed at you, but IMO they are not.


----------



## Nassenoff

Quote:


> Originally Posted by *Bojamijams*
> 
> Every time I used CHILL, it always made it sluggish. Not to mention, makes Freesync act crazy due to how low it gets.
> 
> Never seen a point to CHILL myself, and I don't believe any AMD marketing since the BS they pulled with the failure called Vega


What is the freesync range of the monitor you are using?


----------



## Semel

I've sold my unlocked fury tri-x. For the time being I'm gonna use my i7 built-in gpu . Ill wait for vega 56 custom cards to get released but it doesn't mean I'll blindly buy one of them.

Several factors will be crucial in this regard

1) price
2) cooling
3)whether you can use a custom bios
4) availability

If I'm not satisfied with one or two from this list I'll just get a custom 1080 or wait a bit, get more money and buy 1080ti.

AMD seriously fraked up this time.


----------



## gupsterg

Custom bios is very unlikely to happen. VEGA has HW which is checking signatures with UEFI.

I have only read one mention on web when the digital signatures were cracked and then AMD responded with an update AFAIK. This was also on GPU with no HW checking the signatures, they probably have it pretty locked up now.

Where there was a possibility of VEGA 56 unlocks prior to this info, that also does not exist now IMO.


----------



## Semel

I'm more interested in modifying voltages,limits etc than in unlocks








Quote:


> EGA has HW which is checking signatures with UEFI.


Yeah I know. Maybe third party manufactures can switch it on\off.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Custom bios is very unlikely to happen. VEGA has HW which is checking signatures with UEFI.
> 
> I have only read one mention on web when the digital signatures were cracked and then AMD responded with an update AFAIK. This was also on GPU with no HW checking the signatures, they probably have it pretty locked up now.
> 
> Where there was a possibility of VEGA 56 unlocks prior to this info, that also does not exist now IMO.


There is no signature checking in Linux and they have somewhat been able to flash Bios. Also driver registry hacks are being used to modify the power Limits in Windows.


----------



## CptAsian

Quote:


> Originally Posted by *neurotix*
> 
> Guys, if you want to overclock Fury on 17.7.2, just use this Afterburner Beta. I tried it and it works.
> 
> 17.7.2 adds Radeon Chill for Fury, Chill works in Crossfire, and a host of other improvements. You should use it.
> 
> Hope this helps.


The link seems to be broken (redirects me). What version is that? 4.4.0? If so, that's the one I have, and it's not working for me. Any ideas?


----------



## Bojamijams

Quote:


> Originally Posted by *Nassenoff*
> 
> What is the freesync range of the monitor you are using?


42-100


----------



## Nassenoff

Quote:


> Originally Posted by *Bojamijams*
> 
> 42-100


Have you tried increasing the lowest allowed chill fps to a higher number than 42? Chill is set to 40 fps lowest by default, right?


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> Custom bios is very unlikely to happen. VEGA has HW which is checking signatures with UEFI.
> 
> I have only read one mention on web when the digital signatures were cracked and then AMD responded with an update AFAIK. This was also on GPU with no HW checking the signatures, they probably have it pretty locked up now.
> 
> Where there was a possibility of VEGA 56 unlocks prior to this info, that also does not exist now IMO.


buildzoid mention in his latest video that someone on OCN got it to boot on Linux and update the BIOS on linux while running, I wonder who made that


----------



## Bojamijams

Quote:


> Originally Posted by *Nassenoff*
> 
> Have you tried increasing the lowest allowed chill fps to a higher number than 42? Chill is set to 40 fps lowest by default, right?


No I haven't, but I still don't see a point to Chill. I'd rather my video card didn't do giant FPS changes/swings every couple of seconds. I've just never seen it work positively for me. Like I said, it was always sluggish and I don't even know when I am ever staying in one spot for more than a few hundred milliseconds for this thing to even work.

Honestly, if this Chill had any real merit to it, I would expect nVidia to already implement something like that since their R&D budget is at least 10x what AMD/RTGs is.


----------



## Drake87

Quote:


> Originally Posted by *neurotix*
> 
> Guys, if you want to overclock Fury on 17.7.2, just use this Afterburner Beta. I tried it and it works.
> 
> 17.7.2 adds Radeon Chill for Fury, Chill works in Crossfire, and a host of other improvements. You should use it.
> 
> Hope this helps.


I'm using the beta of afterburner and I can't get my overclock to work. Either I have everything grayed out except fan control. When I set the enable unofficial overclock I get the options back, but the card never goes above 300mhz core. Am I missing something?


----------



## LionS7

Quote:


> Originally Posted by *Drake87*
> 
> I'm using the beta of afterburner and I can't get my overclock to work. Either I have everything grayed out except fan control. When I set the enable unofficial overclock I get the options back, but the card never goes above 300mhz core. Am I missing something?


The unofficial overclock did not work with 17.7.2. Everything else works fine with Afterburner 4.4.0 beta 15 for me.


----------



## Drake87

Quote:


> Originally Posted by *LionS7*
> 
> The unofficial overclock did not work with 17.7.2. Everything else works fine with Afterburner 4.4.0 beta 15 for me.


Beta 15? Newest I can find is 44 beta 12.

edit nevermind i found it


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> There is no signature checking in Linux and they have somewhat been able to flash Bios. Also driver registry hacks are being used to modify the power Limits in Windows.


Read from here chap







.

The security feature means at post the VBIOS is security checked, so user does not get to OS at all, regardless







.

Then this post clarify what member did for Linux







.

The Linux route is very similar to WinOS registry hack







. In that when the WinOS registry has a copy of PowerPlay it will get priority over firmware. Now this still is limiting. For example i2c comms is disabled on VEGA voltage control chip. So let's say on Fiji we are limited to max 1.3V as VID for a DPM in PowerPlay, anything above driver bomb at OS load. We could use an offset on IR3567B to go higher, implemented by VBIOS if we wanted. You won't have that route open on VEGA, unknown at present what is max VID for a DPM.

Even the registry loop hole could be closed by AMD anytime IMO, by changing how driver is.
Quote:


> Originally Posted by *PontiacGTX*
> 
> buildzoid mention in his latest video that someone on OCN got it to boot on Linux and update the BIOS on linux while running, I wonder who made that


@buildzoid is subbed to the linked OCN thread, I think he means wolf9466 as the person who did, but did not happen as stated in video.


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> Read from here chap
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The security feature means at post the VBIOS is security checked, so user does not get to OS at all, regardless
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then this post clarify what member did for Linux
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The Linux route is very similar to WinOS registry hack
> 
> 
> 
> 
> 
> 
> 
> . In that when the WinOS registry has a copy of PowerPlay it will get priority over firmware. Now this still is limiting. For example i2c comms is disabled on VEGA voltage control chip. So let's say on Fiji we are limited to max 1.3V as VID for a DPM in PowerPlay, anything above driver bomb at OS load. We could use an offset on IR3567B to go higher, implemented by VBIOS if we wanted. You won't have that route open on VEGA, unknown at present what is max VID for a DPM.
> 
> Even the registry loop hole could be closed by AMD anytime IMO, by changing how driver is.
> @Buildzoid is subbed to the linked OCN thread, I think he means wolf9466 as the person who did, but did not happen as stated in video.


Yes,Thanks i just found that


----------



## gupsterg

Sweet








.


----------



## Alastair

Why would AMD do that though. On one hand they claim to be all "pro enthusiast" and "our enthusiasts mean so much to us" and then with the other hand slap us with a back hand and prevent those very enthusiasts from modifying BIOS's.









I've used AMD cards since 5770. Nothing from green team interested me till maxell. Then 970 fiasco happened. Now I am thinking if Volta comes with decent Async support ill jump over to them.


----------



## PontiacGTX

Quote:


> Originally Posted by *Alastair*
> 
> Why would AMD do that though. On one hand they claim to be all "pro enthusiast" and "our enthusiasts mean so much to us" and then with the other hand slap us with a back hand and prevent those very enthusiasts from modifying BIOS's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've used AMD cards since 5770. Nothing from green team interested me till maxell. Then 970 fiasco happened. Now I am thinking if Volta comes with decent Async support ill jump over to them.


People who will unlock VEGA 56 to VEGA 64, People who can OC VEGA 56 at high clock speed and beat VEGA 64, People who could try to use VEGA FE bios on VEGA 64

why someone would spend 500usd+ ona VEGA FE if you can have with half vram and similar compute performance,evne with a driver mod they could, or would spend 100usd more on the VEGA 64 water cooled when you can modify the max allowed TDP and have same performance with your own waterblock, or why spend 100uSD ON vega 64 AIR COOLED when you get same performance with an overclocked VEGA 56,and also for those want use more power than the card is allowed to now


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> Why would AMD do that though. On one hand they claim to be all "pro enthusiast" and "our enthusiasts mean so much to us" and then with the other hand slap us with a back hand and prevent those very enthusiasts from modifying BIOS's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've used AMD cards since 5770. Nothing from green team interested me till maxell. Then 970 fiasco happened. Now I am thinking if Volta comes with decent Async support ill jump over to them.


I'm thinking part of windows secure boot requirements is to blame.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> Read from here chap
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The security feature means at post the VBIOS is security checked, so user does not get to OS at all, regardless
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then this post clarify what member did for Linux
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The Linux route is very similar to WinOS registry hack
> 
> 
> 
> 
> 
> 
> 
> . In that when the WinOS registry has a copy of PowerPlay it will get priority over firmware. Now this still is limiting. For example i2c comms is disabled on VEGA voltage control chip. So let's say on Fiji we are limited to max 1.3V as VID for a DPM in PowerPlay, anything above driver bomb at OS load. We could use an offset on IR3567B to go higher, implemented by VBIOS if we wanted. You won't have that route open on VEGA, unknown at present what is max VID for a DPM.
> 
> @Buildzoid is subbed to the linked OCN thread, I think he means wolf9466 as the person who did, but did not happen as stated in video.


Thanks for the links.









REP +1

I read about 4-5 pages of the Preliminary view of AMD VEGA Frontier Edition Bios thread. No wonder we haven't seen much of you. you've been busy.









Any chance of fooling the security by showing it official Bios, while injecting/replacing data table bios queries to modded bios or expanding PCI-E control through the crossfire connection?


----------



## gupsterg

Quote:


> Originally Posted by *bluezone*
> 
> I'm thinking part of windows secure boot requirements is to blame.


I really don't think it is part of this TBH.

Let's ignore what VEGA has/does for a moment. Let's talk about Fiji and below.

'pure UEFI mode' is achieved by setting CSM to Off/Disabled in UEFI of motherboard.

CSM = Compatibility Support Module, now if a GPU does not have UEFI/GOP module in VBIOS it is not usable with CSM: Off/Disabled.

When CSM is off with Secure Boot Off and VBIOS has UEFI/GOP module, the motherboard UEFI is not concerned with validating UEFI/GOP module. The UEFI/GOP module in VBIOS won't validate itself either. The UEFI/GOP module must validate Legacy section of VBIOS as it needs that info to make card function. The UEFI/GOP module references the same digital signature that drivers do at OS load . So here we have a feature to secure UEFI mode.

Next when CSM is off with Secure Boot *On* and VBIOS has UEFI/GOP module, the motherboard UEFI *will validate* UEFI/GOP module in VBIOS. The UEFI/GOP module in VBIOS has a digital signature for itself. So now the process is motherboard validates VBIOS UEFI/GOP module, VBIOS UEFI/GOP module validates Legacy section of VBIOS. So we have two features to secure UEFI mode.

IMO all is covered on a MS POV for secure UEFI post mode.

So as you can see we had pretty much 3 features to secure VBIOS.

i) Digital signature in Legacy Section of VBIOS, which OS driver refs to know it is unmodified. Which AMD can decide to have on or off.

ii) When CSM: Off, SB: Off, UEFI/GOP module of VBIOS validates Legacy section.

iii) When CSM: Off, SB: On, mobo UEFI validates VBIOS UEFI/GOP module, VBIOS UEFI/GOP module validates Legacy section.

So on VEGA we have all of above, plus a security processor checking VBIOS to make sure it has not been modified.

You will recall in Fiji Bios mod there was discussion on how a modified VBIOS when using CSM: Off black screens/does not post. How this was resolved was by using a custom UEFI/GOP module in VBIOS. This modification made UEFI/GOP module not validate digital signature in Legacy Section of VBIOS. We just can not use Secure Boot as the signature in UEFI/GOP module is not updated to reflect changes within it. As you can guess when Secure Boot is on the mobo UEFI must get validated as well, so it can't be modified not to ignore the signature in UEFI/GOP module of VBIOS.

So there were several layers of security already to make UEFI org/MS happy.

Now on VEGA due to this security processor we can't :-

i) modify UEFI/GOP module not to ignore digital signature in Legacy section so we can have 'pure UEFI mode' with CSM: Off.

ii) modify Legacy section to have changes we want.

Also AMD have disabled i2c comms on VEGA FE for sure by VBIOS, RX VEGA seems as if it is the same, so we can't do certain things as we did on cards before.
Quote:


> Originally Posted by *bluezone*
> 
> Thanks for the links.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> REP +1
> 
> I read about 4-5 pages of the Preliminary view of AMD VEGA Frontier Edition Bios thread. No wonder we haven't seen much of you. you've been busy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any chance of fooling the security by showing it official Bios, while injecting/replacing data table bios queries to modded bios or expanding PCI-E control through the crossfire connection?


No idea on your suggestions.

The methods that are successful so far are pretty much similar IMO. The member that has made mods on Linux has done so by loading a modified copy of bios when OS loads. The WinOS registry trick is pretty much the same, the driver gives priority to copy of PowerPlay in registry when it exist at OS load.


----------



## bluezone

Quote:


> Originally Posted by *gupsterg*
> 
> I really don't think it is part of this TBH.
> 
> Let's ignore what VEGA has/does for a moment. Let's talk about Fiji and below.
> 
> 'pure UEFI mode' is achieved by setting CSM to Off/Disabled in UEFI of motherboard.
> 
> CSM = Compatibility Support Module, now if a GPU does not have UEFI/GOP module in VBIOS it is not usable with CSM: Off/Disabled.
> 
> When CSM is off with Secure Boot Off and VBIOS has UEFI/GOP module, the motherboard UEFI is not concerned with validating UEFI/GOP module. The UEFI/GOP module in VBIOS won't validate itself either. The UEFI/GOP module must validate Legacy section of VBIOS as it needs that info to make card function. The UEFI/GOP module references the same digital signature that drivers do at OS load . So here we have a feature to secure UEFI mode.
> 
> Next when CSM is off with Secure Boot *On* and VBIOS has UEFI/GOP module, the motherboard UEFI *will validate* UEFI/GOP module in VBIOS. The UEFI/GOP module in VBIOS has a digital signature for itself. So now the process is motherboard validates VBIOS UEFI/GOP module, VBIOS UEFI/GOP module validates Legacy section of VBIOS. So we have two features to secure UEFI mode.
> 
> IMO all is covered on a MS POV for secure UEFI post mode.
> 
> So as you can see we had pretty much 3 features to secure VBIOS.
> 
> i) Digital signature in Legacy Section of VBIOS, which OS driver refs to know it is unmodified. Which AMD can decide to have on or off.
> 
> ii) When CSM: Off, SB: Off, UEFI/GOP module of VBIOS validates Legacy section.
> 
> iii) When CSM: Off, SB: On, mobo UEFI validates VBIOS UEFI/GOP module, VBIOS UEFI/GOP module validates Legacy section.
> 
> So on VEGA we have all of above, plus a security processor checking VBIOS to make sure it has not been modified.
> 
> You will recall in Fiji Bios mod there was discussion on how a modified VBIOS when using CSM: Off black screens/does not post. How this was resolved was by using a custom UEFI/GOP module in VBIOS. This modification made UEFI/GOP module not validate digital signature in Legacy Section of VBIOS. We just can not use Secure Boot as the signature in UEFI/GOP module is not updated to reflect changes within it. As you can guess when Secure Boot is on the mobo UEFI must get validated as well, so it can't be modified not to ignore the signature in UEFI/GOP module of VBIOS.
> 
> So there were several layers of security already to make UEFI org/MS happy.
> 
> Now on VEGA due to this security processor we can't :-
> 
> i) modify UEFI/GOP module not to ignore digital signature in Legacy section so we can have 'pure UEFI mode' with CSM: Off.
> 
> ii) modify Legacy section to have changes we want.
> 
> Also AMD have disabled i2c comms on VEGA FE for sure by VBIOS, RX VEGA seems as if it is the same, so we can't do certain things as we did on cards before..


So it's heavy handed, overreaching and of questionable purpose. You have given an excellent explanation by the way. It appears to be over built or of questionable purpose. It slightly leaves me with a slight taste of Apple proprietary influence. With them as a customer it makes me go Hmmmmm...








Quote:


> Originally Posted by *gupsterg*
> 
> Also AMD have disabled i2c comms on VEGA FE for sure by VBIOS, RX VEGA seems as if it is the same, so we can't do certain things as we did on cards before.
> No idea on your suggestions.
> 
> The methods that are successful so far are pretty much similar IMO. The member that has made mods on Linux has done so by loading a modified copy of bios when OS loads. The WinOS registry trick is pretty much the same, the driver gives priority to copy of PowerPlay in registry when it exist at OS load.


Well it looks like some success is being had. I finally finished reading the AMD VEGA Frontier Edition Bios thread.
The suggestions I made were ideas of dealing with the security system. Namely convincing it your supposed to be there, blinding it as to your actions or working under the nose of security and acting independently. I suggested the crossfire connection because, it would be nature be an active connection to work and it's high speed so likely little to no security or encryption. This would be because of it being necessary for high speed work. The handshake to initiate the connection might be a problem though.








Of course if you can turn it off, as you mentioned. None of these would be needed.


----------



## gupsterg

I believe they have done what they have pretty much to protect their IP.

I had Fury Tri-X and it unlocked to 3840SP, pretty much in all things I tried at the time the difference of a Fury vs a Fury X at same clock speeds was within error margin of runs. Now as AMD has had several bouts of people unlocking things on cards it pretty much meant someone who was not phased by this would buy the cheaper product.

I only opted to keep the Fury X as:-

i) it cost me very close price to Fury Tri-X.

ii) I felt the resale of Fury X will be better than Fury when it comes to sell.

iii) I liked the short 'package' of card and how the AIO meant hot air was exhausted out of case.

CF is done over the PCI-E bus, any 'hacking/cracking' that way is gonna need some real pros IMO, beyond me







. IMO the digital signature needs cracking, this is difficult from what I have seen mentioned, it was cracked at one point and then changed. Again without cracking the digital sig you can't have code in VBIOS that would circumvent security. There is only one mention of it I have ever seen in all the trawling I have done since 2015 when searching for info on Hawaii bios.

It is sad IMO.

Why not make enthusiast grade cards have say a VBIOS that is changeable via menu system, motherboards have had this for so long, have they and CPU makers had it real bad due to this?


----------



## Alastair

Rainbow Six Siege players are you having issues with near constant game crashes?


----------



## Nassenoff

Quote:


> Originally Posted by *Bojamijams*
> 
> No I haven't, but I still don't see a point to Chill. I'd rather my video card didn't do giant FPS changes/swings every couple of seconds. I've just never seen it work positively for me. Like I said, it was always sluggish and I don't even know when I am ever staying in one spot for more than a few hundred milliseconds for this thing to even work.
> 
> Honestly, if this Chill had any real merit to it, I would expect nVidia to already implement something like that since their R&D budget is at least 10x what AMD/RTGs is.


Well the point is probably to get the power consumption down, which AMD is constantly being cirticed for. And they claim lower input lag.
Why would Nvidia use any R&D on this? There perfornmance per watt is much better ATM and Vega did not change that, maybe made it worse. Maybe they saw the power draw of vega during the development of it and realized something had to be done


----------



## Nassenoff

R9 290 (290X BIOS) removed, Fury X installed.
Love the Aquacomputer waterblock design


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Alastair*
> 
> Rainbow Six Siege players are you having issues with near constant game crashes?


No, i can play normaly NP









Im on 17.7.1WHQL

Shortcut commands:

-useallavailablecores -malloc=system -force-driver-type-warp -sm6 -dx11mt -pthreads 12 -maxvram=4072

-pthreads 12 (not -pthreads 16 on my ZEN because game only utilise 6-12cores well)

Run CMD with Administrator Rights and copy this:

bcdedit /set useplatformclock true
bcdedit /set disabledynamictick yes

or
bcdedit /set useplatformclock true
bcdedit /set tscsyncpolicy Enhanced
bcdedit /set disabledynamictick yes

If you want to disable enchanced :
bcdedit /deletevalue tscsyncpolicy

Usually Log-Off


----------



## bluezone

Another new driver. 17.8.1

Release notes:

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.8.1-Release-Notes.aspx

Anyone have any idea why "Bethesda Launcher" is in the driver package? it's unchecked by default?


----------



## Ne01 OnnA

Quote:


> Originally Posted by *bluezone*
> 
> Another new driver. 17.8.1
> 
> Release notes:
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.8.1-Release-Notes.aspx
> 
> Anyone have any idea why "Bethesda Launcher" is in the driver package? it's unchecked by default?


Another big f.up OC still broken on Fiji
etc.

16.12.2 not working OK for ME:Andromeda lolZ
On 17.7.1 i have some font corruption in Opera & can't use Full screen Videos in Tweeter & Firefox !

UPD.
Now im Testing my Favorite: 17.1.2WHQL Feb6








So far is OK, ME:A working Great, FH3 also have good performance.

Im thinking about, instaling 17.8.2 Aug18 and OC via Wattman -> any insights?


----------



## gupsterg

Fiji members I have highlighted I have given driver feedback on HBM OC issue with v17.x.x drivers to AMD and I have contacted AMD Matt via mentions here (@LtMatt).

Today on OCuk a discussion occurred regarding Fiji in a VEGA thread, read from this post. As AMD Matt posts there as well as AMD Community site the most I have tried to engage him in further discussion via this thread.

I hope you guys will take time to share you thoughts there.


----------



## prom

I've got a couple basic questions that I'm hoping some folks can shed some light on:

*Air Nano owners:*
What are your VRM temps like, and what kind of temperatures/fan speeds are you seeing?
When playing some games my VRM temps bounce around 95-100c, which causes the card to throtle.

DPM7 is currently undervolted to 1193mv, and PT is @ 30%. I can loop heaven for 15min and stay around 80c on the core.
My current ambient is ~25c, fan speed is max @ 3500rpm (i wear headphones). Think I need to upgrade the VRM pads?

As a followup, if you're using MSI AB, what does your fan profile look like?

*General Fury owners:*
Can I use WattTool? I'm just looking to try new software for fiddling, and all the recent Vega coverage has shown me this software which looks pretty neat.


----------



## Ne01 OnnA

So far so good








Drivers are Solid as f.unk







-> MSI AB v.16 working and Our Guru Tool also (its better for 17.8.x)
OverdriveNTool.exe -> strongly recomended.

Tested so far:
ME:A = Great
FH3 = Great (IMO The best driver for Forza)
AC:S = Great 'somehow'
R6 Siege = Great (now i can Up some Shadows to High, also no cluster f.uk in Brasil)
Heroes VII = Great also

Overall thanks to Unwinder (MSI AB v.16) i can set my Fans and with OverdriveNTool i can edit exactly the same values as i have in 17.71 with TriXX !








Everything is OK

PS. As for HBM OC i have BIOS with 1050/550 1.212v/1.337v


----------



## prom

Sounds good, OverdriveNTool doesn't break my VRM temp readings like WattTool did.
That said, it doesn't look like the HBM overclocking is working.

Did 17.8.1 break it again?
_*Edit:* It's been broke for a bit apparently, no wonder_


----------



## Ne01 OnnA

Updated my last post


----------



## PontiacGTX

Quote:


> Originally Posted by *gupsterg*
> 
> Fiji members I have highlighted I have given driver feedback on HBM OC issue with v17.x.x drivers to AMD and I have contacted AMD Matt via mentions here (@LtMatt).
> 
> Today on OCuk a discussion occurred regarding Fiji in a VEGA thread, read from this post. As AMD Matt posts there as well as AMD Community site the most I have tried to engage him in further discussion via this thread.
> 
> I hope you guys will take time to share you thoughts there.


Maybe the reaosn it onyl gets as much performance gains from HBM on Fury (x) at 545MHz is because the timmings were done for not going really beyond 500MHz?


----------



## gupsterg

The timings straps in stock VBIOS are 100MHz, 400MHz, 500MHz and 600MHz.

When you go say 501MHz you enter 600MHz strap, on Fiji 500MHz and 600MHz have identical timings, only 400MHz and 100MHz have differing timings.

Then lets say you go 601MHz what happens? the last strap timings are used. This was the way on Hawaii and Grenada and some VBIOS of those cards lacked straps and timings for RAM MHz the IC could achieve. Did they not OC well? nope, did they not gain performance if EDC errors were not occurring on RAM IC, etc? nope.

Trust me straps and timings are not the issue.

It is purely AMD have block HBM performance gain from HBM clock increase.

I sat for ~30min in OS using card last night at 700MHz HBM on v17.x.x driver and on on any v16.x.x driver I will get artifacts at desktop with 600MHz.


----------



## prom

Can anyone with an aircooled Nano take a look at these numbers and tell me what you think?
Lately I've felt like the card is throttling a lot more in games, and the VRM temps are looking a bit high.

15 Minutes of Heaven is no problem, and there's no throttling whatsoever.
15 Minutes of Valley is a little more game-like and will result in the VRMs hitting 98-99 degrees and the card throttles back.



As you can see my core temps are totally fine, so I'm wondering: *Should replace the stock thermal pads?*









If you'd REALLY like to help out, maybe someone with a Nano (that's still on air) would be interested in mimicking my test?
HWiNFO64 running, cleared at the start of the 15minute cycle, and screen shotted when the 15min have elapsed


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Fiji members I have highlighted I have given driver feedback on HBM OC issue with v17.x.x drivers to AMD and I have contacted AMD Matt via mentions here (@LtMatt).
> 
> Today on OCuk a discussion occurred regarding Fiji in a VEGA thread, read from this post. As AMD Matt posts there as well as AMD Community site the most I have tried to engage him in further discussion via this thread.
> 
> I hope you guys will take time to share you thoughts there.


After reading that, you know what. I am done. I was so chuffed with these cards. I loved running them at 545 they ran like a bomb, I recommended team red cards to friends for ASYNC compute and fressync etc etc. but after this I think I am done. No more. Ill buy a GTX as my next card. Been purchasing cards since HD5XXX. But no more.


----------



## PontiacGTX

Quote:


> Originally Posted by *Alastair*
> 
> After reading that, you know what. I am done. I was so chuffed with these cards. I loved running them at 545 they ran like a bomb, I recommended team red cards to friends for ASYNC compute and fressync etc etc. but after this I think I am done. No more. Ill buy a GTX as my next card. Been purchasing cards since HD5XXX. But no more.


I dont know if AMD would like to take the risk either HBM failures due to overclock or Hynix showing that HBM1 might be just as good as HBM2 with OC


----------



## Alastair

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> After reading that, you know what. I am done. I was so chuffed with these cards. I loved running them at 545 they ran like a bomb, I recommended team red cards to friends for ASYNC compute and fressync etc etc. but after this I think I am done. No more. Ill buy a GTX as my next card. Been purchasing cards since HD5XXX. But no more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont know if AMD would like to take the risk either HBM failures due to overclock or Hynix showing that HBM1 might be just as good as HBM2 with OC
Click to expand...

I don't give two hoots what the hell AMD is trying to show. Nerfing my damned Fury's is not the way to do it. OC has ALWAYS been at the risk of the user, and removing any OC functionality to Fiji, of which there is already precious little to begin with, goes against their enthusiasts. Which may I remind all present, "We love gaming. We love all you enthusiasts and overclockers that keep pushing our hardware to the limits" - Lisa Su.

This goes against all of that. Which actually triggers me to the next level.


----------



## xkm1948

Quote:


> Originally Posted by *Alastair*
> 
> I don't give two hoots what the hell AMD is trying to show. Nerfing my damned Fury's is not the way to do it. OC has ALWAYS been at the risk of the user, and removing any OC functionality to Fiji, of which there is already precious little to begin with, goes against their enthusiasts. Which may I remind all present, "We love gaming. We love all you enthusiasts and overclockers that keep pushing our hardware to the limits" - Lisa Su.
> 
> This goes against all of that. Which actually triggers me to the next level.


The one you should be angry about is Raja, not Lisa. AMD has bascially spun off RTG. And Fiji happened right before the spin off. So go figure why Raja never gave Fiji any love.


----------



## bluezone

Quote:


> Originally Posted by *prom*
> 
> Can anyone with an aircooled Nano take a look at these numbers and tell me what you think?
> Lately I've felt like the card is throttling a lot more in games, and the VRM temps are looking a bit high.
> 
> 15 Minutes of Heaven is no problem, and there's no throttling whatsoever.
> 15 Minutes of Valley is a little more game-like and will result in the VRMs hitting 98-99 degrees and the card throttles back.
> 
> 
> 
> As you can see my core temps are totally fine, so I'm wondering: *Should replace the stock thermal pads?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you'd REALLY like to help out, maybe someone with a Nano (that's still on air) would be interested in mimicking my test?
> HWiNFO64 running, cleared at the start of the 15minute cycle, and screen shotted when the 15min have elapsed


I have a Nano (still on air cooling) and I've done a lot of cooling mods, plus bios modifications. So a comparison runs wouldn't be useful. Here is a 15 min loop of the Valley for reference anyhow.



Spoiler: Warning: Spoiler!







The easiest thing to recommend, is to place a fan directly on the back of the Nano, closest to the PCI-E power plug. This will help with VRM and GPU temps.
Also if your feeling ambitious, fabricate some duct work to draw direct outside air into the Nano cooling system. Even a case with excellent air flow can benefit from this. I suspect it might even help those with water loops installations and higher than wanted temps.

If none of this helps; then consider a TIM replacement; and adding Tim to both sides of the VRM pads. These steps do two things: 1) help lower GPU temps so that cooler air is flowing out of the cooler and then flowing across the VRM cooler. 2) Increase the heat flow out of the VRM by providing a better heat path via TIM to pad to TIM to heat pipe (better contact).

Cheers


----------



## prom

*@bluezone*
This is the setup (side panel w/ dual fans not shown)


At first I thought perhaps the fin direction of the cooler was at fault, but I doubt it since my GPU temperatures weren't always this poor.
As you can see, there isn't really room to put a fan on top of the card.

On the bottom is a 140mm Noctua P14, and temperatures remain within a couple degrees no matter if the fan is removed, off, or cranked to maximum.

My CPU temperatures don't get excessive (under 60 during extended gaming sessions), and 100% GPU load doesn't necessarily result in high VRM temps (ie: Heaven vs Valley benchmarks).

So I'm leaning towards replacing the TIM & heatpads with some higher quality bits.
The card was RMAd when I first got it (due to artifacting), so I imagine it's possible that whoever fixed the card did a pretty poor job.

0.5mm Fujipoly pads and some GC-Extreme TIM should do the trick, yea?


----------



## bluezone

Quote:


> Originally Posted by *prom*
> 
> *@bluezone*
> This is the setup (side panel w/ dual fans not shown)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> At first I thought perhaps the fin direction of the cooler was at fault, but I doubt it since my GPU temperatures weren't always this poor.
> As you can see, there isn't really room to put a fan on top of the card.
> 
> 
> 
> On the bottom is a 140mm Noctua P14, and temperatures remain within a couple degrees no matter if the fan is removed, off, or cranked to maximum.
> 
> My CPU temperatures don't get excessive (under 60 during extended gaming sessions), and 100% GPU load doesn't necessarily result in high VRM temps (ie: Heaven vs Valley benchmarks).
> 
> So I'm leaning towards replacing the TIM & heatpads with some higher quality bits.
> The card was RMAd when I first got it (due to artifacting), so I imagine it's possible that whoever fixed the card did a pretty poor job.
> 
> 0.5mm Fujipoly pads and some GC-Extreme TIM should do the trick, yea?


WOW!! That's beyond excellent cooling air coming into the GPU.








On the other hand I suspect warm CPU cooler air might be spilling onto the back of your Nano, but yikes defiantly no room for a fan on the back of the card. You might try a plastic or cardboard divider between the Nano and the CPU cooler. But I suspect this will not help. The hottest point on the back of the card is under where the white sticker is located.Any chance of you having thermal pads and small thin cooler you could stick over that spot, needs to be affixed so it doesn't move. Careful there are live connections on the back, but I suspect you know that already.

So yes GC-Extreme or my next favorite Deeopool Z5 or Z9 (easier to spread) Fjipoly pads,with a touch of TIM is helpful. I found that adding the TIM is worth a 1-2 deg C cooler temps. I didn't believe it would work until I tried it myself.

PS. A quick occurred to me as I was finishing. I am going to assume all air vents out the small fan at the back (plus GPU). You might have a touch too much positive air pressure. Unless the side panel w/ dual fans is a vent.







Follow me on this. While GPU air can vent out the back panel, air flow out the other end (and over the VRM cooler as well) may be restricted by high case pressure air and turbulence.. How are temperatures with the side cover off?

EDIT: is the case bottom fan hub motor blocking the air intake to the Nano?


----------



## prom

All fans are set to intake, except for the rear 92mm fan.
All the panels are drilled for ventilation (NCase M1).
This system is dust free inside, so a clogged heatsink is also pretty well out of the question.

I could flip one of the side fans to exhaust, but I don't think it'll matter.
Having the side panel off doesn't noticeably change the cards performance.

I don't think the CPU heatsink is having much of an effect either since VRM temps vary depending on the game/benchmark.

*For example:*

While playing PUBG, some rounds the card runs hotter and will throttle, and other times the card does not. Rounds are ~15-30min long.
Heaven bench will not throttle, despite running the extreme preset with 8x AA.
Valley bench will start to throttle after 8-9ish minutes into a 15min cycle.
3dMark won't VRM throttle when benching, but will throttle during the stress test.
Also, how is anyone running these Nanos WITHOUT +50% power and _not_ power throttling?
I'm running 1170mv (-61mv from stock) with 35% power, and it will still power throttle on occasion.

At any rate, where the heck does someone buy Fujipoly pads nowadays?
And what thickness should I be getting for a Nano? I'm only _guessing_ it's 0.5mm.


----------



## bluezone

Quote:


> Originally Posted by *prom*
> 
> All fans are set to intake, except for the rear 92mm fan.
> All the panels are drilled for ventilation (NCase M1).
> This system is dust free inside, so a clogged heatsink is also pretty well out of the question.
> 
> I could flip one of the side fans to exhaust, but I don't think it'll matter.
> Having the side panel off doesn't noticeably change the cards performance.
> 
> I don't think the CPU heatsink is having much of an effect either since VRM temps vary depending on the game/benchmark.
> 
> *For example:*
> 
> While playing PUBG, some rounds the card runs hotter and will throttle, and other times the card does not. Rounds are ~15-30min long.
> Heaven bench will not throttle, despite running the extreme preset with 8x AA.
> Valley bench will start to throttle after 8-9ish minutes into a 15min cycle.
> 3dMark won't VRM throttle when benching, but will throttle during the stress test.
> Also, how is anyone running these Nanos WITHOUT +50% power and _not_ power throttling?
> I'm running 1170mv (-61mv from stock) with 35% power, and it will still power throttle on occasion.
> 
> At any rate, where the heck does someone buy Fujipoly pads nowadays?
> And what thickness should I be getting for a Nano? I'm only _guessing_ it's 0.5mm.


Now that I have seen a picture of the full case. Not likely too high of positive air pressure.

I'm running Bios Mods to equal +50%. Plus other changes.

If that bottom fan hub is blocking any air intake into the Nano, at all, any positive air pressure or velocity will be negated by the restriction (it looks close from what I can make out).. Can you delete the bottom fan (under GPU) or mount a thinner or smaller diameter fan that will not block GPU air intake?

I have never ordered any Fujipoly pads. not easy to come by where I am. But if I recall earlier discussions it was 0.5 thickness.

Excellent under Volting by the way.


----------



## prom

The bottom fan doesn't seem to make much of a performance difference.
Stopped, or maxed out it doesn't _noticeably_ change the cards temps.

Other NCase owners have noted that bottom fans are beneficial to the small case, unless the fan pulls less air than the GPU demands.
It's a 140mm fan, it's not moving less air than the GPU fan









I'll experiment tomorrow in removing the 140mm fan and flipping one of the side fans to exhaust.

It just doesn't make sense to me that the VRMs can be up to 98c whereas the core will be sitting around 78 at the exact same time


----------



## Rootax

Quote:


> Originally Posted by *prom*
> 
> The bottom fan doesn't seem to make much of a performance difference.
> Stopped, or maxed out it doesn't _noticeably_ change the cards temps.
> 
> Other NCase owners have noted that bottom fans are beneficial to the small case, unless the fan pulls less air than the GPU demands.
> It's a 140mm fan, it's not moving less air than the GPU fan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll experiment tomorrow in removing the 140mm fan and flipping one of the side fans to exhaust.
> 
> It just doesn't make sense to me that the VRMs can be up to 98c whereas the core will be sitting around 78 at the exact same time


Hi.

I believe it's my first post here, I lurk a lot... Anyway, I'm not surprised at your VRMs being hotter that the core. My FuryX is underwater (custom loop, XSPC wb), and the VRMs are always 7-8 hotter than my core. Right now for exemple, with my FuryX mining for days, my core is at 43-44c, (it's hot in france today, it doesn't help), and my VRMs are around 51


----------



## miklkit

I have a Sapphire Fury and it is running fine at stock clocks but it has one very irritating "feature". It has a built in hi definition audio. I am constantly disabling it and deleting its drivers but win10 keeps reinstalling it and disabling the Creative X-Fi sound card. The last time it did this it had the sound coming through the monitor speakers!

Is there any way to make 100% sure that the on board sound is disabled and stays disabled?


----------



## PontiacGTX

I wonder if someone has tried to use liquid thermal compound on the Fiji GPU and if no one has tried there is some precaution should be taken?


----------



## Alastair

Quote:


> Originally Posted by *PontiacGTX*
> 
> I wonder if someone has tried to use liquid thermal compound on the Fiji GPU and if no one has tried there is some precaution should be taken?


What do you mean liquid compound? Liquid metal?


----------



## PontiacGTX

Quote:


> Originally Posted by *Alastair*
> 
> What do you mean liquid compound? Liquid metal?


yes


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> yes


Yes it damaged the die upon removal. Someone used IC Diamond as well and scratched the die.

Anyone else read this?

https://www.hardocp.com/article/2017/08/25/amd_radeon_rx_vega_56_versus_r9_fury

Edited: got interrupted while typing.


----------



## Gdourado

With the cheapest RX Vega 56 costing 450 euros, being out of stock and being a reference air cooled model, is a Fury X for 290 euros a good buy?
Looking at getting a 144hz 1440p freesync display.

Cheers!


----------



## 99belle99

Quote:


> Originally Posted by *Gdourado*
> 
> With the cheapest RX Vega 56 costing 450 euros, being out of stock and being a reference air cooled model, is a Fury X for 290 euros a good buy?
> Looking at getting a 144hz 1440p freesync display.
> 
> Cheers!


That is an excellent price for a Fury X. You will hit 144Hz on some games but the likes of GTA V you will hit 70 FPS.


----------



## LeadbyFaith21

Quote:


> Originally Posted by *Gdourado*
> 
> With the cheapest RX Vega 56 costing 450 euros, being out of stock and being a reference air cooled model, is a Fury X for 290 euros a good buy?
> Looking at getting a 144hz 1440p freesync display.
> 
> Cheers!


I would consider that a good buy, you probably won't hit 144 Hz on modern AAA games at max settings, but you'll be certainly be above 60, and at that price (and considering current GPU pricing as a whole), I don't think you'll find a better GPU for 144Hz 1440p


----------



## Gdourado

how does the 4gb of memory handle 1440p?
Also, getting a used card with already some usage time, is there the risk of the cooling AIO having some liquid evaporate and not be effective?


----------



## LeadbyFaith21

Quote:


> Originally Posted by *Gdourado*
> 
> how does the 4gb of memory handle 1440p?
> Also, getting a used card with already some usage time, is there the risk of the cooling AIO having some liquid evaporate and not be effective?


I can't speak to the AIO failing, I've got mine on waterblocks, so hopefully someone else can answer that part. As far as vram usage at 1440p (and even 4k), I haven't ran into any except with Titanfall 2 on insane textures (which requires 6 gb), but it will depend on what games you plan on playing with it.

If you want, I can test run some games and give you the recorded vram usage, if that would be helpful.


----------



## 99belle99

I have never ran into any problems with memory usage on the games I play at 1440p.

I'd say you should be alright regarding the aio as I have had no issues or read of any issues. I did read about coil wine though.


----------



## LionS7

Quote:


> Originally Posted by *99belle99*
> 
> I have never ran into any problems with memory usage on the games I play at 1440p.
> 
> I'd say you should be alright regarding the aio as I have had no issues or read of any issues. I did read about coil wine though.


Well, you are not playing on max then. Try Rise of the Tomb Raider Very high tex or Resident Evil 7 on max settings. Or maybe Mirror's Edge: Catalyst on EPIC.


----------



## 99belle99

Quote:


> Originally Posted by *LionS7*
> 
> Well, you are not playing on max then. Try Rise of the Tomb Raider Very high tex or Resident Evil 7 on max settings. Or maybe Mirror's Edge: Catalyst on EPIC.


I never played any of them. I did say I never came across any issues on the games I play.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> I would consider that a good buy, you probably won't hit 144 Hz on modern AAA games at max settings, but you'll be certainly be above 60, and at that price (and considering current GPU pricing as a whole), I don't think you'll find a better GPU for 144Hz 1440p


The problem of "some ppl" today is that they need Ultra Qualtiy even if is not visible Visual difference between Ultra Vs High








They are -Modern Benchmarks People- or -nV Childrens-









All in all -> you have Settings in games = Use it










Fury is the best 1440p GPU (besides Vega) on the market.
When i get HBCC working for Fury (RadeonMOD is my Invention) we will have hell of a GPU


----------



## LeadbyFaith21

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> The problem of "some ppl" today is that they need Ultra Qualtiy even if is not visible Visual difference between Ultra Vs High
> 
> 
> 
> 
> 
> 
> 
> 
> They are -Modern Benchmarks People- or -nV Childrens-
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in all -> you have Settings in games = Use it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury is the best 1440p GPU (besides Vega) on the market.
> When i get HBCC working for Fury (RadeonMOD is my Invention) we will have hell of a GPU


Very nice! I'll have to give your software a try!


----------



## Sleazybigfoot

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> The problem of "some ppl" today is that they need Ultra Qualtiy even if is not visible Visual difference between Ultra Vs High
> 
> 
> 
> 
> 
> 
> 
> 
> They are -Modern Benchmarks People- or -nV Childrens-
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in all -> you have Settings in games = Use it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury is the best 1440p GPU (besides Vega) on the market.
> When i get HBCC working for Fury (RadeonMOD is my Invention) we will have hell of a GPU


I've got the same GPU as you do, do you have any recommended changes you made in your RadeonMOD program?

(Also, what's HBCC?)


----------



## Ne01 OnnA

Quote:


> Originally Posted by *Sleazybigfoot*
> 
> I've got the same GPU as you do, do you have any recommended changes you made in your RadeonMOD program?
> 
> (Also, what's HBCC?)


HBCC is HBM cache or High Bandwidth cache controller

If HBM cache software allows Fury to have it -> we will
But maby it's H/W related Feature, then we will not have it...

===


----------



## Alastair

Quote:


> Originally Posted by *LionS7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *99belle99*
> 
> I have never ran into any problems with memory usage on the games I play at 1440p.
> 
> I'd say you should be alright regarding the aio as I have had no issues or read of any issues. I did read about coil wine though.
> 
> 
> 
> Well, you are not playing on max then. Try Rise of the Tomb Raider Very high tex or Resident Evil 7 on max settings. Or maybe Mirror's Edge: Catalyst on EPIC.
Click to expand...

I've [layed ROTR and it seems to perform alright. It munches away 24GB worth of pagefile though.


----------



## Alastair

High bandwidth cache is a hardware feature on Vega as far as I am aware so Fury owners will not get it.


----------



## Rootax

Xcom2 with the highest texture settings eat more than 4gb too. But, honestly, in most of the games, the quality drop between max and the next lower thing is no THAT huge, so, it's ok to drop a few settings here and here. It's still a high end card imo. Plus the Fury X is still a beast compute wise. And if you have a freesync screen, it's even better.


----------



## Minotaurtoo

until vega liquid or whatever's next with liquid AIO is cheaper, I have no intention of dropping my fury x... even with stock fan settings and overclocked it will run folding at home with average temps near 50C with my custom fan profile it never tops 47C...


----------



## PontiacGTX

}
Quote:


> Originally Posted by *99belle99*
> 
> I have never ran into any problems with memory usage on the games I play at 1440p.
> 
> I'd say you should be alright regarding the aio as I have had no issues or read of any issues. I did read about coil wine though.


you cant benchmark doom on nightmare settings at 4k
Quote:


> Originally Posted by *Ne01 OnnA*
> 
> The problem of "some ppl" today is that they need Ultra Qualtiy even if is not visible Visual difference between Ultra Vs High
> 
> 
> 
> 
> 
> 
> 
> 
> They are -Modern Benchmarks People- or -nV Childrens-
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in all -> you have Settings in games = Use it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fury is the best 1440p GPU (besides Vega) on the market.
> When i get HBCC working for Fury (RadeonMOD is my Invention) we will have hell of a GPU


it cant be done HBCC is hardware based


----------



## AliNT77

Is there any way to increase MVDDC to something higher than 1.3v on my nano without hex editing?? I can't even get 545 stable on my card







(


----------



## gupsterg

You can use i2c command, but if you have other tools accessing bus at the time can cause issue.

If you ate using a driver newer than v16.12.2 WHQL then HBM clock increase gives no performance gain. AMD have given us FineWine cork'd edition







.


----------



## xkm1948

Quote:


> Originally Posted by *gupsterg*
> 
> You can use i2c command, but if you have other tools accessing bus at the time can cause issue.
> 
> If you ate using a driver newer than v16.12.2 WHQL then HBM clock increase gives no performance gain. AMD have given us FineWine cork'd edition
> 
> 
> 
> 
> 
> 
> 
> .


Someone mentioned you moved on to the green side? 1080 or 1080Ti? How big of a different comparing to FuryX


----------



## AliNT77

Quote:


> Originally Posted by *gupsterg*
> 
> You can use i2c command, but if you have other tools accessing bus at the time can cause issue.
> 
> If you ate using a driver newer than v16.12.2 WHQL then HBM clock increase gives no performance gain. AMD have given us FineWine cork'd edition
> 
> 
> 
> 
> 
> 
> 
> .


yes i know about the driver thing... i tried HBM OC on 16.12.2 and found that 545 is not stable but i think with a bit of extra juice its achievable...

i searched for the i2c command and all i could find was that video you uploaded to YT to tease the memory voltage offset coming to afterburner


----------



## gupsterg

Your Fiji should be on bus 6 device 30.

You can check by doing a i2c dump.

Example of how your shortcut property for "Target" box should be:-

"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" -i2cdump

*Note:* the space between " -

Next the command:-

/wi6,30,8e,01

Your shortcut target box properties will look like:-

"C:\Program Files\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8e,01

Note:- The space between " /

In the command that "we" have done the 01 is the data value denoting what the MVDDC offset step is, a step is 6.25mV, this is stored as a hexadecimal.

I have used 1.325V on my Fury X since ~March 16 to run 545MHz. So I would use 04, there is a little anomaly on this register so you may need 05 for +25mV, it was touched on in the Fiji Bios thread when I discussed it with The Stilt.

A restart will hold value you set a shutdown and repost does not.

To remove VDDC offset the command is:-

/wi6,30,8d,00
Quote:


> Originally Posted by *xkm1948*
> 
> Someone mentioned you moved on to the green side? 1080 or 1080Ti? How big of a different comparing to FuryX


Mine gets delivered today. Bit sad dumping AMD, as lose FreeSync function that MG279Q has, not swapping monitor yet, G-Sync to pricey.

Should be between 23-33% performance gain. I won't be able to run it yet as waiting on parts to finish custom loop. It's a MSI GTX 1080 Sea Hawk EK X, viewing reviews on TPU ~10% faster than an FE, as it's factory OC'd. Should get ~2GHz+ so another boost of ~5% IMO.


----------



## AliNT77

Quote:


> Originally Posted by *gupsterg*
> 
> Your Fiji should be on bus 6 device 30.
> 
> You can check by doing a i2c dump.
> 
> Example of how your shortcut property for "Target" box should be:-
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" -i2cdump
> 
> *Note:* the space between " -
> 
> Next the command:-
> 
> /wi6,30,8e,01
> 
> Your shortcut target box properties will look like:-
> 
> "C:\Program Files\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8e,01
> 
> Note:- The space between " /
> 
> In the command that "we" have done the 01 is the data value denoting what the MVDDC offset step is, a step is 6.25mV, this is stored as a hexadecimal.
> 
> I have used 1.325V on my Fury X since ~March 16 to run 545MHz. So I would use 04, there is a little anomaly on this register so you may need 05 for +25mV, it was touched on in the Fiji Bios thread when I discussed it with The Stilt.
> 
> A restart will hold value you set a shutdown and repost does not.
> 
> To remove VDDC offset the command is:-
> 
> /wi6,30,8d,00


you are a legend







that's exactly what i was looking for


----------



## gupsterg

No problem







, enjoy.

Just make sure you run command when nothing is actively monitoring 'system'. Have no OS OC SW loaded in background and monitoring tools.


----------



## AliNT77

@gupsterg
okay so tried it with 1.337mv and still 545 is not stable







... i think i need better cooling on my nano...









is it possible to undervolt the HBM ? i gave up on OCing so im gonna try undervolting for lower temps...


----------



## AliNT77

nevrmind ... adding a minus(-) did the trick


----------



## AliNT77

1.2v seems 100% stable
Core temp is 1-2C lower

1.1v results instant crash so it's actually working...


----------



## monza1412

Quote:


> Originally Posted by *bluezone*
> 
> Yes it damaged the die upon removal. Someone used IC Diamond as well and scratched the die.
> 
> Anyone else read this?
> 
> https://www.hardocp.com/article/2017/08/25/amd_radeon_rx_vega_56_versus_r9_fury
> 
> Edited: got interrupted while typing.


interesting read, let me wondering about the performance difference between vega and fiji clock for clock..



https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/7/#diagramm-aots-escalation-rx-vega-56-vs-vega-64-vs-fury-x


----------



## Ne01 OnnA

Quote:


> Originally Posted by *monza1412*
> 
> interesting read, let me wondering about the performance difference between vega and fiji clock for clock..
> 
> 
> 
> https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/7/#diagramm-aots-escalation-rx-vega-56-vs-vega-64-vs-fury-x


Doesn't matter right now (16.8.2017 10:08 Date of the tests)

Now Vega owners have 17.8.2 driver and this one is best to date for Gaming









All in all those tests are Obsolete right now, and in a month or two we will have situation that Vega 56 is 1080 rival







now is Vega 64

I will Pick Vega AIB from XFX or Sapphire, P.Color with WC or Triple Fan, it will be Big Vega Time then with new Drivers (November/December)


----------



## monza1412

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> Doesn't matter right now (16.8.2017 10:08 Date of the tests)
> 
> Now Vega owners have 17.8.2 driver and this one is best to date for Gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in all those tests are Obsolete right now, and in a month or two we will have situation that Vega 56 is 1080 rival
> 
> 
> 
> 
> 
> 
> 
> now is Vega 64
> 
> I will Pick Vega AIB from XFX or Sapphire, P.Color with WC or Triple Fan, it will be Big Vega Time then with new Drivers (November/December)


I don't know, they clearly stated that they used 2 drivers, one being an alpha of the 17.8.2, so I cannot agree that the test is obsolete. Maybe in the next months they update the results with more mature drivers.


----------



## Ne01 OnnA

Quote:


> Originally Posted by *monza1412*
> 
> I don't know, they clearly stated that they used 2 drivers, one being an alpha of the 17.8.2, so I cannot agree that the test is obsolete. Maybe in the next months they update the results with more mature drivers.


VEGA is in Beta state now, i will wait for October/November Tests
Also im looking into AIBs with 16GB HBM2 4xStack ! -> Yes they will be at some point.


----------



## dagget3450

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> VEGA is in Beta state now, i will wait for October/November Tests
> Also im looking into AIBs with 16GB HBM2 4xStack ! -> Yes they will be at some point.


Sadly true about vega...


----------



## bluezone

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> VEGA is in Beta state now, i will wait for October/November Tests
> Also im looking into AIBs with 16GB HBM2 4xStack ! -> Yes they will be at some point.


Until games start to take advantage of Vega architecture, there will not likely be much difference between Fiji and Vega, other than clock speed related differentiation. Even RTG/Kaduri has alluded to this. Primitive culling and rasterization improvements were expected to be incremental improvements. Vega is not even Kaduri's architectural lay out. It is an evolution of GNC. Navi, if I am not mistaken, will entirely be his baby.

Vega FE is 16GB HBM2 2 stacks. There is not enough room on the standardized GPU/interposer package for more than 2 stacks. iirc the interposer is almost as big as it can be made at the moment. The Vega GPU is a pretty big die due to the added transistors for higher clocks/on die memory/pipeline improvements.


----------



## LeadbyFaith21

Hey guys, I just got rid of one of my Furys and have an EK block (Nickel and plexi) and black back plate from it, is there anyone on here who would be interested in it? I just ask that you pay whatever it costs me to ship it to you


----------



## bluezone

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> Hey guys, I just got rid of one of my Furys and have an EK block (Nickel and plexi) and black back plate from it, is there anyone on here who would be interested in it? I just ask that you pay whatever it costs me to ship it to you


I'd take it in a minute If I didn't have a Nano instead of a Fury.


----------



## gupsterg

I am in the throws of indecision







.

Fiji is cracking card IMO, I have enjoyed it thoroughly with the AIO unit. It has played games I use well, it rocks [email protected] as well. Having gone 1440P FreeSync shortly after purchasing Fury X I also enjoyed a jump in res and smoothness vs Hawaii, even though that had FS I didn't have monitor then.

I seemed to have purchased a MSI GTX 1080 Sea Hawk EK X to replace it







.

Main reason to buy it was it has a water block from factory, so warranty is defo intact. It came with a missing screw on water block, made by EK. As etailer had no stock left of the MSI GTX Sea Hawk EK X (I understand is discontinued card), they offered me a discount and requested I speak to MSI/EKWB for missing screw. Via a support ticket EK have stated they will send missing screw FOC.

I still seem to want to keep the Fury X







.

The PCB is so small that I







that aspect, even now a year+ of owning it I gawp at the size of it. The MSI GTX 1080 is much wider and longer, even though the PCB is sparsely populated I can't understand why it is so big.

Sapphire list the Fury X as 195mm (L) X 110mm (W) X 40mm (H).
RX Vega 64 ref PCB as 272mm (L) X 112mm (W) X 40mm (H).
MSI GTX 1080 Sea Hawk EK X is 278mm (L) x 165mm (W) x 20mm (H).

The measurements above are from manufacturer sites. On width it includes PCI-E fingers on PCB, ~12mm. In the case of the stated GTX 1080 model, width includes ~25mm for the WC ports on side of block. A bit disappointed on VEGA that they made PCB as long as they did, I may want VEGA Nano.

Only water block I can find for Fury X is a Bykski one, brand wise it's a non issue IMO, just investing £80 into Fiji so late in ownership seems too much







.


----------



## Minotaurtoo

Quote:


> Originally Posted by *gupsterg*
> 
> I am in the throws of indecision
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Fiji is cracking card IMO, I have enjoyed it thoroughly with the AIO unit. It has played games I use well, it rocks [email protected] as well. Having gone 1440P FreeSync shortly after purchasing Fury X I also enjoyed a jump in res and smoothness vs Hawaii, even though that had FS I didn't have monitor then.
> 
> I seemed to have purchased a MSI GTX 1080 Sea Hawk EK X to replace it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Main reason to buy it was it has a water block from factory, so warranty is defo intact. It came with a missing screw on water block, made by EK. As etailer had no stock left of the MSI GTX Sea Hawk EK X (I understand is discontinued card), they offered me a discount and requested I speak to MSI/EKWB for missing screw. Via a support ticket EK have stated they will send missing screw FOC.
> 
> I still seem to want to keep the Fury X
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The PCB is so small that I
> 
> 
> 
> 
> 
> 
> 
> that aspect, even now a year+ of owning it I gawp at the size of it. The MSI GTX 1080 is much wider and longer, even though the PCB is sparsely populated I can't understand why it is so big.
> 
> Sapphire list the Fury X as 195mm (L) X 110mm (W) X 40mm (H).
> RX Vega 64 ref PCB as 272mm (L) X 112mm (W) X 40mm (H).
> MSI GTX 1080 Sea Hawk EK X is 278mm (L) x 165mm (W) x 20mm (H).
> 
> The measurements above are from manufacturer sites. On width it includes PCI-E fingers on PCB, ~12mm. In the case of the stated GTX 1080 model, width includes ~25mm for the WC ports on side of block. A bit disappointed on VEGA that they made PCB as long as they did, I may want VEGA Nano.
> 
> Only water block I can find for Fury X is a Bykski one, brand wise it's a non issue IMO, just investing £80 into Fiji so late in ownership seems too much
> 
> 
> 
> 
> 
> 
> 
> .


my son says he wants to give one of your fury x's a home lol... he doesn't have much money though and neither do I lol


----------



## PontiacGTX

any idea how to improve this score?
https://www.3dmark.com/compare/fs/13373277/fs/13555318#


----------



## bluezone

794 MHz vs 1,050 MHz? Why is the Aug 16 score @ 794 MHz?


----------



## PontiacGTX

Quote:


> Originally Posted by *bluezone*
> 
> 794 MHz vs 1,050 MHz? Why is the Aug 16 score @ 794 MHz?


not sure probably was the driver


----------



## bluezone

Speaking of drivers. 17.9.1.

http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.9.1-Release-Notes.aspx

Apparently they look down on overclocking.

OVERCLOCKING WARNING: AMD processors are intended to be operated only within their associated specifications and factory settings. Operating your AMD processor outside of official AMD specifications or outside of factory settings, including but not limited to the conducting of overclocking (including use of this overclocking software, even if such software has been directly or indirectly provided by AMD or otherwise affiliated in any way with AMD), may damage your processor and/or lead to other problems, including but not limited to, damage to your system components (including your motherboard and components thereon (e.g. memory)), system instabilities (e.g. data loss and corrupted images), reduction in system performance, shortened processor, system component and/or system life and in extreme cases, total system failure. AMD does not provide support or service for issues or damages related to use of an AMD processor outside of official AMD specifications or outside of factory settings. You may also not receive support or service from your board or system manufacturer. Please make sure you have saved all important data before using this overclocking software. DAMAGES CAUSED BY USE OF YOUR AMD PROCESSOR OUTSIDE OF OFFICIAL AMD SPECIFICATIONS OR OUTSIDE OF FACTORY SETTINGS ARE NOT COVERED UNDER ANY AMD PRODUCT WARRANTY AND MAY NOT BE COVERED BY YOUR BOARD OR SYSTEM MANUFACTURER'S WARRANTY.

The software that has been directly or indirectly provided by AMD or an entity otherwise affiliated with AMD may disable or alter: (1) software including features and functions in the operating system, drivers and applications, and other system settings; and (2) system services. WHEN THE SOFTEWARE IS USED TO DISABLE OR ALTER THESE ITEMS IN WHOLE OR PART, YOU MAY EXPERIENCE (A) INCREASED RISKS THAT CERTAIN SECURITY FUNCTIONS DO NOT FUNCTION THEREBY EXPOSING YOUR COMPUTER SYSTEM TO POTENTIAL SECURITY THREATS INCLUDING, WITHOUT LIMITATION, HARM FROM VIRUSES, WORMS AND OTHER HARMFUL SOFTWARE; (B) PERFORMANCE AND INTEROPERABILITY ISSUES THAT MAY ADVERSELY AFFECT YOUR EXPERIENCE AND THE STABILITY OF YOUR COMPUTING SYSTEM; AND (C) OTHER EXPERIENCES RESULTING IN ADVERSE EFFECTS, INCLUDING, BUT NOT LIMITED, TO DATA CORRUPTION OR LOSS.

Nice.


----------



## Skyl3r

Quote:


> Originally Posted by *bluezone*
> 
> Speaking of drivers. 17.9.1.
> 
> http://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.9.1-Release-Notes.aspx
> 
> Apparently they look down on overclocking.
> 
> Nice.


Don't blow up your GPU








I'll look down on you too!


----------



## bluezone

Quote:


> Originally Posted by *Skyl3r*
> 
> Don't blow up your GPU
> 
> 
> 
> 
> 
> 
> 
> 
> I'll look down on you too!


LOL.


----------



## Ne01 OnnA

"Vega FE is 16GB HBM2 2 stacks. There is not enough room on the standardized GPU/interposer package for more than 2 stacks. iirc the interposer is almost as big as it can be made at the moment. The Vega GPU is a pretty big die due to the added transistors for higher clocks/on die memory/pipeline improvements."

So we need to wait for ->

*AMD Production of Vega 11:*
Globalfoundaries has been contracted to begin manufacturing of AMD's next-gen Vega GPU's, Siliconware Precision Industries has also been hired as it's packager of the chips, according to DigiTimes.

Globalfoundaries will be the company producing the wafers for AMD's new Vega 11 GPU's. They will be using 14nm FinFET process technology to manufacture the new series of GPU's with the sources saying they are holding "backend orders for the Vega 11 series."

Packaging specialist SPIL, which has already obtained orders for AMD's Vega 10-series chips, will continue to hold the majority of backend orders for the Vega 11 series, the sources noted.

Taiwan Semiconductor Manufacturing Company (TSMC) with its CoWoS (chip-on-wafer-on-substrate) technology has reportedly secured orders for AI chips from Nvidia and Google. TSMC has further enhanced its advanced packaging capability eyeing a bigger presence in the supercomputer field, the sources said.


----------



## PontiacGTX

the interposer opinion sounds from someone who isnt talking with facts


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> the interposer opinion sounds from someone who isnt talking with facts


Are you referring to me? There was something written about Vega being near maximum interposer size in one the many Vega FE/RX Vega preview/reviews.


----------



## PontiacGTX

Quote:


> Originally Posted by *bluezone*
> 
> Are you referring to me? There was something written about Vega being near maximum interposer size in one the many Vega FE/RX Vega preview/reviews.


no i mean the quote from the post above
Quote:


> "Vega FE is 16GB HBM2 2 stacks. There is not enough room on the standardized GPU/interposer package for more than 2 stacks. iirc the interposer is almost as big as it can be made at the moment. The Vega GPU is a pretty big die due to the added transistors for higher clocks/on die memory/pipeline improvements."


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> no i mean the quote from the post above


Yes that would be me being quoted by Ne01 OnnA.
Quote:


> "Vega FE is 16GB HBM2 2 stacks. There is not enough room on the standardized GPU/interposer package for more than 2 stacks. iirc the interposer is almost as big as it can be made at the moment. The Vega GPU is a pretty big die due to the added transistors for higher clocks/on die memory/pipeline improvements."




This is Vega FE with 16GB (2 stacks) of HBM. I see no room on the package for more HBM stacks. I can quote Fiji launch regarding size limits on the interposer. Size wise the same or close to the same interposer and package size as Fiji. Vega is a big Chip and adding more HBM would increase it size. Due to adding more memory controllers.
Quote:


> Finally, as large as the Fiji GPU is, the silicon interposer it sits on is even larger. The interposer measures 1011mm2, nearly twice the size of Fiji. Since Fiji and its HBM stacks need to fit on top of it, the interposer must be very large to do its job, and in the process it pushes its own limits. The actual interposer die is believed to exceed the reticle limit of the 65nm process AMD is using to have it built, and as a result the interposer is carefully constructed so that only the areas that need connectivity receive metal layers. This allows AMD to put down such a large interposer without actually needing a fab capable of reaching such a large reticle limit.


http://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/3

As for Ne01 OnnA's post below his quote of me. I'm not sure how that has any bearing on or effect on what I had replied to him earlier.







Thus I made no further reply to him. He sometimes includes unrelated random items to his posts.


----------



## PontiacGTX

but I mean how do you know interposer can not be bigger


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> but I mean how do you know interposer can not be bigger


In a practical sense, it is already bigger than it can be photographically printed with circuitry. Short version is that not all of the interposer layers contains circuitry paths (It's just dead space. Such as the metal ring area around the interposer package). This is due to industry process aperture size. How large of area they can actually print at the required process size. They could place the aperture farther away from the silicon being printed upon ,just like moving a magnifying glass), but the process size would increase too.
On a side note NVidia recently money bagged their silicon supplier to make larger aperture equipment (non-standard) for their personal use.


----------



## PontiacGTX

I think they can move the VRM tot he right and increase the interposer there wouldnt be such thing as limited interposer


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> I think they cna mode the VRM tot he right and increase the interposer there wouldnt be such thing as limited interposer


Sorry I don't recall what "cna mode" is.


----------



## PontiacGTX

Quote:


> Originally Posted by *bluezone*
> 
> Sorry I don't recall what "cna mode" is.


typo I mena they can move the VRM to make more space for interposer


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> typo I mena they can move the VRM to make more space for interposer


I take it you mean the HBM stacks. VRM's can be placed anywhere on the video card, but main GPU VRM's should be placed close to the GPU/Interposer to avoid lag and capacitance issues (as well as other problems). The VRM's don't limit the GPU package size on Vega.
Plus there is a limit, depending on method of component handling, of how close the package components can be placed. they cannot be touching each other either, expansion rates due to differing thermal loading are the culprit.
Besides they already have 4 and 8GB HBM stacks. Plus Vega would need on die componentry to handle the interface to additional HBM stacks.
I'm not sure if you could add additional stacks to the existing interfaces.
Each HBM stack has its own memory controller on the bottom of each HBM die stack .

EDITED: I left out a word.

Here read this.

https://www.quora.com/Why-does-a-CPU-GPU-chip-have-a-physical-size-limit


----------



## xBastek

So with the current driver can we overclock memory?
Im still on 17.7.1 because of that.


----------



## PontiacGTX

Quote:


> Originally Posted by *xBastek*
> 
> So with the current driver can we overclock memory?
> Im still on 17.7.1 because of that.


you always can overclock memory using MSI AB or Trixx


----------



## AliNT77

you cant actually overclock memory with any driver past 16.12.2


----------



## bluezone

New drivers 17.9.2

Release notes and links.

https://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.9.2-Release-Notes.aspx

I did a few quick test runs with these new drivers. I think i like them.

Here is FS @1050 MHz. 17 283 Graphics score.

https://www.3dmark.com/3dm/22282651

FS @ 1100 MHz set in wattman. 17 706 Graphics score.

https://www.3dmark.com/3dm/22282750

FS @ 1100 MHz No Tess. 19 746 Graphics score.

https://www.3dmark.com/3dm/22282847

I will have to play around some more.


----------



## gupsterg

An update as an ex Fury X owner.

Last night was 1st time I did some actual gaming on MSI GTX 1080 Sea Hawk EK X. Prior time has just been spent benching/tweaking ThreadRipper setup.

I have ~150hrs+ clocked up on SWBF with Fury X, so I'd say experience of how it performances capped at 88FPS and using FreeSync I have alot.

I have not OC'd GTX 1080, due to it being on WC and how nVidia Boost 3.0 works I see flat clocks of ~1965MHz in benches.

So SWBF same Ultra 1440P setup but no V-Sync and FreeSync, I get range of say ~120-144FPS. The gameplay at times does not seem as smooth as Fury X capped at 88 FPS with FreeSync







.

Next I tried Lords of the fallen, I get somewhere around ~90FPS+. Again the experience was not as smooth as Fury X IMO







.

I then thought let's cap FPS in games to say 90, use same refresh rate, with V-Sync, to my suprise I found no such option in nVidia driver panel







.

I called it a night then and plan to try other things today to see if I can smooth out gameplay 'experience'.

On another note my Fury X sold a day or so ago. Packed and ready to ship today. Some what some regret has crept in on swapping GPUs. Not welching on sale though.

I have sneaky feeling that variable refresh rate tech has spoiled me







. I may have to dispose of GTX 1080 sooner, rather than later







.


----------



## Ne01 OnnA

gupsterg









YES, the Fiji + FreeSync is Awesome









(I hope you'll be back to ATI soon =







)

Also the 17.9.3 is New Best of the Best for my Fury












Spoiler: Warning: Spoiler!





and Gamer Mode ~240tW Max


----------



## jearly410

@gupsterg Ain't that something, eh? I'm another ex fury x user, but made the jump to vega 64, and variable refresh has become something I can't live without. Even though I reach 80+fps 3440 1440 100hz all ultra, I can instantly tell when freesync is disabled (ultimate general: civil war has flicker problems with freesync since the crimson update and I have to disable freesync for it.)

My friend who has a 1080 with no gsync played with my fury x freesync system, said it felt better, and the next day bought a gsync monitor.

Get the vega gup! It is a noticeable improvement over my old crossfire fury x system.


----------



## bluezone

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *gupsterg*
> 
> An update as an ex Fury X owner.
> 
> Last night was 1st time I did some actual gaming on MSI GTX 1080 Sea Hawk EK X. Prior time has just been spent benching/tweaking ThreadRipper setup.
> 
> I have ~150hrs+ clocked up on SWBF with Fury X, so I'd say experience of how it performances capped at 88FPS and using FreeSync I have alot.
> 
> I have not OC'd GTX 1080, due to it being on WC and how nVidia Boost 3.0 works I see flat clocks of ~1965MHz in benches.
> 
> So SWBF same Ultra 1440P setup but no V-Sync and FreeSync, I get range of say ~120-144FPS. The gameplay at times does not seem as smooth as Fury X capped at 88 FPS with FreeSync
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Next I tried Lords of the fallen, I get somewhere around ~90FPS+. Again the experience was not as smooth as Fury X IMO
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I then thought let's cap FPS in games to say 90, use same refresh rate, with V-Sync, to my suprise I found no such option in nVidia driver panel
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Do you think you might check out a G-Sync panel?
> 
> I called it a night then and plan to try other things today to see if I can smooth out gameplay 'experience'.
> 
> On another note my Fury X sold a day or so ago. Packed and ready to ship today. Some what some regret has crept in on swapping GPUs. Not welching on sale though.
> 
> I have sneaky feeling that variable refresh rate tech has spoiled me
> 
> 
> 
> 
> 
> 
> 
> . I may have to dispose of GTX 1080 sooner, rather than later
> 
> 
> 
> 
> 
> 
> 
> .






HI Gupsterg. looks like AdoredTV likes the upper range NVidia Series too. He tested The GTX 1080 Ti






Do you think you might try out a G-Sync panel? How locked down are th GTX series Via Bios MOD.?


----------



## gupsterg

@Ne01 OnnA

I hear you man







. I wish I had gamed on GTX 1080 before placing advert for Fury X.

She be gone now to new home







.

Real fine card. Truly golden like my i5 4690K. Even when I sold that I had regret.

@jearly410

I think when you've used variable refresh rate tech it's damn hard to go back. It's defo more in your face than say going from high refresh rate monitor to lower.

I have found one VEGA 64 here in the UK, close to launch price ~£479. It's open box. I may pull the trigger on it.

I have been AMD GPU since HD5xxx, my last nVidia card was GTX 280. Imagine my shock to be greeted with very similar driver panel as back then







.

@bluezone

Nope not going G-Sync. Besides how many less models there are of G-Sync panels vs FreeSync there's a premium for them (as you'd know).

Bios mod is not possible on nVidia.

I was just reading on OCuk thread on nVidia drivers. Guess recent discussion?


Spoiler: Warning: Spoiler!



Members sharing link to tool to extract just driver from nVidia driver, so they won't have telementry/GeForce experience











Grass does not seem greenier on green to me now







.


----------



## Ne01 OnnA

"I have been AMD GPU since HD5xxx, my last nVidia card was GTX 280. Imagine my shock to be greeted with very similar driver panel as back then







" -> My Last nV was also GTX285 (i still have it







i put this one when im changing GPUs)

.... ATI 9800Pro 128MB -> 6600 GT 256MB -> 1950XT -> 3870 -> GTX285 -> HD 5870 -> 7970GHz Ed & XFX 280X Black -> Nitro Fury-X -> Pending







VEGA or Navi?


----------



## Offler

I purchased AMD FuryX by Gigabyte few months ago. It was an early sample which was used for testing for local hardware magazines.

As people on this forum know, the pump it had was the early one and had high-pitched noise. After few weeks the sound was gone, and yes the pump is definitely working fine as temperatures are always at 40-50 degrees of Celsius.

(Hm. I should change my signature)

Change of my graphic card led to one minor problem with connectivity. My old display 23" Samsung had either DVI-D or HDMI, while FuryX has either HDMI or Display Port...HDMI proved to have inferior image qualiy for the setup so...

https://www.samsung.com/us/computing/monitors/uhd-and-wqhd/samsung-uhd-28-monitor-with-high-glossy-black-finish-lu28e590ds-za/

So far I really like the image quality and Freesync works really really nice. I encountered few trouble.

a) Windows setting to have 150% size of everything...
it took several reboots, so system would "remember" the setting. Few times simply happenned that system started on 100% size, but the system was claiming 150%...
Note I have Windows 7.

b) Miranda "docking"
Usually takes much larger space, than displayed.

c) Some games have problem with proper scaling
Skyrim Legendary Edition works fine in normal fulscreen, in borderless windowed is image about 4x bigger as expected.

Also if borderless windowed mode works fine, Start panel of system remains "always on top" even when such setting is not enabled.

d) On some occassiions graphic driver reports "Stream through Display Port ended"
I noticed its sometimes related to change of resolution...

Anyone here has experience on combination of Windows 7 x64, FuryX, and 4k display with Freesync support?


----------



## 99belle99

Well done a bios mod on my Fury X thanks to that very informative first post in the fiji bios editing thread by gupsterg. I am running at 1150MHz and 550MHz HBM and an extra bit of voltage. Seems stable so far anyway. Haven't gamed on it yet though just done synthetic benchmarks.


----------



## Offler

Ok, I resolved the issue with image size.

Somehow AMD driver turned on Virtual Super Resolution feature, Took much lower resolution 1920x1080 and extended it to fullscreen 3840x2160... which resulted in pixelated image, mainly the letters in the system instead of using normal native resolution of the display.

I changed the Display port, driver now claims "Not supported" and does not attempt to do that.


----------



## Alastair

Quote:


> Originally Posted by *99belle99*
> 
> Well done a bios mod on my Fury X thanks to that very informative first post in the fiji bios editing thread by gupsterg. I am running at 1150MHz and 550MHz HBM and an extra bit of voltage. Seems stable so far anyway. Haven't gamed on it yet though just done synthetic benchmarks.


Are you using any of the newer drivers because as far as I am aware even on new drivers the BIOS mods no longer change HBM clocks.


----------



## rbys

Just wondering if anyone tried the latest beta version of Afterburner (4.4.0 beta 19). When I try to do -76mv it only accept -75mv instead. I thought that Fiji was undervolting in -12mv steps ( http://www.tomshardware.com/reviews/msi-afterburner-undervolt-radeon-r9-fury,4425.html )?

https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-42#post-5479753
Quote:


> · Improved voltage offset calculation accuracy for AMD Fiji, Ellesmere and Baffin GPU families


----------



## bluezone

Quote:


> Originally Posted by *Alastair*
> 
> Are you using any of the newer drivers because as far as I am aware even on new drivers the BIOS mods no longer change HBM clocks.


HBM clocks are still locked in the newer drivers, but i have not tested the very latest one yet. 17.10.1
Quote:


> Originally Posted by *rbys*
> 
> Just wondering if anyone tried the latest beta version of Afterburner (4.4.0 beta 19). When I try to do -76mv it only accept -75mv instead. I thought that Fiji was undervolting in -12mv steps ( http://www.tomshardware.com/reviews/msi-afterburner-undervolt-radeon-r9-fury,4425.html )?
> 
> https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-42#post-5479753


6.25mv steps, so you will end up with weird steps like -75mv showing up instead of -76mv. You still adjust in roughly 6mv steps.


----------



## 99belle99

Quote:


> Originally Posted by *Alastair*
> 
> Are you using any of the newer drivers because as far as I am aware even on new drivers the BIOS mods no longer change HBM clocks.


Really. Didn't know that. Thanks for letting me know.


----------



## OZrevhead

Guys, I have not owned an AMD video card since 290x release but I am looking at either a Nano or Fury-X for my SG13 WC itx build (most just because they are short and you can get full water blocks for them). Its been 2 years since these were released and I noticed that Nano and Fury-X didn't score that well in release reviews, with benchmarks placing them around 980 or 980Ti level but more recent reviews have them scoring better, is this due to driver improvements? I am tending towards a Nano as I already have a full block for one, are there bios mods available for Nano to bring it closer to Fury-X performance? The only thing putting me off a Nano or Fury-X is they are quite hard to find, I am currently watching one of each for sale and they are about the same price so no advantage to either there. What do you guys think?

Any help would be greatly appreciated.

Thanks


----------



## diggiddi

If they cost the same just get the fury x


----------



## Starbomba

My own Nano behaves between a 1060 & 1070, being closer to the latter than the former. It is still a good performance for the money. All games i play run @ 4K and i do not have any issues whatsoever at High details.

If you're going with full WC block route, get the Fury X. It has a better PCB overall, and both cards are already shorter than the average.


----------



## FlawleZ

Anyone tried 17.10.1 yet?


----------



## xkm1948

17.10.1 was unstable for me. Constant BSOD for my FuryX.

Currently using Windows 10 FCU with 17.40 beta driver. A lot more stable than 17.10.1


----------



## Ne01 OnnA

Im on 17.9.3 WHQL and this is best for me 4 WinX CU Update








Also noticed all games performs a little bit #Better

Good Update IMO


----------



## djsatane

Latest 17.10.2 drivers with my fury x have a weird issue with games. If I set video mode in game to fullscreen, meaning when alt tabbing out of the game gpu is not used, then whenever alt tabbing out of the game or alt tabbing back in there is a half second or so input pause that occurs every time. This did not occur on previous drivers... In windowed modes its fine, but this is not something that I would consider progress.


----------



## Ne01 OnnA

All in 1440p




Spoiler: Elex & EW2






[/spoile]


----------



## Minotaurtoo

interesting results


----------



## FlawleZ

Vega looking better with the latest update and drivers. Our R9 Fury cards are maturing very well. Most of the time now trumping 980 TI which is a good feeling ?


----------



## PontiacGTX

Quote:


> Originally Posted by *Ne01 OnnA*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in 1440p
> 
> 
> 
> 
> Spoiler: Elex & EW2
> 
> 
> 
> 
> 
> 
> [/spoile]


meanwhile


----------



## CptAsian

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ne01 OnnA*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All in 1440p
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Elex & EW2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> meanwhile
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Is Wolfenstein II in DX12 or Vulcan or what? Impressive.


----------



## PontiacGTX

Quote:


> Originally Posted by *CptAsian*
> 
> Is Wolfenstein II in DX12 or Vulcan or what? Impressive.


Vulkan


----------



## Arizonian

Quote:


> Originally Posted by *FlawleZ*
> 
> Vega looking better with the latest update and drivers. Our R9 Fury cards are maturing very well. Most of the time now trumping 980 TI which is a good feeling ?


I agree. Best $499 I've spent when Nitro Fury released on a GPU in a long time and it's held up gaming for me just fine. AMD Driver team did a turn around and have been day one game ready since 290X released. I think August 2018 will be interesting and will be time for me to upgrade finally. Fury has been like a fine wine for me.


----------



## CptAsian

Quote:


> Originally Posted by *PontiacGTX*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CptAsian*
> 
> Is Wolfenstein II in DX12 or Vulcan or what? Impressive.
> 
> 
> 
> Vulkan
Click to expand...

Cool, thanks. I suspected as much.


----------



## PontiacGTX

@Unwinder added even better voltage control to fiji GPUs! http://www.overclock.net/t/1641162/g3d-msi-afterburner-4-4-0-stable-final


----------



## bluezone

Quote:


> Originally Posted by *PontiacGTX*
> 
> @Unwinder added even better voltage control to fiji GPUs! http://www.overclock.net/t/1641162/g3d-msi-afterburner-4-4-0-stable-final


Very cool, good catch on the update!

+1 REP.


----------



## xkm1948

Quote:


> Originally Posted by *bluezone*
> 
> Very cool, good catch on the update!
> 
> +1 REP.


What is the actual benefits then? Better overclocking? Restore of HBM overclocking?


----------



## PontiacGTX

well It was very polite of him
Quote:


> Originally Posted by *xkm1948*
> 
> What is the actual benefits then? Better overclocking? Restore of HBM overclocking?


you can undervolt further


----------



## bluezone

Another game compatibility driver. ( Call of Duty®: WWII/AMD XConnect™ Technology Vega). Relive 17.11.1

release notes.

https://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.11.1-Release-Notes.aspx

EDIT: Seems to be a decent driver.

https://www.3dmark.com/3dm/23054482 Graphics Score 17 903


----------



## 99belle99

Quote:


> Originally Posted by *bluezone*
> 
> Another game compatibility driver. ( Call of Duty®: WWII/AMD XConnect™ Technology Vega). Relive 17.11.1
> 
> release notes.
> 
> https://support.amd.com/en-us/kb-articles/Pages/Radeon-Software-Crimson-ReLive-Edition-17.11.1-Release-Notes.aspx
> 
> EDIT: Seems to be a decent driver.
> 
> https://www.3dmark.com/3dm/23054482 Graphics Score 17 903


How did you get your memory speed reported as 800MHz?


----------



## bluezone

Quote:


> Originally Posted by *99belle99*
> 
> How did you get your memory speed reported as 800MHz?


It's a memory speed reporting bug in the driver. A few people have encountered it. So i didn't get it to report 800MHz.


----------



## PolluxCastor

Just saying hello with my x58 / Fury setup.

Currently -48mv on volts and +50mhz on core for 1100mhz clock and running around 220 watts on load.


----------



## diggiddi

Welcome to the group, nice looking GPU you got there


----------



## 99belle99

I just downloaded Heaven 4.0 after not using it for a long time. So I had a overclock done in AMD Wattman, and at the end of each benchmark run (right before the end) the Wattman settings would crash reverting back to stock. The exact same place each time. Anyone else find this?


----------



## xkm1948

Any initial FuryX owner have their their cards' pump failed? Reading on an Italian hardware site seems to show that FuryX original pumps are starting to fail after 2 years

https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.bitsandchips.it%2F9-hardware%2F9120-raijintek-morpheus-ii-prova-su-strada-con-una-radeon-r9-furyx&edit-text=&act=url


----------



## bluezone

Intel and AMD today announced "AMD To Develop Semi-Custom Graphics Chip For Intel".. http://www.tomshardware.com/news/amd-intel-gpu-emib-soc,35852.html.

Besides this shocker. No need for interposor with this solution. Though i wonder if the GPU and HBM will still use an interposer.


Spoiler: Warning: Spoiler!







EDIT: more info.

https://semiaccurate.com/2017/11/06/intel-gpus-officially-dont-work-amd-rescue/


----------



## 99belle99

I think that is great news as it is great to see AMD get some cash to further develop new CPU's and also GPU's.


----------



## Offler

Quote:


> Originally Posted by *xkm1948*
> 
> Any initial FuryX owner have their their cards' pump failed? Reading on an Italian hardware site seems to show that FuryX original pumps are starting to fail after 2 years
> 
> https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.bitsandchips.it%2F9-hardware%2F9120-raijintek-morpheus-ii-prova-su-strada-con-una-radeon-r9-furyx&edit-text=&act=url


I have one of the initial pumps. The initial high-pitched noise is not there anymore. But I dont have the card active for more than 3 months now. After two years I would make complete replacement anyway.


----------



## bluezone

Now it gets weirder.. Not that this was unexpected after VEGA and suddenly a RTG manager takes a leave of absence.

"Statement from AMD - Raja Koduri Leaves"

https://www.hardocp.com/news/2017/11/07/statement_from_amd_raja_koduri_leaves

What happens now?


----------



## xkm1948

Good riddance. Raja ****ed up Radeon. It is good he is gone


----------



## bluezone

lf what I've been reading is correct, it sounds like Raja was trying to remove RTG from AMD and have it spun off to Intel. This sounds suspiciously correct, considering that Intel just laid off a good chunk of it's graphics department, engineering staff, just before the announcement of RTG graphics useage. AMD may of just shot the gun, that Intel had pointed at its own foot. Raja is no longer there to steer the boat in their direction.
What do you guys think..


----------



## bluezone

I might of spoke too soon.

https://wccftech.com/exclusive-raja-koduri-will-seeking-new-horizons-intel/


----------



## Offler

Anyone else paired FuryX with Phenom II?

https://www.3dmark.com/fs/12415999

Weird feeling to search for firestrike results and be Nr.1 with CPU+GPU combination valid result. Especially when GPU is not clocked at all.


----------



## u3a6

Quote:


> Originally Posted by *Offler*
> 
> Anyone else paired FuryX with Phenom II?
> 
> https://www.3dmark.com/fs/12415999
> 
> Weird feeling to search for firestrike results and be Nr.1 with CPU+GPU combination valid result. Especially when GPU is not clocked at all.


It's a shame that in the first link the combined is so abysmal...
https://www.3dmark.com/fs/11269229

https://www.3dmark.com/fs/10827632

(That is a fury unlocked to 64CU's, but it is listed as a fury non x)


----------



## Offler

Quote:


> Originally Posted by *u3a6*
> 
> It's a shame that in the first link the combined is so abysmal...
> https://www.3dmark.com/fs/11269229
> 
> https://www.3dmark.com/fs/10827632
> 
> (That is a fury unlocked to 64CU's, but it is listed as a fury non x)


thanks

So i can read from that ... potential Graphic performance up to 19 000. I just lack some proper tools to perform overclock - except the ones in driver.

Your Phenom 1100 is on same frequency as mine 1090 so ... in general they should perform same. I just have faster memory on really tight latencies (1600 Mhz 6-6-6-18)


----------



## u3a6

Quote:


> Originally Posted by *Offler*
> 
> thanks
> 
> So i can read from that ... potential Graphic performance up to 19 000. I just lack some proper tools to perform overclock - except the ones in driver.
> 
> Your Phenom 1100 is on same frequency as mine 1090 so ... in general they should perform same. I just have faster memory on really tight latencies (1600 Mhz 6-6-6-18)


My old Crosshair III formula could run this bad boy at 4.1Ghz, unfortunately this is what my MSI 970 gaming allows me to do (such horrible board really). I have my NB at 2800MHz tho. Your memory is much faster than mine yes







If you put some effort into it you will breeze past my score easily. Let me know how it goes!

EDIT: This is how my card looks like at the moment


----------



## diggiddi

Quote:


> Originally Posted by *Offler*
> 
> Anyone else paired FuryX with Phenom II?
> 
> https://www.3dmark.com/fs/12415999
> 
> Weird feeling to search for firestrike results and be Nr.1 with CPU+GPU combination valid result. Especially when GPU is not clocked at all.


Wow that's a very unbalanced system







no wonder you are the only one, how does it run considering the cpu bottleneck


----------



## Offler

Quote:


> Originally Posted by *diggiddi*
> 
> Wow that's a very unbalanced system
> 
> 
> 
> 
> 
> 
> 
> no wonder you are the only one, how does it run considering the cpu bottleneck


I actually never hit CPU bottleneck based on CPU utilization (either for 1 core or all 6 cores). Its very surprising.

Yet my settings and config contain few tricks.

a) Memory remap feature disabled
So everything on PCI-E has dedicated 1gb+ only for transfers. (thats a requirement dictated by my Raid) CPU use the rest 7gb while does not make concurrent read/writes competing PCI-E bus. CPU can get higher performance peaks this way (and if CPU or RAM is not stable, system might crash if remap is disabled)

b) CAS latency 6
So effective CPU performance is 5-10% better (based on linx tests with small block sizes), when compared to more conservative CAS 8.

c) High HT Frequencies.
Currenlty 2800 and 2400, but i have to check. Transfers from CPU to GPU were about 5.4 Gb/s while short latency participates on overall performance.

d) FullHD or 4k gaming, 60FPS, full details
Either Vsynced+triple buffer or Frame limited. This is an approach you can find with consoles, when you have an ideal image output. Usually neither CPU or GPU hits 100% as the real bottleneck is display.

If it hits 60fps (Freesync On), and input latency is working fine... no problem.

But on other hand i really dont play the newest games. Recently: Skyrim: Special Edition (great balanced engine), Starcraft II, Thief (with Mantle, and again balanced engine), Witcher 3: Wild hunt (and i am really surprised as it runs exceptionally well).

I think it has something to do with:
http://www.overclock.net/t/1641119/hardware-canucks-i7-2600k-vs-i7-8700k-is-upgrading-worthwhile


----------



## diggiddi

Quote:


> Originally Posted by *Offler*
> 
> I actually never hit CPU bottleneck based on CPU utilization (either for 1 core or all 6 cores). Its very surprising.
> 
> Yet my settings and config contain few tricks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> a) Memory remap feature disabled
> So everything on PCI-E has dedicated 1gb+ only for transfers. (thats a requirement dictated by my Raid) CPU use the rest 7gb while does not make concurrent read/writes competing PCI-E bus. CPU can get higher performance peaks this way (and if CPU or RAM is not stable, system might crash if remap is disabled)
> 
> b) CAS latency 6
> So effective CPU performance is 5-10% better (based on linx tests with small block sizes), when compared to more conservative CAS 8.
> 
> c) High HT Frequencies.
> Currenlty 2800 and 2400, but i have to check. Transfers from CPU to GPU were about 5.4 Gb/s while short latency participates on overall performance.
> 
> d) FullHD or 4k gaming, 60FPS, full details
> Either Vsynced+triple buffer or Frame limited. This is an approach you can find with consoles, when you have an ideal image output. Usually neither CPU or GPU hits 100% as the real bottleneck is display.
> 
> If it hits 60fps (Freesync On), and input latency is working fine... no problem.
> 
> But on other hand i really dont play the newest games. Recently: Skyrim: Special Edition (great balanced engine), Starcraft II, Thief (with Mantle, and again balanced engine), Witcher 3: Wild hunt (and i am really surprised as it runs exceptionally well).
> 
> I think it has something to do with:
> http://www.overclock.net/t/1641119/hardware-canucks-i7-2600k-vs-i7-8700k-is-upgrading-worthwhile


Cool


----------



## Starbomba

Quote:


> Originally Posted by *Offler*
> 
> Anyone else paired FuryX with Phenom II?
> 
> https://www.3dmark.com/fs/12415999
> 
> Weird feeling to search for firestrike results and be Nr.1 with CPU+GPU combination valid result. Especially when GPU is not clocked at all.


You just made me wonder how 3DMark would look on my 1400T (unlocked 840T) and Xfire Fury X + Nano. Also, on my other bench rigs too....

Too much HW to bench, too little time D:


----------



## Offler

Quote:


> Originally Posted by *Starbomba*
> 
> You just made me wonder how 3DMark would look on my 1400T (unlocked 840T) and Xfire Fury X + Nano. Also, on my other bench rigs too....
> 
> Too much HW to bench, too little time D:


I planned to have Ryzen by now (FuryX had to be first piece of next build), but even when I know that Ryzen/Threadripper is performing much better, i dont see as a problem that GPU is that much overkill to CPU.


----------



## xkm1948

FuryX, the most neglected flagship GPU put out by AMD. FineWine? More like FineVinegar.

Would never touch another AMD GPU, got burned hard this round.


----------



## bluezone

Sorry but haven't most recent Assassin's Creed games been NVidia game works titles?


----------



## Ne01 OnnA

Yes but on Fury-X is strongly recommended to use V.High Textures not Ultra (in UBI-Works games







)
Now get some testing in SW BF2, BF1, NFS or UT







--> Yes Fury shines again









Look -> We can Play easily.




Spoiler: more Games


----------



## kondziowy

In AC:Origins Fury performance is bad but it's exactly the same as it was in Unity or Syndicate. So no big surprise here. Maybe needs lower tess factor?

The real question is what is going on in other games from the last 1-2 months and in particular Wolfenstein 2 Vulkan - even in Medium details at lower than 4GB Vram it is not doing great and barely beats 390x which is running Mein Leben- not Medium. Has anyone seen more benchmarks in High/Medium ?


----------



## Ne01 OnnA

Wolfenstein 2 Vulkan? Mein Leben settings + Ultra shadows
45-62FPS constant (my Chill settings)
Performance is Great IMO


----------



## 99belle99

I find the Fury X decent on every game I play anyway. Mind you I haven't played games released in the past few months.


----------



## xkm1948

__
https://www.reddit.com/r/7ckjj5/furyx_aging_tremendously_bad_in_2017/

FineWine into FineVinegar.


----------



## Tgrove

It has 4gb what are people expecting in 2017?


----------



## kondziowy

Yeah *@xkm1948*, I have no idea what you and reddit expected also









Just taking random 8 games from 2015 to show how it looked back then:
http://www.pcgameshardware.de/Just-Cause-3-Spiel-9784/Specials/Test-Benchmarks-1179397/
http://www.pcgameshardware.de/Dirt-Rally-Spiel-55539/Specials/Benchmark-Test-1182995/
http://www.pcgameshardware.de/The-Division-Spiel-37399/Specials/Beta-Benchmark-Test-1184726/
http://www.pcgameshardware.de/Rainbow-Six-Siege-Spiel-54457/Specials/Test-Benchmarks-Tuning-1179802/
http://www.pcgameshardware.de/Rise-of-the-Tomb-Raider-Spiel-54451/Specials/Grafikkarten-Benchmarks-1184288/
http://www.pcgameshardware.de/Fallout-4-Spiel-18293/Specials/Test-Benchmark-vor-Release-1177284/
http://www.pcgameshardware.de/The-Witcher-3-Spiel-38488/Specials/Test-Version-107-Windows-10-1165910/
http://www.pcgameshardware.de/Call-of-Duty-Black-Ops-3-Spiel-55478/Specials/BO3-Beta-Benchmarks-Windows-10-1169217/

So Fury was in a preeeety bad shape against 980Ti in all, and I LOLed at Rise of the tomb raider where 980Ti is 170%.

The one weird game right now is just this Wolfenstein 2 and that can't be explained - but maybe it's just R9 390's 8 Gigs that are making the difference. Because R9 290 4GB is well behind Fury.

BTW I dropped ultra textures on Fury since Mirror's Edge Catalyst released in mid 2016. Now about 50% of games need more vram.

Looking at those old results it seems like Fury is actually gaining over time, and 6% to 1070Ti/1080 in one of the best looking games in 4K looks sick af to be honest


----------



## rbys

Has anyone noticed a memory leak in MSI Afterburner 4.4.0 on Fiji with AMD Crimson 17.10.2 to 17.11.x?



more info here: https://forums.guru3d.com/threads/bug-memory-leak-in-msi-afterburner-4-4-0-final.417875/


----------



## B'Fish

Hello people,

got a bit of a ****ty problem at this moment. grabbed myself a 2nd hand r9 fury nitro OC from sapphire. The videocard did great for like 2 months but I tried to flash the bios once and bricked it, well no problem because i have 2 biosses on it anyway. But now it seems that the 2nd bios bricked itself after a few weeks of light usage.

Is it possible that a bios can get corrupted? we had a power surge in our home because i woke up with the lights not working etc. the PC was off at this time and it started again without problems. but after i shut it down it got REKKED i guess. the pc works on the iGPU. videocard will not boot from another pci-express slot etc. I rly hope it isnt just dead..... :<

I want to try and reflash the bios but is there a way to check if the GPU is like really broken in a quick way?


----------



## LeadbyFaith21

So I took a EK block off my Fury X and put the AIO back on it and now get some slight corruption on the screen, with 2 bars of pixels randomly being red, green, blue and white. It then goes into a boot loop after a few minutes. Does anyone know what could be the cause of this and how to fix it?


----------



## u3a6

Quote:


> Originally Posted by *LeadbyFaith21*
> 
> So I took a EK block off my Fury X and put the AIO back on it and now get some slight corruption on the screen, with 2 bars of pixels randomly being red, green, blue and white. It then goes into a boot loop after a few minutes. Does anyone know what could be the cause of this and how to fix it?


Looks like an hardware issue... Maybe interposer damage?


----------



## FlawleZ

Quote:


> Originally Posted by *xkm1948*
> 
> 
> __
> https://www.reddit.com/r/7ckjj5/furyx_aging_tremendously_bad_in_2017/
> 
> FineWine into FineVinegar.


Hameeeedo posted the same troll fanboy post on hardforum. Fanboy is fanboy. Trolls will be trolls.


----------



## kondziowy

Looks like 980Ti gets wrecked by Fury X in VRMark DX12. Whole Maxwell is doing bad.
https://overclock3d.net/reviews/software/vrmark_cyan_room_dx12_benchmark_-_amd_vs_nvidia/2

That's interesting because Maxwell was doing well in Time Spy.


----------



## Alastair

Any one tried patching the ATIkmdag files using the patcher to try unlock memory overclock through afterburner. I heard rumours around this might give us out HBM Oc's back.


----------



## u3a6

Quote:


> Originally Posted by *Alastair*
> 
> Any one tried patching the ATIkmdag files using the patcher to try unlock memory overclock through afterburner. I heard rumours around this might give us out HBM Oc's back.


That would be sweet!


----------



## diggiddi

Quote:


> Originally Posted by *u3a6*
> 
> That would be sweet!


Where is our resident HBM expert Gupsterg to the rescue? Gups where ya at?


----------



## xkm1948

Quote:


> Originally Posted by *diggiddi*
> 
> Where is our resident HBM expert Gupsterg to the rescue? Gups where ya at?


Gupsterg has quit modding FuryX. He moved on to Team Green 1080. With the way AMD treating Fury/X line up I would not investing in this arc as well.



As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!



Moving on from FuryX to green camp next round for sure.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xkm1948*
> 
> Gupsterg has quit modding FuryX. He moved on to Team Green 1080. With the way AMD treating Fury/X line up I would not investing in this arc as well.
> 
> 
> 
> As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!
> 
> 
> 
> Moving on from FuryX to green camp next round for sure.


I have testing this game with Fury X. I run it and Medium 1440p and yes you are limited by memory. Was getting huge fps drops in complex scenes. AMD has completely abandoned Fury because it did not sell that well. Same thing will happen with Vega.


----------



## xkm1948

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have testing this game with Fury X. I run it and Medium 1440p and yes you are limited by memory. Was getting huge fps drops in complex scenes. AMD has completely abandoned Fury because it did not sell that well. Same thing will happen with Vega.


Agreed. Vega is very likely to be abandoned down the line pretty soon. AMD's high end is really not worth it as of now.


----------



## geriatricpollywog

I think I killed my Fury X. I replaced the stock AIO and now there are artifacts on the screen during bootup and in Windows. Has anybody seen this before and recovered? I removed the cooler and there is no visible damage to the die.


----------



## kondziowy

Quote:


> Originally Posted by *xkm1948*
> 
> As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!


Why are you saying this and then proceed to post the graph above? Fury X is the fastest 4GB card, and RX570 4GB is much slower than RX470 8GB.
980Ti beaten by prehistoric Hawaii, and much slower than 1070 (and 1070 was supposed to bring the power of the 980Ti, Jen-Hsun own words).


----------



## u3a6

Quote:


> Originally Posted by *0451*
> 
> I think I killed my Fury X. I replaced the stock AIO and now there are artifacts on the screen during bootup and in Windows. Has anybody seen this before and recovered? I removed the cooler and there is no visible damage to the die.


Interposer can be damaged...
Quote:


> Originally Posted by *kondziowy*
> 
> Why are you saying this and then proceed to post the graph above? Fury X is the fastest 4GB card, and RX570 4GB is much slower than RX470 8GB.
> 980Ti beaten by prehistoric Hawaii, and much slower than 1070 (and 1070 was supposed to bring the power of the 980Ti, Jen-Hsun own words).


Hawaii looking insane indeed, I hadn't noticed that before! :O


----------



## PolluxCastor

Fury owners on latest drivers can you test if you pass OCCT GPU test at any settings please? my Fury won't pass no matter what clocks or voltages even underclocked.


----------



## RotheMan

Has anyone tested undervolting with the new Afterburner? I can go up to [email protected] with my Nitro by now. How far can you go?


----------



## xkm1948

Titan V is here folks, with 12GB of HBM2, 110 TFlops of computing power (9 times of TitanXp) Man the new line Volta based GPUs are coming. Time to finally upgrade


----------



## FlawleZ

Quote:


> Originally Posted by *xkm1948*
> 
> Titan V is here folks, with 12GB of HBM2, 110 TFlops of computing power (9 times of TitanXp) Man the new line Volta based GPUs are coming. Time to finally upgrade


3K? No thanks


----------



## xkm1948

Quote:


> Originally Posted by *FlawleZ*
> 
> 3K? No thanks


Nah, I mean we can get the out going TitanXp from rich folks trying to get Titan V now.









If it is any indication. The upcoming Volta based 80 and 80Ti would be absolute beast


----------



## gupsterg

Quote:


> Originally Posted by *B'Fish*
> 
> Hello people,
> 
> got a bit of a ****ty problem at this moment. grabbed myself a 2nd hand r9 fury nitro OC from sapphire. The videocard did great for like 2 months but I tried to flash the bios once and bricked it, well no problem because i have 2 biosses on it anyway. But now it seems that the 2nd bios bricked itself after a few weeks of light usage.
> 
> Is it possible that a bios can get corrupted? we had a power surge in our home because i woke up with the lights not working etc. the PC was off at this time and it started again without problems. but after i shut it down it got REKKED i guess. the pc works on the iGPU. videocard will not boot from another pci-express slot etc. I rly hope it isnt just dead..... :<
> 
> I want to try and reflash the bios but is there a way to check if the GPU is like really broken in a quick way?


None that I know of.

If you use a 2nd GPU and ATiWinFlash does not detect card for flashing your only option to flash is then using external tool.
Quote:


> Originally Posted by *Alastair*
> 
> Any one tried patching the ATIkmdag files using the patcher to try unlock memory overclock through afterburner. I heard rumours around this might give us out HBM Oc's back.
> Quote:
> 
> 
> 
> Originally Posted by *u3a6*
> 
> That would be sweet!
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> Where is our resident HBM expert Gupsterg to the rescue? Gups where ya at?
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

I was on a break







.

I wouldn't have thought this would have any effect. ToastyX's tool is for other purposes as stated here. I no longer have Fiji GPU, so can not test







.
Quote:


> Originally Posted by *xkm1948*
> 
> Gupsterg has quit modding FuryX. He moved on to Team Green 1080. With the way AMD treating Fury/X line up I would not investing in this arc as well.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Moving on from FuryX to green camp next round for sure.


I did get a GTX 1080. As I had no G-Sync monitor the experience was pants IMO without variable refresh rate tech. I sold it on within a month at no loss.

The compare for "gaming experience" was a MSI GTX 1080 EK X vs stock Fury X. Due to nVidia boost tech, lower temp as on water and higher PL in VBIOS of GTX 1080 from factory, it hit consistently ~1975MHz GPU on it's own. Even high FPS games like SWBF seemed to have a hitch/stutter on GTX 1080 at times, where as Fury X with FreeSync was far superior gaming experience for me. Then the driver panel on nVidia was pants IMO. Several features I use regularly like say FRTC, per game profiles, etc were not there.

I then purchased a GigaByte RX VEGA 64 AIR (inc. FOC Prey/Wolf.2) on promo and EK-FC Radeon Vega block. I like VEGA so far, shame bios mod is blocked, so using registry PowerPlay mods. Power usage is pants vs GTX 1080 IMO. As it uses more power it affects loop water temps more.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have testing this game with Fury X. I run it and Medium 1440p and yes you are limited by memory. Was getting huge fps drops in complex scenes. AMD has completely abandoned Fury because it did not sell that well. Same thing will happen with Vega.
> Quote:
> 
> 
> 
> Originally Posted by *xkm1948*
> 
> Agreed. Vega is very likely to be abandoned down the line pretty soon. AMD's high end is really not worth it as of now.
Click to expand...

I did feel AMD abandoned Fiji, so this was a reason to sell up. VEGA may also go that way, only time will tell, for now I'll just enjoy it as it is







.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xkm1948*
> 
> Titan V is here folks, with 12GB of HBM2, 110 TFlops of computing power (9 times of TitanXp) Man the new line Volta based GPUs are coming. Time to finally upgrade


That "11Tf" is not the compute you can use for gaming. Its for Deep learning.


----------



## xkm1948

Quote:


> Originally Posted by *gupsterg*
> 
> None that I know of.
> 
> If you use a 2nd GPU and ATiWinFlash does not detect card for flashing your only option to flash is then using external tool.
> I was on a break
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I wouldn't have thought this would have any effect. ToastyX's tool is for other purposes as stated here. I no longer have Fiji GPU, so can not test
> 
> 
> 
> 
> 
> 
> 
> .
> I did get a GTX 1080. As I had no G-Sync monitor the experience was pants IMO without variable refresh rate tech. I sold it on within a month at no loss.
> 
> The compare for "gaming experience" was a MSI GTX 1080 EK X vs stock Fury X. Due to nVidia boost tech, lower temp as on water and higher PL in VBIOS of GTX 1080 from factory, it hit consistently ~1975MHz GPU on it's own. Even high FPS games like SWBF seemed to have a hitch/stutter on GTX 1080 at times, where as Fury X with FreeSync was far superior gaming experience for me. Then the driver panel on nVidia was pants IMO. Several features I use regularly like say FRTC, per game profiles, etc were not there.
> 
> I then purchased a GigaByte RX VEGA 64 AIR (inc. FOC Prey/Wolf.2) on promo and EK-FC Radeon Vega block. I like VEGA so far, shame bios mod is blocked, so using registry PowerPlay mods. Power usage is pants vs GTX 1080 IMO. As it uses more power it affects loop water temps more.
> I did feel AMD abandoned Fiji, so this was a reason to sell up. VEGA may also go that way, only time will tell, for now I'll just enjoy it as it is
> 
> 
> 
> 
> 
> 
> 
> .


Haven't seen you in a while. I am truly surprised you even bought a Vega. With the way AMD treated Fiji especially.

In case you don't know, current Vega may be an alpha vega per rumors from Hardforum. There will be a Vega refresh next year with actually functional Primitive Shadering and TBRS.

https://hardforum.com/threads/amd-plans-to-release-vega-refresh-in-2018.1949215/


----------



## u3a6

Quote:


> Originally Posted by *xkm1948*
> 
> Titan V is here folks, with 12GB of HBM2, 110 TFlops of computing power (9 times of TitanXp) Man the new line Volta based GPUs are coming. Time to finally upgrade


110 Tensor Flops not Tera Flops I think.


----------



## gupsterg

Quote:


> Originally Posted by *xkm1948*
> 
> Haven't seen you in a while. I am truly surprised you even bought a Vega. With the way AMD treated Fiji especially.
> 
> In case you don't know, current Vega may be an alpha vega per rumors from Hardforum. There will be a Vega refresh next year with actually functional Primitive Shadering and TBRS.
> 
> https://hardforum.com/threads/amd-plans-to-release-vega-refresh-in-2018.1949215/


Been busy with Ryzen/Threadripper/VEGA threads TBH plus had a break







. Yeah the way Fiji support had gone I was gonna not get VEGA, but I tried the GTX 1080 and as had no G-Sync monitor it was a failed gaming experience. Purchasing a G-Sync monitor like for like model as I currently have incurred at least ~£200 premium.

RX VEGA 64 was ~£518, I got ~£11 cashback and games in my eyes worth ~£40, so netted to £467. The water block I got for ~£90. So I ended up not having hassle of selling my ASUS MG279Q, saved ~£110 if I took the premium for ASUS PG279Q and detcucted water block for VEGA.

Like I said once you've had variable refresh rate tech I reckon it's hard to go back. So FreeSync was the hook that kept me on AMD plus AMD driver panel is way better than nVidia IMO.

Intresting thread, read only 1st page so far, at present not convinced this is the case with VEGA. Anyhow we'll all know soon engough







.


----------



## xkm1948

Quote:


> Originally Posted by *gupsterg*
> 
> Been busy with Ryzen/Threadripper/VEGA threads TBH plus had a break
> 
> 
> 
> 
> 
> 
> 
> . Yeah the way Fiji support had gone I was gonna not get VEGA, but I tried the GTX 1080 and as had no G-Sync monitor it was a failed gaming experience. Purchasing a G-Sync monitor like for like model as I currently have incurred at least ~£200 premium.
> 
> RX VEGA 64 was ~£518, I got ~£11 cashback and games in my eyes worth ~£40, so netted to £467. The water block I got for ~£90. So I ended up not having hassle of selling my ASUS MG279Q, saved ~£110 if I took the premium for ASUS PG279Q and detcucted water block for VEGA.
> 
> Like I said once you've had variable refresh rate tech I reckon it's hard to go back. So FreeSync was the hook that kept me on AMD plus AMD driver panel is way better than nVidia IMO.
> 
> Intresting thread, read only 1st page so far, at present not convinced this is the case with VEGA. Anyhow we'll all know soon engough
> 
> 
> 
> 
> 
> 
> 
> .


To each its own i guess. AMD is not getting another dime from me any time soon. Not with the crap they pulled with Fiji and the sh*it show of Vega.

Also i think the AIO pump on my FuryX is on the way out. Conveniently after Sapphire's 2 year warranty expired. Buying cards with AIO and short warranty seems to be a very bad idea.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xkm1948*
> 
> To each its own i guess. AMD is not getting another dime from me any time soon. Not with the crap they pulled with Fiji and the sh*it show of Vega.
> 
> Also i think the AIO pump on my FuryX is on the way out. Conveniently after Sapphire's 2 year warranty expired. Buying cards with AIO and short warranty seems to be a very bad idea.


I got Fury X after my 1080 Ti because I want to see for myself what the card was about and i feel sorry for people that game AMD $650 for the support they got. AMD gave more support to cheap RX 480 cards. Nvidia is much better to their users in the high end. They make you feel special about spending big $.


----------



## xkm1948

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I got Fury X after my 1080 Ti because I want to see for myself what the card was about and i feel sorry for people that game AMD $650 for the support they got. AMD gave more support to cheap RX 480 cards. Nvidia is much better to their users in the high end. They make you feel special about spending big $.


Well I don't need to feel "special". I just need some features to work as well as optimizations for newer gaming titles.

AMD promised SteamVR Async Reprojection support at the start of 2017. They provided support for all Polaris, yet still no signs of support for Fury/X

Also I wish I only paid $650 for my FuryX. Factoring in the initial price jack up, tax and shipping I paid close to $720 out of pocket.









Things would have been so much better if I went 980Ti


----------



## Alastair

Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *B'Fish*
> 
> Hello people,
> 
> got a bit of a ****ty problem at this moment. grabbed myself a 2nd hand r9 fury nitro OC from sapphire. The videocard did great for like 2 months but I tried to flash the bios once and bricked it, well no problem because i have 2 biosses on it anyway. But now it seems that the 2nd bios bricked itself after a few weeks of light usage.
> 
> Is it possible that a bios can get corrupted? we had a power surge in our home because i woke up with the lights not working etc. the PC was off at this time and it started again without problems. but after i shut it down it got REKKED i guess. the pc works on the iGPU. videocard will not boot from another pci-express slot etc. I rly hope it isnt just dead..... :<
> 
> I want to try and reflash the bios but is there a way to check if the GPU is like really broken in a quick way?
> 
> 
> 
> None that I know of.
> 
> If you use a 2nd GPU and ATiWinFlash does not detect card for flashing your only option to flash is then using external tool.
> Quote:
> 
> 
> 
> Originally Posted by *Alastair*
> 
> Any one tried patching the ATIkmdag files using the patcher to try unlock memory overclock through afterburner. I heard rumours around this might give us out HBM Oc's back.
> Quote:
> 
> 
> 
> Originally Posted by *u3a6*
> 
> That would be sweet!
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> Where is our resident HBM expert Gupsterg to the rescue? Gups where ya at?
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> I was on a break
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I wouldn't have thought this would have any effect. ToastyX's tool is for other purposes as stated here. I no longer have Fiji GPU, so can not test
> 
> 
> 
> 
> 
> 
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *xkm1948*
> 
> Gupsterg has quit modding FuryX. He moved on to Team Green 1080. With the way AMD treating Fury/X line up I would not investing in this arc as well.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Moving on from FuryX to green camp next round for sure.
> 
> Click to expand...
> 
> I did get a GTX 1080. As I had no G-Sync monitor the experience was pants IMO without variable refresh rate tech. I sold it on within a month at no loss.
> 
> The compare for "gaming experience" was a MSI GTX 1080 EK X vs stock Fury X. Due to nVidia boost tech, lower temp as on water and higher PL in VBIOS of GTX 1080 from factory, it hit consistently ~1975MHz GPU on it's own. Even high FPS games like SWBF seemed to have a hitch/stutter on GTX 1080 at times, where as Fury X with FreeSync was far superior gaming experience for me. Then the driver panel on nVidia was pants IMO. Several features I use regularly like say FRTC, per game profiles, etc were not there.
> 
> I then purchased a GigaByte RX VEGA 64 AIR (inc. FOC Prey/Wolf.2) on promo and EK-FC Radeon Vega block. I like VEGA so far, shame bios mod is blocked, so using registry PowerPlay mods. Power usage is pants vs GTX 1080 IMO. As it uses more power it affects loop water temps more.
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> I have testing this game with Fury X. I run it and Medium 1440p and yes you are limited by memory. Was getting huge fps drops in complex scenes. AMD has completely abandoned Fury because it did not sell that well. Same thing will happen with Vega.
> Quote:
> 
> 
> 
> Originally Posted by *xkm1948*
> 
> Agreed. Vega is very likely to be abandoned down the line pretty soon. AMD's high end is really not worth it as of now.
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> I did feel AMD abandoned Fiji, so this was a reason to sell up. VEGA may also go that way, only time will tell, for now I'll just enjoy it as it is
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Ill test it when I get back home after the holidays. Until then we will just have to wait.


----------



## Alastair

I am thinking next time I wont spend so much money on the high end cards. Maybe next time I'll just go for x80 class cards. Support for Fury hasn't been great while Polaris has had a ton of support. And maybe with Gddr6 mid range cards will be really nice in the coming days.

At this point I don't really wanna support AMD for what they have done to their Fiji owners and also as we have discovered more recently to RX560 owners. And I don't want to support Nvidia because of Gameworks and drivers and GTX970 etc. We need a third player in the GPU market.

Its been downhill for Radeon ever since RTG split off from AMD. Lisa Su please look apon us with your divine grace.

edit no 4 I think?: Can anyone tell me what a decent RX580 scores compared to a Fury (non x) these days? Ive seen talk of 580 hitting 1400-1450 and that sounds really good. And once you have done some timing mods and OC's to help get around the bandwidth bottleneck of Polaris I am sure it would really fly, and I am pretty sure it would be nearly as fast as Fury as well.


----------



## XR5777

Hello.

Having real trouble with the AMD blockdrivers right now. What a bunch of utter scumbags AMD team really are. Absolutely disgusting tactics as per usual.

Even a bios modification gets reverted from what GPUz states as the 'Original Memory Speed is 550mhz'' and the ''Current Speed is 500mhz.''

Has to be a way of reverting from the Nowattman back to the old Overdrive API parameters which I believe would allow HBM overclock.

So just because AMD failed at Vega with the gimped bandwidth, they purposely disabled all HBM editing on Fiji.

We need our revenge on this scam of a company by back engineering the new API and allowing HBM overclock for the dedicated Redteam.


----------



## Offler

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I got Fury X after my 1080 Ti because I want to see for myself what the card was about and i feel sorry for people that game AMD $650 for the support they got. AMD gave more support to cheap RX 480 cards. Nvidia is much better to their users in the high end. They make you feel special about spending big $.


Quote:


> Originally Posted by *xkm1948*
> 
> Well I don't need to feel "special". I just need some features to work as well as optimizations for newer gaming titles.
> 
> AMD promised SteamVR Async Reprojection support at the start of 2017. They provided support for all Polaris, yet still no signs of support for Fury/X
> 
> Also I wish I only paid $650 for my FuryX. Factoring in the initial price jack up, tax and shipping I paid close to $720 out of pocket.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Things would have been so much better if I went 980Ti


I purchcased my FuryX year ago, for rougly 350 euro, later paired it with display with DP support. I still praise the effectivity of watercooler. Even when i run it on quite high RPM, its by far not that noisy as R9-290x i had previously.

I was disappointed at FuryX release when it did not had more than 64 ROPs, but AMD is still not moving foward in this regard and Pixel/Texel fillrates are not going much up. Nvidia has plenty of fillrate (so scores in 3dmark look cool), but i have to praise AMD for the computing power in Shaders. AMD might push their performance in raw 3d foward simply by adding more ROPs years ago, but its still not happenning.

When it comes to feature support, only thing I was surprised at, was lack of support for HEVC man10 in UVD. But its true that many new features came after RX 4xx series, wich is kinda shame.

Regardless all above, i find FuryX to be a great card. High performance, good cooling and (relatively) silent.


----------



## Alastair

Quote:


> Originally Posted by *XR5777*
> 
> Hello.
> 
> Having real trouble with the AMD blockdrivers right now. What a bunch of utter scumbags AMD team really are. Absolutely disgusting tactics as per usual.
> 
> Even a bios modification gets reverted from what GPUz states as the 'Original Memory Speed is 550mhz'' and the ''Current Speed is 500mhz.''
> 
> Has to be a way of reverting from the Nowattman back to the old Overdrive API parameters which I believe would allow HBM overclock.
> 
> So just because AMD failed at Vega with the gimped bandwidth, they purposely disabled all HBM editing on Fiji.
> 
> We need our revenge on this scam of a company by back engineering the new API and allowing HBM overclock for the dedicated Redteam.


have you tried patching the driver with ATIKMDAG patcher?


----------



## XR5777

Hello and thankyou for the quick reply much appreciated.

Are you referring to the ToastX patch?

https://www.monitortests.com/forum/Thread-AMD-ATI-Pixel-Clock-Patcher?page=83

If that's correct yes indeed I have already installed the patch, unfortunately the driver still reverts the HBM back to 500mhz.

I'm already on the forum hoping for a reply.


----------



## Alastair

Yeah thats the one. I saw that the drivers were blocking my miners RX cards bios mods but when patched allowed the BIOS mods. I figured maybe it would be the same for the Furies too.


----------



## Minotaurtoo

from what I've read simply installing one of the older drivers 16xxx would do the trick as far as re-enabling the bios editing memory clocks


----------



## Alastair

This is TRUE Minataurtoo but that would negate any optimisations for newer titles given to us by newer drivers. If you don't need new drivers for new titles I would say go to 16.x. If not maybe stick to newer drivers. As much as I hate it we weren't gaining all that much from HBM Oc's anyway were we?


----------



## Minotaurtoo

I gained about 3-5% across the games I play... funny thing is for any gpu reader (gpu-z, etc) they all show my mem still at 545 and I can't tell the it ever changed back to the 500 from the start..... so I don't know if I'm just immune or if it changed back and the optimizations made up for the difference.


----------



## xkm1948

Quote:


> Originally Posted by *Alastair*
> 
> This is TRUE Minataurtoo but that would negate any optimisations for newer titles given to us by newer drivers. If you don't need new drivers for new titles I would say go to 16.x. If not maybe stick to newer drivers. As much as I hate it we weren't gaining all that much from HBM Oc's anyway were we?


Newer drivers solve a lot of glitches in games. If I use 16.xx to play Nier automata I have loads of crushing back to desktop problem


----------



## Ne01 OnnA

*AMD Radeon Software Adrenalin Edition Features:*

New: AMD Link (mobile APP)

*Performance Monitoring*

Monitor and track your PC gaming performance.
Bar graphs to monitor FPS and system info.

Notifications

Stay informed about Radeon Software
Receive real-time notifications when Radeon Software updates are released
Keep AMD Link up-to-date with the newest features and stability updates

*Introducing Radeon Overlay*

In-game control of Radeon Settings and PC performance monitoring (Press ALT+R to enable)
Greater insight for better gaming
FPS performance monitoring
Convenient built-in performance logging
Supported on DirectX 9, 11, 12 and Vulkan

*Radeon ReLive*

Manage, organize and upload your gaming memories
Share your moments on social media
Queue uploads to multiple social platforms
View and trim your video captures

*Enhanced Sync - What's new:*

*All Radeon GCN-based products*
Vulkan
Notebook products
Multi-GPU
AMD Eyefinity Technology

*Radeon Wattman*

Easily save and load your own profiles
Share your custom profiles
Load community-driver profiles

Frame Rate Target Control Now Supporting Vulkan

*Performance early numbers:*



Results are based on November 2017 internal testing of preliminary driver on a Radeon RX 480 and at 1920×1080 with high preset and may vary with use of final driver

THX goes to WhyCry.


----------



## FlawleZ

The fact that the gains listed for adrenalin edition is listed as a negative number is very confusing to me.


----------



## Offler

If the driver will offer anything near FPS profiling, it would be cool.

From time to time I have to argue with game developers that there is engine bug causing FPS drop, while they are unable to reproduce that. Then i have to sent complicated and convoluted ways how to reproduce it, and how to spot it.


----------



## XR5777

Hello and thankyou for the information much appreciated.

It seems [email protected] produces micro-stutter its hard to explain but I use a TV so Vsync is a must have and without the slight increase to 550mhz its never a solid60fps experience.

Granted you can install an older driver but I'm forced to use these new drivers for new titles as such its a shame.

I hand on heart didn't even know about these 'Blockdrivers' which were released nearly a year ago, I use my own custom API but now I just cant figure out the bugs on new titles I'm literally bashing my head against the wall here.

I cannot believe AMD with such little resources would actually put money and time into writing such a deceiving API.

Hopefully ToastX may reply over at their site with more info.

Thankyou for your time and hope to speak soon.


----------



## Alastair

Does the adrenalin edition driver borderless windowed mode MGPU support extend to Fiji or is it a case of only Polaris and Vega getting the new Shiny features?


----------



## PontiacGTX

Quote:


> Originally Posted by *xkm1948*
> 
> Gupsterg has quit modding FuryX. He moved on to Team Green 1080. With the way AMD treating Fury/X line up I would not investing in this arc as well.
> 
> 
> 
> As you can see the FuryX was not limited by VRAM. The usage did not go over 4GB. Yet at 1440p FuryX runs like ****. Mind you this is an AMD sponsored game!!
> 
> 
> 
> Moving on from FuryX to green camp next round for sure.


yes you are right the game was not VRAM limited. it was lack of optimization fixed on patch 3

Quote:


> Originally Posted by *kondziowy*
> 
> Why are you saying this and then proceed to post the graph above? Fury X is the fastest 4GB card, and RX570 4GB is much slower than RX470 8GB.
> 980Ti beaten by prehistoric Hawaii, and much slower than 1070 (and 1070 was supposed to bring the power of the 980Ti, Jen-Hsun own words).


the performance ont he ggame was the worst performance possible, ,,if you didnt try it, probably you didnt know how bad the performance was


----------



## miklkit

Quote:


> Originally Posted by *XR5777*
> 
> Hello and thankyou for the information much appreciated.
> 
> It seems [email protected] produces micro-stutter its hard to explain but I use a TV so Vsync is a must have and without the slight increase to 550mhz its never a solid60fps experience.
> 
> Granted you can install an older driver but I'm forced to use these new drivers for new titles as such its a shame.
> 
> I hand on heart didn't even know about these 'Blockdrivers' which were released nearly a year ago, I use my own custom API but now I just cant figure out the bugs on new titles I'm literally bashing my head against the wall here.
> 
> I cannot believe AMD with such little resources would actually put money and time into writing such a deceiving API.
> 
> Hopefully ToastX may reply over at their site with more info.
> 
> Thankyou for your time and hope to speak soon.


500mhz produces microstutters? This is something that I have noticed along with an overall drop in performance. So I just went back to the 16.12.2 drivers. Had to unplug the puter from the internets to keep windoze from reinstalling the 17.1.1 drivers..............

Set the ram to 550 in Afterburner but it is still running at 500mhz. Is this Sapphire Fury locked or do i need to go back further to get better drivers?


----------



## diggiddi

The new drivers helped out my crossfire issues in PCars and am getting better FPS too, I wonder if BF4 CFX is fixed, anyone know if hbm oc works?


----------



## diggiddi

Quote:


> Originally Posted by *FlawleZ*
> 
> The fact that the gains listed for adrenalin edition is listed as a negative number is very confusing to me.


Not negative, its tilda sign ~


----------



## Minotaurtoo

ok... I got enhanced sync working ... and it's great...... chill seems to be working better... but I can't get one single overlay to come up... even tried changing the toggle key combinations... tried 3 games and so far no overlay... any ideas


----------



## Offler

I tried enhanced sync, but it does not seem to do anything extra compared to frame limiter + freesync...


----------



## 99belle99

Quote:


> Originally Posted by *Offler*
> 
> I tried enhanced sync, but it does not seem to do anything extra compared to frame limiter + *freesync*...


It's obviously designed for people who do not have Freesync. What were you expecting for it to be different than frame limiter and Freesync?


----------



## miklkit

Well, I tried everything I could think of and made things worse instead of better. This ram runs at 500mhz only no matter what it is set to. But the older drivers have delivered much better frame rates and smoothness.


----------



## xkm1948

Optimization and support for Fiji iirc has died. Move on folks


----------



## miklkit

Are you going to buy me a Vega 64?


----------



## xkm1948

Quote:


> Originally Posted by *miklkit*
> 
> Are you going to buy me a Vega 64?


If you buy me a 1080Ti first


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> ok... I got enhanced sync working ... and it's great...... chill seems to be working better... but I can't get one single overlay to come up... even tried changing the toggle key combinations... tried 3 games and so far no overlay... any ideas


Yeah I was having the same problem until I noticed there was a button to turn on overlay in the menu

Quote:


> Originally Posted by *xkm1948*
> 
> Optimization and support for Fiji iirc has died. Move on folks


Not true, I am getting more FPS in Project Cars and crossfire is working again


----------



## xkm1948

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah I was having the same problem until I noticed there was a button to turn on overlay in the menu
> Not true, I am getting more FPS in Project Cars and crossfire is working again


one title,

try showing more performance improvement for 2017 released game titles. For Prey I am actually seeing slightly performance regression in 17.12.1 versus 17.11.4


----------



## diggiddi

Quote:


> Originally Posted by *xkm1948*
> 
> one title,
> 
> try showing more performance improvement for 2017 released game titles. For Prey I am actually seeing slightly performance regression in 17.12.1 versus 17.11.4


Quit shifting the posts man, you claimed there were no improvements I'm showing you otherwise, I will test Pcars 2 demo and let you know how that goes


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah I was having the same problem until I noticed there was a button to turn on overlay in the menu
> Not true, I am getting more FPS in Project Cars and crossfire is working again


every where I found it said about toggle combinations... where is this other button to turn it on? I've looked through so many menus I've gone blind


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> every where I found it said about toggle combinations... where is this other button to turn it on? I've looked through so many menus I've gone blind


LOL I feel your pain
Performance Tab-> Metrics Options -> Show Metrics


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> LOL I feel your pain
> Performance Tab-> Metrics Options -> Show Metrics


ok... I have no performance tab..... not when I bring up radeon settings anyway... heres what failed... global setting>performance monitoring>shows a bunch of buttons that are all enabled... tried various right clickys on the tray icon to no avail... guess I'm a complete fail at this


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> ok... I have no performance tab..... not when I bring up radeon settings anyway... heres what failed... global setting>performance monitoring>shows a bunch of buttons that are all enabled... tried various right clickys on the tray icon to no avail... guess I'm a complete fail at this


In game Alt R when the Overlay opens use routine I quoted earlier


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> In game Alt R when the Overlay opens use routine I quoted earlier


that's the problem... alt R isn't working... even tried changing the alt R to something else and nope.... can't get any overlay or metrics


----------



## diggiddi

Quote:


> Originally Posted by *Minotaurtoo*
> 
> that's the problem... alt R isn't working... even tried changing the alt R to something else and nope.... can't get any overlay or metrics


DDU and reinstall in safe mode then try again
Pcars 2 looks like its getting better fps too


----------



## Minotaurtoo

Quote:


> Originally Posted by *diggiddi*
> 
> DDU and reinstall in safe mode then try again
> Pcars 2 looks like its getting better fps too


I was afraid that would be the only solution... oh well. thanks for the help


----------



## PontiacGTX

Quote:


> Originally Posted by *xkm1948*
> 
> one title,
> 
> try showing more performance improvement for 2017 released game titles. For Prey I am actually seeing slightly performance regression in 17.12.1 versus 17.11.4


one single title which should have been optimized a LONG time, since there is Porject Cars 2 now


----------



## rbys

Anyone else having issues with the Fury/Fury X refusing to idle with Radeon Adrenaline edition after exiting a game?


----------



## huzzug

Can you try restart64.exe from CRU utility? That should reset the gpu drivers


----------



## bluezone

From a YouTuber whom normally is NSFW (language). Be fore warned. I watch for actually good technical breakdowns and humor from this guy. Not very scientific, but for those interested in how thermal compound propagates from application methods under pressure, Here it is.






He also has a love of Spoonerisms..


----------



## bluezone

Merry Christmas everyone.


----------



## bluezone

Hot fix Alpha driver (DX9 games) for all the doom, gloom and veiled agenda types out there./s

Radeon Software Adrenalin Edition 18.1.1 Alpha

https://support.amd.com/en-us/kb-articles/Pages/RSAE-18.1.1-Alpha-Release-Notes.aspx

Cheers


----------



## Offler

Been doing some cleanup in system and I went through removal of old drivers.

Among Legacy drivers i found AODDriver 4.3 , which is apparently better known as ATI/AMD Overdrive or AMD Fuel service. Directory is dated August 2015. Is this even still active in current versions of AMD drivers (17.12 atm) where it was replaced by Wattman.

Even when AMD Fuel service is no longer present in system, this legacy driver is still present and active. Could somebody please check in legacy drivers if its there?

Also I am looking for anything which acts as a kodek (*.ax ? ) and does utillize HW capatibilities of Fiji UVD...


----------



## whitrzac

How good/bad is the stock cooling on the pro duo?

Are the vrms actively cooled?


----------



## diggiddi

Quote:


> Originally Posted by *bluezone*
> 
> Hot fix Alpha driver (DX9 games) for all the doom, gloom and veiled agenda types out there./s
> 
> Radeon Software Adrenalin Edition 18.1.1 Alpha
> 
> https://support.amd.com/en-us/kb-articles/Pages/RSAE-18.1.1-Alpha-Release-Notes.aspx
> 
> Cheers


How are these drivers working?


----------



## bluezone

So far no problems. Benchmark scores seem ok, but i havenot gotten around to testing any DX9 games. I should have some time tomorrow for that.. I've been a little busy setting up some guitars.


----------



## PolluxCastor

Having issues overclocking, games do not make use of + voltage adjustments rendering any OC value unstable.
OCCT throttles even with max power target.

Latest driver release installed and is a fresh DDU one.


----------



## EastCoast

Plug up computer (along with power strip if previously used) in receptacle next to circuit breaker in home and try again.


----------



## HagbardCeline

Sorry if this is a dumb question, but I was having game-crashing issues with Star Wars Battlefront 2 since the last couple AMD drive updates. On the AMD forums, it was suggested that I DDU in Safe Mode and install the new 18.1.1 beta drivers cleanly. However, whenever I try to install those drivers, it still installs 17.12.1 drivers. I deleted all downloaded drivers from my computer except 18.1.1 and it still installed 17.12.1, which makes me suspect the installer is downloading instead of using the one on my hard drive.

Either way, the crashes have continued. GPU temps are hovering in the low 60s. The game crashes to desktop without any error codes or popups, though occasionally I get a popup telilng me that my AMD display driver has stopped responding and recovered. Getting pretty frustrating. Have played 100s of hours on this game without any issues until very recently. It happened before and after the the Battlefront 2 patches went life, so I don't beleive it's related to that.

Anyway, is there a better way to install only the 18.1.1 drivers, or do I have to install them over the 17.12.1?

Thanks!

GPU: Sapphire R9 Fury X
CPU: i7 5930k
Ram: 64gb


----------



## Ne01 OnnA

^^ When instaling them:

Add Open as Admin -> Then Hit local 18.1.1 (When installer loads up)


----------



## Offler

Regarding 18.1.1 i found out that "Detecting hardware"phase of driver installation may cause system to crash. Reason is conflict with my soundcard, and it appears that during this phase and driver installation is driver doing some operations within APIC table. That alone may mess up whole system or triggered some latent hardware conflicts.

In my case it was expecting AMD Sata controller, which i dont need to use and its disabled, and messed up with Creative X-Fi resources.

But its highly hardware specific.


----------



## drm8627

seen anyone selling their fury for a good price? i been trying to find one to sli with mine, but the prices , even on ebay, are insane.


----------



## huzzug

What card do you have, because if you're sure you want to *SLI* with your card, chances are it isn't going to work.


----------



## drm8627

Quote:


> Originally Posted by *huzzug*
> 
> What card do you have, because if you're sure you want to *SLI* with your card, chances are it isn't going to work.


derp. i meant crossfire. i have a fury x. Ive read it can crossfire with any of the fury lineup, just be limited to the slower cards clockspeeds.


----------



## huzzug

Hang in there tight, bruv. It's going to be difficult to score a card that fakks within your budget.


----------



## drm8627

Quote:


> Originally Posted by *huzzug*
> 
> Hang in there tight, bruv. It's going to be difficult to score a card that fakks within your budget.


i mean. i bought my fury x a month ago for 250. lol. then 5 months ago i bought a fury for 190. i was offering 300 for a base fury and people were acting like i kicked their dog.

(had to trade other fury for a 480 for my girls mitx rig, it didnt fit, and i had a different gpu at the time)


----------



## miklkit

Quote:


> Originally Posted by *HagbardCeline*
> 
> Sorry if this is a dumb question, but I was having game-crashing issues with Star Wars Battlefront 2 since the last couple AMD drive updates. On the AMD forums, it was suggested that I DDU in Safe Mode and install the new 18.1.1 beta drivers cleanly. However, whenever I try to install those drivers, it still installs 17.12.1 drivers. I deleted all downloaded drivers from my computer except 18.1.1 and it still installed 17.12.1, which makes me suspect the installer is downloading instead of using the one on my hard drive.
> 
> Either way, the crashes have continued. GPU temps are hovering in the low 60s. The game crashes to desktop without any error codes or popups, though occasionally I get a popup telilng me that my AMD display driver has stopped responding and recovered. Getting pretty frustrating. Have played 100s of hours on this game without any issues until very recently. It happened before and after the the Battlefront 2 patches went life, so I don't beleive it's related to that.
> 
> Anyway, is there a better way to install only the 18.1.1 drivers, or do I have to install them over the 17.12.1?
> 
> Thanks!
> 
> GPU: Sapphire R9 Fury X
> CPU: i7 5930k
> Ram: 64gb


I have been having crashing issues also with the 17xxx drivers too. So I DLed the 18xxx drivers, cleaned out the old drivers with AMDCleanupUtility, then deleted the AMD folder on the C: drive. Only then did I install the 18xxx drivers and during the install where it asks for express or custom install I choose custom. There i uncheck the AMD hidef audio drivers. After the install is complete I then go into "system" then "device drivers" and disable all the AMD high definition audio devices. No problems at all after that.

In Cinebench the 17xxx drivers would not run at all for me, while the 18xxx drivers deliver about 14fps less than the 16xxx drivers.


----------



## PolluxCastor

Switching to team green once i get the funds and will most likely never look back.

Games crashing and driver stopped working 24/7 on R9 Fury and it seems AMD only care for their mainstream products and not their high end.


----------



## kondziowy

Quote:


> Originally Posted by *PolluxCastor*
> 
> Switching to team green once i get the funds and will most likely never look back.
> 
> Games crashing and driver stopped working 24/7 on R9 Fury and it seems AMD only care for their mainstream products and not their high end.


Is it doing that at stock clocks? You have Fury at 1120MHz which means you run it balls to the wall. It will not run like that forever. One year and it probably degrades.


----------



## 99belle99

Quote:


> Originally Posted by *PolluxCastor*
> 
> Switching to team green once i get the funds and will most likely never look back.
> 
> Games crashing and driver stopped working 24/7 on R9 Fury and it seems AMD only care for their mainstream products and not their high end.


I never had any issues. I had a R9 290 overclocked to the limit for a few years without issues and then I moved on to a Fury X and yes it was not an overclocker at all so I just ran it at stock for 2 years or so with out issues and at stock it was perfectly fine for gaming at 1440p 144Hz Freesync. Yes not all modern games ran at 144Hz but a lot did and the ones that didn't ran perfectly fine for my needs. I sold it recently in order to get a Vega 64 but the prices just shot up and no way am I paying over the odds for one so I will wait it out as I have my Xbox One X to tie me over gaming wise.


----------



## Offler

Latest driver (18.1.1) caused to me some of trouble, but all the stuff is related to already old trouble with my soundcard. Once installed, it works...


----------



## Ne01 OnnA

*New Tweaks*

Some new Tweaks for everyone (Yup Vega/290 etc. included)

Make reg file (e.g. Tweak.reg):

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD]

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\AMDAnalytics]
"AnalyticsAccepted"="false"

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\Chill]
"ChillLevelDefault"=dword:00000002
"GlobalEnable"=dword:00000000

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\CN]
"UA_Enabled"=dword:00000001
"PreloadDelay"=dword:000000c8
"UnloadDelay"=dword:000000c8
"MemorySizeTreshold"=dword:000000c8
"StageTen_hide"="true"
"AllowPartners"="true"

[HKEY_LOCAL_MACHINE\SOFTWARE\AMDDVR]
"AMDDVR_Tracked"=dword:00000000

===
Side note please do a backup of those keys in regedit


----------



## Alastair

kondziowy said:


> Quote:Originally Posted by *PolluxCastor*
> 
> Switching to team green once i get the funds and will most likely never look back.
> 
> Games crashing and driver stopped working 24/7 on R9 Fury and it seems AMD only care for their mainstream products and not their high end.
> 
> Is it doing that at stock clocks? You have Fury at 1120MHz which means you run it balls to the wall. It will not run like that forever. One year and it probably degrades.


Well mine have pretty much been running 1150 day one at 3840 shaders so ce I got them a little after release. They still going.


----------



## Offler

Ne01 OnnA said:


> Some new Tweaks for everyone (Yup Vega/290 etc. included)
> 
> Make reg file (e.g. Tweak.reg):
> 
> Windows Registry Editor Version 5.00
> 
> [HKEY_LOCAL_MACHINE\SOFTWARE\AMD]
> 
> [HKEY_LOCAL_MACHINE\SOFTWARE\AMD\AMDAnalytics]
> "AnalyticsAccepted"="false"
> 
> [HKEY_LOCAL_MACHINE\SOFTWARE\AMD\Chill]
> "ChillLevelDefault"=dword:00000002
> "GlobalEnable"=dword:00000000
> 
> [HKEY_LOCAL_MACHINE\SOFTWARE\AMD\CN]
> "UA_Enabled"=dword:00000001
> "PreloadDelay"=dword:000000c8
> "UnloadDelay"=dword:000000c8
> "MemorySizeTreshold"=dword:000000c8
> "StageTen_hide"="true"
> "AllowPartners"="true"
> 
> [HKEY_LOCAL_MACHINE\SOFTWARE\AMDDVR]
> "AMDDVR_Tracked"=dword:00000000
> 
> ===
> Side note please do a backup of those keys in regedit


May I ask for short description of those features?


----------



## Ne01 OnnA

Offler said:


> May I ask for short description of those features?


All is IMO, i can be wrong for some options. (I'm using these Tweaks daily)

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD]

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\AMDAnalytics]
"AnalyticsAccepted"="false"

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\Chill]
"ChillLevelDefault"=dword:00000002 -> 2 means App default
"GlobalEnable"=dword:00000000 -> here if 1 you have global chill for every game (Yes also for Forza H3 lol but it might not work properly)

[HKEY_LOCAL_MACHINE\SOFTWARE\AMD\CN]
"UA_Enabled"=dword:00000001 (dunno)
"PreloadDelay"=dword:000000c8 (delay in ms default was 300)
"UnloadDelay"=dword:000000c8
"MemorySizeTreshold"=dword:000000c8 (we don't need a clutter  )
"StageTen_hide"="true" (Yes im not using DVR at all)
"AllowPartners"="true" (bethesda?)

[HKEY_LOCAL_MACHINE\SOFTWARE\AMDDVR]
"AMDDVR_Tracked"=dword:00000000 (no it's not Tracked lol)


----------



## EastCoast

miklkit said:


> Quote: Originally Posted by *HagbardCeline*
> 
> Sorry if this is a dumb question, but I was having game-crashing issues with Star Wars Battlefront 2 since the last couple AMD drive updates. On the AMD forums, it was suggested that I DDU in Safe Mode and install the new 18.1.1 beta drivers cleanly. However, whenever I try to install those drivers, it still installs 17.12.1 drivers. I deleted all downloaded drivers from my computer except 18.1.1 and it still installed 17.12.1, which makes me suspect the installer is downloading instead of using the one on my hard drive.
> 
> Either way, the crashes have continued. GPU temps are hovering in the low 60s. The game crashes to desktop without any error codes or popups, though occasionally I get a popup telilng me that my AMD display driver has stopped responding and recovered. Getting pretty frustrating. Have played 100s of hours on this game without any issues until very recently. It happened before and after the the Battlefront 2 patches went life, so I don't beleive it's related to that.
> 
> Anyway, is there a better way to install only the 18.1.1 drivers, or do I have to install them over the 17.12.1?
> 
> Thanks!
> 
> GPU: Sapphire R9 Fury X
> CPU: i7 5930k
> Ram: 64gb
> 
> 
> I have been having crashing issues also with the 17xxx drivers too. So I DLed the 18xxx drivers, cleaned out the old drivers with AMDCleanupUtility, then deleted the AMD folder on the C: drive. Only then did I install the 18xxx drivers and during the install where it asks for express or custom install I choose custom. There i uncheck the AMD hidef audio drivers. After the install is complete I then go into "system" then "device drivers" and disable all the AMD high definition audio devices. No problems at all after that.
> 
> In Cinebench the 17xxx drivers would not run at all for me, while the 18xxx drivers deliver about 14fps less than the 16xxx drivers.


Windows is superceding your install. If you used DDU in safe mode it suppose to make a registry change to prevent win10 from installing drivers for your video card. So it's not clear if you have the latest version of DDU or not. But it does work.


----------



## Centauri

Anybody else have a seemingly software related issue that causes extreme, completely erratic throttling of the GPU clocks and stuttering in games?(the tachometer goes full strobe-light on me) I've been playing with drivers all day and can't nail it down.


----------



## Offler

Centauri said:


> Anybody else have a seemingly software related issue that causes extreme, completely erratic throttling of the GPU clocks and stuttering in games?(the tachometer goes full strobe-light on me) I've been playing with drivers all day and can't nail it down.


Have you tried to set "power" slider to maximum?


----------



## Centauri

Offler said:


> Have you tried to set "power" slider to maximum?


In Afterburner? Yeah, it doesn't seem to be having any effect.

Card was running fine until... Something


----------



## Offler

Centauri said:


> In Afterburner? Yeah, it doesn't seem to be having any effect.
> 
> Card was running fine until... Something



I mean in Radeon Settings>Global Settings>Wattman. Powerr limit +50%

And yes, under some circumstances it resets back to original value.


----------



## Centauri

Yeah, she's set to max.

My GPU usage graph from Afterburner looks like a seismograph readout too during games and it's nothing to do with voltages, clocks or CrossFire.


----------



## Centauri

See attachment


----------



## Offler

In most cases the GPU can sleep when playing on 1080p, and gpu utilization looks like yours. on 4k, its fully utilized.


----------



## miklkit

Yeah, that looks like an old DX8 or DX9 game.


----------



## Centauri

It's World of Warships in this case - DX11. GTA V is also giving me the same problems.

But I just cracked out Doom (w/ Vulkan) and it's smooth as butter. What should I look into as a problem?


----------



## miklkit

One thing you could try is to turn off vsync and then limit max fps in MSI Afterbuner's Rivatuner section like I do. I set it to limit fps to 150fps, which smooths out the loads but doesn't put too high of a load on the gpu.


----------



## NightAntilli

Are you sure you're not set in mining mode? I've tried gaming in mining mode and it's a mess. So I generally have to switch the profile every time I want to play a game. Rig is mining for the rest of the time.


----------



## Centauri

Wasn't mining mode either.

Did a Windows restore and it resolved everything.


----------



## LazarusIV

Hey everyone! I just grabbed this monitor: 
https://www.amazon.com/dp/B06Y4TQSK1/_encoding=UTF8?coliid=I24Y3CMVZSSJL&colid=WFQQ29SBQ91A&psc=0
and I'm wondering which driver should I use with my R9 Fury Nitro... I heard 18.2.1 was pretty good but the later ones were somewhat unstable with furies. Thoughts?


----------



## BIGTom

So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now. 

I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.

Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything. 

Really can’t afford a new GPU with the current prices..

Thanks


----------



## ZealotKi11er

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks



Probably try to fix the current cooler. Maybe create a fill port on the RAD.


----------



## NightAntilli

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks


All aftermarket coolers are EOL as far as I know... Maybe you can try finding a used one on Ebay. Other than that, I think you're out of options.


----------



## Offler

According to one guide, its possible to connect the existing cooler to other water loop, you just disconnect the pump on the card.


----------



## Ne01 OnnA

ZealotKi11er said:


> Probably try to fix the current cooler. Maybe create a fill port on the RAD.


With syringe it can be done IMO, you can always Fill the Hole with some Hot to Cold Solder/Tim
Just find some good Coolant and Fill up to 95%, leave some space for Pump current.

And you'll end up with even cooler FuryX


----------



## SavantStrike

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks


Full cover blocks are still readily available. If that's not amenable then cannibalize an AIO and splice in the tubes from the pump on the card onwards.


----------



## ht_addict

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks


I would but go with full water block on it. Can still be found online. Can be the beginning of a custom loop. Where are you located.


----------



## microchidism

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks


Heres a page where someone changed tubes on an AIO you can use it as a guide

http://www.overclock.net/t/990111/official-antec-k-hler-h2o-620-920-owners-club/4020#post_19653870


----------



## josephimports

BIGTom said:


> So my Fury-X started leaking coolant from the hoses. Pretty sure it’s lost most of the coolant now.
> 
> I emailed XFX to ask if they could repair it, but I am not expecting them to be able to.
> 
> Does anyone know if there are any aftermarket coolers that are compatible with Fury-X? I’ve searched quite a bit and have not been able to find anything.
> 
> Really can’t afford a new GPU with the current prices..
> 
> Thanks


Raijintek offers a compatible air cooler. XFX replaced my Fury X last year due to a functional but very loud pump. They replaced it with a new card in retail box. GL getting it sorted out.

http://www.raijintek.com/en/products_detail.php?ProductID=46

https://www.newegg.com/Product/Product.aspx?Item=9SIA66Z28H1378

https://imgur.com/a/mZNVY#oc6uwUU

https://www.reddit.com/r/Amd/comments/70nbuv/raijintek_morpheus_ii_on_xfx_r9_fury_with_2x


----------



## LazarusIV

LazarusIV said:


> Hey everyone! I just grabbed this monitor:
> https://www.amazon.com/dp/B06Y4TQSK1/_encoding=UTF8?coliid=I24Y3CMVZSSJL&colid=WFQQ29SBQ91A&psc=0
> and I'm wondering which driver should I use with my R9 Fury Nitro... I heard 18.2.1 was pretty good but the later ones were somewhat unstable with furies. Thoughts?


Anyone have any recommendations for which driver to use? Thanks!


----------



## Intuit

*I cant get this card to stream and play games*

Was wondering if anyone had a screen shot of some stable overclock frequencies and voltages. Im trying to stream and anytime i start it up my visuals just start to crap themselves and my stream is laggy and my game stutters. But once i load into game my visuals are fine but the stream doesnt run smooth. Im running a FX-8370 @4.1GHz with 16GB of RAM on AsusM5A99X EVO R2.0 with a XFX R9 Fury. Everything works fine until i start the stream then i have the lag and even after i turn the stream off it still acts up until i restart the computer. Any help/info would be appreciated. Its hard to believe this card cant stream and play some games


----------



## BIGTom

josephimports said:


> Raijintek offers a compatible air cooler. XFX replaced my Fury X last year due to a functional but very loud pump. They replaced it with a new card in retail box. GL getting it sorted out.
> 
> http://www.raijintek.com/en/products_detail.php?ProductID=46
> 
> https://www.newegg.com/Product/Product.aspx?Item=9SIA66Z28H1378
> 
> https://imgur.com/a/mZNVY#oc6uwUU
> 
> https://www.reddit.com/r/Amd/comments/70nbuv/raijintek_morpheus_ii_on_xfx_r9_fury_with_2x


Awesome thanks!


----------



## webhito

Anyone else all of a sudden have their tachometer and core speed stuck at 100% while idle?


----------



## Minotaurtoo

webhito said:


> Anyone else all of a sudden have their tachometer and core speed stuck at 100% while idle?


yeah, don't know what causes it, but it's random and a restart will fix it... at first I thought I had a virus using my gpu, but a quick look at power draw and I dismissed that theory... tell you what though it helps with your passmark score because the core clock never rises high enough in the 2d bench part, but if it stick at top clocks you'll get a better score : )


----------



## webhito

Minotaurtoo said:


> yeah, don't know what causes it, but it's random and a restart will fix it... at first I thought I had a virus using my gpu, but a quick look at power draw and I dismissed that theory... tell you what though it helps with your passmark score because the core clock never rises high enough in the 2d bench part, but if it stick at top clocks you'll get a better score : )


Thanks for the quick reply! Was not sure if my card was going nuts and needed a replacement.


----------



## Minotaurtoo

webhito said:


> Thanks for the quick reply! Was not sure if my card was going nuts and needed a replacement.


I think it's a driver glitch... not sure really


----------



## webhito

Minotaurtoo said:


> I think it's a driver glitch... not sure really


Either that or a windows update, I had the same thing happen right after uninstalling the driver with ddu.


----------



## xkm1948

webhito said:


> Anyone else all of a sudden have their tachometer and core speed stuck at 100% while idle?


This usually happens to me when I use Firefox to watch YouTube. The UVD is broken in AMD's drivers. Disabling the hardware video acceleration in Firefox will fix it.


----------



## webhito

xkm1948 said:


> This usually happens to me when I use Firefox to watch YouTube. The UVD is broken in AMD's drivers. Disabling the hardware video acceleration in Firefox will fix it.


Hmm, on my side it can happen by just being on the desktop without even opening the browser, but yes, I do use firefox. 

One thing though, I completely uninstalled the driver with ddu and updated to the latest patch offered by windows which is 1803 I believe and just updated to the driver offered by microsoft and the issue seems to be gone, at least for now. Will update if it does come back again.


----------



## webhito

Did a fresh install, let windows update to its own driver, for some reason though, I have another problem, randomly when browsing it locks up and I lose video, stays frozen for a while and only once has restarted to give me a bugcheck error. Not sure if its defective or related to bios settings...

Mind you, this happens with latest driver or windows.


----------



## miklkit

Since I updated to the latest drivers youtube videos lock up on me randomly. Maybe it is the same issue.


----------



## webhito

miklkit said:


> Since I updated to the latest drivers youtube videos lock up on me randomly. Maybe it is the same issue.


Yea, mine happens with any driver whatsoever, its just a matter of time before it crashes, happens watching movies, browsing the web or just staying idle on the desktop. Looks like I got a lemon.


----------



## miklkit

Mine is a little different. The video will lock up in the first minute or so, but after fiddling to get it going again it is good for many hours. Games never have any problems.


----------



## Minotaurtoo

I heard a rumor that there might be someone on here that has a waterblock for one of these cards for pretty cheap : ) I have a fury x that might be needing a block soon... I just now noticed it because I was able to get many of the fans I used to have out of my rig thanks to ryzen running cooler and just not needing all those fans... and in doing so I noticed that my pump seems to be slightly audible when revved up... was just a dull hum... but since I have a custom loop in my pc I thought...hmm why not just get a block for it... anyway, if you have one you want to get rid of cheap, pm me.


----------



## Starbomba

Minotaurtoo said:


> I heard a rumor that there might be someone on here that has a waterblock for one of these cards for pretty cheap : ) I have a fury x that might be needing a block soon... I just now noticed it because I was able to get many of the fans I used to have out of my rig thanks to ryzen running cooler and just not needing all those fans... and in doing so I noticed that my pump seems to be slightly audible when revved up... was just a dull hum... but since I have a custom loop in my pc I thought...hmm why not just get a block for it... anyway, if you have one you want to get rid of cheap, pm me.


I am in the other side of your dilemma, i have a spare block but no card to use xD


----------



## Minotaurtoo

Starbomba said:


> I am in the other side of your dilemma, i have a spare block but no card to use xD


the bad part is that I don't really know if I need one or not... just can hear the pump over my D5 when under extreme load... but I do have a custom loop already for the cpu so that's what made me even ask... at the moment though, I'm a bit on the broke side... orphaned kittys, and my son's wisdom teeth hitting at the same time... little short on time and cash so this has been put on hold for the moment... if you decide to sell, let me know : )


----------



## EastCoast

Can anyone with a Fury/Fury X use wattman only to get your normal 3D Base Clocks? I'm not sure if it's 1020/1050 or something else but for some reason I'm not seeing that with this version of Crimson. Only 800ish range. MSI works fine, no problem. Wattman, cannot get it to clock to base clock even though 100% utilization. 



I don't update drivers often so my last driver was a few months ago. Everything was working fine then.




Edit:
It's fixed. 

For anyone else. Make sure you reset MSI A/B using that arrow, circle icon then reset settings in Wattman. Make sure you leave Temperature at 85. Lowering it lowers your clock rate in Wattman. Lowering it in MSI A/B lowers it in wattman even if it doesn't show it. So when you decide to use Wattman instead of MSI AB your clock rate will be lowered based on what temp you used. 



For wattman just crank up the min. fan speed in order to keep your gpu cool.


----------



## asdkj1740

just got a furyx recenetly, and it crashes like hell during gaming(battlefield 3 and pubg and shadow of war).
running furmark / watching youtube video are all fine, but just in gaming...crashes after 5mins.
no oc at all. temps reported by hwinfo64 are all fine too. 
i am using dual 2000rpm fans to push pull the rad however these two fans are connected by a y cable and powered by the fury x fan header on its pcb (by a converter).

tried the latest 18.5.2 and older one 18.5.1 (reinstalled by ddu), same crashing.

any suggestion on some older driver with better stability for fury x?

thank you.


----------



## EastCoast

asdkj1740 said:


> just got a furyx recenetly, and it crashes like hell during gaming(battlefield 3 and pubg and shadow of war).
> running furmark / watching youtube video are all fine, but just in gaming...crashes after 5mins.
> no oc at all. temps reported by hwinfo64 are all fine too.
> i am using dual 2000rpm fans to push pull the rad however these two fans are connected by a y cable and powered by the fury x fan header on its pcb (by a converter).
> 
> tried the latest 18.5.2 and older one 18.5.1 (reinstalled by ddu), same crashing.
> 
> any suggestion on some older driver with better stability for fury x?
> 
> thank you.


Need a bit more details
FuryX bought new/used? 

OS (32/64)
FuryX water cooled or DYI?
Do you have any other app installed like MSI AB, Trixx, etc?


----------



## asdkj1740

EastCoast said:


> Need a bit more details
> FuryX bought new/used?
> 
> OS (32/64)
> FuryX water cooled or DYI?
> Do you have any other app installed like MSI AB, Trixx, etc?


used one.
stock aio cooling, i just replace the stock fan with two ippc 2000rpm noctua f12 fans, and power them from the fury x pcb.

win10 64, using msi ab and hwinfo64 only.

using 850w psu, i used to power 300w gtx1070 (200w gtx970 & 760 as well) and no problem happened at all.


----------



## EastCoast

asdkj1740 said:


> used one.
> stock aio cooling, i just replace the stock fan with two ippc 2000rpm noctua f12 fans, and power them from the fury x pcb.
> 
> win10 64, using msi ab and hwinfo64 only.
> 
> using 850w psu, i used to power 300w gtx1070 (200w gtx970 & 760 as well) and no problem happened at all.



Do you get any error popup screens? If so what does it say?


I would suggest that you start MSI AB then hit the reset button. If you are using the default skin it's written. If you use the other skins it should be the reset icon (arrow circle thingy). Exit out of AB then start Radeon Settings. Go to Global Graphics and click on the word reset. Reset your Cache. Then go to wattman and reset that as well. 



Reboot
Play a game
Hit Left CTRL and O while in the game to bring up the OSD.
Monitor your temps and frame rates. 

See if that helps. But I would like to know if you are getting an error message.


----------



## asdkj1740

EastCoast said:


> Do you get any error popup screens? If so what does it say?
> 
> 
> I would suggest that you start MSI AB then hit the reset button. If you are using the default skin it's written. If you use the other skins it should be the reset icon (arrow circle thingy). Exit out of AB then start Radeon Settings. Go to Global Graphics and click on the word reset. Reset your Cache. Then go to wattman and reset that as well.
> 
> 
> 
> Reboot
> Play a game
> Hit Left CTRL and O while in the game to bring up the OSD.
> Monitor your temps and frame rates.
> 
> See if that helps. But I would like to know if you are getting an error message.


i tried reset them, dont help.

for bf3, a directx error shown, stating the gpu is removed (implying the driver crashes).
for shadow of war, no error message.
for pubg, a error screen shown asking me to report, no meaning info stated however.


----------



## neojack

I would try to unplugg all my sata / M2 drives, USB devices except keyboard and mouse, install windows on another drive, with updates, chipset drivers, and graphic drivers

then launch steam and do a few games.


also, maybe the "enable 4g encoding" switch may help in the BIOS. or not.
you can also try another PCI-Express slot. (chek your MB manuel for a 16x or 8x port)


There is the possibility that this unit has mined for years and is degraded...


----------



## EastCoast

asdkj1740 said:


> i tried reset them, dont help.
> 
> for bf3, a directx error shown, stating the gpu is removed (implying the driver crashes).
> for shadow of war, no error message.
> for pubg, a error screen shown asking me to report, no meaning info stated however.



Using HWINFO (sensors only) post a pic of all the info regarding the gpu. You use snipping tool in win10 to get just that portion.


----------



## Minotaurtoo

neojack said:


> I would try to unplugg all my sata / M2 drives, USB devices except keyboard and mouse, install windows on another drive, with updates, chipset drivers, and graphic drivers
> 
> then launch steam and do a few games.
> 
> 
> also, maybe the "enable 4g encoding" switch may help in the BIOS. or not.
> you can also try another PCI-Express slot. (chek your MB manuel for a 16x or 8x port)
> 
> 
> There is the possibility that this unit has mined for years and is degraded...


 I was thinking either an unstable OC that might have been placed in the bios from the original user or degradation... you may be on to something with that idea of it being a mining card given how many of these were used for that.


if that is the case dropping the clocks 50 or so mhz might solve it... but seeing hwinfo would help..


----------



## Terve

Hi,

does somebody know the thickness of the original thermal pads? Considering to switch back to air cooling (as I dont have time for water cooling anymore..), but I lost the original pads..

Thank you!


----------



## Terve

Hi,

i just bought some 1mm pad's for my XFX R9 Nano and it looks like it's working fine. I just realizied that I have around ~95c on the VRM during benchmarking (Furmark). I never checked the original temps - is that normal or way too high? 

Thanks!

Sebastian


----------



## Ne01 OnnA

It High, Max allowed is 95-120deg.
try to chill it with additional fan directed on the Back of the GPU 

Hope that helped you Bratan'

On my new Monster i have to look at HotSpot which is always a little hotter than the Chip itself.
Hotspot is ~65-68deg
Chip is ~55-60

maby you have some Hotspot in Nano also?


----------



## Terve

Thank you! Well, looks like it's not good, but at least it seems that the R9 won't start burn soon :-D 

I just ordered some Fujipoly pad's and hope this will help to cool the VRM a bit more down. I will also try to cool the back a bit more - but that won't be that easy, because there is not much space above the nano due to the Noctua NH-D15 :-D

Could there maybe an issue with the thickness of the thermal pad? While I was re-mounting the stock cooler it looked to me like the 0.5mm pad's are to thin to make contact, so I decided to go with some cheap 1mm. 

What do you mean exactly by "hotspot"?


----------



## Ne01 OnnA

Terve said:


> Thank you! Well, looks like it's not good, but at least it seems that the R9 won't start burn soon :-D
> 
> I just ordered some Fujipoly pad's and hope this will help to cool the VRM a bit more down. I will also try to cool the back a bit more - but that won't be that easy, because there is not much space above the nano due to the Noctua NH-D15 :-D
> 
> Could there maybe an issue with the thickness of the thermal pad? While I was re-mounting the stock cooler it looked to me like the 0.5mm pad's are to thin to make contact, so I decided to go with some cheap 1mm.
> 
> What do you mean exactly by "hotspot"?


Here screen from HWinfo
(look close at GPU side, and compare some Values)
AC: O Gaming session.


----------



## M3TAl

Has anyone found a way to mount an EK VGA Supremacy to an R9 Fury? The hole spacing seems to be 64x64mm or 65x65mm and EK's largest bracket is 61x61mm.

Edit: looked far and wide for any universal block that will fit these cards and found nothing, except the Alphacool NexXxoS GPX - ATI R9 Fury M04. Ordered one off eBay.


----------



## Garwinski

Ne01 OnnA said:


> It High, Max allowed is 95-120deg.
> try to chill it with additional fan directed on the Back of the GPU
> 
> Hope that helped you Bratan'
> 
> On my new Monster i have to look at HotSpot which is always a little hotter than the Chip itself.
> Hotspot is ~65-68deg
> Chip is ~55-60
> 
> maby you have some Hotspot in Nano also?


120 degrees on VRM? Isnt that way to high? I think 95 is its absolute limit. Do we have any official numbers on this?


----------



## M3TAl

Garwinski said:


> 120 degrees on VRM? Isnt that way to high? I think 95 is its absolute limit. Do we have any official numbers on this?


VRM max temp probably depends on the specific MOSFET chips used. A lot of motherboards throttle or shut-off at 125C on the VRM, GPU's I'm not sure.


----------



## ZealotKi11er

Here is my Fury X. The soft plastic coating was coming off so I removed.


----------



## Offler

xkm1948 said:


> https://www.youtube.com/watch?v=m6mQa6LOoPE
> 
> FuryX, the most neglected flagship GPU put out by AMD. FineWine? More like FineVinegar.
> 
> Would never touch another AMD GPU, got burned hard this round.


Going to react to this rather old post...

In the mean time, driver 17.12 bumped 3dmark graphic scores by 30 percent and the upgrade was measurable in the games. On the "i should have purchased 980ti"... well...
https://www.3dmark.com/compare/fs/16285602/fs/14680042/fs/13775565

My FuryX is still on stock freqnency - however i cannot tell if the 980ti is OCd or not.


Btw, now I dont have that my FuryX paired with Phenom II but with Threadripper 1900x:
https://www.3dmark.com/compare/fs/16285602/fs/14680042

As you can see graphics score were not limited by older CPU. CPU score difference was expected, and Combined score as well. I dont see no issues with performance of the card - even in comparation with nvidia - whatsoever.


Edit:
If you have case Fractal Design Node 804, you can mount the radiator in the right chamber, while card itself will pass through central part and fit into slot. The tubes are flexing a bit, but it fits.


----------



## Offler

Hello guys. Did anyone here tried Crossfire with 2x R9 Nano?

As the mining seems to finally been declining, some R9 Nanos are becoming available (which means cards might be in bad shape), but comparing them to price of new 1080s and Vega 64s it seems to be a viable Crossfire option, with very interesting Watt/Performance ratio.

Initial calculations show that it might be possible to run 2x Nanos plus Threadripper 1900 on a (old but decent) 650watt Corsair PSU, for same/lower purchasing price. Also... a crossfire on anything sub 1000w would be interesting.


----------



## Garwinski

Offler said:


> Hello guys. Did anyone here tried Crossfire with 2x R9 Nano?
> 
> As the mining seems to finally been declining, some R9 Nanos are becoming available (which means cards might be in bad shape), but comparing them to price of new 1080s and Vega 64s it seems to be a viable Crossfire option, with very interesting Watt/Performance ratio.
> 
> Initial calculations show that it might be possible to run 2x Nanos plus Threadripper 1900 on a (old but decent) 650watt Corsair PSU, for same/lower purchasing price. Also... a crossfire on anything sub 1000w would be interesting.


With that PSU it could be hit or miss I think. I had a Fury Nano for a while, and yes, it used 150watts... But then downclocks heavily under load as well, unless you max out the power limit, taking the usage to 225watts. Again, could be possible, but a bit on the edge I think. 

Also of course, it depends on what games you play most. A lot of games these days dont support crossfire. But if you play a select number of games often and they _do_ support crossfire, then a soft-priced extra card might be tempting for me. I have a Fury X now, and before it I never cared for the sound the card makes, but damn, the Fury X is barely audible even under max load and a more aggressive fan curve than the 'stock' fan profile. I would love to have a second Fury X one day. 

On the other hand, if in the future prices for second hand Fury X drop to around €200 euro's and it would be an interesting investment for me, I could also sell my Fury X and buy a Vega 56, which I would prefer much in these days over two cards. When I bought my 7990 in 2013 or so many games supported Crossfire. Today, not so much. Maybe we will see a resurgence in dual GPU support when mayor engine developers add support for this in their Vulkan/DirectX12 rendering pipeline?


----------



## Offler

This is kinda strange.

I got new PSU (Seasonic Prime Ultra 850 Titanium) as I wanted to be sure the Threadripper is "well fed" and PC will burn less watts compared to the old one.

When I was doing tests with MSI Kombustor, i noticed that there is no difference between power settings 100% and 150% in terms of watts consumed. Assuming that RadeonSettings might be bugged i set it to -50% - noticing FPS and power consumption went down, then i set it to -20%... And here it started to be strange...

FPS went from 340 to 360, even when frequency of the GPU was not 1050MHz, but 950Mhz... After playing around i set it to -19% to find highest FPS according to MSI Kombustor. I did a 3dMark Firestrike and there was also small improvement of few FPS, but as you would expect the power draw was significantly lower.

MSI Kombustor 430w
(100% TDP - 150% TDP)
tuned power for max FPS -19%: 377w

Does not make much sense really. I would not expect an increase in FPS, unless GPU is throttling and according to measurements its not.


----------



## Minotaurtoo

Fury's are strange beasts.... My Fury x does it's level best at 1075mhz 545mhz ram at stock voltages... anything more craps out, anymore voltages and performance goes down even if I can clock to 1200mhz... but for some reason magic happens there at those clocks. on userbench it ranks in the 98-100th percentile most of the time. http://www.userbenchmark.com/UserRun/11139880 



There are those that say the memory overclock is disabled, but it does make some kind of difference on the benchmarks... I did all this with a basic bios mod


I'd be curious to see your userbench results to see how yours compares with your settings... you may have me beaten


----------



## Offler

Minotaurtoo said:


> Fury's are strange beasts.... My Fury x does it's level best at 1075mhz 545mhz ram at stock voltages... anything more craps out, anymore voltages and performance goes down even if I can clock to 1200mhz... but for some reason magic happens there at those clocks. on userbench it ranks in the 98-100th percentile most of the time. http://www.userbenchmark.com/UserRun/11139880
> 
> 
> 
> There are those that say the memory overclock is disabled, but it does make some kind of difference on the benchmarks... I did all this with a basic bios mod
> 
> 
> I'd be curious to see your userbench results to see how yours compares with your settings... you may have me beaten


http://www.userbenchmark.com/UserRun/11171094

Mine does not overclock at all. Also the resuls on NVMe drive seems to be wrong (too low).


----------



## Minotaurtoo

Offler said:


> http://www.userbenchmark.com/UserRun/11171094
> 
> Mine does not overclock at all. Also the resuls on NVMe drive seems to be wrong (too low).



yours is not far below average, way faster than mine.... my drive is the 970evo 250GB it doesn't keep up with the pro 512 at all... but I couldn't afford the pro so I went with the cheaper evo edition and most of them score around 250% or so... your ram is a dang site better than mine too until you get to the latency... that quad channel setup I suppose has to do with that.


----------



## Offler

Minotaurtoo said:


> yours is not far below average, way faster than mine.... my drive is the 970evo 250GB it doesn't keep up with the pro 512 at all... but I couldn't afford the pro so I went with the cheaper evo edition and most of them score around 250% or so... your ram is a dang site better than mine too until you get to the latency... that quad channel setup I suppose has to do with that.


I figured out what was going on with NVMe drive. I reinstalled Samsung Magician, ran the test in it and IOPS went wayyy down from 300 000 to 100 000. Played a bit with configuration. Then - and for the very first time - i agree with people on this forum who claim that HPET may cause trouble. So used bcdedit, disabled usedplatformclock and usedplatformtick. Samsung Magician then reported 300 000+ IOPS.

The new scores are here:
http://www.userbenchmark.com/UserRun/11172145


----------



## Minotaurtoo

Offler said:


> I figured out what was going on with NVMe drive. I reinstalled Samsung Magician, ran the test in it and IOPS went wayyy down from 300 000 to 100 000. Played a bit with configuration. Then - and for the very first time - i agree with people on this forum who claim that HPET may cause trouble. So used bcdedit, disabled usedplatformclock and usedplatformtick. Samsung Magician then reported 300 000+ IOPS.
> 
> The new scores are here:
> http://www.userbenchmark.com/UserRun/11172145


that's much better... guess I need to check mine too, it maybe getting hit some.... but still I don't think mine will ever keep up with yours


----------



## Offler

Minotaurtoo said:


> that's much better... guess I need to check mine too, it maybe getting hit some.... but still I don't think mine will ever keep up with yours


Well, your GPU scores seems to be bit better. But my card is not overclockable much.


----------



## Offler

Minotaurtoo said:


> that's much better... guess I need to check mine too, it maybe getting hit some.... but still I don't think mine will ever keep up with yours


I re-tested the power settings towards Kombustor once again. It seems that for 1280x720 was optimal setting -19%. For 2560x1440 its -8%. Here isnt the rise of the FPS so significant, just about 4-5 FPS difference.

However... in both cases it seemed to be CPU bound, as the test is single threaded, and GPU did not went to 1050MHz...

Basically it seems like OS power and performance management kicked in, limited CPU utilization, limited GPU utilization in order to reduce overall FPS. I wonder if something similar can be tested on a Nvidia GPU, because according to several tests, it seems that Nvidia is getting higher CPU utilization...


----------



## xkm1948

Anyone tried the 2019 Adrenaline driver with FuryX yet? I heard the new driver mess up fan curve and does not down clock.


----------



## Garwinski

xkm1948 said:


> Anyone tried the 2019 Adrenaline driver with FuryX yet? I heard the new driver mess up fan curve and does not down clock.


No such issues here on my Fury X.


----------



## Alastair

Is HBM overclocking stil blocked by driver?


----------



## NightAntilli

Haven't tested it, but, it's unlikely they would unlock it. AMD said they wouldn't put resources into unlocking it.


----------



## Alastair

NightAntilli said:


> Haven't tested it, but, it's unlikely they would unlock it. AMD said they wouldn't put resources into unlocking it.


Cause I have broken through 7K superposition Extreme 1080p. But a little more from HBM won't hurt at all


----------



## Kana-Maru

I know Navi is getting all of the news and hype right now, but I never got around to posting my Fury X stock RE2 benchmarks here so I guess I'll do it now. I ran some benchmarks and posted them on Reddit and made a YouTube showing the performance. This Fury X is still amazing me nearly 4 years after release. With Navi on the way I MIGHT finally bite the bullet and upgrade. Even with the 4GB HBM limitation the card still performs well for the games I play. Put a ton of times into benchmarking games over the past 4 years or so with this GPU and it has been wonderful. Nvidia was an option, but I don't think I want to pay their current prices mostly because I see more longterm value in AMD GPUs at the moment. 

I don't know what AMD\ATI did with the Fury X and the drivers, but this thing has been a BLAST at 4K over the past few years. 

Here are the links: 

--Reddit: Resident Evil 2 - Fury X Benchmarked
https://www.reddit.com/r/Amd/comments/bervnl/resident_evil_2_fury_x_benchmarked/

--YouTube: RE2 @ 4K Fury X + High Settings






Due to my 4GB buffer limitations I had ReLive downscale the recording to 1080p, but I kept the bitrate high enough not to effect my FPS that much. I CAN max the game, but Shadows and Mesh needs to be set to High instead of Max due to micro stutter at high resolutions. However, I did NOT notice a difference in image quality even after playing the game on a MAX with a Vega 64.

My old website is down now, but I'll get around to making another one and posting all of my Fury X benchmarks I've ran since Day 1 over the years.


----------



## rsiyasena

I've been experiencing my Fury X downclock to 300Mhz randomly in games over and over again. After much research and multiple DDU/Driver reinstallation attempts, I came across this Reddit post: https://old.reddit.com/r/Amd/comments/8ynepm/how_to_fix_fury_x_broken_sensor_gpu_liquid_temp/ 

I too am seeing 200C spikes in the "GPU Liquid Temperature" field on HWiNFO and am wondering if there is a way to limit that sensor without me disassembling the Fury X.

Thanks!


----------



## generaleramon

*Posting some infos/test here*

I started to analyze the bios to understand wich bit change wich timing
this is what i found right now:
01 00 *21* 34 *12* *A0* 22 8A 00 *13* 63 60 44 00 00 00 00 AD 2A 38 0C 73 *28* 22 0D 08 07 0C 0C 1E 01 10 0A 00 10 42
21= TRCDWA=8 (Good BW uplift (+35GB/s)and stable if set to 0(HEX>1C))
12= TRC=18
A0= TR2W=10
13= TRP_WRA=19 (Good BW uplift +8GB/s setting it to 16(HEX>10)
28= WR=8

WinAMDTweak.exe --current Results:


Spoiler



GPU 0: Fiji [Radeon R9 FURY / NANO Series] Hynix HBM

Channel 0's write command parameters:
DAT_DLY: 6 DQS_DLY: 6 DQS_XTR: 1 OEN_DLY: 6
OEN_EXT: 0 OEN_SEL: 3 CMD_DLY: 0 ADR_DLY: 0
Channel 1's write command parameters:
DAT_DLY: 6 DQS_DLY: 6 DQS_XTR: 1 OEN_DLY: 6
OEN_EXT: 0 OEN_SEL: 3 CMD_DLY: 0 ADR_DLY: 0
Power Mangement related timings:
CKSRE: 5 CKSRX: 5 CKE_PULSE: 10 CKE: 5 SEQ_IDLE: 7
RAS related timings:
RC: 18 RRD: 3 RCDRA: 8 RCDR: 7 RCDWA: 0 RCDW: 0
CAS related timings:
CL: 4 W2R: 10 R2R: 2 CCDL: 1 R2W: 10 NOPR: 0 NOPW: 0
Misc. DRAM timings:
MRD: 4 RRDL: 4 RFC: 48 TRP: 6 RP_RDA: 12 RP_WRA: 19
Misc2. DRAM timings:
WDATATR: 0 T32AW: 0 RPAR: 0 WPAR: 0 FAW: 0 PA2WDATA: 0 PA2RDATA: 0
Mode Register 0:
DBR: 1 DBW: 1 TCSR: 0 DQR: 1
DQW: 1 ADD_PAR: 1 TM: 0
Mode Register 1:
WR: 8 NDS: 1
Mode Register 2:
WL: 2 RL: 4
Mode Register 3:
APRAS: 13 BG: 0 BL: 0
Refresh Interval:
REF: 11
Thermal Throttle Control:
THRESH: 0 LEVEL: 0 PWRDOWN: 0 SHUTDOWN: 0 EN_SHUTDOWN: 0 OVERSAMPLE: 0 AVG_SAMPLE: 0
Hammer:
ENB: 1 CNT: 5 TRC: 0



Updated my custom set,now getting 450GB/s! :h34r-smi HBM is FAST!
Testing memory with this tool, not sure if it's accurate, seems to stress the card, not like a AAA game. "Need for Speed - Payback" is the best game to test memory stability anyway.
https://github.com/ihaque/memtestCL


Spoiler



Test summary:
-----------------------------------------
25 iterations over 3500 MiB of memory on device Fiji
Moving inversions (ones and zeros): 0 failed iterations
(0 total incorrect bits)
Memtest86 walking 8-bit: 0 failed iterations
(0 total incorrect bits)
True walking zeros (8-bit): 0 failed iterations
(0 total incorrect bits)
True walking ones (8-bit): 0 failed iterations
(0 total incorrect bits)
Moving inversions (random): 0 failed iterations
(0 total incorrect bits)
True walking zeros (32-bit): 0 failed iterations
(0 total incorrect bits)
True walking ones (32-bit): 0 failed iterations
(0 total incorrect bits)
Random blocks: 0 failed iterations
(0 total incorrect bits)
Memtest86 Modulo-20: 0 failed iterations
(0 total incorrect bits)
Integer logic: 0 failed iterations
(0 total incorrect bits)
Integer logic (4 loops): 0 failed iterations
(0 total incorrect bits)
Integer logic (local memory): 0 failed iterations
(0 total incorrect bits)
Integer logic (4 loops, local memory): 0 failed iterations
(0 total incorrect bits)
Final error count: 0 errors



And this is my latest "UberMixHBM v0.6". Based on the 400Mhz strap but with a lot of custom stuff and some 100Mhz strap bits  .A lot better than the usual "TMOD" you can find online. :thumb:


Code:


01 00 1C 34 12 A0 22 8A 00 10 63 60 44 00 00 00 00 AD 2A 38 0C 73 28 22 0D 08 07 0C 0C 1E 01 10 0A 00 10 42

Done some short tests:


Code:


//The Witcher 3 1440P Ultra (NO AA) (I like how i'm losing only 1FPS enabling AA at 1440P :rolleyes:)
Stock= 54FPS
UberMixHBM v0.6= 55FPS
//Superposition Benchmark 1080P Extreme (Best of 2 Runs)
Stock= 3220pts
UberMixHBM v0.6= 3245pts +0.7%
//Superposition Benchmark 4K Optimized + Medium Textures + DOF OFF + MB OFF (Best of 2 Runs)
Stock= 4975pts
UberMixHBM v0.6= 5061pts +1.7%


----------



## taisel

Just adding a data point here for an r9 fury nitro overclock/undervolt that's actually fully stable, not benchmark stable trash.

Using a voltage offset of +68.75mv to circumvent the 1.25v cap.

Voltages in parenthesis are with the voltage offset added and aliased to the 6.25mv steppings the voltage controller uses. I had to subtract the offset voltage from the DPM states and then save the new voltages back into said DPM states in order to not overvolt the lower DPM states.
P1: 780mhz 900mv (968.75mv)
P2: 890mhz 968mv (1037.5mv)
P3: 965mhz 1046mv (1112.5mv)
P4: 1025mhz 1118mv (1187.5mv)
P5: 1070mhz 1181mv (1250mv)
P6: 1100mhz 1225mv (1293.75mv)
P7: 1115mhz 1250mv (1318.75mv)

Often the reported voltage inside afterburner and HWInfo64 is 25mv above the set voltage+offset combination, rather peaking at +37.5mv above what was set. At the same time numbers 37.5mv below can be seen in games that cause vdroop.

The board is an r9 fury nitro OC+ (stock 1050mhz @ 1250mv for P7) with hardware locked CUs and a BIOS signed on 2016-7-20.
Asic Quality: 61%
Using default unmodified 260w/75c BIOS (Second BIOS position with the blue light on is 300w/80c, I'm not using that).

I've noticed that overclocks/undervolts that are valley/firestrike/timespy/superposition/furmark stable are not BFII or OW (max settings at 200% scale spamming ult in training room) stable. Those two games seem to brown out a GPU where firestike/timespy will not. It may take a few minutes to an hour but there are definitely OCs that aren't stable in BFII or OW that are in those benches.

I've yet to experience negative perf scaling, but maybe that's because I'm using a ~297w power cap of my own, which probably holds the card back from hitting some internal throttling others have run into. The P7 state is useful and giving a performance gain here, as the card will boost to that under the power cap set on games that aren't a power virus (aka boosting logic of our own). In fact, most games I play will hit and stay at P7 despite the ~297w cap I set.


----------



## taisel

Posting the fan curve for the earlier mentioned r9 fury nitro oc+ board.

Note: the "zero fan rpm" fuzzy logic engages below 28% fan speed.


For those who want to know how Sapphire's Intelligent Fan Control (IFC) II works specifically:
On the way up it acts in this manner:
- 22% and below is fans disabled.
- Once 27% fan speed is hit the first fan is engaged.
- Once 28% fan speed is hit all 3 fans are engaged.

On the way down:
- From 27% down to 25% the RPM of all 3 fans is lowered below that of 28%.
- Once 24% is reached, only two of the three fans will spin (should be the middle fan that turns off).
- Once 23% is reached, only one fan will spin.
- Once 22% is reached, all fans are disabled.


----------



## p4inkill3r

I'm selling a Sapphire Nitro Fury just FYI if anyone is looking for one. Gently used, still looks brand new.


----------



## diggiddi

p4inkill3r said:


> I'm selling a Sapphire Nitro Fury just FYI if anyone is looking for one. Gently used, still looks brand new.


How much?


----------



## DedEmbryonicCe1

If you sell it outside this channel leave me a PM because I'm thinking about selling my Sapphire Fury X as well. My RX 5700 XT 50th AE meets or exceeds all my expectations.


----------



## p4inkill3r

diggiddi said:


> How much?


I'm asking $275 shipped but willing to negotiate.
I've been out of the game for a couple years and used to have a handle on the used market, so be gentle if the pricing is outrageous.


----------



## diggiddi

yeah that's too much for a 4gb card


----------



## taisel

Attaching some tuned profiles after further testing:
Frequency profiles for 1115, 1100, 1070, 890, and 508 MHz.

The 890 and 508 MHz profiles have a voltage offset of -150mv! Literally 750mv base voltage instead of 900. Apparently the P0 and P1 states are champs are running way below the P2 state's voltages.



Code:


[Startup]
Format=2
PowerLimit=15
ThermalLimit=72
CoreClk=1100000
VFCurve=00000200080000000008000000006144000096430000000000006144000039440000000000C0764400805E44000000000020804400806844000000000020854400407144000000000080894400807744000000000000964400C085440000000000409C440080894400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25
CoreVoltageBoost=50
[Defaults]
Format=2
CoreVoltageBoost=0
PowerLimit=0
ThermalLimit=
CoreClk=1050000
VFCurve=000002000800000000080000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C440040834400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25
[Settings]
CaptureDefaults=0
[Profile1]
Format=2
CoreVoltageBoost=75
PowerLimit=15
ThermalLimit=72
CoreClk=1115000
VFCurve=0000020008000000000800000000614400009643000000000000614400804044000000000080704400805E44000000000000824400407144000000000060864400C078440000000000E0924400C085440000000000209944008089440000000000409C4400608B4400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25
[PreSuspendedMode]
Format=2
CoreVoltageBoost=
PowerLimit=
ThermalLimit=
CoreClk=
VFCurve=
MemClk=
FanMode=
FanSpeed=
[Profile2]
Format=2
CoreVoltageBoost=50
PowerLimit=15
ThermalLimit=72
CoreClk=1100000
VFCurve=00000200080000000008000000006144000096430000000000006144000039440000000000C0764400805E44000000000020804400806844000000000020854400407144000000000080894400807744000000000000964400C085440000000000409C440080894400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=58
[Profile3]
Format=2
CoreVoltageBoost=0
PowerLimit=15
ThermalLimit=72
CoreClk=1070000
VFCurve=00000200080000000008000000006144000096430000000000006144000002440000000000406A44008036440000000000A0814400805E440000000000608644008068440000000000608B44004071440000000000C08F44008077440000000000409C4400C0854400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25
[Profile4]
Format=2
CoreVoltageBoost=-150
PowerLimit=15
ThermalLimit=72
CoreClk=890000
VFCurve=000002000800000000080000000061440000964300000000008065440000FE430000000000E0874400003444000000000060944400805E44000000000060944400805E44000000000060944400805E44000000000060944400805E44000000000060944400805E4400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25
[Profile5]
Format=2
CoreVoltageBoost=-150
PowerLimit=15
ThermalLimit=72
CoreClk=525000
VFCurve=000002000800000000080000000061440000964300000000008065440000FE4300000000008065440000FE4300000000008065440000FE4300000000008065440000FE4300000000008065440000FE4300000000008065440000FE4300000000008065440000FE4300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000061440000964300000000004067440000FE430000000000406A44004033440000000000A0814400805A44000000000060864400C063440000000000608B4400006C440000000000C08F44008073440000000000409C4400408344000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MemClk=500000
FanMode=1
FanSpeed=25

Additionally, here's the lines for MSIAfterburner.cfg for the custom fan curve:


Code:


SwAutoFanControl=1
SwAutoFanControlFlags=10000000h
SwAutoFanControlPeriod=100
SwAutoFanControlCurve=000001000600000000000000000038420000B041000038420000B841000060420000E0410000704200001842000078420000344200008C420000C84200000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000


----------



## taisel

Got furmark down to 75 watts on a fury nitro:


----------



## PontiacGTX

500mhz... I bet that doesnt run better than a HD 7950 in games


----------



## taisel

PontiacGTX said:


> 500mhz... I bet that doesnt run better than a HD 7950 in games


The point is 75 watts power draw to equal one.


----------



## vazza10

hi all,

i know its kinda an old thread but i have an issue with one of my pro duo cards. one works flawless but the other card the second i try anything remotely stressing in crossfire it black screens and boots back to desktop. it doesn't restart the computer but i get some errors and sometime not. ill post more info just wanna see who is still active on the thread.

cheers


----------



## Himo5

Sometimes it can be as simple as redoing the thermal compound. My Fury X suddenly started making it impossible to boot up the sytem so I took it apart, cleaned everything up, replaced all the thermal pads and compound and now it works as good as new. Thermal pads aren't cheap, of course, and you need to get the same thickness as the old ones, but you can cut them up to suit the parts they replace and even Arctic Silver 5 is more than good enough to replace the compound. It pays to have something like an Ifixit kit and to take the dissassembly slow and careful, but this is a farely simple procedure.


----------



## vazza10

cheers, ive done every thing you could think off in that respect and tried on 10 different drivers both amd and pro. some of the photos ill post will be some image of whats going on with voltage points and Current use in crossfire. funny it crashes on every thing but furmark. but you do a render test it fails but restart furmark it works again. i have this feeling either the pcie Switch Chip (one between GPU's) or one of the vrms are not providing enough amps on gpu2. single gpu cards easier to work with. sorry about the image quality but most of the values can be seen ifs to hard to figure out i know which value equels to what bench mark. in this order. 1. uginie 2 bench on 1080p extreme, 2. standard 3dmark time spy with the error code, 3. fire strike with error code and usage bottom right, 4.furmark both 720p CF and 1080p CF + render issue, 5. GPU z Gpu 1+2 and finally a picture of the suspect vrm temp compared to all the rest which are all about 38-40 on the back of the pcb.

System spec: r5 2600x, asus b350 plus, g skill 2400 16bg, psu great wall 1500W.

if the images are hard to see i have better quality ones wasnt sure on what size it would allow to post. 

Thanks


----------



## FlawleZ

Alastair said:


> Is HBM overclocking stil blocked by driver?


Yes, but if you have a Sapphire Nitro R9 Fury and use Sapphire Trix 6.8 its able to get around it somehow. I'm able to run my memory at 550mhz, maybe more I haven't gone beyond that yet.


----------



## FlawleZ

FlawleZ said:


> Yes, but if you have a Sapphire Nitro R9 Fury and use Sapphire Trix 6.8 its able to get around it somehow. I'm able to run my memory at 550mhz, maybe more I haven't gone beyond that yet.


I spent a couple hours extensively testing clock speed changes on the core and memory today. Unfortunately it seems although Trixx increases the clock speed at the software level, it does not appear to translate to the hardware level. Improvements from 500-650Mhz seemed to be negligible and within margin of error. Every benchmark and piece of software appears to show a clock speed change yet the performance just doesn't reflect it.


----------



## Dilet

Grabbing a Nitro Fury today for a friend's upcoming build. What's more effective, over or undervolting? I've mainly worked with Vega in terms of AMD cards, anything I need to know?


----------



## NightAntilli

Dilet said:


> Grabbing a Nitro Fury today for a friend's upcoming build. What's more effective, over or undervolting? I've mainly worked with Vega in terms of AMD cards, anything I need to know?


My Fury Nitro+ had very little lenience. Overvolting didn't give me many more clocks, and undervolting automatically dropped the clock frequency. And the HBM is locked... So... Maybe I was unlucky.


----------



## CptAsian

Dilet said:


> Grabbing a Nitro Fury today for a friend's upcoming build. What's more effective, over or undervolting? I've mainly worked with Vega in terms of AMD cards, anything I need to know?


Overvolting/clocking yielded noticeably increased temps and marginal gains in clocks and effective speeds. I could undervolt quite a bit and maintain factory clocks with lower temps on both of my cards, so I'd say undervolting is the way to go.


----------



## taisel

Decided to chart out the known good f/v values for the Sapphire R9 Fury Nitro OC+ card that I have.

Yes, the 300 and 508 MHz data points are accurate, they only differ by 18.75mv, being 750mv and 768.75mv respectively. The VBIOS has both P0 and P1 overvolted by around 150mv out of the factory.


Essentially anything before 500MHz has a flat voltage floor of ~750mv
There's a curve of around 1.3MHz per 1mv from a bit past 500MHz to around 900MHz.
Past 900MHz the curve changes towards 0.8MHz per 1mv.
Past 1070MHz there's a voltage wall where the MHz gains per mv crashes hard (around 0.6 MHz/mv at 1090, 0.4ish past 1100).


----------

